Skip to content

Commit

Permalink
bug: moved the run commands from dockerfile to python file
Browse files Browse the repository at this point in the history
  • Loading branch information
limcheekin committed Sep 7, 2023
1 parent 1ad97bf commit 884666c
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 5 deletions.
4 changes: 0 additions & 4 deletions orca-mini-v3-7b/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,3 @@ RUN pip install -U pip setuptools wheel && \
# Download model
RUN mkdir model && \
curl -L https://huggingface.co/TheBloke/orca_mini_v3_7B-GGML/resolve/main/orca_mini_v3_7b.ggmlv3.q4_0.bin -o model/ggml-model.bin

# Fix: Cannot allocate memory. Try increasing RLIMIT_MLOCK ('ulimit -l' as root).
RUN echo "* soft memlock unlimited" >> /etc/security/limits.conf && \
echo "* hard memlock unlimited" >> /etc/security/limits.conf
5 changes: 4 additions & 1 deletion orca-mini-v3-7b/fastapi_app.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,10 @@

image = Image.from_dockerfile(
"Dockerfile", force_build=True
).pip_install("pydantic_settings").pip_install("fastapi==0.100.1")
).pip_install("pydantic_settings").pip_install("fastapi==0.100.1").run_commands(
# Fix: Cannot allocate memory. Try increasing RLIMIT_MLOCK ('ulimit -l' as root).
'echo "* soft memlock unlimited" >> /etc/security/limits.conf && echo "* hard memlock unlimited" >> /etc/security/limits.conf',
)


@stub.function(image=image, cpu=14, memory=5120, keep_warm=1, timeout=600)
Expand Down

0 comments on commit 884666c

Please sign in to comment.