Skip to content

Commit

Permalink
commit change
Browse files Browse the repository at this point in the history
  • Loading branch information
Chuck Tang committed May 3, 2024
1 parent 4926b66 commit 766da09
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 3 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/docker.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,10 +19,10 @@ jobs:
include:
- name: "2.3.0_cu121_flash2"
base_image: mosaicml/pytorch:2.3.0_cu121-python3.11-ubuntu20.04
dep_groups: "[gpu-flash2,te]"
dep_groups: "[gpu-flash2]"
- name: "2.3.0_cu121_flash2_aws"
base_image: mosaicml/pytorch:2.3.0_cu121-python3.11-ubuntu20.04-aws
dep_groups: "[gpu-flash2,te]"
dep_groups: "[gpu-flash2]"
steps:
- name: Maximize Build Space on Worker
uses: easimon/maximize-build-space@v4
Expand Down
3 changes: 2 additions & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,8 @@ ADD https://raw.githubusercontent.com/mosaicml/llm-foundry/$BRANCH_NAME/setup.py
RUN rm setup.py

# Install and uninstall foundry to cache foundry requirements
RUN MAX_JOBS=1 pip install --verbose --no-cache-dir git+https://github.com/NVIDIA/TransformerEngine.git@stable
RUN git clone -b $BRANCH_NAME https://github.com/mosaicml/llm-foundry.git
RUN MAX_JOBS=1 pip install --ignore-installed --verbose --no-cache-dir "./llm-foundry${DEP_GROUPS}"
RUN MAX_JOBS=1 pip install --verbose --no-cache-dir "./llm-foundry${DEP_GROUPS}"
RUN pip uninstall -y llm-foundry
RUN rm -rf llm-foundry

0 comments on commit 766da09

Please sign in to comment.