Skip to content

Commit

Permalink
Merge pull request #23 from Naomiusearch/flash_attention_for_rocm
Browse files Browse the repository at this point in the history
Make installation steps look better
  • Loading branch information
dejay-vu authored Dec 5, 2023
2 parents 3d2b6f5 + 820b2b1 commit 68aac13
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -217,15 +217,15 @@ pytest -q -s tests/test_flash_attn.py
- PyTorch 1.12.1+
- MI200 & MI300 GPUs
## Method 1: Build from Source
### i. Launch a ROCm PyTorch docker (recommended): E.g.
### I. Launch a ROCm PyTorch docker (recommended): E.g.
```bash
docker run -it --device /dev/dri --device /dev/kfd --network host --ipc host --privileged --cap-add SYS_PTRACE --group-add video --security-opt seccomp=unconfined rocm/pytorch:rocm5.7_ubuntu22.04_py3.10_pytorch_2.0.1
```
### ii. Clone the repo with submodules
### II. Clone the repo with submodules
```bash
git clone --recursive https://github.com/ROCmSoftwarePlatform/flash-attention.git
```
### iii. (optional): Build for the desired GPU architecture(s) by setting the enviroment variable (semicolon seperated). We currently only support the following options. If you do not specify, defaultly it will build for your native device architecture:
### III. (optional): Build for the desired GPU architecture(s) by setting the enviroment variable (semicolon seperated). We currently only support the following options. If you do not specify, defaultly it will build for your native device architecture:
To manually target for MI200 series:
```bash
export GPU_ARCHS="gfx90a"
Expand All @@ -234,7 +234,7 @@ To manually target for MI300 series:
```bash
export GPU_ARCHS="gfx940;gfx941;gfx942"
```
### iii. Build from source
### IV. Build from source
```bash
$ cd flash-attention
$ export PYTHON_SITE_PACKAGES=$(python -c 'import site; print(site.getsitepackages()[0])')
Expand Down

0 comments on commit 68aac13

Please sign in to comment.