From 7bc3dedcae2b70524ca2c0e58cab32a806395c25 Mon Sep 17 00:00:00 2001 From: Mohit Sharma Date: Tue, 5 Dec 2023 04:55:30 +0000 Subject: [PATCH 1/5] updated doc --- .../onnxruntime/usage_guides/amdgpu.mdx | 20 ++++++++++--------- 1 file changed, 11 insertions(+), 9 deletions(-) diff --git a/docs/source/onnxruntime/usage_guides/amdgpu.mdx b/docs/source/onnxruntime/usage_guides/amdgpu.mdx index 1859637464a..5ad7e7df435 100644 --- a/docs/source/onnxruntime/usage_guides/amdgpu.mdx +++ b/docs/source/onnxruntime/usage_guides/amdgpu.mdx @@ -11,20 +11,22 @@ The following setup installs the ONNX Runtime support with ROCM Execution Provid #### 1. ROCm Installation -To install ROCM 5.7, please follow the [ROCm installation guide](https://rocm.docs.amd.com/en/latest/deploy/linux/index.html). +Refer to the [ROCm installation guide](https://rocm.docs.amd.com/en/latest/deploy/linux/index.html) to install ROCm 5.7. -#### 2. PyTorch Installation with ROCm Support +#### 2. Installing `onnxruntime-rocm` + +Utilize either the provided [Dockerfile](https://github.com/huggingface/optimum-amd/blob/main/docker/onnx-runtime-amd-gpu/Dockerfile) example or perform a local installation from the source since pip wheels are currently unavailable. + +##### Local Installation Steps: + +##### 2.1 Installing PyTorch with ROCm Support Optimum ONNX Runtime integration relies on some functionalities of Transformers that require PyTorch. For now, we recommend to use Pytorch compiled against RoCm 5.7, that can be installed following [PyTorch installation guide](https://pytorch.org/get-started/locally/): ```bash pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm5.7 ``` - -For docker installation, the following base image is recommended: `rocm/pytorch:rocm5.7_ubuntu22.04_py3.10_pytorch_2.0.1` - - -### 3. ONNX Runtime installation with ROCm Execution Provider +### 2.2 ONNX Runtime installation with ROCm Execution Provider ```bash # pre-requisites @@ -34,8 +36,8 @@ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh # Install ONNXRuntime from source git clone --recursive https://github.com/ROCmSoftwarePlatform/onnxruntime.git -git checkout rocm5.7_internal_testing_eigen-3.4.zip_hash cd onnxruntime +git checkout rocm5.7_internal_testing_eigen-3.4.zip_hash ./build.sh --config Release --build_wheel --update --build --parallel --cmake_extra_defines ONNXRUNTIME_VERSION=$(cat ./VERSION_NUMBER) --use_rocm --rocm_home=/opt/rocm pip install build/Linux/Release/dist/* @@ -66,7 +68,7 @@ Before going further, run the following sample code to check whether the install >>> assert ort_model.providers == ["ROCMExecutionProvider", "CPUExecutionProvider"] ``` -In case this code runs gracefully, congratulations, the installation is successfull! If you encounter the following error or similar, +In case this code runs gracefully, congratulations, the installation is successful! If you encounter the following error or similar, ``` ValueError: Asked to use ROCMExecutionProvider as an ONNX Runtime execution provider, but the available execution providers are ['CPUExecutionProvider']. From fdeb0f0848edc5616085a373c003b23776ef6a89 Mon Sep 17 00:00:00 2001 From: Mohit Sharma Date: Tue, 5 Dec 2023 05:19:18 +0000 Subject: [PATCH 2/5] update doc --- docs/source/onnxruntime/usage_guides/amdgpu.mdx | 15 +++++++++++---- 1 file changed, 11 insertions(+), 4 deletions(-) diff --git a/docs/source/onnxruntime/usage_guides/amdgpu.mdx b/docs/source/onnxruntime/usage_guides/amdgpu.mdx index 5ad7e7df435..6e3bb5acc11 100644 --- a/docs/source/onnxruntime/usage_guides/amdgpu.mdx +++ b/docs/source/onnxruntime/usage_guides/amdgpu.mdx @@ -9,21 +9,28 @@ This guide will show you how to run inference on the `ROCMExecutionProvider` exe ## Installation The following setup installs the ONNX Runtime support with ROCM Execution Provider with ROCm 5.7. -#### 1. ROCm Installation +#### ROCm Installation Refer to the [ROCm installation guide](https://rocm.docs.amd.com/en/latest/deploy/linux/index.html) to install ROCm 5.7. -#### 2. Installing `onnxruntime-rocm` +#### Installing `onnxruntime-rocm` -Utilize either the provided [Dockerfile](https://github.com/huggingface/optimum-amd/blob/main/docker/onnx-runtime-amd-gpu/Dockerfile) example or perform a local installation from the source since pip wheels are currently unavailable. +Utilize the provided [Dockerfile](https://github.com/huggingface/optimum-amd/blob/main/docker/onnx-runtime-amd-gpu/Dockerfile) example or perform a local installation from the source since pip wheels are currently unavailable. -##### Local Installation Steps: +**Docker Installation:** + +```bash +docker build -f Dockerfile -t ort/rocm . +``` + +**Local Installation Steps** ##### 2.1 Installing PyTorch with ROCm Support Optimum ONNX Runtime integration relies on some functionalities of Transformers that require PyTorch. For now, we recommend to use Pytorch compiled against RoCm 5.7, that can be installed following [PyTorch installation guide](https://pytorch.org/get-started/locally/): ```bash pip3 install --pre torch torchvision torchaudio --index-url https://download.pytorch.org/whl/nightly/rocm5.7 +# Use 'rocm/pytorch:latest' as the preferred base image when using Docker for PyTorch installation. ``` ### 2.2 ONNX Runtime installation with ROCm Execution Provider From 5265b113ddb4beb8fd5d6944d27dfdb697c91939 Mon Sep 17 00:00:00 2001 From: Mohit Sharma Date: Tue, 5 Dec 2023 06:11:06 +0000 Subject: [PATCH 3/5] fix format --- docs/source/onnxruntime/usage_guides/amdgpu.mdx | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/source/onnxruntime/usage_guides/amdgpu.mdx b/docs/source/onnxruntime/usage_guides/amdgpu.mdx index 6e3bb5acc11..a95ab737f0e 100644 --- a/docs/source/onnxruntime/usage_guides/amdgpu.mdx +++ b/docs/source/onnxruntime/usage_guides/amdgpu.mdx @@ -9,11 +9,11 @@ This guide will show you how to run inference on the `ROCMExecutionProvider` exe ## Installation The following setup installs the ONNX Runtime support with ROCM Execution Provider with ROCm 5.7. -#### ROCm Installation +#### 1 ROCm Installation Refer to the [ROCm installation guide](https://rocm.docs.amd.com/en/latest/deploy/linux/index.html) to install ROCm 5.7. -#### Installing `onnxruntime-rocm` +#### 2 Installing `onnxruntime-rocm` Utilize the provided [Dockerfile](https://github.com/huggingface/optimum-amd/blob/main/docker/onnx-runtime-amd-gpu/Dockerfile) example or perform a local installation from the source since pip wheels are currently unavailable. From 1fb1dead0d14ac6eac505d3cec99a231f33b90b4 Mon Sep 17 00:00:00 2001 From: Mohit Sharma Date: Tue, 5 Dec 2023 07:21:44 +0000 Subject: [PATCH 4/5] fix doc --- docs/source/onnxruntime/usage_guides/amdgpu.mdx | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/source/onnxruntime/usage_guides/amdgpu.mdx b/docs/source/onnxruntime/usage_guides/amdgpu.mdx index a95ab737f0e..97913ebc7e3 100644 --- a/docs/source/onnxruntime/usage_guides/amdgpu.mdx +++ b/docs/source/onnxruntime/usage_guides/amdgpu.mdx @@ -23,9 +23,9 @@ Utilize the provided [Dockerfile](https://github.com/huggingface/optimum-amd/blo docker build -f Dockerfile -t ort/rocm . ``` -**Local Installation Steps** +**Local Installation Steps:** -##### 2.1 Installing PyTorch with ROCm Support +##### 2.1 PyTorch with ROCm Support Optimum ONNX Runtime integration relies on some functionalities of Transformers that require PyTorch. For now, we recommend to use Pytorch compiled against RoCm 5.7, that can be installed following [PyTorch installation guide](https://pytorch.org/get-started/locally/): ```bash @@ -33,7 +33,7 @@ pip3 install --pre torch torchvision torchaudio --index-url https://download.pyt # Use 'rocm/pytorch:latest' as the preferred base image when using Docker for PyTorch installation. ``` -### 2.2 ONNX Runtime installation with ROCm Execution Provider +### 2.2 ONNX Runtime with ROCm Execution Provider ```bash # pre-requisites From aab64d432b70de4d52f00b838df7eab99ce55d93 Mon Sep 17 00:00:00 2001 From: fxmarty <9808326+fxmarty@users.noreply.github.com> Date: Tue, 5 Dec 2023 18:19:53 +0900 Subject: [PATCH 5/5] Update docs/source/onnxruntime/usage_guides/amdgpu.mdx --- docs/source/onnxruntime/usage_guides/amdgpu.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/onnxruntime/usage_guides/amdgpu.mdx b/docs/source/onnxruntime/usage_guides/amdgpu.mdx index 97913ebc7e3..381168a337f 100644 --- a/docs/source/onnxruntime/usage_guides/amdgpu.mdx +++ b/docs/source/onnxruntime/usage_guides/amdgpu.mdx @@ -15,7 +15,7 @@ Refer to the [ROCm installation guide](https://rocm.docs.amd.com/en/latest/deplo #### 2 Installing `onnxruntime-rocm` -Utilize the provided [Dockerfile](https://github.com/huggingface/optimum-amd/blob/main/docker/onnx-runtime-amd-gpu/Dockerfile) example or perform a local installation from the source since pip wheels are currently unavailable. +Please use the provided [Dockerfile](https://github.com/huggingface/optimum-amd/blob/main/docker/onnx-runtime-amd-gpu/Dockerfile) example or do a local installation from source since pip wheels are currently unavailable. **Docker Installation:**