Skip to content

Commit

Permalink
Merge branch 'update_hugectr_version_23.9.0' into 'main'
Browse files Browse the repository at this point in the history
Update new version: 23.9.0

See merge request dl/hugectr/hugectr!1478
  • Loading branch information
minseokl committed Sep 26, 2023
2 parents 9dcbe75 + 335b39a commit 065b738
Show file tree
Hide file tree
Showing 29 changed files with 54 additions and 54 deletions.
2 changes: 1 addition & 1 deletion HugeCTR/include/common.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@
namespace HugeCTR {

#define HUGECTR_VERSION_MAJOR 23
#define HUGECTR_VERSION_MINOR 8
#define HUGECTR_VERSION_MINOR 9
#define HUGECTR_VERSION_PATCH 0

#define WARP_SIZE 32
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ If you'd like to quickly train a model using the Python interface, do the follow

1. Start a NGC container with your local host directory (/your/host/dir mounted) by running the following command:
```
docker run --gpus=all --rm -it --cap-add SYS_NICE -v /your/host/dir:/your/container/dir -w /your/container/dir -it -u $(id -u):$(id -g) nvcr.io/nvidia/merlin/merlin-hugectr:23.08
docker run --gpus=all --rm -it --cap-add SYS_NICE -v /your/host/dir:/your/container/dir -w /your/container/dir -it -u $(id -u):$(id -g) nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```

**NOTE**: The **/your/host/dir** directory is just as visible as the **/your/container/dir** directory. The **/your/host/dir** directory is also your starting directory.
Expand Down
4 changes: 2 additions & 2 deletions docs/source/hierarchical_parameter_server/profiling_hps.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,13 +67,13 @@ To build HPS profiler from source, do the following:
Pull the container using the following command:

```shell
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.08
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```

Launch the container in interactive mode (mount the HugeCTR root directory into the container for your convenience) by running this command:

```shell
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:23.08
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```

3. Here is an example of how you can build HPS Profiler using the build options:
Expand Down
2 changes: 1 addition & 1 deletion docs/source/hugectr_user_guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ The following sample command pulls and starts the Merlin Training container:

```shell
# Run the container in interactive mode
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-hugectr:23.08
$ docker run --gpus=all --rm -it --cap-add SYS_NICE nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```

### Building HugeCTR from Scratch
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/hps_cc/config.hpp
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
*/
#pragma once

// TODO: The configurations are not needed anymore in merlin-base:23.08
// TODO: The configurations are not needed anymore in merlin-base:23.09
// #include <absl/base/options.h>
// #undef ABSL_OPTION_USE_STD_STRING_VIEW
// #define ABSL_OPTION_USE_STD_STRING_VIEW 0
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hierarchical_parameter_server_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.08 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.08`.\n",
"The HPS Python module is preinstalled in the 23.09 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.09`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hps_multi_table_sparse_input_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.08 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.08`.\n",
"The HPS Python module is preinstalled in the 23.09 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.09`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hps_pretrained_model_training_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.08 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.08`.\n",
"The HPS Python module is preinstalled in the 23.09 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.09`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/hps_table_fusion_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.08 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.08`.\n",
"The HPS Python module is preinstalled in the 23.09 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.09`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
14 changes: 7 additions & 7 deletions hps_tf/notebooks/hps_tensorflow_triton_deployment_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get HPS from NGC\n",
"\n",
"The HPS Python module is preinstalled in the 23.08 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.08`.\n",
"The HPS Python module is preinstalled in the 23.09 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.09`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down Expand Up @@ -854,9 +854,9 @@
"INFO:tensorflow:Automatic mixed precision has been deactivated.\n",
"2022-11-23 01:37:23.028482: I tensorflow/core/grappler/devices.cc:66] Number of eligible GPUs (core count >= 8, compute capability >= 0.0): 1\n",
"2022-11-23 01:37:23.028568: I tensorflow/core/grappler/clusters/single_machine.cc:358] Starting new session\n",
"2022-11-23 01:37:23.081909: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 30991 MB memory: -> device: 0, name: Tesla V100-SXM2-32GB, pci bus id: 0000:06:00.0, compute capability: 7.0\n",
"2022-11-23 01:37:23.088593: W tensorflow/compiler/tf2tensorrt/convert/trt_optimization_pass.cc:198] Calibration with FP32 or FP16 is not implemented. Falling back to use_calibration = False.Note that the default value of use_calibration is True.\n",
"2022-11-23 01:37:23.089761: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:952] \n",
"2022-11-23 01:37:23.091909: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1532] Created device /job:localhost/replica:0/task:0/device:GPU:0 with 30991 MB memory: -> device: 0, name: Tesla V100-SXM2-32GB, pci bus id: 0000:06:00.0, compute capability: 7.0\n",
"2022-11-23 01:37:23.098593: W tensorflow/compiler/tf2tensorrt/convert/trt_optimization_pass.cc:198] Calibration with FP32 or FP16 is not implemented. Falling back to use_calibration = False.Note that the default value of use_calibration is True.\n",
"2022-11-23 01:37:23.099761: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:952] \n",
"\n",
"################################################################################\n",
"TensorRT unsupported/non-converted OP Report:\n",
Expand All @@ -872,9 +872,9 @@
"For more information see https://docs.nvidia.com/deeplearning/frameworks/tf-trt-user-guide/index.html#supported-ops.\n",
"################################################################################\n",
"\n",
"2022-11-23 01:37:23.089860: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:1280] The environment variable TF_TRT_MAX_ALLOWED_ENGINES=20 has no effect since there are only 1 TRT Engines with at least minimum_segment_size=3 nodes.\n",
"2022-11-23 01:37:23.089893: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:799] Number of TensorRT candidate segments: 1\n",
"2022-11-23 01:37:23.080667: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:916] Replaced segment 0 consisting of 9 nodes by TRTEngineOp_000_000.\n"
"2022-11-23 01:37:23.099860: W tensorflow/compiler/tf2tensorrt/segment/segment.cc:1280] The environment variable TF_TRT_MAX_ALLOWED_ENGINES=20 has no effect since there are only 1 TRT Engines with at least minimum_segment_size=3 nodes.\n",
"2022-11-23 01:37:23.099893: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:799] Number of TensorRT candidate segments: 1\n",
"2022-11-23 01:37:23.090667: I tensorflow/compiler/tf2tensorrt/convert/convert_graph.cc:916] Replaced segment 0 consisting of 9 nodes by TRTEngineOp_000_000.\n"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion hps_tf/notebooks/sok_to_hps_dlrm_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@
"\n",
"### Get SOK from NGC\n",
"\n",
"Both SOK and HPS Python modules are preinstalled in the 23.08 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.08`.\n",
"Both SOK and HPS Python modules are preinstalled in the 23.09 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.09`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container.\n",
"\n",
Expand Down
10 changes: 5 additions & 5 deletions hps_trt/notebooks/benchmark_tf_trained_large_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -1279,17 +1279,17 @@
" ```shell\n",
" git clone https://github.com/NVIDIA-Merlin/Merlin.git\n",
" cd Merlin/docker\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:23.08 -f dockerfile.merlin .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-tensorflow:23.08 -f dockerfile.tf .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:23.09 -f dockerfile.merlin .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-tensorflow:23.09 -f dockerfile.tf .\n",
" cd ../..\n",
" ```\n",
"- **Option B (G+H optimized HugeCTR)**:\n",
" ```shell\n",
" git clone https://github.com/NVIDIA-Merlin/Merlin.git\n",
" cd Merlin/docker\n",
" sed -i -e 's/\" -DENABLE_INFERENCE=ON/\" -DUSE_HUGE_PAGES=ON -DENABLE_INFERENCE=ON/g' dockerfile.merlin\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:23.08 -f dockerfile.merlin .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-tensorflow:23.08 -f dockerfile.tf .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-base:23.09 -f dockerfile.merlin .\n",
" docker build -t nvcr.io/nvstaging/merlin/merlin-tensorflow:23.09 -f dockerfile.tf .\n",
" cd ../..\n",
" ````"
]
Expand Down Expand Up @@ -1325,7 +1325,7 @@
"\n",
"Your filesystem or system environment might impose constraints. The following command just serves as an example. It assumes HugeCTR was downloaded from GitHub into the current working directory (`git clone https://github.com/NVIDIA-Merlin/HugeCTR.git`). To allow writing files, we first give root user (inside the docker image you are root) to access to the notebook folder (this folder), and then startup a suitable Jupyter server.\n",
"```shell\n",
"export HCTR_SRC=\"${PWD}/HugeCTR\" && chmod -R 777 \"${HCTR_SRC}/hps_trt/notebooks\" && docker run -it --rm --gpus all --network=host -v ${HCTR_SRC}:/hugectr nvcr.io/nvstaging/merlin/merlin-tensorflow:23.08 jupyter-lab --allow-root --ip 0.0.0.0 --port 8888 --no-browser --notebook-dir=/hugectr/hps_trt/notebooks\n",
"export HCTR_SRC=\"${PWD}/HugeCTR\" && chmod -R 777 \"${HCTR_SRC}/hps_trt/notebooks\" && docker run -it --rm --gpus all --network=host -v ${HCTR_SRC}:/hugectr nvcr.io/nvstaging/merlin/merlin-tensorflow:23.09 jupyter-lab --allow-root --ip 0.0.0.0 --port 8888 --no-browser --notebook-dir=/hugectr/hps_trt/notebooks\n",
"``` "
]
},
Expand Down
2 changes: 1 addition & 1 deletion hps_trt/notebooks/demo_for_hugectr_trained_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"### Use NGC\n",
"\n",
"The HPS TensorRT plugin is preinstalled in the 23.08 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:23.08`.\n",
"The HPS TensorRT plugin is preinstalled in the 23.09 and later [Merlin HugeCTR Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-hugectr): `nvcr.io/nvidia/merlin/merlin-hugectr:23.09`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container."
]
Expand Down
2 changes: 1 addition & 1 deletion hps_trt/notebooks/demo_for_pytorch_trained_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"### Use NGC\n",
"\n",
"The HPS TensorRT plugin is preinstalled in the 23.08 and later [Merlin PyTorch Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-pytorch): `nvcr.io/nvidia/merlin/merlin-pytorch:23.08`.\n",
"The HPS TensorRT plugin is preinstalled in the 23.09 and later [Merlin PyTorch Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-pytorch): `nvcr.io/nvidia/merlin/merlin-pytorch:23.09`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container."
]
Expand Down
2 changes: 1 addition & 1 deletion hps_trt/notebooks/demo_for_tf_trained_model.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@
"\n",
"### Use NGC\n",
"\n",
"The HPS TensorRT plugin is preinstalled in the 23.08 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.08`.\n",
"The HPS TensorRT plugin is preinstalled in the 23.09 and later [Merlin TensorFlow Container](https://catalog.ngc.nvidia.com/orgs/nvidia/teams/merlin/containers/merlin-tensorflow): `nvcr.io/nvidia/merlin/merlin-tensorflow:23.09`.\n",
"\n",
"You can check the existence of the required libraries by running the following Python code after launching this container."
]
Expand Down
6 changes: 3 additions & 3 deletions notebooks/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,16 +19,16 @@ git clone https://github.com/NVIDIA/HugeCTR
Pull the container using the following command:

```shell
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.08
docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```

Launch the container in interactive mode (mount the HugeCTR root directory into the container for your convenience) by running this command:

```shell
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:23.08
docker run --gpus all --rm -it --cap-add SYS_NICE --ipc=host --ulimit memlock=-1 --ulimit stack=67108864 -u root -v $(pwd):/HugeCTR -w /HugeCTR -p 8888:8888 nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```

> To run the Sparse Operation Kit notebooks, specify the `nvcr.io/nvidia/merlin/merlin-tensorflow:23.08` container.
> To run the Sparse Operation Kit notebooks, specify the `nvcr.io/nvidia/merlin/merlin-tensorflow:23.09` container.

## 3. Customized Building (Optional)

Expand Down
6 changes: 3 additions & 3 deletions release_notes.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Release Notes

## What's New in Version 23.08
## What's New in Version 23.09

+ **Hierarchical Parameter Server**:
+ Support static EC fp8 quantization
Expand Down Expand Up @@ -181,7 +181,7 @@ In this release, we have fixed issues and enhanced the code.
```{important}
In January 2023, the HugeCTR team plans to deprecate semantic versioning, such as `v4.3`.
Afterward, the library will use calendar versioning only, such as `v23.08`.
Afterward, the library will use calendar versioning only, such as `v23.09`.
```
+ **Support for BERT and Variants**:
Expand Down Expand Up @@ -263,7 +263,7 @@ The [HugeCTR Training and Inference with Remote File System Example](https://nvi
```{important}
In January 2023, the HugeCTR team plans to deprecate semantic versioning, such as `v4.2`.
Afterward, the library will use calendar versioning only, such as `v23.08`.
Afterward, the library will use calendar versioning only, such as `v23.09`.
```
+ **Change to HPS with Redis or Kafka**:
Expand Down
4 changes: 2 additions & 2 deletions samples/criteo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an

1. Pull the HugeCTR NGC Docker by running the following command:
```bash
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.08
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running the following command:
```bash
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.08
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```

### Build the HugeCTR Docker Container on Your Own ###
Expand Down
4 changes: 2 additions & 2 deletions samples/criteo_multi_slots/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an

1. Pull the HugeCTR NGC Docker by running the following command:
```bash
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.08
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running this command:
```bash
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.08
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```

### Build the HugeCTR Docker Container on Your Own ###
Expand Down
4 changes: 2 additions & 2 deletions samples/dcn/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an

1. Pull the HugeCTR NGC Docker by running the following command:
```bash
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.08
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running the following command:
```bash
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.08
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```

### Build the HugeCTR Docker Container on Your Own ###
Expand Down
4 changes: 2 additions & 2 deletions samples/deepfm/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,11 @@ HugeCTR is available as buildable source code, but the easiest way to install an

1. Pull the HugeCTR NGC Docker by running the following command:
```bash
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.08
$ docker pull nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```
2. Launch the container in interactive mode with the HugeCTR root directory mounted into the container by running the following command:
```bash
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.08
$ docker run --gpus=all --rm -it --cap-add SYS_NICE -u $(id -u):$(id -g) -v $(pwd):/hugectr -w /hugectr nvcr.io/nvidia/merlin/merlin-hugectr:23.09
```

### Build the HugeCTR Docker Container on Your Own ###
Expand Down
Loading

0 comments on commit 065b738

Please sign in to comment.