From 1ad85a367fdff355435b4c9c79d600d802574a96 Mon Sep 17 00:00:00 2001 From: Dustin Franklin Date: Mon, 15 Apr 2024 10:54:31 -0400 Subject: [PATCH 1/2] updated package list --- packages/README.md | 83 +++++++++++++++++++++++++++++++++++----------- 1 file changed, 64 insertions(+), 19 deletions(-) diff --git a/packages/README.md b/packages/README.md index 7e5b727c0..0cb6bc11c 100644 --- a/packages/README.md +++ b/packages/README.md @@ -1,5 +1,5 @@ # Packages -> [`AUDIO`](#user-content-audio) [`BUILD`](#user-content-build) [`CORE`](#user-content-core) [`CUDA`](#user-content-cuda) [`DIFFUSION`](#user-content-diffusion) [`LLM`](#user-content-llm) [`ML`](#user-content-ml) [`OTHER`](#user-content-other) [`PYTORCH`](#user-content-pytorch) [`RAPIDS`](#user-content-rapids) [`ROS`](#user-content-ros) [`SENSORS`](#user-content-sensors) [`TRANSFORMER`](#user-content-transformer) [`VECTORDB`](#user-content-vectordb) [`VIT`](#user-content-vit) +> [`AUDIO`](#user-content-audio) [`BUILD`](#user-content-build) [`CORE`](#user-content-core) [`CUDA`](#user-content-cuda) [`DIFFUSION`](#user-content-diffusion) [`LLM`](#user-content-llm) [`ML`](#user-content-ml) [`OTHER`](#user-content-other) [`PYTORCH`](#user-content-pytorch) [`RAPIDS`](#user-content-rapids) [`ROS`](#user-content-ros) [`SENSORS`](#user-content-sensors) [`SMART-HOME`](#user-content-smart-home) [`TRANSFORMER`](#user-content-transformer) [`VECTORDB`](#user-content-vectordb) [`VIT`](#user-content-vit) [`WYOMING`](#user-content-wyoming) | | | |------------|------------| @@ -13,35 +13,44 @@ |     [`whisperx`](/packages/audio/whisperx) | [![`whisperx_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/whisperx_jp51.yml?label=whisperx:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/whisperx_jp51.yml) [![`whisperx_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/whisperx_jp60.yml?label=whisperx:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/whisperx_jp60.yml) | |     [`xtts`](/packages/audio/xtts) | | | **`BUILD`** | | -|     [`bazel`](/packages/bazel) | [![`bazel_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/bazel_jp46.yml?label=bazel:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/bazel_jp46.yml) [![`bazel_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/bazel_jp51.yml?label=bazel:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/bazel_jp51.yml) | -|     [`build-essential`](/packages/build-essential) | [![`build-essential_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/build-essential_jp46.yml?label=build-essential:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/build-essential_jp46.yml) [![`build-essential_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/build-essential_jp51.yml?label=build-essential:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/build-essential_jp51.yml) | -|     [`cmake:apt`](/packages/cmake/cmake_apt) | [![`cmake-apt_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-apt_jp46.yml?label=cmake-apt:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-apt_jp46.yml) [![`cmake-apt_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-apt_jp51.yml?label=cmake-apt:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-apt_jp51.yml) | -|     [`cmake:pip`](/packages/cmake/cmake_pip) | [![`cmake-pip_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-pip_jp46.yml?label=cmake-pip:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-pip_jp46.yml) [![`cmake-pip_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-pip_jp51.yml?label=cmake-pip:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-pip_jp51.yml) | -|     [`nodejs`](/packages/nodejs) | | -|     [`protobuf:apt`](/packages/protobuf/protobuf_apt) | [![`protobuf-apt_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-apt_jp46.yml?label=protobuf-apt:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-apt_jp46.yml) [![`protobuf-apt_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-apt_jp51.yml?label=protobuf-apt:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-apt_jp51.yml) | -|     [`protobuf:cpp`](/packages/protobuf/protobuf_cpp) | [![`protobuf-cpp_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-cpp_jp46.yml?label=protobuf-cpp:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-cpp_jp46.yml) [![`protobuf-cpp_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-cpp_jp51.yml?label=protobuf-cpp:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-cpp_jp51.yml) | -|     [`python:3.10`](/packages/python) | | -|     [`python:3.11`](/packages/python) | | -|     [`python:3.12`](/packages/python) | | -|     [`rust`](/packages/rust) | [![`rust_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/rust_jp46.yml?label=rust:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/rust_jp46.yml) [![`rust_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/rust_jp51.yml?label=rust:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/rust_jp51.yml) | +|     [`bazel`](/packages/build/bazel) | [![`bazel_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/bazel_jp46.yml?label=bazel:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/bazel_jp46.yml) [![`bazel_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/bazel_jp51.yml?label=bazel:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/bazel_jp51.yml) | +|     [`build-essential`](/packages/build/build-essential) | [![`build-essential_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/build-essential_jp46.yml?label=build-essential:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/build-essential_jp46.yml) [![`build-essential_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/build-essential_jp51.yml?label=build-essential:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/build-essential_jp51.yml) | +|     [`cmake:apt`](/packages/build/cmake/cmake_apt) | [![`cmake-apt_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-apt_jp46.yml?label=cmake-apt:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-apt_jp46.yml) [![`cmake-apt_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-apt_jp51.yml?label=cmake-apt:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-apt_jp51.yml) | +|     [`cmake:pip`](/packages/build/cmake/cmake_pip) | [![`cmake-pip_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-pip_jp46.yml?label=cmake-pip:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-pip_jp46.yml) [![`cmake-pip_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-pip_jp51.yml?label=cmake-pip:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-pip_jp51.yml) | +|     [`nodejs`](/packages/build/nodejs) | | +|     [`protobuf:apt`](/packages/build/protobuf/protobuf_apt) | [![`protobuf-apt_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-apt_jp46.yml?label=protobuf-apt:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-apt_jp46.yml) [![`protobuf-apt_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-apt_jp51.yml?label=protobuf-apt:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-apt_jp51.yml) | +|     [`protobuf:cpp`](/packages/build/protobuf/protobuf_cpp) | [![`protobuf-cpp_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-cpp_jp46.yml?label=protobuf-cpp:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-cpp_jp46.yml) [![`protobuf-cpp_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-cpp_jp51.yml?label=protobuf-cpp:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-cpp_jp51.yml) | +|     [`python:3.10`](/packages/build/python) | | +|     [`python:3.11`](/packages/build/python) | | +|     [`python:3.12`](/packages/build/python) | | +|     [`python:3.6`](/packages/build/python) | | +|     [`python:3.8`](/packages/build/python) | | +|     [`rust`](/packages/build/rust) | [![`rust_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/rust_jp46.yml?label=rust:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/rust_jp46.yml) [![`rust_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/rust_jp51.yml?label=rust:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/rust_jp51.yml) | | **`CORE`** | | |     [`arrow:12.0.1`](/packages/arrow) | | |     [`arrow:14.0.1`](/packages/arrow) | | |     [`arrow:5.0.0`](/packages/arrow) | | -|     [`docker`](/packages/docker) | | +|     [`docker`](/packages/build/docker) | | +|     [`ffmpeg`](/packages/ffmpeg) | | |     [`gstreamer`](/packages/gstreamer) | [![`gstreamer_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/gstreamer_jp46.yml?label=gstreamer:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/gstreamer_jp46.yml) [![`gstreamer_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/gstreamer_jp51.yml?label=gstreamer:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/gstreamer_jp51.yml) [![`gstreamer_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/gstreamer_jp60.yml?label=gstreamer:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/gstreamer_jp60.yml) | |     [`jupyterlab`](/packages/jupyterlab) | [![`jupyterlab_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/jupyterlab_jp46.yml?label=jupyterlab:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/jupyterlab_jp46.yml) [![`jupyterlab_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/jupyterlab_jp51.yml?label=jupyterlab:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/jupyterlab_jp51.yml) [![`jupyterlab_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/jupyterlab_jp60.yml?label=jupyterlab:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/jupyterlab_jp60.yml) | |     [`numpy`](/packages/numpy) | [![`numpy_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/numpy_jp46.yml?label=numpy:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/numpy_jp46.yml) [![`numpy_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/numpy_jp51.yml?label=numpy:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/numpy_jp51.yml) [![`numpy_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/numpy_jp60.yml?label=numpy:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/numpy_jp60.yml) | +|     [`opencv:4.5.0`](/packages/opencv) | | +|     [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) | | |     [`opencv:4.8.1`](/packages/opencv) | [![`opencv-481_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/opencv-481_jp60.yml?label=opencv-481:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/opencv-481_jp60.yml) | |     [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) | | |     [`opencv:4.9.0`](/packages/opencv) | | | **`CUDA`** | | -|     [`cuda-python:12.2`](/packages/cuda/cuda-python) | | -|     [`cuda-python:12.4`](/packages/cuda/cuda-python) | | +|     [`cuda-python:11.4`](/packages/cuda/cuda-python) | | +|     [`cuda:11.4`](/packages/cuda/cuda) | | +|     [`cuda:11.4-samples`](/packages/cuda/cuda) | | +|     [`cuda:11.8`](/packages/cuda/cuda) | | +|     [`cuda:11.8-samples`](/packages/cuda/cuda) | | |     [`cuda:12.2`](/packages/cuda/cuda) | [![`cuda-122_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cuda-122_jp60.yml?label=cuda-122:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cuda-122_jp60.yml) | |     [`cuda:12.2-samples`](/packages/cuda/cuda) | [![`cuda-122-samples_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cuda-122-samples_jp60.yml?label=cuda-122-samples:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cuda-122-samples_jp60.yml) | |     [`cuda:12.4`](/packages/cuda/cuda) | | |     [`cuda:12.4-samples`](/packages/cuda/cuda) | | +|     [`cudnn`](/packages/cuda/cudnn) | | |     [`cudnn:8.9`](/packages/cuda/cudnn) | [![`cudnn-89_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cudnn-89_jp60.yml?label=cudnn-89:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cudnn-89_jp60.yml) | |     [`cudnn:9.0`](/packages/cuda/cudnn) | | |     [`cupy`](/packages/cuda/cupy) | [![`cupy_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cupy_jp46.yml?label=cupy:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cupy_jp46.yml) [![`cupy_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cupy_jp51.yml?label=cupy:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cupy_jp51.yml) [![`cupy_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cupy_jp60.yml?label=cupy:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cupy_jp60.yml) | @@ -56,7 +65,11 @@ |     [`auto_awq:0.2.4`](/packages/llm/auto_awq) | | |     [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) | | |     [`awq:0.1.0`](/packages/llm/awq) | | +|     [`bitsandbytes`](/packages/llm/bitsandbytes) | [![`bitsandbytes_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/bitsandbytes_jp51.yml?label=bitsandbytes:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/bitsandbytes_jp51.yml) | +|     [`bitsandbytes:builder`](/packages/llm/bitsandbytes) | | +|     [`exllama:0.0.14`](/packages/llm/exllama) | | |     [`exllama:0.0.15`](/packages/llm/exllama) | | +|     [`flash-attention`](/packages/llm/flash-attention) | | |     [`gptq-for-llama`](/packages/llm/gptq-for-llama) | [![`gptq-for-llama_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/gptq-for-llama_jp51.yml?label=gptq-for-llama:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/gptq-for-llama_jp51.yml) [![`gptq-for-llama_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/gptq-for-llama_jp60.yml?label=gptq-for-llama:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/gptq-for-llama_jp60.yml) | |     [`huggingface_hub`](/packages/llm/huggingface_hub) | [![`huggingface_hub_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/huggingface_hub_jp46.yml?label=huggingface_hub:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/huggingface_hub_jp46.yml) [![`huggingface_hub_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/huggingface_hub_jp51.yml?label=huggingface_hub:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/huggingface_hub_jp51.yml) | |     [`l4t-text-generation`](/packages/l4t/l4t-text-generation) | [![`l4t-text-generation_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-text-generation_jp51.yml?label=l4t-text-generation:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-text-generation_jp51.yml) [![`l4t-text-generation_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-text-generation_jp60.yml?label=l4t-text-generation:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-text-generation_jp60.yml) | @@ -69,11 +82,16 @@ |     [`minigpt4`](/packages/llm/minigpt4) | [![`minigpt4_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/minigpt4_jp51.yml?label=minigpt4:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/minigpt4_jp51.yml) [![`minigpt4_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/minigpt4_jp60.yml?label=minigpt4:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/minigpt4_jp60.yml) | |     [`mlc:51fb0f4`](/packages/llm/mlc) | | |     [`mlc:607dc5a`](/packages/llm/mlc) | | +|     [`nano_llm:24.4`](/packages/llm/nano_llm) | | +|     [`nano_llm:main`](/packages/llm/nano_llm) | | |     [`ollama`](/packages/llm/ollama) | | |     [`openai`](/packages/llm/openai) | | |     [`optimum`](/packages/llm/optimum) | [![`optimum_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/optimum_jp46.yml?label=optimum:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/optimum_jp46.yml) [![`optimum_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/optimum_jp51.yml?label=optimum:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/optimum_jp51.yml) | +|     [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) | | +|     [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) | | |     [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) | | |     [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) | | +|     [`text-generation-inference`](/packages/llm/text-generation-inference) | [![`text-generation-inference_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/text-generation-inference_jp51.yml?label=text-generation-inference:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/text-generation-inference_jp51.yml) | |     [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) | | |     [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) | | |     [`text-generation-webui:main`](/packages/llm/text-generation-webui) | | @@ -86,30 +104,48 @@ |     [`jetson-inference`](/packages/jetson-inference) | | |     [`l4t-ml`](/packages/l4t/l4t-ml) | [![`l4t-ml_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-ml_jp46.yml?label=l4t-ml:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-ml_jp46.yml) [![`l4t-ml_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-ml_jp51.yml?label=l4t-ml:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-ml_jp51.yml) [![`l4t-ml_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-ml_jp60.yml?label=l4t-ml:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-ml_jp60.yml) | |     [`l4t-pytorch`](/packages/l4t/l4t-pytorch) | [![`l4t-pytorch_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-pytorch_jp46.yml?label=l4t-pytorch:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-pytorch_jp46.yml) [![`l4t-pytorch_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-pytorch_jp51.yml?label=l4t-pytorch:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-pytorch_jp51.yml) [![`l4t-pytorch_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-pytorch_jp60.yml?label=l4t-pytorch:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-pytorch_jp60.yml) | +|     [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) | [![`l4t-tensorflow-tf1_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-tensorflow-tf1_jp46.yml?label=l4t-tensorflow-tf1:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-tensorflow-tf1_jp46.yml) [![`l4t-tensorflow-tf1_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-tensorflow-tf1_jp51.yml?label=l4t-tensorflow-tf1:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-tensorflow-tf1_jp51.yml) | |     [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) | [![`l4t-tensorflow-tf2_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-tensorflow-tf2_jp46.yml?label=l4t-tensorflow-tf2:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-tensorflow-tf2_jp46.yml) [![`l4t-tensorflow-tf2_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-tensorflow-tf2_jp51.yml?label=l4t-tensorflow-tf2:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-tensorflow-tf2_jp51.yml) [![`l4t-tensorflow-tf2_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-tensorflow-tf2_jp60.yml?label=l4t-tensorflow-tf2:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-tensorflow-tf2_jp60.yml) | |     [`nemo`](/packages/nemo) | [![`nemo_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nemo_jp46.yml?label=nemo:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nemo_jp46.yml) [![`nemo_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nemo_jp51.yml?label=nemo:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nemo_jp51.yml) [![`nemo_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nemo_jp60.yml?label=nemo:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nemo_jp60.yml) | |     [`onnx`](/packages/onnx) | [![`onnx_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/onnx_jp46.yml?label=onnx:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/onnx_jp46.yml) [![`onnx_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/onnx_jp51.yml?label=onnx:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/onnx_jp51.yml) | +|     [`onnxruntime:1.11`](/packages/onnxruntime) | | +|     [`onnxruntime:1.11-builder`](/packages/onnxruntime) | | +|     [`onnxruntime:1.16.3`](/packages/onnxruntime) | | +|     [`onnxruntime:1.16.3-builder`](/packages/onnxruntime) | | |     [`onnxruntime:1.17`](/packages/onnxruntime) | | |     [`onnxruntime:1.17-builder`](/packages/onnxruntime) | | |     [`openai-triton`](/packages/openai-triton) | | |     [`openai-triton:builder`](/packages/openai-triton) | | +|     [`tensorflow`](/packages/tensorflow) | [![`tensorflow_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorflow_jp46.yml?label=tensorflow:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorflow_jp46.yml) [![`tensorflow_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorflow_jp51.yml?label=tensorflow:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorflow_jp51.yml) | |     [`tensorflow2`](/packages/tensorflow) | [![`tensorflow2_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorflow2_jp46.yml?label=tensorflow2:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorflow2_jp46.yml) [![`tensorflow2_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorflow2_jp51.yml?label=tensorflow2:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorflow2_jp51.yml) [![`tensorflow2_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorflow2_jp60.yml?label=tensorflow2:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorflow2_jp60.yml) | |     [`tritonserver`](/packages/tritonserver) | [![`tritonserver_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tritonserver_jp46.yml?label=tritonserver:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tritonserver_jp46.yml) [![`tritonserver_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tritonserver_jp51.yml?label=tritonserver:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tritonserver_jp51.yml) [![`tritonserver_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tritonserver_jp60.yml?label=tritonserver:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tritonserver_jp60.yml) | |     [`tvm`](/packages/tvm) | [![`tvm_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tvm_jp51.yml?label=tvm:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tvm_jp51.yml) | | **`OTHER`** | | +|     [`tensorrt`](/packages/tensorrt) | | +|     [`tensorrt:10.0`](/packages/tensorrt) | | |     [`tensorrt:8.6`](/packages/tensorrt) | [![`tensorrt-86_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorrt-86_jp60.yml?label=tensorrt-86:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorrt-86_jp60.yml) | | **`PYTORCH`** | | +|     [`pytorch:1.10`](/packages/pytorch) | [![`pytorch-110_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-110_jp46.yml?label=pytorch-110:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-110_jp46.yml) | +|     [`pytorch:1.9`](/packages/pytorch) | [![`pytorch-19_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-19_jp46.yml?label=pytorch-19:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-19_jp46.yml) | +|     [`pytorch:2.0`](/packages/pytorch) | [![`pytorch-20_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-20_jp51.yml?label=pytorch-20:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-20_jp51.yml) | |     [`pytorch:2.1`](/packages/pytorch) | [![`pytorch-21_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-21_jp51.yml?label=pytorch-21:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-21_jp51.yml) [![`pytorch-21_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-21_jp60.yml?label=pytorch-21:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-21_jp60.yml) | |     [`pytorch:2.2`](/packages/pytorch) | | |     [`pytorch:2.3`](/packages/pytorch) | | |     [`torch2trt`](/packages/pytorch/torch2trt) | [![`torch2trt_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torch2trt_jp46.yml?label=torch2trt:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torch2trt_jp46.yml) [![`torch2trt_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torch2trt_jp51.yml?label=torch2trt:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torch2trt_jp51.yml) | |     [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) | [![`torch_tensorrt_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torch_tensorrt_jp46.yml?label=torch_tensorrt:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torch_tensorrt_jp46.yml) [![`torch_tensorrt_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torch_tensorrt_jp51.yml?label=torch_tensorrt:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torch_tensorrt_jp51.yml) | +|     [`torchaudio:0.10.0`](/packages/pytorch/torchaudio) | | +|     [`torchaudio:0.9.0`](/packages/pytorch/torchaudio) | | +|     [`torchaudio:2.0.1`](/packages/pytorch/torchaudio) | | |     [`torchaudio:2.1.0`](/packages/pytorch/torchaudio) | | |     [`torchaudio:2.2.2`](/packages/pytorch/torchaudio) | | |     [`torchaudio:2.3.0`](/packages/pytorch/torchaudio) | | +|     [`torchvision:0.10.0`](/packages/pytorch/torchvision) | | +|     [`torchvision:0.11.1`](/packages/pytorch/torchvision) | | +|     [`torchvision:0.15.1`](/packages/pytorch/torchvision) | | |     [`torchvision:0.16.2`](/packages/pytorch/torchvision) | | |     [`torchvision:0.17.2`](/packages/pytorch/torchvision) | | | **`RAPIDS`** | | +|     [`cudf:21.10.02`](/packages/rapids/cudf) | | |     [`cudf:23.10.03`](/packages/rapids/cudf) | [![`cudf-231003_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cudf-231003_jp60.yml?label=cudf-231003:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cudf-231003_jp60.yml) | |     [`cuml`](/packages/rapids/cuml) | [![`cuml_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cuml_jp51.yml?label=cuml:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cuml_jp51.yml) | | **`ROS`** | | @@ -125,19 +161,26 @@ |     [`ros:iron-desktop`](/packages/ros) | [![`ros-iron-desktop_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-desktop_jp46.yml?label=ros-iron-desktop:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-desktop_jp46.yml) [![`ros-iron-desktop_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-desktop_jp51.yml?label=ros-iron-desktop:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-desktop_jp51.yml) | |     [`ros:iron-ros-base`](/packages/ros) | [![`ros-iron-ros-base_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-ros-base_jp46.yml?label=ros-iron-ros-base:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-ros-base_jp46.yml) [![`ros-iron-ros-base_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-ros-base_jp51.yml?label=ros-iron-ros-base:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-ros-base_jp51.yml) | |     [`ros:iron-ros-core`](/packages/ros) | [![`ros-iron-ros-core_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-ros-core_jp46.yml?label=ros-iron-ros-core:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-ros-core_jp46.yml) [![`ros-iron-ros-core_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-ros-core_jp51.yml?label=ros-iron-ros-core:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-ros-core_jp51.yml) | +|     [`ros:melodic-desktop`](/packages/ros) | [![`ros-melodic-desktop_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-melodic-desktop_jp46.yml?label=ros-melodic-desktop:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-melodic-desktop_jp46.yml) | +|     [`ros:melodic-ros-base`](/packages/ros) | [![`ros-melodic-ros-base_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-melodic-ros-base_jp46.yml?label=ros-melodic-ros-base:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-melodic-ros-base_jp46.yml) | +|     [`ros:melodic-ros-core`](/packages/ros) | [![`ros-melodic-ros-core_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-melodic-ros-core_jp46.yml?label=ros-melodic-ros-core:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-melodic-ros-core_jp46.yml) | |     [`ros:noetic-desktop`](/packages/ros) | [![`ros-noetic-desktop_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-desktop_jp46.yml?label=ros-noetic-desktop:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-desktop_jp46.yml) [![`ros-noetic-desktop_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-desktop_jp51.yml?label=ros-noetic-desktop:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-desktop_jp51.yml) | |     [`ros:noetic-ros-base`](/packages/ros) | [![`ros-noetic-ros-base_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-ros-base_jp46.yml?label=ros-noetic-ros-base:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-ros-base_jp46.yml) [![`ros-noetic-ros-base_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-ros-base_jp51.yml?label=ros-noetic-ros-base:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-ros-base_jp51.yml) | |     [`ros:noetic-ros-core`](/packages/ros) | [![`ros-noetic-ros-core_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-ros-core_jp46.yml?label=ros-noetic-ros-core:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-ros-core_jp46.yml) [![`ros-noetic-ros-core_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-ros-core_jp51.yml?label=ros-noetic-ros-core:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-ros-core_jp51.yml) | | **`SENSORS`** | | |     [`realsense`](/packages/realsense) | [![`realsense_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/realsense_jp46.yml?label=realsense:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/realsense_jp46.yml) [![`realsense_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/realsense_jp51.yml?label=realsense:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/realsense_jp51.yml) [![`realsense_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/realsense_jp60.yml?label=realsense:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/realsense_jp60.yml) | |     [`zed`](/packages/zed) | [![`zed_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/zed_jp46.yml?label=zed:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/zed_jp46.yml) [![`zed_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/zed_jp51.yml?label=zed:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/zed_jp51.yml) | +| **`SMART-HOME`** | | +|     [`homeassistant-base`](/packages/smart-home/homeassistant-base) | | +|     [`homeassistant-core:2024.4.2`](/packages/smart-home/homeassistant-core) | | +|     [`homeassistant-core:latest`](/packages/smart-home/homeassistant-core) | | | **`TRANSFORMER`** | | |     [`ctranslate2`](/packages/ctranslate2) | | | **`VECTORDB`** | | -|     [`faiss:be12427`](/packages/vectordb/faiss) | | -|     [`faiss:be12427-builder`](/packages/vectordb/faiss) | | -|     [`faiss:v1.7.3`](/packages/vectordb/faiss) | | -|     [`faiss:v1.7.3-builder`](/packages/vectordb/faiss) | | +|     [`faiss:1.7.3`](/packages/vectordb/faiss) | | +|     [`faiss:1.7.3-builder`](/packages/vectordb/faiss) | | +|     [`faiss:1.7.4`](/packages/vectordb/faiss) | | +|     [`faiss:1.7.4-builder`](/packages/vectordb/faiss) | | |     [`faiss_lite`](/packages/vectordb/faiss_lite) | | |     [`nanodb`](/packages/vectordb/nanodb) | [![`nanodb_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nanodb_jp51.yml?label=nanodb:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nanodb_jp51.yml) [![`nanodb_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nanodb_jp60.yml?label=nanodb:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nanodb_jp60.yml) | |     [`raft`](/packages/rapids/raft) | | @@ -147,3 +190,5 @@ |     [`nanosam`](/packages/vit/nanosam) | [![`nanosam_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nanosam_jp51.yml?label=nanosam:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nanosam_jp51.yml) [![`nanosam_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nanosam_jp60.yml?label=nanosam:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nanosam_jp60.yml) | |     [`sam`](/packages/vit/sam) | [![`sam_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/sam_jp51.yml?label=sam:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/sam_jp51.yml) [![`sam_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/sam_jp60.yml?label=sam:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/sam_jp60.yml) | |     [`tam`](/packages/vit/tam) | [![`tam_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tam_jp51.yml?label=tam:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tam_jp51.yml) [![`tam_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tam_jp60.yml?label=tam:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tam_jp60.yml) | +| **`WYOMING`** | | +|     [`wyoming-openwakeword:latest`](/packages/smart-home/wyoming/openwakeword) | | From d1573a3e8d7ba3fef36ebb23a7391e60eaf64db7 Mon Sep 17 00:00:00 2001 From: Dustin Franklin Date: Mon, 15 Apr 2024 11:13:56 -0400 Subject: [PATCH 2/2] updated docs --- packages/arrow/README.md | 28 ++-- packages/audio/audiocraft/README.md | 22 +-- packages/audio/faster-whisper/README.md | 20 +-- packages/audio/piper-tts/README.md | 20 +-- packages/audio/riva-client/README.md | 34 ++--- packages/audio/whisper/README.md | 22 +-- packages/audio/whisperx/README.md | 22 +-- packages/audio/xtts/README.md | 53 +++++++ packages/build/bazel/README.md | 22 +-- packages/build/build-essential/README.md | 24 +-- packages/build/cmake/cmake_apt/README.md | 20 +-- packages/build/cmake/cmake_pip/README.md | 24 +-- packages/build/docker/README.md | 20 +-- packages/build/nodejs/README.md | 20 +-- .../build/protobuf/protobuf_apt/README.md | 20 +-- .../build/protobuf/protobuf_cpp/README.md | 20 +-- packages/build/python/README.md | 62 +++++--- packages/build/rust/README.md | 24 +-- packages/ctranslate2/README.md | 20 +-- packages/cuda/cuda-python/README.md | 39 +++-- packages/cuda/cuda/README.md | 60 +++++--- packages/cuda/cudnn/README.md | 38 +++-- packages/cuda/cupy/README.md | 22 +-- packages/cuda/pycuda/README.md | 22 +-- packages/deepstream/README.md | 22 +-- .../stable-diffusion-webui/README.md | 22 +-- packages/diffusion/stable-diffusion/README.md | 22 +-- packages/ffmpeg/README.md | 54 +++++++ packages/gstreamer/README.md | 24 +-- packages/jetson-inference/README.md | 24 +-- packages/jetson-utils/README.md | 22 +-- packages/jupyterlab/README.md | 22 +-- packages/l4t/l4t-diffusion/README.md | 22 +-- packages/l4t/l4t-ml/README.md | 22 +-- packages/l4t/l4t-pytorch/README.md | 22 +-- packages/l4t/l4t-tensorflow/README.md | 26 ++-- packages/l4t/l4t-text-generation/README.md | 26 ++-- packages/llm/auto_awq/README.md | 23 +-- packages/llm/auto_gptq/README.md | 27 ++-- packages/llm/awq/README.md | 33 ++--- packages/llm/bitsandbytes/README.md | 31 ++-- packages/llm/exllama/README.md | 39 +++-- packages/llm/flash-attention/README.md | 52 +++++++ packages/llm/gptq-for-llama/README.md | 23 ++- packages/llm/huggingface_hub/README.md | 24 +-- packages/llm/langchain/README.md | 26 ++-- packages/llm/llama_cpp/README.md | 34 ++--- packages/llm/llamaspeak/README.md | 22 +-- packages/llm/llava/README.md | 22 +-- packages/llm/local_llm/README.md | 28 ++-- packages/llm/minigpt4/README.md | 22 +-- packages/llm/mlc/README.md | 137 ++---------------- packages/llm/nano_llm/README.md | 9 +- packages/llm/ollama/README.md | 20 +-- packages/llm/openai/README.md | 54 +++++++ packages/llm/optimum/README.md | 22 +-- packages/llm/tensorrt_llm/README.md | 78 ++++++++++ .../llm/text-generation-inference/README.md | 22 +-- packages/llm/text-generation-webui/README.md | 33 +++-- packages/llm/transformers/README.md | 32 ++-- packages/llm/xformers/README.md | 22 +-- packages/nemo/README.md | 22 +-- packages/numba/README.md | 22 +-- packages/numpy/README.md | 24 +-- packages/onnx/README.md | 26 ++-- packages/onnxruntime/README.md | 71 +++++++-- packages/openai-triton/README.md | 29 ++-- packages/opencv/README.md | 43 +++--- packages/opencv/opencv_builder/README.md | 26 ++-- packages/pytorch/README.md | 114 ++++++--------- packages/pytorch/torch2trt/README.md | 24 +-- packages/pytorch/torch_tensorrt/README.md | 22 +-- packages/pytorch/torchaudio/README.md | 59 ++++++-- packages/pytorch/torchvision/README.md | 53 +++++-- packages/rapids/cudf/README.md | 26 ++-- packages/rapids/cuml/README.md | 22 +-- packages/rapids/raft/README.md | 20 +-- packages/realsense/README.md | 22 +-- packages/ros/README.md | 90 ++++++------ .../smart-home/homeassistant-base/README.md | 20 +-- .../smart-home/homeassistant-core/README.md | 28 ++-- .../smart-home/wyoming/openwakeword/README.md | 20 +-- packages/tensorflow/README.md | 26 ++-- packages/tensorrt/README.md | 38 ++--- packages/tritonserver/README.md | 22 +-- packages/tvm/README.md | 20 +-- packages/vectordb/faiss/README.md | 58 ++++---- packages/vectordb/faiss_lite/README.md | 22 +-- packages/vectordb/nanodb/README.md | 24 +-- packages/vit/efficientvit/README.md | 22 +-- packages/vit/nanoowl/README.md | 22 +-- packages/vit/nanosam/README.md | 22 +-- packages/vit/sam/README.md | 22 +-- packages/vit/tam/README.md | 22 +-- packages/zed/README.md | 22 +-- 95 files changed, 1638 insertions(+), 1330 deletions(-) create mode 100644 packages/audio/xtts/README.md create mode 100644 packages/ffmpeg/README.md create mode 100644 packages/llm/flash-attention/README.md create mode 100644 packages/llm/openai/README.md create mode 100644 packages/llm/tensorrt_llm/README.md diff --git a/packages/arrow/README.md b/packages/arrow/README.md index 16064d079..fdc6bcc42 100644 --- a/packages/arrow/README.md +++ b/packages/arrow/README.md @@ -9,22 +9,22 @@ | **`arrow:14.0.1`** | | | :-- | :-- | |    Aliases | `arrow` | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Notes | installed under `/usr/local` | | **`arrow:12.0.1`** | | | :-- | :-- | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Notes | installed under `/usr/local` | | **`arrow:5.0.0`** | | | :-- | :-- | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Notes | installed under `/usr/local` | @@ -34,27 +34,27 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag arrow) +jetson-containers run $(autotag arrow) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host arrow:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag arrow) +jetson-containers run -v /path/on/host:/path/in/container $(autotag arrow) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag arrow) my_app --abc xyz +jetson-containers run $(autotag arrow) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -62,7 +62,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh arrow +jetson-containers build arrow ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/audio/audiocraft/README.md b/packages/audio/audiocraft/README.md index e0f68c991..1ff702719 100644 --- a/packages/audio/audiocraft/README.md +++ b/packages/audio/audiocraft/README.md @@ -10,8 +10,8 @@ docs.md | **`audiocraft`** | | | :-- | :-- | |    Builds | [![`audiocraft_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/audiocraft_jp51.yml?label=audiocraft:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/audiocraft_jp51.yml) [![`audiocraft_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/audiocraft_jp60.yml?label=audiocraft:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/audiocraft_jp60.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:distributed`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`torchaudio`](/packages/pytorch/torchaudio) [`opencv`](/packages/opencv) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`xformers`](/packages/llm/xformers) [`protobuf:cpp`](/packages/protobuf/protobuf_cpp) [`jupyterlab`](/packages/jupyterlab) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`torchaudio`](/packages/pytorch/torchaudio) [`opencv`](/packages/opencv) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`xformers`](/packages/llm/xformers) [`protobuf:cpp`](/packages/build/protobuf/protobuf_cpp) [`jupyterlab`](/packages/jupyterlab) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/audiocraft:r35.2.1`](https://hub.docker.com/r/dustynv/audiocraft/tags) `(2023-11-05, 10.7GB)`
[`dustynv/audiocraft:r35.3.1`](https://hub.docker.com/r/dustynv/audiocraft/tags) `(2024-03-07, 7.1GB)`
[`dustynv/audiocraft:r35.4.1`](https://hub.docker.com/r/dustynv/audiocraft/tags) `(2024-01-09, 7.0GB)`
[`dustynv/audiocraft:r36.2.0`](https://hub.docker.com/r/dustynv/audiocraft/tags) `(2024-03-07, 8.6GB)` | @@ -37,29 +37,29 @@ docs.md
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag audiocraft) +jetson-containers run $(autotag audiocraft) # or explicitly specify one of the container images above -./run.sh dustynv/audiocraft:r35.3.1 +jetson-containers run dustynv/audiocraft:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/audiocraft:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag audiocraft) +jetson-containers run -v /path/on/host:/path/in/container $(autotag audiocraft) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag audiocraft) my_app --abc xyz +jetson-containers run $(autotag audiocraft) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +67,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh audiocraft +jetson-containers build audiocraft ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/audio/faster-whisper/README.md b/packages/audio/faster-whisper/README.md index 46d3d7ec1..9e16dab52 100644 --- a/packages/audio/faster-whisper/README.md +++ b/packages/audio/faster-whisper/README.md @@ -9,8 +9,8 @@ docs.md | **`faster-whisper`** | | | :-- | :-- | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) [`ctranslate2`](/packages/ctranslate2) [`huggingface_hub`](/packages/llm/huggingface_hub) [`numpy`](/packages/numpy) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) [`ctranslate2`](/packages/ctranslate2) [`huggingface_hub`](/packages/llm/huggingface_hub) [`numpy`](/packages/numpy) | |    Dependants | [`whisperx`](/packages/audio/whisperx) | |    Dockerfile | [`Dockerfile`](Dockerfile) | @@ -20,27 +20,27 @@ docs.md RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag faster-whisper) +jetson-containers run $(autotag faster-whisper) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host faster-whisper:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag faster-whisper) +jetson-containers run -v /path/on/host:/path/in/container $(autotag faster-whisper) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag faster-whisper) my_app --abc xyz +jetson-containers run $(autotag faster-whisper) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -48,7 +48,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh faster-whisper +jetson-containers build faster-whisper ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/audio/piper-tts/README.md b/packages/audio/piper-tts/README.md index 60127772f..29470e91c 100644 --- a/packages/audio/piper-tts/README.md +++ b/packages/audio/piper-tts/README.md @@ -9,7 +9,7 @@ | **`piper-tts`** | | | :-- | :-- | |    Requires | `L4T ['>=32.6']` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda:12.2`](/packages/cuda/cuda) [`cudnn:8.9`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/cmake/cmake_pip) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) [`onnxruntime`](/packages/onnxruntime) | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) [`onnxruntime`](/packages/onnxruntime) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/piper-tts:r35.4.1`](https://hub.docker.com/r/dustynv/piper-tts/tags) `(2024-04-07, 5.5GB)`
[`dustynv/piper-tts:r36.2.0`](https://hub.docker.com/r/dustynv/piper-tts/tags) `(2024-04-07, 6.7GB)` | @@ -33,29 +33,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag piper-tts) +jetson-containers run $(autotag piper-tts) # or explicitly specify one of the container images above -./run.sh dustynv/piper-tts:r36.2.0 +jetson-containers run dustynv/piper-tts:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/piper-tts:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag piper-tts) +jetson-containers run -v /path/on/host:/path/in/container $(autotag piper-tts) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag piper-tts) my_app --abc xyz +jetson-containers run $(autotag piper-tts) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -63,7 +63,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh piper-tts +jetson-containers build piper-tts ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/audio/riva-client/README.md b/packages/audio/riva-client/README.md index 3aa5059dc..9d535d405 100644 --- a/packages/audio/riva-client/README.md +++ b/packages/audio/riva-client/README.md @@ -62,8 +62,8 @@ To feed the live ASR transcript into the TTS and have it speak your words back t | **`riva-client:cpp`** | | | :-- | :-- | |    Builds | [![`riva-client-cpp_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/riva-client-cpp_jp51.yml?label=riva-client-cpp:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/riva-client-cpp_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`bazel`](/packages/bazel) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`bazel`](/packages/build/bazel) | |    Dockerfile | [`Dockerfile.cpp`](Dockerfile.cpp) | |    Images | [`dustynv/riva-client:cpp-r35.2.1`](https://hub.docker.com/r/dustynv/riva-client/tags) `(2023-08-29, 6.3GB)`
[`dustynv/riva-client:cpp-r35.3.1`](https://hub.docker.com/r/dustynv/riva-client/tags) `(2024-02-24, 6.3GB)`
[`dustynv/riva-client:cpp-r35.4.1`](https://hub.docker.com/r/dustynv/riva-client/tags) `(2023-10-07, 6.3GB)` | |    Notes | https://github.com/nvidia-riva/cpp-clients | @@ -71,11 +71,11 @@ To feed the live ASR transcript into the TTS and have it speak your words back t | **`riva-client:python`** | | | :-- | :-- | |    Builds | [![`riva-client-python_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/riva-client-python_jp60.yml?label=riva-client-python:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/riva-client-python_jp60.yml) [![`riva-client-python_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/riva-client-python_jp51.yml?label=riva-client-python:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/riva-client-python_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) | -|    Dependants | [`llamaspeak`](/packages/llm/llamaspeak) [`local_llm`](/packages/llm/local_llm) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) | +|    Dependants | [`llamaspeak`](/packages/llm/llamaspeak) [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) | |    Dockerfile | [`Dockerfile.python`](Dockerfile.python) | -|    Images | [`dustynv/riva-client:python-r35.2.1`](https://hub.docker.com/r/dustynv/riva-client/tags) `(2023-09-07, 5.0GB)`
[`dustynv/riva-client:python-r35.3.1`](https://hub.docker.com/r/dustynv/riva-client/tags) `(2024-02-24, 5.0GB)`
[`dustynv/riva-client:python-r35.4.1`](https://hub.docker.com/r/dustynv/riva-client/tags) `(2023-10-07, 5.0GB)`
[`dustynv/riva-client:python-r36.2.0`](https://hub.docker.com/r/dustynv/riva-client/tags) `(2024-02-24, 0.2GB)` | +|    Images | [`dustynv/riva-client:python-r35.2.1`](https://hub.docker.com/r/dustynv/riva-client/tags) `(2023-09-07, 5.0GB)`
[`dustynv/riva-client:python-r35.3.1`](https://hub.docker.com/r/dustynv/riva-client/tags) `(2024-02-24, 5.0GB)`
[`dustynv/riva-client:python-r35.4.1`](https://hub.docker.com/r/dustynv/riva-client/tags) `(2023-10-07, 5.0GB)`
[`dustynv/riva-client:python-r36.2.0`](https://hub.docker.com/r/dustynv/riva-client/tags) `(2024-03-11, 0.3GB)` | |    Notes | https://github.com/nvidia-riva/python-clients | @@ -92,7 +92,7 @@ To feed the live ASR transcript into the TTS and have it speak your words back t |   [`dustynv/riva-client:python-r35.2.1`](https://hub.docker.com/r/dustynv/riva-client/tags) | `2023-09-07` | `arm64` | `5.0GB` | |   [`dustynv/riva-client:python-r35.3.1`](https://hub.docker.com/r/dustynv/riva-client/tags) | `2024-02-24` | `arm64` | `5.0GB` | |   [`dustynv/riva-client:python-r35.4.1`](https://hub.docker.com/r/dustynv/riva-client/tags) | `2023-10-07` | `arm64` | `5.0GB` | -|   [`dustynv/riva-client:python-r36.2.0`](https://hub.docker.com/r/dustynv/riva-client/tags) | `2024-02-24` | `arm64` | `0.2GB` | +|   [`dustynv/riva-client:python-r36.2.0`](https://hub.docker.com/r/dustynv/riva-client/tags) | `2024-03-11` | `arm64` | `0.3GB` | |   [`dustynv/riva-client:r35.2.1`](https://hub.docker.com/r/dustynv/riva-client/tags) | `2023-08-10` | `arm64` | `6.3GB` | > Container images are compatible with other minor versions of JetPack/L4T:
@@ -104,29 +104,29 @@ To feed the live ASR transcript into the TTS and have it speak your words back t
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag riva-client) +jetson-containers run $(autotag riva-client) # or explicitly specify one of the container images above -./run.sh dustynv/riva-client:cpp-r35.3.1 +jetson-containers run dustynv/riva-client:python-r36.2.0 # or if using 'docker run' (specify image and mounts/ect) -sudo docker run --runtime nvidia -it --rm --network=host dustynv/riva-client:cpp-r35.3.1 +sudo docker run --runtime nvidia -it --rm --network=host dustynv/riva-client:python-r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag riva-client) +jetson-containers run -v /path/on/host:/path/in/container $(autotag riva-client) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag riva-client) my_app --abc xyz +jetson-containers run $(autotag riva-client) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -134,7 +134,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh riva-client +jetson-containers build riva-client ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/audio/whisper/README.md b/packages/audio/whisper/README.md index 4d5838b40..ca8f0648e 100644 --- a/packages/audio/whisper/README.md +++ b/packages/audio/whisper/README.md @@ -39,8 +39,8 @@ HTTPS (SSL) connection is needed to allow `ipywebrtc` widget to have access to t | **`whisper`** | | | :-- | :-- | |    Builds | [![`whisper_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/whisper_jp60.yml?label=whisper:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/whisper_jp60.yml) [![`whisper_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/whisper_jp51.yml?label=whisper:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/whisper_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`numpy`](/packages/numpy) [`numba`](/packages/numba) [`cudnn`](/packages/cuda/cudnn) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchaudio`](/packages/pytorch/torchaudio) [`rust`](/packages/rust) [`jupyterlab`](/packages/jupyterlab) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`numba`](/packages/numba) [`cudnn`](/packages/cuda/cudnn) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchaudio`](/packages/pytorch/torchaudio) [`rust`](/packages/build/rust) [`jupyterlab`](/packages/jupyterlab) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/whisper:r35.3.1`](https://hub.docker.com/r/dustynv/whisper/tags) `(2024-03-07, 6.0GB)`
[`dustynv/whisper:r35.4.1`](https://hub.docker.com/r/dustynv/whisper/tags) `(2023-12-14, 6.1GB)`
[`dustynv/whisper:r36.2.0`](https://hub.docker.com/r/dustynv/whisper/tags) `(2024-03-03, 7.9GB)` | @@ -65,29 +65,29 @@ HTTPS (SSL) connection is needed to allow `ipywebrtc` widget to have access to t
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag whisper) +jetson-containers run $(autotag whisper) # or explicitly specify one of the container images above -./run.sh dustynv/whisper:r35.3.1 +jetson-containers run dustynv/whisper:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/whisper:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag whisper) +jetson-containers run -v /path/on/host:/path/in/container $(autotag whisper) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag whisper) my_app --abc xyz +jetson-containers run $(autotag whisper) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -95,7 +95,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh whisper +jetson-containers build whisper ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/audio/whisperx/README.md b/packages/audio/whisperx/README.md index 483d77e64..d45873e1f 100644 --- a/packages/audio/whisperx/README.md +++ b/packages/audio/whisperx/README.md @@ -10,8 +10,8 @@ docs.md | **`whisperx`** | | | :-- | :-- | |    Builds | [![`whisperx_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/whisperx_jp60.yml?label=whisperx:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/whisperx_jp60.yml) [![`whisperx_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/whisperx_jp51.yml?label=whisperx:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/whisperx_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchaudio`](/packages/pytorch/torchaudio) [`ctranslate2`](/packages/ctranslate2) [`huggingface_hub`](/packages/llm/huggingface_hub) [`faster-whisper`](/packages/audio/faster-whisper) [`torchvision`](/packages/pytorch/torchvision) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchaudio`](/packages/pytorch/torchaudio) [`ctranslate2`](/packages/ctranslate2) [`huggingface_hub`](/packages/llm/huggingface_hub) [`faster-whisper`](/packages/audio/faster-whisper) [`torchvision`](/packages/pytorch/torchvision) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/whisperx:r35.3.1`](https://hub.docker.com/r/dustynv/whisperx/tags) `(2024-01-19, 6.4GB)`
[`dustynv/whisperx:r36.2.0`](https://hub.docker.com/r/dustynv/whisperx/tags) `(2024-01-19, 8.1GB)` | @@ -35,29 +35,29 @@ docs.md
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag whisperx) +jetson-containers run $(autotag whisperx) # or explicitly specify one of the container images above -./run.sh dustynv/whisperx:r35.3.1 +jetson-containers run dustynv/whisperx:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/whisperx:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag whisperx) +jetson-containers run -v /path/on/host:/path/in/container $(autotag whisperx) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag whisperx) my_app --abc xyz +jetson-containers run $(autotag whisperx) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -65,7 +65,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh whisperx +jetson-containers build whisperx ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/audio/xtts/README.md b/packages/audio/xtts/README.md new file mode 100644 index 000000000..4c4b8c680 --- /dev/null +++ b/packages/audio/xtts/README.md @@ -0,0 +1,53 @@ +# xtts + +> [`CONTAINERS`](#user-content-containers) [`IMAGES`](#user-content-images) [`RUN`](#user-content-run) [`BUILD`](#user-content-build) + +
+CONTAINERS +
+ +| **`xtts`** | | +| :-- | :-- | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`torchaudio`](/packages/pytorch/torchaudio) [`tensorrt`](/packages/tensorrt) [`torch2trt`](/packages/pytorch/torch2trt) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | Fork of coqui-ai/TTS with support for quantization and TensorRT | + +
+ +
+RUN CONTAINER +
+ +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +```bash +# automatically pull or build a compatible container image +jetson-containers run $(autotag xtts) + +# or if using 'docker run' (specify image and mounts/ect) +sudo docker run --runtime nvidia -it --rm --network=host xtts:35.2.1 + +``` +> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. + +To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: +```bash +jetson-containers run -v /path/on/host:/path/in/container $(autotag xtts) +``` +To launch the container running a command, as opposed to an interactive shell: +```bash +jetson-containers run $(autotag xtts) my_app --abc xyz +``` +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +
+
+BUILD CONTAINER +
+ +If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: +```bash +jetson-containers build xtts +``` +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options. +
diff --git a/packages/build/bazel/README.md b/packages/build/bazel/README.md index ac05e68c2..6e455a9e7 100644 --- a/packages/build/bazel/README.md +++ b/packages/build/bazel/README.md @@ -9,8 +9,8 @@ | **`bazel`** | | | :-- | :-- | |    Builds | [![`bazel_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/bazel_jp46.yml?label=bazel:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/bazel_jp46.yml) [![`bazel_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/bazel_jp51.yml?label=bazel:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/bazel_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | |    Dependants | [`riva-client:cpp`](/packages/audio/riva-client) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/bazel:r32.7.1`](https://hub.docker.com/r/dustynv/bazel/tags) `(2023-09-07, 0.4GB)`
[`dustynv/bazel:r35.2.1`](https://hub.docker.com/r/dustynv/bazel/tags) `(2023-09-07, 5.1GB)`
[`dustynv/bazel:r35.3.1`](https://hub.docker.com/r/dustynv/bazel/tags) `(2023-08-29, 5.1GB)`
[`dustynv/bazel:r35.4.1`](https://hub.docker.com/r/dustynv/bazel/tags) `(2023-10-07, 5.1GB)` | @@ -37,29 +37,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag bazel) +jetson-containers run $(autotag bazel) # or explicitly specify one of the container images above -./run.sh dustynv/bazel:r35.4.1 +jetson-containers run dustynv/bazel:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/bazel:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag bazel) +jetson-containers run -v /path/on/host:/path/in/container $(autotag bazel) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag bazel) my_app --abc xyz +jetson-containers run $(autotag bazel) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +67,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh bazel +jetson-containers build bazel ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/build/build-essential/README.md b/packages/build/build-essential/README.md index 21991e6a6..b8c40bda3 100644 --- a/packages/build/build-essential/README.md +++ b/packages/build/build-essential/README.md @@ -9,11 +9,11 @@ | **`build-essential`** | | | :-- | :-- | |    Builds | [![`build-essential_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/build-essential_jp46.yml?label=build-essential:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/build-essential_jp46.yml) [![`build-essential_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/build-essential_jp51.yml?label=build-essential:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/build-essential_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependants | [`arrow:12.0.1`](/packages/arrow) [`arrow:14.0.1`](/packages/arrow) [`arrow:5.0.0`](/packages/arrow) [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bazel`](/packages/bazel) [`bitsandbytes`](/packages/llm/bitsandbytes) [`cmake:apt`](/packages/cmake/cmake_apt) [`cmake:pip`](/packages/cmake/cmake_pip) [`ctranslate2`](/packages/ctranslate2) [`cuda-python`](/packages/cuda/cuda-python) [`cuda-python:builder`](/packages/cuda/cuda-python) [`cuda:11.4`](/packages/cuda/cuda) [`cuda:11.4-samples`](/packages/cuda/cuda) [`cuda:11.8`](/packages/cuda/cuda) [`cuda:11.8-samples`](/packages/cuda/cuda) [`cuda:12.2`](/packages/cuda/cuda) [`cuda:12.2-samples`](/packages/cuda/cuda) [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cudnn`](/packages/cuda/cudnn) [`cudnn:8.9`](/packages/cuda/cudnn) [`cuml`](/packages/rapids/cuml) [`cupy`](/packages/cuda/cupy) [`deepstream`](/packages/deepstream) [`docker`](/packages/docker) [`efficientvit`](/packages/vit/efficientvit) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faiss:be12427`](/packages/vectordb/faiss) [`faiss:be12427-builder`](/packages/vectordb/faiss) [`faiss:v1.7.3`](/packages/vectordb/faiss) [`faiss:v1.7.3-builder`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`faster-whisper`](/packages/audio/faster-whisper) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`gstreamer`](/packages/gstreamer) [`huggingface_hub`](/packages/llm/huggingface_hub) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`jupyterlab`](/packages/jupyterlab) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:ggml`](/packages/llm/llama_cpp) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`llamaspeak`](/packages/llm/llamaspeak) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`nodejs`](/packages/nodejs) [`numba`](/packages/numba) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) [`onnxruntime`](/packages/onnxruntime) [`openai`](/packages/llm/openai) [`openai-triton`](/packages/openai-triton) [`opencv:4.5.0`](/packages/opencv) [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) [`opencv:4.8.1`](/packages/opencv) [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) [`optimum`](/packages/llm/optimum) [`protobuf:apt`](/packages/protobuf/protobuf_apt) [`protobuf:cpp`](/packages/protobuf/protobuf_cpp) [`pycuda`](/packages/cuda/pycuda) [`python`](/packages/python) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.11`](/packages/pytorch) [`pytorch:1.12`](/packages/pytorch) [`pytorch:1.13`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.0-distributed`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.1-builder`](/packages/pytorch) [`pytorch:2.1-distributed`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`realsense`](/packages/realsense) [`riva-client:cpp`](/packages/audio/riva-client) [`riva-client:python`](/packages/audio/riva-client) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`rust`](/packages/rust) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt`](/packages/tensorrt) [`tensorrt:8.6`](/packages/tensorrt) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio`](/packages/pytorch/torchaudio) [`torchvision`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tritonserver`](/packages/tritonserver) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | +|    Requires | `L4T ['>=32.6']` | +|    Dependants | [`arrow:12.0.1`](/packages/arrow) [`arrow:14.0.1`](/packages/arrow) [`arrow:5.0.0`](/packages/arrow) [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bazel`](/packages/build/bazel) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`cmake:apt`](/packages/build/cmake/cmake_apt) [`cmake:pip`](/packages/build/cmake/cmake_pip) [`ctranslate2`](/packages/ctranslate2) [`cuda-python:11.4`](/packages/cuda/cuda-python) [`cuda:11.4`](/packages/cuda/cuda) [`cuda:11.4-samples`](/packages/cuda/cuda) [`cuda:11.8`](/packages/cuda/cuda) [`cuda:11.8-samples`](/packages/cuda/cuda) [`cuda:12.2`](/packages/cuda/cuda) [`cuda:12.2-samples`](/packages/cuda/cuda) [`cuda:12.4`](/packages/cuda/cuda) [`cuda:12.4-samples`](/packages/cuda/cuda) [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cudnn`](/packages/cuda/cudnn) [`cudnn:8.9`](/packages/cuda/cudnn) [`cudnn:9.0`](/packages/cuda/cudnn) [`cuml`](/packages/rapids/cuml) [`cupy`](/packages/cuda/cupy) [`deepstream`](/packages/deepstream) [`docker`](/packages/build/docker) [`efficientvit`](/packages/vit/efficientvit) [`exllama:0.0.14`](/packages/llm/exllama) [`exllama:0.0.15`](/packages/llm/exllama) [`faiss:1.7.3`](/packages/vectordb/faiss) [`faiss:1.7.3-builder`](/packages/vectordb/faiss) [`faiss:1.7.4`](/packages/vectordb/faiss) [`faiss:1.7.4-builder`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`faster-whisper`](/packages/audio/faster-whisper) [`ffmpeg`](/packages/ffmpeg) [`flash-attention`](/packages/llm/flash-attention) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`gstreamer`](/packages/gstreamer) [`homeassistant-base`](/packages/smart-home/homeassistant-base) [`homeassistant-core:2024.4.2`](/packages/smart-home/homeassistant-core) [`homeassistant-core:latest`](/packages/smart-home/homeassistant-core) [`huggingface_hub`](/packages/llm/huggingface_hub) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`jupyterlab`](/packages/jupyterlab) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:0.2.57`](/packages/llm/llama_cpp) [`llamaspeak`](/packages/llm/llamaspeak) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`nodejs`](/packages/build/nodejs) [`numba`](/packages/numba) [`numpy`](/packages/numpy) [`ollama`](/packages/llm/ollama) [`onnx`](/packages/onnx) [`onnxruntime:1.11`](/packages/onnxruntime) [`onnxruntime:1.11-builder`](/packages/onnxruntime) [`onnxruntime:1.16.3`](/packages/onnxruntime) [`onnxruntime:1.16.3-builder`](/packages/onnxruntime) [`onnxruntime:1.17`](/packages/onnxruntime) [`onnxruntime:1.17-builder`](/packages/onnxruntime) [`openai`](/packages/llm/openai) [`openai-triton`](/packages/openai-triton) [`openai-triton:builder`](/packages/openai-triton) [`opencv:4.5.0`](/packages/opencv) [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) [`opencv:4.8.1`](/packages/opencv) [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) [`opencv:4.9.0`](/packages/opencv) [`optimum`](/packages/llm/optimum) [`piper-tts`](/packages/audio/piper-tts) [`protobuf:apt`](/packages/build/protobuf/protobuf_apt) [`protobuf:cpp`](/packages/build/protobuf/protobuf_cpp) [`pycuda`](/packages/cuda/pycuda) [`python:3.10`](/packages/build/python) [`python:3.11`](/packages/build/python) [`python:3.12`](/packages/build/python) [`python:3.6`](/packages/build/python) [`python:3.8`](/packages/build/python) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.2`](/packages/pytorch) [`pytorch:2.3`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`realsense`](/packages/realsense) [`riva-client:cpp`](/packages/audio/riva-client) [`riva-client:python`](/packages/audio/riva-client) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`rust`](/packages/build/rust) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt`](/packages/tensorrt) [`tensorrt:10.0`](/packages/tensorrt) [`tensorrt:8.6`](/packages/tensorrt) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio:0.10.0`](/packages/pytorch/torchaudio) [`torchaudio:0.9.0`](/packages/pytorch/torchaudio) [`torchaudio:2.0.1`](/packages/pytorch/torchaudio) [`torchaudio:2.1.0`](/packages/pytorch/torchaudio) [`torchaudio:2.2.2`](/packages/pytorch/torchaudio) [`torchaudio:2.3.0`](/packages/pytorch/torchaudio) [`torchvision:0.10.0`](/packages/pytorch/torchvision) [`torchvision:0.11.1`](/packages/pytorch/torchvision) [`torchvision:0.15.1`](/packages/pytorch/torchvision) [`torchvision:0.16.2`](/packages/pytorch/torchvision) [`torchvision:0.17.2`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tritonserver`](/packages/tritonserver) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`wyoming-openwakeword:latest`](/packages/smart-home/wyoming/openwakeword) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/build-essential:r32.7.1`](https://hub.docker.com/r/dustynv/build-essential/tags) `(2023-09-07, 0.3GB)`
[`dustynv/build-essential:r35.2.1`](https://hub.docker.com/r/dustynv/build-essential/tags) `(2023-09-07, 4.9GB)`
[`dustynv/build-essential:r35.3.1`](https://hub.docker.com/r/dustynv/build-essential/tags) `(2023-08-29, 4.9GB)`
[`dustynv/build-essential:r35.4.1`](https://hub.docker.com/r/dustynv/build-essential/tags) `(2023-10-07, 4.9GB)` | -|    Notes | installs compilers and build tools | +|    Notes | installs compilers, build tools & configures the default locale | @@ -37,29 +37,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag build-essential) +jetson-containers run $(autotag build-essential) # or explicitly specify one of the container images above -./run.sh dustynv/build-essential:r35.4.1 +jetson-containers run dustynv/build-essential:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/build-essential:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag build-essential) +jetson-containers run -v /path/on/host:/path/in/container $(autotag build-essential) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag build-essential) my_app --abc xyz +jetson-containers run $(autotag build-essential) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +67,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh build-essential +jetson-containers build build-essential ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/build/cmake/cmake_apt/README.md b/packages/build/cmake/cmake_apt/README.md index 1720d65da..c9d217c31 100644 --- a/packages/build/cmake/cmake_apt/README.md +++ b/packages/build/cmake/cmake_apt/README.md @@ -9,8 +9,8 @@ | **`cmake:apt`** | | | :-- | :-- | |    Builds | [![`cmake-apt_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-apt_jp51.yml?label=cmake-apt:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-apt_jp51.yml) [![`cmake-apt_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-apt_jp46.yml?label=cmake-apt:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-apt_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | |    Dependants | [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/cmake:apt-r32.7.1`](https://hub.docker.com/r/dustynv/cmake/tags) `(2023-09-07, 0.3GB)`
[`dustynv/cmake:apt-r35.2.1`](https://hub.docker.com/r/dustynv/cmake/tags) `(2023-08-29, 4.9GB)`
[`dustynv/cmake:apt-r35.3.1`](https://hub.docker.com/r/dustynv/cmake/tags) `(2023-09-07, 5.0GB)`
[`dustynv/cmake:apt-r35.4.1`](https://hub.docker.com/r/dustynv/cmake/tags) `(2023-10-07, 4.9GB)` | @@ -22,27 +22,27 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag cmake_apt) +jetson-containers run $(autotag cmake_apt) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host cmake_apt:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag cmake_apt) +jetson-containers run -v /path/on/host:/path/in/container $(autotag cmake_apt) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag cmake_apt) my_app --abc xyz +jetson-containers run $(autotag cmake_apt) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -50,7 +50,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh cmake_apt +jetson-containers build cmake_apt ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/build/cmake/cmake_pip/README.md b/packages/build/cmake/cmake_pip/README.md index 4a74a066e..d4da06c44 100644 --- a/packages/build/cmake/cmake_pip/README.md +++ b/packages/build/cmake/cmake_pip/README.md @@ -10,12 +10,12 @@ | :-- | :-- | |    Aliases | `cmake` | |    Builds | [![`cmake-pip_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-pip_jp51.yml?label=cmake-pip:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-pip_jp51.yml) [![`cmake-pip_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cmake-pip_jp46.yml?label=cmake-pip:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cmake-pip_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) | -|    Dependants | [`arrow:12.0.1`](/packages/arrow) [`arrow:14.0.1`](/packages/arrow) [`arrow:5.0.0`](/packages/arrow) [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`ctranslate2`](/packages/ctranslate2) [`cuda:11.4-samples`](/packages/cuda/cuda) [`cuda:11.8-samples`](/packages/cuda/cuda) [`cuda:12.2-samples`](/packages/cuda/cuda) [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cuml`](/packages/rapids/cuml) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faiss:be12427`](/packages/vectordb/faiss) [`faiss:be12427-builder`](/packages/vectordb/faiss) [`faiss:v1.7.3`](/packages/vectordb/faiss) [`faiss:v1.7.3-builder`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`faster-whisper`](/packages/audio/faster-whisper) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:ggml`](/packages/llm/llama_cpp) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`onnx`](/packages/onnx) [`onnxruntime`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) [`optimum`](/packages/llm/optimum) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.11`](/packages/pytorch) [`pytorch:1.12`](/packages/pytorch) [`pytorch:1.13`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.0-distributed`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.1-builder`](/packages/pytorch) [`pytorch:2.1-distributed`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`realsense`](/packages/realsense) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio`](/packages/pytorch/torchaudio) [`torchvision`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) | +|    Dependants | [`arrow:12.0.1`](/packages/arrow) [`arrow:14.0.1`](/packages/arrow) [`arrow:5.0.0`](/packages/arrow) [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`ctranslate2`](/packages/ctranslate2) [`cuda:11.4-samples`](/packages/cuda/cuda) [`cuda:11.8-samples`](/packages/cuda/cuda) [`cuda:12.2-samples`](/packages/cuda/cuda) [`cuda:12.4-samples`](/packages/cuda/cuda) [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cuml`](/packages/rapids/cuml) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`exllama:0.0.14`](/packages/llm/exllama) [`exllama:0.0.15`](/packages/llm/exllama) [`faiss:1.7.3`](/packages/vectordb/faiss) [`faiss:1.7.3-builder`](/packages/vectordb/faiss) [`faiss:1.7.4`](/packages/vectordb/faiss) [`faiss:1.7.4-builder`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`faster-whisper`](/packages/audio/faster-whisper) [`flash-attention`](/packages/llm/flash-attention) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:0.2.57`](/packages/llm/llama_cpp) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`onnx`](/packages/onnx) [`onnxruntime:1.11`](/packages/onnxruntime) [`onnxruntime:1.11-builder`](/packages/onnxruntime) [`onnxruntime:1.16.3`](/packages/onnxruntime) [`onnxruntime:1.16.3-builder`](/packages/onnxruntime) [`onnxruntime:1.17`](/packages/onnxruntime) [`onnxruntime:1.17-builder`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`openai-triton:builder`](/packages/openai-triton) [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) [`optimum`](/packages/llm/optimum) [`piper-tts`](/packages/audio/piper-tts) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.2`](/packages/pytorch) [`pytorch:2.3`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`realsense`](/packages/realsense) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio:0.10.0`](/packages/pytorch/torchaudio) [`torchaudio:0.9.0`](/packages/pytorch/torchaudio) [`torchaudio:2.0.1`](/packages/pytorch/torchaudio) [`torchaudio:2.1.0`](/packages/pytorch/torchaudio) [`torchaudio:2.2.2`](/packages/pytorch/torchaudio) [`torchaudio:2.3.0`](/packages/pytorch/torchaudio) [`torchvision:0.10.0`](/packages/pytorch/torchvision) [`torchvision:0.11.1`](/packages/pytorch/torchvision) [`torchvision:0.15.1`](/packages/pytorch/torchvision) [`torchvision:0.16.2`](/packages/pytorch/torchvision) [`torchvision:0.17.2`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/cmake:pip-r32.7.1`](https://hub.docker.com/r/dustynv/cmake/tags) `(2023-12-06, 0.4GB)`
[`dustynv/cmake:pip-r35.2.1`](https://hub.docker.com/r/dustynv/cmake/tags) `(2023-09-07, 5.0GB)`
[`dustynv/cmake:pip-r35.3.1`](https://hub.docker.com/r/dustynv/cmake/tags) `(2023-12-05, 5.0GB)`
[`dustynv/cmake:pip-r35.4.1`](https://hub.docker.com/r/dustynv/cmake/tags) `(2023-10-07, 5.0GB)` | -|    Notes | upgrade cmake with pip | +|    Notes | upgrade `cmake` with `pip` | @@ -23,27 +23,27 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag cmake_pip) +jetson-containers run $(autotag cmake_pip) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host cmake_pip:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag cmake_pip) +jetson-containers run -v /path/on/host:/path/in/container $(autotag cmake_pip) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag cmake_pip) my_app --abc xyz +jetson-containers run $(autotag cmake_pip) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -51,7 +51,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh cmake_pip +jetson-containers build cmake_pip ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/build/docker/README.md b/packages/build/docker/README.md index d0011c027..6eab3846b 100644 --- a/packages/build/docker/README.md +++ b/packages/build/docker/README.md @@ -13,8 +13,8 @@ This approach works with `--runtime nvidia` and access to the GPU. Note that if | **`docker`** | | | :-- | :-- | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) | |    Dependants | [`l4t-text-generation`](/packages/l4t/l4t-text-generation) | |    Dockerfile | [`Dockerfile`](Dockerfile) | @@ -24,27 +24,27 @@ This approach works with `--runtime nvidia` and access to the GPU. Note that if RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag docker) +jetson-containers run $(autotag docker) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host docker:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag docker) +jetson-containers run -v /path/on/host:/path/in/container $(autotag docker) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag docker) my_app --abc xyz +jetson-containers run $(autotag docker) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -52,7 +52,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh docker +jetson-containers build docker ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/build/nodejs/README.md b/packages/build/nodejs/README.md index d39e983eb..408733865 100644 --- a/packages/build/nodejs/README.md +++ b/packages/build/nodejs/README.md @@ -8,8 +8,8 @@ | **`nodejs`** | | | :-- | :-- | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Notes | installs `nodejs`, `npm` | @@ -19,27 +19,27 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag nodejs) +jetson-containers run $(autotag nodejs) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host nodejs:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag nodejs) +jetson-containers run -v /path/on/host:/path/in/container $(autotag nodejs) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag nodejs) my_app --abc xyz +jetson-containers run $(autotag nodejs) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -47,7 +47,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh nodejs +jetson-containers build nodejs ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/build/protobuf/protobuf_apt/README.md b/packages/build/protobuf/protobuf_apt/README.md index 345cecc84..f4160300a 100644 --- a/packages/build/protobuf/protobuf_apt/README.md +++ b/packages/build/protobuf/protobuf_apt/README.md @@ -9,8 +9,8 @@ | **`protobuf:apt`** | | | :-- | :-- | |    Builds | [![`protobuf-apt_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-apt_jp46.yml?label=protobuf-apt:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-apt_jp46.yml) [![`protobuf-apt_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-apt_jp51.yml?label=protobuf-apt:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-apt_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) | |    Dependants | [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cuml`](/packages/rapids/cuml) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/protobuf:apt-r32.7.1`](https://hub.docker.com/r/dustynv/protobuf/tags) `(2023-12-06, 0.4GB)`
[`dustynv/protobuf:apt-r35.2.1`](https://hub.docker.com/r/dustynv/protobuf/tags) `(2023-12-06, 5.0GB)`
[`dustynv/protobuf:apt-r35.3.1`](https://hub.docker.com/r/dustynv/protobuf/tags) `(2023-08-29, 5.0GB)`
[`dustynv/protobuf:apt-r35.4.1`](https://hub.docker.com/r/dustynv/protobuf/tags) `(2023-10-07, 5.0GB)` | @@ -22,27 +22,27 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag protobuf_apt) +jetson-containers run $(autotag protobuf_apt) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host protobuf_apt:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag protobuf_apt) +jetson-containers run -v /path/on/host:/path/in/container $(autotag protobuf_apt) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag protobuf_apt) my_app --abc xyz +jetson-containers run $(autotag protobuf_apt) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -50,7 +50,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh protobuf_apt +jetson-containers build protobuf_apt ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/build/protobuf/protobuf_cpp/README.md b/packages/build/protobuf/protobuf_cpp/README.md index 9a142a853..1a4460555 100644 --- a/packages/build/protobuf/protobuf_cpp/README.md +++ b/packages/build/protobuf/protobuf_cpp/README.md @@ -10,8 +10,8 @@ | :-- | :-- | |    Aliases | `protobuf` | |    Builds | [![`protobuf-cpp_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-cpp_jp51.yml?label=protobuf-cpp:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-cpp_jp51.yml) [![`protobuf-cpp_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/protobuf-cpp_jp46.yml?label=protobuf-cpp:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/protobuf-cpp_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) | |    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/protobuf:cpp-r32.7.1`](https://hub.docker.com/r/dustynv/protobuf/tags) `(2023-12-06, 0.5GB)`
[`dustynv/protobuf:cpp-r35.2.1`](https://hub.docker.com/r/dustynv/protobuf/tags) `(2023-12-06, 5.1GB)`
[`dustynv/protobuf:cpp-r35.3.1`](https://hub.docker.com/r/dustynv/protobuf/tags) `(2023-08-29, 5.1GB)`
[`dustynv/protobuf:cpp-r35.4.1`](https://hub.docker.com/r/dustynv/protobuf/tags) `(2023-10-07, 5.1GB)` | @@ -23,27 +23,27 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag protobuf_cpp) +jetson-containers run $(autotag protobuf_cpp) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host protobuf_cpp:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag protobuf_cpp) +jetson-containers run -v /path/on/host:/path/in/container $(autotag protobuf_cpp) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag protobuf_cpp) my_app --abc xyz +jetson-containers run $(autotag protobuf_cpp) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -51,7 +51,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh protobuf_cpp +jetson-containers build protobuf_cpp ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/build/python/README.md b/packages/build/python/README.md index 50016a5b8..1ecfa8bff 100644 --- a/packages/build/python/README.md +++ b/packages/build/python/README.md @@ -6,16 +6,44 @@ CONTAINERS
-| **`python`** | | +| **`python:3.6`** | | | :-- | :-- | -|    Aliases | `python3` | -|    Builds | [![`python_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/python_jp51.yml?label=python:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/python_jp51.yml) [![`python_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/python_jp46.yml?label=python:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/python_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) | -|    Dependants | [`arrow:12.0.1`](/packages/arrow) [`arrow:14.0.1`](/packages/arrow) [`arrow:5.0.0`](/packages/arrow) [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`cmake:pip`](/packages/cmake/cmake_pip) [`ctranslate2`](/packages/ctranslate2) [`cuda-python`](/packages/cuda/cuda-python) [`cuda-python:builder`](/packages/cuda/cuda-python) [`cuda:11.4-samples`](/packages/cuda/cuda) [`cuda:11.8-samples`](/packages/cuda/cuda) [`cuda:12.2-samples`](/packages/cuda/cuda) [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cuml`](/packages/rapids/cuml) [`cupy`](/packages/cuda/cupy) [`deepstream`](/packages/deepstream) [`docker`](/packages/docker) [`efficientvit`](/packages/vit/efficientvit) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faiss:be12427`](/packages/vectordb/faiss) [`faiss:be12427-builder`](/packages/vectordb/faiss) [`faiss:v1.7.3`](/packages/vectordb/faiss) [`faiss:v1.7.3-builder`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`faster-whisper`](/packages/audio/faster-whisper) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`gstreamer`](/packages/gstreamer) [`huggingface_hub`](/packages/llm/huggingface_hub) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`jupyterlab`](/packages/jupyterlab) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:ggml`](/packages/llm/llama_cpp) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`llamaspeak`](/packages/llm/llamaspeak) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`numba`](/packages/numba) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) [`onnxruntime`](/packages/onnxruntime) [`openai`](/packages/llm/openai) [`openai-triton`](/packages/openai-triton) [`opencv:4.5.0`](/packages/opencv) [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) [`opencv:4.8.1`](/packages/opencv) [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) [`optimum`](/packages/llm/optimum) [`protobuf:apt`](/packages/protobuf/protobuf_apt) [`protobuf:cpp`](/packages/protobuf/protobuf_cpp) [`pycuda`](/packages/cuda/pycuda) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.11`](/packages/pytorch) [`pytorch:1.12`](/packages/pytorch) [`pytorch:1.13`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.0-distributed`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.1-builder`](/packages/pytorch) [`pytorch:2.1-distributed`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`realsense`](/packages/realsense) [`riva-client:python`](/packages/audio/riva-client) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`rust`](/packages/rust) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt`](/packages/tensorrt) [`tensorrt:8.6`](/packages/tensorrt) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio`](/packages/pytorch/torchaudio) [`torchvision`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tritonserver`](/packages/tritonserver) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | +|    Requires | `L4T ['==32.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/python:r32.7.1`](https://hub.docker.com/r/dustynv/python/tags) `(2023-12-05, 0.4GB)`
[`dustynv/python:r35.2.1`](https://hub.docker.com/r/dustynv/python/tags) `(2023-12-05, 5.0GB)`
[`dustynv/python:r35.3.1`](https://hub.docker.com/r/dustynv/python/tags) `(2023-09-07, 5.0GB)`
[`dustynv/python:r35.4.1`](https://hub.docker.com/r/dustynv/python/tags) `(2023-10-07, 4.9GB)` | -|    Notes | installs core python3 packages and pip | +|    Notes | installs core `python3` packages and `pip` | + +| **`python:3.8`** | | +| :-- | :-- | +|    Aliases | `python` | +|    Requires | `L4T ['<36']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | +|    Dependants | [`arrow:12.0.1`](/packages/arrow) [`arrow:14.0.1`](/packages/arrow) [`arrow:5.0.0`](/packages/arrow) [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`cmake:pip`](/packages/build/cmake/cmake_pip) [`ctranslate2`](/packages/ctranslate2) [`cuda-python:11.4`](/packages/cuda/cuda-python) [`cuda:11.4-samples`](/packages/cuda/cuda) [`cuda:11.8-samples`](/packages/cuda/cuda) [`cuda:12.2-samples`](/packages/cuda/cuda) [`cuda:12.4-samples`](/packages/cuda/cuda) [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cuml`](/packages/rapids/cuml) [`cupy`](/packages/cuda/cupy) [`deepstream`](/packages/deepstream) [`docker`](/packages/build/docker) [`efficientvit`](/packages/vit/efficientvit) [`exllama:0.0.14`](/packages/llm/exllama) [`exllama:0.0.15`](/packages/llm/exllama) [`faiss:1.7.3`](/packages/vectordb/faiss) [`faiss:1.7.3-builder`](/packages/vectordb/faiss) [`faiss:1.7.4`](/packages/vectordb/faiss) [`faiss:1.7.4-builder`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`faster-whisper`](/packages/audio/faster-whisper) [`flash-attention`](/packages/llm/flash-attention) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`gstreamer`](/packages/gstreamer) [`huggingface_hub`](/packages/llm/huggingface_hub) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`jupyterlab`](/packages/jupyterlab) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:0.2.57`](/packages/llm/llama_cpp) [`llamaspeak`](/packages/llm/llamaspeak) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`numba`](/packages/numba) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) [`onnxruntime:1.11`](/packages/onnxruntime) [`onnxruntime:1.11-builder`](/packages/onnxruntime) [`onnxruntime:1.16.3`](/packages/onnxruntime) [`onnxruntime:1.16.3-builder`](/packages/onnxruntime) [`onnxruntime:1.17`](/packages/onnxruntime) [`onnxruntime:1.17-builder`](/packages/onnxruntime) [`openai`](/packages/llm/openai) [`openai-triton`](/packages/openai-triton) [`openai-triton:builder`](/packages/openai-triton) [`opencv:4.5.0`](/packages/opencv) [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) [`opencv:4.8.1`](/packages/opencv) [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) [`opencv:4.9.0`](/packages/opencv) [`optimum`](/packages/llm/optimum) [`piper-tts`](/packages/audio/piper-tts) [`protobuf:apt`](/packages/build/protobuf/protobuf_apt) [`protobuf:cpp`](/packages/build/protobuf/protobuf_cpp) [`pycuda`](/packages/cuda/pycuda) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.2`](/packages/pytorch) [`pytorch:2.3`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`realsense`](/packages/realsense) [`riva-client:python`](/packages/audio/riva-client) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`rust`](/packages/build/rust) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt`](/packages/tensorrt) [`tensorrt:10.0`](/packages/tensorrt) [`tensorrt:8.6`](/packages/tensorrt) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio:0.10.0`](/packages/pytorch/torchaudio) [`torchaudio:0.9.0`](/packages/pytorch/torchaudio) [`torchaudio:2.0.1`](/packages/pytorch/torchaudio) [`torchaudio:2.1.0`](/packages/pytorch/torchaudio) [`torchaudio:2.2.2`](/packages/pytorch/torchaudio) [`torchaudio:2.3.0`](/packages/pytorch/torchaudio) [`torchvision:0.10.0`](/packages/pytorch/torchvision) [`torchvision:0.11.1`](/packages/pytorch/torchvision) [`torchvision:0.15.1`](/packages/pytorch/torchvision) [`torchvision:0.16.2`](/packages/pytorch/torchvision) [`torchvision:0.17.2`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tritonserver`](/packages/tritonserver) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | installs core `python3` packages and `pip` | + +| **`python:3.10`** | | +| :-- | :-- | +|    Requires | `L4T ['>=34']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | installs core `python3` packages and `pip` | + +| **`python:3.11`** | | +| :-- | :-- | +|    Requires | `L4T ['>=34']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | +|    Dependants | [`wyoming-openwakeword:latest`](/packages/smart-home/wyoming/openwakeword) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | installs core `python3` packages and `pip` | + +| **`python:3.12`** | | +| :-- | :-- | +|    Requires | `L4T ['>=34']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | +|    Dependants | [`homeassistant-core:2024.4.2`](/packages/smart-home/homeassistant-core) [`homeassistant-core:latest`](/packages/smart-home/homeassistant-core) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | installs core `python3` packages and `pip` | @@ -39,29 +67,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag python) +jetson-containers run $(autotag python) # or explicitly specify one of the container images above -./run.sh dustynv/python:r32.7.1 +jetson-containers run dustynv/python:r32.7.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/python:r32.7.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag python) +jetson-containers run -v /path/on/host:/path/in/container $(autotag python) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag python) my_app --abc xyz +jetson-containers run $(autotag python) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -69,7 +97,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh python +jetson-containers build python ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/build/rust/README.md b/packages/build/rust/README.md index 32b79f842..a1cc24b4d 100644 --- a/packages/build/rust/README.md +++ b/packages/build/rust/README.md @@ -9,9 +9,9 @@ | **`rust`** | | | :-- | :-- | |    Builds | [![`rust_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/rust_jp51.yml?label=rust:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/rust_jp51.yml) [![`rust_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/rust_jp46.yml?label=rust:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/rust_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`efficientvit`](/packages/vit/efficientvit) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jupyterlab`](/packages/jupyterlab) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain:samples`](/packages/llm/langchain) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`optimum`](/packages/llm/optimum) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xtts`](/packages/audio/xtts) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) | +|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`efficientvit`](/packages/vit/efficientvit) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jupyterlab`](/packages/jupyterlab) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain:samples`](/packages/llm/langchain) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`optimum`](/packages/llm/optimum) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xtts`](/packages/audio/xtts) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/rust:r32.7.1`](https://hub.docker.com/r/dustynv/rust/tags) `(2023-12-05, 0.7GB)`
[`dustynv/rust:r35.2.1`](https://hub.docker.com/r/dustynv/rust/tags) `(2023-12-06, 5.3GB)`
[`dustynv/rust:r35.3.1`](https://hub.docker.com/r/dustynv/rust/tags) `(2023-08-29, 5.3GB)`
[`dustynv/rust:r35.4.1`](https://hub.docker.com/r/dustynv/rust/tags) `(2023-10-07, 5.2GB)` | @@ -37,29 +37,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag rust) +jetson-containers run $(autotag rust) # or explicitly specify one of the container images above -./run.sh dustynv/rust:r35.2.1 +jetson-containers run dustynv/rust:r35.2.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/rust:r35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag rust) +jetson-containers run -v /path/on/host:/path/in/container $(autotag rust) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag rust) my_app --abc xyz +jetson-containers run $(autotag rust) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +67,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh rust +jetson-containers build rust ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/ctranslate2/README.md b/packages/ctranslate2/README.md index 2b2e28a64..966d6ff37 100644 --- a/packages/ctranslate2/README.md +++ b/packages/ctranslate2/README.md @@ -9,8 +9,8 @@ docs.md | **`ctranslate2`** | | | :-- | :-- | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dependants | [`faster-whisper`](/packages/audio/faster-whisper) [`whisperx`](/packages/audio/whisperx) | |    Dockerfile | [`Dockerfile`](Dockerfile) | @@ -20,27 +20,27 @@ docs.md RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag ctranslate2) +jetson-containers run $(autotag ctranslate2) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host ctranslate2:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag ctranslate2) +jetson-containers run -v /path/on/host:/path/in/container $(autotag ctranslate2) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag ctranslate2) my_app --abc xyz +jetson-containers run $(autotag ctranslate2) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -48,7 +48,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh ctranslate2 +jetson-containers build ctranslate2 ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/cuda/cuda-python/README.md b/packages/cuda/cuda-python/README.md index 9b9f3a623..36f1a53c9 100644 --- a/packages/cuda/cuda-python/README.md +++ b/packages/cuda/cuda-python/README.md @@ -6,20 +6,13 @@ CONTAINERS
-| **`cuda-python:builder`** | | +| **`cuda-python:11.4`** | | | :-- | :-- | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`numpy`](/packages/numpy) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | - -| **`cuda-python`** | | -| :-- | :-- | -|    Builds | [![`cuda-python_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cuda-python_jp60.yml?label=cuda-python:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cuda-python_jp60.yml) [![`cuda-python_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cuda-python_jp51.yml?label=cuda-python:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cuda-python_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`numpy`](/packages/numpy) | -|    Dependants | [`faiss_lite`](/packages/vectordb/faiss_lite) [`local_llm`](/packages/llm/local_llm) [`nanodb`](/packages/vectordb/nanodb) [`raft`](/packages/rapids/raft) | +|    Aliases | `cuda-python` | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`python`](/packages/build/python) [`numpy`](/packages/numpy) | +|    Dependants | [`faiss_lite`](/packages/vectordb/faiss_lite) [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`raft`](/packages/rapids/raft) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/cuda-python:r35.2.1`](https://hub.docker.com/r/dustynv/cuda-python/tags) `(2023-12-06, 5.0GB)`
[`dustynv/cuda-python:r35.3.1`](https://hub.docker.com/r/dustynv/cuda-python/tags) `(2023-08-29, 5.0GB)`
[`dustynv/cuda-python:r35.4.1`](https://hub.docker.com/r/dustynv/cuda-python/tags) `(2023-12-06, 5.0GB)`
[`dustynv/cuda-python:r36.2.0`](https://hub.docker.com/r/dustynv/cuda-python/tags) `(2023-12-06, 3.5GB)` | @@ -29,6 +22,8 @@ | Repository/Tag | Date | Arch | Size | | :-- | :--: | :--: | :--: | +|   [`dustynv/cuda-python:builder-r35.4.1`](https://hub.docker.com/r/dustynv/cuda-python/tags) | `2024-03-26` | `arm64` | `5.1GB` | +|   [`dustynv/cuda-python:builder-r36.2.0`](https://hub.docker.com/r/dustynv/cuda-python/tags) | `2024-03-11` | `arm64` | `3.6GB` | |   [`dustynv/cuda-python:r35.2.1`](https://hub.docker.com/r/dustynv/cuda-python/tags) | `2023-12-06` | `arm64` | `5.0GB` | |   [`dustynv/cuda-python:r35.3.1`](https://hub.docker.com/r/dustynv/cuda-python/tags) | `2023-08-29` | `arm64` | `5.0GB` | |   [`dustynv/cuda-python:r35.4.1`](https://hub.docker.com/r/dustynv/cuda-python/tags) | `2023-12-06` | `arm64` | `5.0GB` | @@ -43,29 +38,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag cuda-python) +jetson-containers run $(autotag cuda-python) # or explicitly specify one of the container images above -./run.sh dustynv/cuda-python:r36.2.0 +jetson-containers run dustynv/cuda-python:builder-r35.4.1 # or if using 'docker run' (specify image and mounts/ect) -sudo docker run --runtime nvidia -it --rm --network=host dustynv/cuda-python:r36.2.0 +sudo docker run --runtime nvidia -it --rm --network=host dustynv/cuda-python:builder-r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag cuda-python) +jetson-containers run -v /path/on/host:/path/in/container $(autotag cuda-python) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag cuda-python) my_app --abc xyz +jetson-containers run $(autotag cuda-python) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -73,7 +68,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh cuda-python +jetson-containers build cuda-python ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/cuda/cuda/README.md b/packages/cuda/cuda/README.md index dfe91042c..8d36810f7 100644 --- a/packages/cuda/cuda/README.md +++ b/packages/cuda/cuda/README.md @@ -9,48 +9,62 @@ | **`cuda:12.2`** | | | :-- | :-- | |    Builds | [![`cuda-122_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cuda-122_jp60.yml?label=cuda-122:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cuda-122_jp60.yml) | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) | -|    Dependants | [`cuda:12.2-samples`](/packages/cuda/cuda) | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | +|    Dependants | [`cuda:12.2-samples`](/packages/cuda/cuda) [`cudnn:8.9`](/packages/cuda/cudnn) [`tensorrt:8.6`](/packages/tensorrt) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/cuda:12.2-r36.2.0`](https://hub.docker.com/r/dustynv/cuda/tags) `(2023-12-05, 3.4GB)`
[`dustynv/cuda:12.2-samples-r36.2.0`](https://hub.docker.com/r/dustynv/cuda/tags) `(2023-12-07, 4.8GB)` | +| **`cuda:12.4`** | | +| :-- | :-- | +|    Requires | `L4T ['==36.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | +|    Dependants | [`cuda:12.4-samples`](/packages/cuda/cuda) [`cudnn:9.0`](/packages/cuda/cudnn) [`tensorrt:10.0`](/packages/tensorrt) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + | **`cuda:12.2-samples`** | | | :-- | :-- | |    Builds | [![`cuda-122-samples_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cuda-122-samples_jp60.yml?label=cuda-122-samples:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cuda-122-samples_jp60.yml) | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda:12.2`](/packages/cuda/cuda) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:12.2`](/packages/cuda/cuda) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.samples`](Dockerfile.samples) | |    Images | [`dustynv/cuda:12.2-samples-r36.2.0`](https://hub.docker.com/r/dustynv/cuda/tags) `(2023-12-07, 4.8GB)` | |    Notes | CUDA samples from https://github.com/NVIDIA/cuda-samples installed under /opt/cuda-samples | +| **`cuda:12.4-samples`** | | +| :-- | :-- | +|    Requires | `L4T ['==36.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:12.4`](/packages/cuda/cuda) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) | +|    Dockerfile | [`Dockerfile.samples`](Dockerfile.samples) | +|    Notes | CUDA samples from https://github.com/NVIDIA/cuda-samples installed under /opt/cuda-samples | + | **`cuda:11.8`** | | | :-- | :-- | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | |    Dependants | [`cuda:11.8-samples`](/packages/cuda/cuda) | |    Dockerfile | [`Dockerfile`](Dockerfile) | | **`cuda:11.8-samples`** | | | :-- | :-- | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda:11.8`](/packages/cuda/cuda) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.8`](/packages/cuda/cuda) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.samples`](Dockerfile.samples) | |    Notes | CUDA samples from https://github.com/NVIDIA/cuda-samples installed under /opt/cuda-samples | | **`cuda:11.4`** | | | :-- | :-- | |    Aliases | `cuda` | -|    Requires | `L4T <36` | -|    Dependencies | [`build-essential`](/packages/build-essential) | -|    Dependants | [`arrow:12.0.1`](/packages/arrow) [`arrow:14.0.1`](/packages/arrow) [`arrow:5.0.0`](/packages/arrow) [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`cuda-python`](/packages/cuda/cuda-python) [`cuda-python:builder`](/packages/cuda/cuda-python) [`cuda:11.4-samples`](/packages/cuda/cuda) [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cudnn`](/packages/cuda/cudnn) [`cudnn:8.9`](/packages/cuda/cudnn) [`cuml`](/packages/rapids/cuml) [`cupy`](/packages/cuda/cupy) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faiss:be12427`](/packages/vectordb/faiss) [`faiss:be12427-builder`](/packages/vectordb/faiss) [`faiss:v1.7.3`](/packages/vectordb/faiss) [`faiss:v1.7.3-builder`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:ggml`](/packages/llm/llama_cpp) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`numba`](/packages/numba) [`onnxruntime`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`opencv:4.5.0`](/packages/opencv) [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) [`opencv:4.8.1`](/packages/opencv) [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) [`optimum`](/packages/llm/optimum) [`pycuda`](/packages/cuda/pycuda) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.11`](/packages/pytorch) [`pytorch:1.12`](/packages/pytorch) [`pytorch:1.13`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.0-distributed`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.1-builder`](/packages/pytorch) [`pytorch:2.1-distributed`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`realsense`](/packages/realsense) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt`](/packages/tensorrt) [`tensorrt:8.6`](/packages/tensorrt) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio`](/packages/pytorch/torchaudio) [`torchvision`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tritonserver`](/packages/tritonserver) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | +|    Requires | `L4T ['<36']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | +|    Dependants | [`arrow:12.0.1`](/packages/arrow) [`arrow:14.0.1`](/packages/arrow) [`arrow:5.0.0`](/packages/arrow) [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`cuda-python:11.4`](/packages/cuda/cuda-python) [`cuda:11.4-samples`](/packages/cuda/cuda) [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cudnn`](/packages/cuda/cudnn) [`cuml`](/packages/rapids/cuml) [`cupy`](/packages/cuda/cupy) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`exllama:0.0.14`](/packages/llm/exllama) [`exllama:0.0.15`](/packages/llm/exllama) [`faiss:1.7.3`](/packages/vectordb/faiss) [`faiss:1.7.3-builder`](/packages/vectordb/faiss) [`faiss:1.7.4`](/packages/vectordb/faiss) [`faiss:1.7.4-builder`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`flash-attention`](/packages/llm/flash-attention) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:0.2.57`](/packages/llm/llama_cpp) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`numba`](/packages/numba) [`ollama`](/packages/llm/ollama) [`onnxruntime:1.11`](/packages/onnxruntime) [`onnxruntime:1.11-builder`](/packages/onnxruntime) [`onnxruntime:1.16.3`](/packages/onnxruntime) [`onnxruntime:1.16.3-builder`](/packages/onnxruntime) [`onnxruntime:1.17`](/packages/onnxruntime) [`onnxruntime:1.17-builder`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`openai-triton:builder`](/packages/openai-triton) [`opencv:4.5.0`](/packages/opencv) [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) [`opencv:4.8.1`](/packages/opencv) [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) [`opencv:4.9.0`](/packages/opencv) [`optimum`](/packages/llm/optimum) [`piper-tts`](/packages/audio/piper-tts) [`pycuda`](/packages/cuda/pycuda) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.2`](/packages/pytorch) [`pytorch:2.3`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`realsense`](/packages/realsense) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt`](/packages/tensorrt) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio:0.10.0`](/packages/pytorch/torchaudio) [`torchaudio:0.9.0`](/packages/pytorch/torchaudio) [`torchaudio:2.0.1`](/packages/pytorch/torchaudio) [`torchaudio:2.1.0`](/packages/pytorch/torchaudio) [`torchaudio:2.2.2`](/packages/pytorch/torchaudio) [`torchaudio:2.3.0`](/packages/pytorch/torchaudio) [`torchvision:0.10.0`](/packages/pytorch/torchvision) [`torchvision:0.11.1`](/packages/pytorch/torchvision) [`torchvision:0.15.1`](/packages/pytorch/torchvision) [`torchvision:0.16.2`](/packages/pytorch/torchvision) [`torchvision:0.17.2`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tritonserver`](/packages/tritonserver) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | |    Dockerfile | [`Dockerfile.builtin`](Dockerfile.builtin) | | **`cuda:11.4-samples`** | | | :-- | :-- | |    Aliases | `cuda:samples` | -|    Requires | `L4T <36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['<36']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.samples`](Dockerfile.samples) | |    Notes | CUDA samples from https://github.com/NVIDIA/cuda-samples installed under /opt/cuda-samples | @@ -74,29 +88,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag cuda) +jetson-containers run $(autotag cuda) # or explicitly specify one of the container images above -./run.sh dustynv/cuda:12.2-samples-r36.2.0 +jetson-containers run dustynv/cuda:12.2-samples-r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/cuda:12.2-samples-r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag cuda) +jetson-containers run -v /path/on/host:/path/in/container $(autotag cuda) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag cuda) my_app --abc xyz +jetson-containers run $(autotag cuda) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -104,7 +118,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh cuda +jetson-containers build cuda ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/cuda/cudnn/README.md b/packages/cuda/cudnn/README.md index f883f4491..6995fdb59 100644 --- a/packages/cuda/cudnn/README.md +++ b/packages/cuda/cudnn/README.md @@ -8,19 +8,25 @@ | **`cudnn:8.9`** | | | :-- | :-- | -|    Aliases | `cudnn` | |    Builds | [![`cudnn-89_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cudnn-89_jp60.yml?label=cudnn-89:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cudnn-89_jp60.yml) | -|    Requires | `L4T ==36.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`cudnn`](/packages/cuda/cudnn) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faiss_lite`](/packages/vectordb/faiss_lite) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:ggml`](/packages/llm/llama_cpp) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`onnxruntime`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`opencv:4.5.0`](/packages/opencv) [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) [`opencv:4.8.1`](/packages/opencv) [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) [`optimum`](/packages/llm/optimum) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.11`](/packages/pytorch) [`pytorch:1.12`](/packages/pytorch) [`pytorch:1.13`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.0-distributed`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.1-builder`](/packages/pytorch) [`pytorch:2.1-distributed`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt`](/packages/tensorrt) [`tensorrt:8.6`](/packages/tensorrt) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio`](/packages/pytorch/torchaudio) [`torchvision`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tritonserver`](/packages/tritonserver) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | +|    Requires | `L4T ['==36.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:12.2`](/packages/cuda/cuda) | +|    Dependants | [`tensorrt:8.6`](/packages/tensorrt) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/cudnn:8.9-r36.2.0`](https://hub.docker.com/r/dustynv/cudnn/tags) `(2023-12-05, 4.9GB)` | +| **`cudnn:9.0`** | | +| :-- | :-- | +|    Requires | `L4T ['==36.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:12.4`](/packages/cuda/cuda) | +|    Dependants | [`tensorrt:10.0`](/packages/tensorrt) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + | **`cudnn`** | | | :-- | :-- | -|    Requires | `L4T <36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`cudnn`](/packages/cuda/cudnn) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faiss_lite`](/packages/vectordb/faiss_lite) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:ggml`](/packages/llm/llama_cpp) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`onnxruntime`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`opencv:4.5.0`](/packages/opencv) [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) [`opencv:4.8.1`](/packages/opencv) [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) [`optimum`](/packages/llm/optimum) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.11`](/packages/pytorch) [`pytorch:1.12`](/packages/pytorch) [`pytorch:1.13`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.0-distributed`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.1-builder`](/packages/pytorch) [`pytorch:2.1-distributed`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt`](/packages/tensorrt) [`tensorrt:8.6`](/packages/tensorrt) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio`](/packages/pytorch/torchaudio) [`torchvision`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tritonserver`](/packages/tritonserver) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | +|    Requires | `L4T ['<36']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) | +|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`exllama:0.0.14`](/packages/llm/exllama) [`exllama:0.0.15`](/packages/llm/exllama) [`faiss_lite`](/packages/vectordb/faiss_lite) [`flash-attention`](/packages/llm/flash-attention) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:0.2.57`](/packages/llm/llama_cpp) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`onnxruntime:1.11`](/packages/onnxruntime) [`onnxruntime:1.11-builder`](/packages/onnxruntime) [`onnxruntime:1.16.3`](/packages/onnxruntime) [`onnxruntime:1.16.3-builder`](/packages/onnxruntime) [`onnxruntime:1.17`](/packages/onnxruntime) [`onnxruntime:1.17-builder`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`openai-triton:builder`](/packages/openai-triton) [`opencv:4.5.0`](/packages/opencv) [`opencv:4.5.0-builder`](/packages/opencv/opencv_builder) [`opencv:4.8.1`](/packages/opencv) [`opencv:4.8.1-builder`](/packages/opencv/opencv_builder) [`opencv:4.9.0`](/packages/opencv) [`optimum`](/packages/llm/optimum) [`piper-tts`](/packages/audio/piper-tts) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.2`](/packages/pytorch) [`pytorch:2.3`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt`](/packages/tensorrt) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio:0.10.0`](/packages/pytorch/torchaudio) [`torchaudio:0.9.0`](/packages/pytorch/torchaudio) [`torchaudio:2.0.1`](/packages/pytorch/torchaudio) [`torchaudio:2.1.0`](/packages/pytorch/torchaudio) [`torchaudio:2.2.2`](/packages/pytorch/torchaudio) [`torchaudio:2.3.0`](/packages/pytorch/torchaudio) [`torchvision:0.10.0`](/packages/pytorch/torchvision) [`torchvision:0.11.1`](/packages/pytorch/torchvision) [`torchvision:0.15.1`](/packages/pytorch/torchvision) [`torchvision:0.16.2`](/packages/pytorch/torchvision) [`torchvision:0.17.2`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tritonserver`](/packages/tritonserver) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | |    Images | [`dustynv/cudnn:8.9-r36.2.0`](https://hub.docker.com/r/dustynv/cudnn/tags) `(2023-12-05, 4.9GB)` | @@ -42,29 +48,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag cudnn) +jetson-containers run $(autotag cudnn) # or explicitly specify one of the container images above -./run.sh dustynv/cudnn:8.9-r36.2.0 +jetson-containers run dustynv/cudnn:8.9-r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/cudnn:8.9-r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag cudnn) +jetson-containers run -v /path/on/host:/path/in/container $(autotag cudnn) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag cudnn) my_app --abc xyz +jetson-containers run $(autotag cudnn) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -72,7 +78,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh cudnn +jetson-containers build cudnn ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/cuda/cupy/README.md b/packages/cuda/cupy/README.md index 70d88908a..9306d14dd 100644 --- a/packages/cuda/cupy/README.md +++ b/packages/cuda/cupy/README.md @@ -9,8 +9,8 @@ | **`cupy`** | | | :-- | :-- | |    Builds | [![`cupy_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cupy_jp60.yml?label=cupy:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cupy_jp60.yml) [![`cupy_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cupy_jp51.yml?label=cupy:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cupy_jp51.yml) [![`cupy_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cupy_jp46.yml?label=cupy:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cupy_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`numpy`](/packages/numpy) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`numpy`](/packages/numpy) | |    Dependants | [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cuml`](/packages/rapids/cuml) [`l4t-ml`](/packages/l4t/l4t-ml) [`raft`](/packages/rapids/raft) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/cupy:r32.7.1`](https://hub.docker.com/r/dustynv/cupy/tags) `(2023-12-06, 0.5GB)`
[`dustynv/cupy:r35.2.1`](https://hub.docker.com/r/dustynv/cupy/tags) `(2023-12-05, 5.1GB)`
[`dustynv/cupy:r35.3.1`](https://hub.docker.com/r/dustynv/cupy/tags) `(2023-09-07, 5.1GB)`
[`dustynv/cupy:r35.4.1`](https://hub.docker.com/r/dustynv/cupy/tags) `(2023-12-06, 5.1GB)`
[`dustynv/cupy:r36.2.0`](https://hub.docker.com/r/dustynv/cupy/tags) `(2023-12-06, 3.5GB)` | @@ -38,29 +38,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag cupy) +jetson-containers run $(autotag cupy) # or explicitly specify one of the container images above -./run.sh dustynv/cupy:r35.4.1 +jetson-containers run dustynv/cupy:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/cupy:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag cupy) +jetson-containers run -v /path/on/host:/path/in/container $(autotag cupy) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag cupy) my_app --abc xyz +jetson-containers run $(autotag cupy) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -68,7 +68,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh cupy +jetson-containers build cupy ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/cuda/pycuda/README.md b/packages/cuda/pycuda/README.md index 409bec621..e59f53f4b 100644 --- a/packages/cuda/pycuda/README.md +++ b/packages/cuda/pycuda/README.md @@ -9,8 +9,8 @@ | **`pycuda`** | | | :-- | :-- | |    Builds | [![`pycuda_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pycuda_jp46.yml?label=pycuda:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pycuda_jp46.yml) [![`pycuda_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pycuda_jp51.yml?label=pycuda:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pycuda_jp51.yml) [![`pycuda_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pycuda_jp60.yml?label=pycuda:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pycuda_jp60.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`numpy`](/packages/numpy) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`numpy`](/packages/numpy) | |    Dependants | [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/pycuda:r32.7.1`](https://hub.docker.com/r/dustynv/pycuda/tags) `(2023-12-06, 0.4GB)`
[`dustynv/pycuda:r35.2.1`](https://hub.docker.com/r/dustynv/pycuda/tags) `(2023-09-07, 5.0GB)`
[`dustynv/pycuda:r35.3.1`](https://hub.docker.com/r/dustynv/pycuda/tags) `(2023-08-29, 5.0GB)`
[`dustynv/pycuda:r35.4.1`](https://hub.docker.com/r/dustynv/pycuda/tags) `(2023-12-06, 5.0GB)`
[`dustynv/pycuda:r36.2.0`](https://hub.docker.com/r/dustynv/pycuda/tags) `(2023-12-06, 3.5GB)` | @@ -38,29 +38,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag pycuda) +jetson-containers run $(autotag pycuda) # or explicitly specify one of the container images above -./run.sh dustynv/pycuda:r32.7.1 +jetson-containers run dustynv/pycuda:r32.7.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/pycuda:r32.7.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag pycuda) +jetson-containers run -v /path/on/host:/path/in/container $(autotag pycuda) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag pycuda) my_app --abc xyz +jetson-containers run $(autotag pycuda) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -68,7 +68,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh pycuda +jetson-containers build pycuda ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/deepstream/README.md b/packages/deepstream/README.md index 34980579b..e18e362db 100644 --- a/packages/deepstream/README.md +++ b/packages/deepstream/README.md @@ -9,8 +9,8 @@ | **`deepstream`** | | | :-- | :-- | |    Builds | [![`deepstream_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/deepstream_jp60.yml?label=deepstream:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/deepstream_jp60.yml) [![`deepstream_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/deepstream_jp46.yml?label=deepstream:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/deepstream_jp46.yml) [![`deepstream_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/deepstream_jp51.yml?label=deepstream:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/deepstream_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/cmake/cmake_pip) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) [`tritonserver`](/packages/tritonserver) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) [`tritonserver`](/packages/tritonserver) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/deepstream:r32.7.1`](https://hub.docker.com/r/dustynv/deepstream/tags) `(2024-03-06, 2.3GB)`
[`dustynv/deepstream:r35.2.1`](https://hub.docker.com/r/dustynv/deepstream/tags) `(2023-12-22, 6.8GB)`
[`dustynv/deepstream:r35.3.1`](https://hub.docker.com/r/dustynv/deepstream/tags) `(2024-03-05, 6.8GB)`
[`dustynv/deepstream:r35.4.1`](https://hub.docker.com/r/dustynv/deepstream/tags) `(2023-12-06, 6.7GB)`
[`dustynv/deepstream:r36.2.0`](https://hub.docker.com/r/dustynv/deepstream/tags) `(2024-03-05, 9.8GB)` | |    Notes | https://docs.nvidia.com/metropolis/deepstream/dev-guide/text/DS_Quickstart.html | @@ -38,29 +38,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag deepstream) +jetson-containers run $(autotag deepstream) # or explicitly specify one of the container images above -./run.sh dustynv/deepstream:r32.7.1 +jetson-containers run dustynv/deepstream:r32.7.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/deepstream:r32.7.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag deepstream) +jetson-containers run -v /path/on/host:/path/in/container $(autotag deepstream) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag deepstream) my_app --abc xyz +jetson-containers run $(autotag deepstream) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -68,7 +68,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh deepstream +jetson-containers build deepstream ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/diffusion/stable-diffusion-webui/README.md b/packages/diffusion/stable-diffusion-webui/README.md index f9dd99c93..e2e9496bb 100644 --- a/packages/diffusion/stable-diffusion-webui/README.md +++ b/packages/diffusion/stable-diffusion-webui/README.md @@ -45,8 +45,8 @@ Stable Diffusion XL | **`stable-diffusion-webui`** | | | :-- | :-- | |    Builds | [![`stable-diffusion-webui_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/stable-diffusion-webui_jp51.yml?label=stable-diffusion-webui:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/stable-diffusion-webui_jp51.yml) [![`stable-diffusion-webui_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/stable-diffusion-webui_jp60.yml?label=stable-diffusion-webui:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/stable-diffusion-webui_jp60.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:distributed`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`xformers`](/packages/llm/xformers) [`pycuda`](/packages/cuda/pycuda) [`opencv`](/packages/opencv) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`xformers`](/packages/llm/xformers) [`pycuda`](/packages/cuda/pycuda) [`opencv`](/packages/opencv) [`tensorrt`](/packages/tensorrt) [`onnxruntime`](/packages/onnxruntime) | |    Dependants | [`l4t-diffusion`](/packages/l4t/l4t-diffusion) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/stable-diffusion-webui:r35.2.1`](https://hub.docker.com/r/dustynv/stable-diffusion-webui/tags) `(2024-02-02, 7.3GB)`
[`dustynv/stable-diffusion-webui:r35.3.1`](https://hub.docker.com/r/dustynv/stable-diffusion-webui/tags) `(2024-02-02, 7.3GB)`
[`dustynv/stable-diffusion-webui:r35.4.1`](https://hub.docker.com/r/dustynv/stable-diffusion-webui/tags) `(2024-02-02, 7.3GB)`
[`dustynv/stable-diffusion-webui:r36.2.0`](https://hub.docker.com/r/dustynv/stable-diffusion-webui/tags) `(2024-02-02, 8.9GB)` | @@ -74,29 +74,29 @@ Stable Diffusion XL
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag stable-diffusion-webui) +jetson-containers run $(autotag stable-diffusion-webui) # or explicitly specify one of the container images above -./run.sh dustynv/stable-diffusion-webui:r35.3.1 +jetson-containers run dustynv/stable-diffusion-webui:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/stable-diffusion-webui:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag stable-diffusion-webui) +jetson-containers run -v /path/on/host:/path/in/container $(autotag stable-diffusion-webui) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag stable-diffusion-webui) my_app --abc xyz +jetson-containers run $(autotag stable-diffusion-webui) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -104,7 +104,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh stable-diffusion-webui +jetson-containers build stable-diffusion-webui ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/diffusion/stable-diffusion/README.md b/packages/diffusion/stable-diffusion/README.md index e800080bd..d33475747 100644 --- a/packages/diffusion/stable-diffusion/README.md +++ b/packages/diffusion/stable-diffusion/README.md @@ -64,8 +64,8 @@ To run all these steps from a script, see [`stable-diffusion/test.sh`](/packages | **`stable-diffusion`** | | | :-- | :-- | |    Builds | [![`stable-diffusion_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/stable-diffusion_jp51.yml?label=stable-diffusion:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/stable-diffusion_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | |    Dependants | [`l4t-diffusion`](/packages/l4t/l4t-diffusion) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/stable-diffusion:r35.2.1`](https://hub.docker.com/r/dustynv/stable-diffusion/tags) `(2023-12-14, 6.1GB)`
[`dustynv/stable-diffusion:r35.3.1`](https://hub.docker.com/r/dustynv/stable-diffusion/tags) `(2023-12-12, 6.1GB)`
[`dustynv/stable-diffusion:r35.4.1`](https://hub.docker.com/r/dustynv/stable-diffusion/tags) `(2023-12-15, 6.1GB)` | @@ -92,29 +92,29 @@ To run all these steps from a script, see [`stable-diffusion/test.sh`](/packages
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag stable-diffusion) +jetson-containers run $(autotag stable-diffusion) # or explicitly specify one of the container images above -./run.sh dustynv/stable-diffusion:r35.4.1 +jetson-containers run dustynv/stable-diffusion:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/stable-diffusion:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag stable-diffusion) +jetson-containers run -v /path/on/host:/path/in/container $(autotag stable-diffusion) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag stable-diffusion) my_app --abc xyz +jetson-containers run $(autotag stable-diffusion) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -122,7 +122,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh stable-diffusion +jetson-containers build stable-diffusion ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/ffmpeg/README.md b/packages/ffmpeg/README.md new file mode 100644 index 000000000..6167df821 --- /dev/null +++ b/packages/ffmpeg/README.md @@ -0,0 +1,54 @@ +# ffmpeg + +> [`CONTAINERS`](#user-content-containers) [`IMAGES`](#user-content-images) [`RUN`](#user-content-run) [`BUILD`](#user-content-build) + +
+CONTAINERS +
+ +| **`ffmpeg`** | | +| :-- | :-- | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | +|    Dependants | [`homeassistant-core:2024.4.2`](/packages/smart-home/homeassistant-core) [`homeassistant-core:latest`](/packages/smart-home/homeassistant-core) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | installs ffmpeg | + +
+ +
+RUN CONTAINER +
+ +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +```bash +# automatically pull or build a compatible container image +jetson-containers run $(autotag ffmpeg) + +# or if using 'docker run' (specify image and mounts/ect) +sudo docker run --runtime nvidia -it --rm --network=host ffmpeg:35.2.1 + +``` +> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. + +To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: +```bash +jetson-containers run -v /path/on/host:/path/in/container $(autotag ffmpeg) +``` +To launch the container running a command, as opposed to an interactive shell: +```bash +jetson-containers run $(autotag ffmpeg) my_app --abc xyz +``` +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +
+
+BUILD CONTAINER +
+ +If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: +```bash +jetson-containers build ffmpeg +``` +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options. +
diff --git a/packages/gstreamer/README.md b/packages/gstreamer/README.md index ab8fde2b6..4a3b9fb5c 100644 --- a/packages/gstreamer/README.md +++ b/packages/gstreamer/README.md @@ -9,9 +9,9 @@ | **`gstreamer`** | | | :-- | :-- | |    Builds | [![`gstreamer_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/gstreamer_jp51.yml?label=gstreamer:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/gstreamer_jp51.yml) [![`gstreamer_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/gstreamer_jp46.yml?label=gstreamer:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/gstreamer_jp46.yml) [![`gstreamer_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/gstreamer_jp60.yml?label=gstreamer:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/gstreamer_jp60.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) | -|    Dependants | [`deepstream`](/packages/deepstream) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-ml`](/packages/l4t/l4t-ml) [`local_llm`](/packages/llm/local_llm) [`nanoowl`](/packages/vit/nanoowl) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) | +|    Dependants | [`deepstream`](/packages/deepstream) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-ml`](/packages/l4t/l4t-ml) [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanoowl`](/packages/vit/nanoowl) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/gstreamer:r32.7.1`](https://hub.docker.com/r/dustynv/gstreamer/tags) `(2023-12-06, 0.7GB)`
[`dustynv/gstreamer:r35.2.1`](https://hub.docker.com/r/dustynv/gstreamer/tags) `(2023-09-07, 5.1GB)`
[`dustynv/gstreamer:r35.3.1`](https://hub.docker.com/r/dustynv/gstreamer/tags) `(2023-12-06, 5.1GB)`
[`dustynv/gstreamer:r35.4.1`](https://hub.docker.com/r/dustynv/gstreamer/tags) `(2023-10-07, 5.1GB)`
[`dustynv/gstreamer:r36.2.0`](https://hub.docker.com/r/dustynv/gstreamer/tags) `(2023-12-07, 5.4GB)` | @@ -38,29 +38,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag gstreamer) +jetson-containers run $(autotag gstreamer) # or explicitly specify one of the container images above -./run.sh dustynv/gstreamer:r36.2.0 +jetson-containers run dustynv/gstreamer:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/gstreamer:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag gstreamer) +jetson-containers run -v /path/on/host:/path/in/container $(autotag gstreamer) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag gstreamer) my_app --abc xyz +jetson-containers run $(autotag gstreamer) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -68,7 +68,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh gstreamer +jetson-containers build gstreamer ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/jetson-inference/README.md b/packages/jetson-inference/README.md index 8d9896349..cf163ea7a 100644 --- a/packages/jetson-inference/README.md +++ b/packages/jetson-inference/README.md @@ -8,9 +8,9 @@ | **`jetson-inference`** | | | :-- | :-- | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) | -|    Dependants | [`local_llm`](/packages/llm/local_llm) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`tensorrt`](/packages/tensorrt) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) | +|    Dependants | [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/jetson-inference:22.06`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2022-09-30, 6.5GB)`
[`dustynv/jetson-inference:r32.4.3`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2020-10-27, 0.9GB)`
[`dustynv/jetson-inference:r32.4.4`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2021-11-16, 0.9GB)`
[`dustynv/jetson-inference:r32.5.0`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2021-08-09, 0.9GB)`
[`dustynv/jetson-inference:r32.6.1`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2021-08-24, 0.9GB)`
[`dustynv/jetson-inference:r32.7.1`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2023-05-15, 1.1GB)`
[`dustynv/jetson-inference:r34.1.0`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2022-04-08, 5.9GB)`
[`dustynv/jetson-inference:r34.1.1`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2023-03-18, 6.1GB)`
[`dustynv/jetson-inference:r35.1.0`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2023-05-15, 6.1GB)`
[`dustynv/jetson-inference:r35.2.1`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2023-05-15, 6.0GB)`
[`dustynv/jetson-inference:r35.3.1`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2023-05-15, 5.6GB)`
[`dustynv/jetson-inference:r35.4.1`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2023-08-30, 5.7GB)`
[`dustynv/jetson-inference:r36.2.0`](https://hub.docker.com/r/dustynv/jetson-inference/tags) `(2023-12-19, 7.9GB)` | @@ -45,29 +45,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag jetson-inference) +jetson-containers run $(autotag jetson-inference) # or explicitly specify one of the container images above -./run.sh dustynv/jetson-inference:r36.2.0 +jetson-containers run dustynv/jetson-inference:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/jetson-inference:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag jetson-inference) +jetson-containers run -v /path/on/host:/path/in/container $(autotag jetson-inference) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag jetson-inference) my_app --abc xyz +jetson-containers run $(autotag jetson-inference) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -75,7 +75,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh jetson-inference +jetson-containers build jetson-inference ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/jetson-utils/README.md b/packages/jetson-utils/README.md index 042008fb6..9c15eb8f4 100644 --- a/packages/jetson-utils/README.md +++ b/packages/jetson-utils/README.md @@ -9,8 +9,8 @@ | **`jetson-utils`** | | | :-- | :-- | |    Builds | [![`jetson-utils_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/jetson-utils_jp60.yml?label=jetson-utils:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/jetson-utils_jp60.yml) [![`jetson-utils_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/jetson-utils_jp46.yml?label=jetson-utils:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/jetson-utils_jp46.yml) [![`jetson-utils_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/jetson-utils_jp51.yml?label=jetson-utils:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/jetson-utils_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/cmake/cmake_pip) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/jetson-utils:r32.7.1`](https://hub.docker.com/r/dustynv/jetson-utils/tags) `(2024-02-24, 0.7GB)`
[`dustynv/jetson-utils:r35.2.1`](https://hub.docker.com/r/dustynv/jetson-utils/tags) `(2023-12-05, 5.2GB)`
[`dustynv/jetson-utils:r35.3.1`](https://hub.docker.com/r/dustynv/jetson-utils/tags) `(2024-02-24, 5.2GB)`
[`dustynv/jetson-utils:r36.2.0`](https://hub.docker.com/r/dustynv/jetson-utils/tags) `(2024-02-24, 7.1GB)` | @@ -36,29 +36,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag jetson-utils) +jetson-containers run $(autotag jetson-utils) # or explicitly specify one of the container images above -./run.sh dustynv/jetson-utils:r35.3.1 +jetson-containers run dustynv/jetson-utils:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/jetson-utils:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag jetson-utils) +jetson-containers run -v /path/on/host:/path/in/container $(autotag jetson-utils) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag jetson-utils) my_app --abc xyz +jetson-containers run $(autotag jetson-utils) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -66,7 +66,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh jetson-utils +jetson-containers build jetson-utils ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/jupyterlab/README.md b/packages/jupyterlab/README.md index 8b1ed2443..71610d6a6 100644 --- a/packages/jupyterlab/README.md +++ b/packages/jupyterlab/README.md @@ -9,8 +9,8 @@ | **`jupyterlab`** | | | :-- | :-- | |    Builds | [![`jupyterlab_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/jupyterlab_jp46.yml?label=jupyterlab:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/jupyterlab_jp46.yml) [![`jupyterlab_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/jupyterlab_jp51.yml?label=jupyterlab:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/jupyterlab_jp51.yml) [![`jupyterlab_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/jupyterlab_jp60.yml?label=jupyterlab:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/jupyterlab_jp60.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) [`numpy`](/packages/numpy) [`rust`](/packages/rust) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`rust`](/packages/build/rust) | |    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`efficientvit`](/packages/vit/efficientvit) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain:samples`](/packages/llm/langchain) [`sam`](/packages/vit/sam) [`tam`](/packages/vit/tam) [`whisper`](/packages/audio/whisper) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/jupyterlab:r32.7.1`](https://hub.docker.com/r/dustynv/jupyterlab/tags) `(2024-03-07, 0.7GB)`
[`dustynv/jupyterlab:r35.2.1`](https://hub.docker.com/r/dustynv/jupyterlab/tags) `(2023-12-06, 5.3GB)`
[`dustynv/jupyterlab:r35.3.1`](https://hub.docker.com/r/dustynv/jupyterlab/tags) `(2024-03-07, 5.4GB)`
[`dustynv/jupyterlab:r35.4.1`](https://hub.docker.com/r/dustynv/jupyterlab/tags) `(2023-10-07, 5.3GB)`
[`dustynv/jupyterlab:r36.2.0`](https://hub.docker.com/r/dustynv/jupyterlab/tags) `(2024-03-07, 0.6GB)` | @@ -39,29 +39,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag jupyterlab) +jetson-containers run $(autotag jupyterlab) # or explicitly specify one of the container images above -./run.sh dustynv/jupyterlab:r35.3.1 +jetson-containers run dustynv/jupyterlab:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/jupyterlab:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag jupyterlab) +jetson-containers run -v /path/on/host:/path/in/container $(autotag jupyterlab) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag jupyterlab) my_app --abc xyz +jetson-containers run $(autotag jupyterlab) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -69,7 +69,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh jupyterlab +jetson-containers build jupyterlab ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/l4t/l4t-diffusion/README.md b/packages/l4t/l4t-diffusion/README.md index a61d0baa3..ff53d5f53 100644 --- a/packages/l4t/l4t-diffusion/README.md +++ b/packages/l4t/l4t-diffusion/README.md @@ -19,8 +19,8 @@ By default, this container will automatically start the [`stable-diffusion-webui | **`l4t-diffusion`** | | | :-- | :-- | |    Builds | [![`l4t-diffusion_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-diffusion_jp51.yml?label=l4t-diffusion:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-diffusion_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:distributed`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`xformers`](/packages/llm/xformers) [`pycuda`](/packages/cuda/pycuda) [`opencv`](/packages/opencv) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`xformers`](/packages/llm/xformers) [`pycuda`](/packages/cuda/pycuda) [`opencv`](/packages/opencv) [`tensorrt`](/packages/tensorrt) [`onnxruntime`](/packages/onnxruntime) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) | |    Images | [`dustynv/l4t-diffusion:r35.2.1`](https://hub.docker.com/r/dustynv/l4t-diffusion/tags) `(2024-01-09, 7.3GB)`
[`dustynv/l4t-diffusion:r35.3.1`](https://hub.docker.com/r/dustynv/l4t-diffusion/tags) `(2023-09-24, 6.9GB)`
[`dustynv/l4t-diffusion:r35.4.1`](https://hub.docker.com/r/dustynv/l4t-diffusion/tags) `(2024-02-02, 7.3GB)` | @@ -44,29 +44,29 @@ By default, this container will automatically start the [`stable-diffusion-webui
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag l4t-diffusion) +jetson-containers run $(autotag l4t-diffusion) # or explicitly specify one of the container images above -./run.sh dustynv/l4t-diffusion:r35.4.1 +jetson-containers run dustynv/l4t-diffusion:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/l4t-diffusion:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag l4t-diffusion) +jetson-containers run -v /path/on/host:/path/in/container $(autotag l4t-diffusion) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag l4t-diffusion) my_app --abc xyz +jetson-containers run $(autotag l4t-diffusion) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -74,7 +74,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh l4t-diffusion +jetson-containers build l4t-diffusion ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/l4t/l4t-ml/README.md b/packages/l4t/l4t-ml/README.md index ed278f2b5..d325928ea 100644 --- a/packages/l4t/l4t-ml/README.md +++ b/packages/l4t/l4t-ml/README.md @@ -9,8 +9,8 @@ | **`l4t-ml`** | | | :-- | :-- | |    Builds | [![`l4t-ml_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-ml_jp60.yml?label=l4t-ml:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-ml_jp60.yml) [![`l4t-ml_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-ml_jp46.yml?label=l4t-ml:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-ml_jp46.yml) [![`l4t-ml_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-ml_jp51.yml?label=l4t-ml:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-ml_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`torchaudio`](/packages/pytorch/torchaudio) [`protobuf:cpp`](/packages/protobuf/protobuf_cpp) [`tensorflow2`](/packages/tensorflow) [`opencv`](/packages/opencv) [`pycuda`](/packages/cuda/pycuda) [`cupy`](/packages/cuda/cupy) [`onnxruntime`](/packages/onnxruntime) [`numba`](/packages/numba) [`gstreamer`](/packages/gstreamer) [`rust`](/packages/rust) [`jupyterlab`](/packages/jupyterlab) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`torchaudio`](/packages/pytorch/torchaudio) [`tensorrt`](/packages/tensorrt) [`protobuf:cpp`](/packages/build/protobuf/protobuf_cpp) [`tensorflow2`](/packages/tensorflow) [`opencv`](/packages/opencv) [`pycuda`](/packages/cuda/pycuda) [`cupy`](/packages/cuda/cupy) [`onnxruntime`](/packages/onnxruntime) [`numba`](/packages/numba) [`gstreamer`](/packages/gstreamer) [`rust`](/packages/build/rust) [`jupyterlab`](/packages/jupyterlab) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/l4t-ml:r32.7.1`](https://hub.docker.com/r/dustynv/l4t-ml/tags) `(2023-11-13, 2.4GB)`
[`dustynv/l4t-ml:r35.2.1`](https://hub.docker.com/r/dustynv/l4t-ml/tags) `(2024-01-04, 7.1GB)`
[`dustynv/l4t-ml:r35.3.1`](https://hub.docker.com/r/dustynv/l4t-ml/tags) `(2023-12-11, 7.0GB)`
[`dustynv/l4t-ml:r35.4.1`](https://hub.docker.com/r/dustynv/l4t-ml/tags) `(2024-03-07, 7.1GB)`
[`dustynv/l4t-ml:r36.2.0`](https://hub.docker.com/r/dustynv/l4t-ml/tags) `(2024-03-07, 8.9GB)` | @@ -37,29 +37,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag l4t-ml) +jetson-containers run $(autotag l4t-ml) # or explicitly specify one of the container images above -./run.sh dustynv/l4t-ml:r36.2.0 +jetson-containers run dustynv/l4t-ml:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/l4t-ml:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag l4t-ml) +jetson-containers run -v /path/on/host:/path/in/container $(autotag l4t-ml) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag l4t-ml) my_app --abc xyz +jetson-containers run $(autotag l4t-ml) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +67,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh l4t-ml +jetson-containers build l4t-ml ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/l4t/l4t-pytorch/README.md b/packages/l4t/l4t-pytorch/README.md index 8d98680f0..d93e33f8d 100644 --- a/packages/l4t/l4t-pytorch/README.md +++ b/packages/l4t/l4t-pytorch/README.md @@ -9,8 +9,8 @@ | **`l4t-pytorch`** | | | :-- | :-- | |    Builds | [![`l4t-pytorch_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-pytorch_jp46.yml?label=l4t-pytorch:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-pytorch_jp46.yml) [![`l4t-pytorch_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-pytorch_jp60.yml?label=l4t-pytorch:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-pytorch_jp60.yml) [![`l4t-pytorch_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-pytorch_jp51.yml?label=l4t-pytorch:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-pytorch_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`torchaudio`](/packages/pytorch/torchaudio) [`torch2trt`](/packages/pytorch/torch2trt) [`opencv`](/packages/opencv) [`pycuda`](/packages/cuda/pycuda) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`torchaudio`](/packages/pytorch/torchaudio) [`tensorrt`](/packages/tensorrt) [`torch2trt`](/packages/pytorch/torch2trt) [`opencv`](/packages/opencv) [`pycuda`](/packages/cuda/pycuda) | |    Images | [`dustynv/l4t-pytorch:r32.7.1`](https://hub.docker.com/r/dustynv/l4t-pytorch/tags) `(2023-12-14, 1.2GB)`
[`dustynv/l4t-pytorch:r35.2.1`](https://hub.docker.com/r/dustynv/l4t-pytorch/tags) `(2023-12-11, 5.6GB)`
[`dustynv/l4t-pytorch:r35.3.1`](https://hub.docker.com/r/dustynv/l4t-pytorch/tags) `(2023-12-14, 5.6GB)`
[`dustynv/l4t-pytorch:r35.4.1`](https://hub.docker.com/r/dustynv/l4t-pytorch/tags) `(2023-12-12, 5.6GB)`
[`dustynv/l4t-pytorch:r36.2.0`](https://hub.docker.com/r/dustynv/l4t-pytorch/tags) `(2023-12-14, 7.3GB)` | @@ -36,29 +36,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag l4t-pytorch) +jetson-containers run $(autotag l4t-pytorch) # or explicitly specify one of the container images above -./run.sh dustynv/l4t-pytorch:r36.2.0 +jetson-containers run dustynv/l4t-pytorch:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/l4t-pytorch:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag l4t-pytorch) +jetson-containers run -v /path/on/host:/path/in/container $(autotag l4t-pytorch) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag l4t-pytorch) my_app --abc xyz +jetson-containers run $(autotag l4t-pytorch) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -66,7 +66,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh l4t-pytorch +jetson-containers build l4t-pytorch ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/l4t/l4t-tensorflow/README.md b/packages/l4t/l4t-tensorflow/README.md index 61d3bc5a1..b0a9fe8ee 100644 --- a/packages/l4t/l4t-tensorflow/README.md +++ b/packages/l4t/l4t-tensorflow/README.md @@ -9,15 +9,15 @@ | **`l4t-tensorflow:tf1`** | | | :-- | :-- | |    Builds | [![`l4t-tensorflow-tf1_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-tensorflow-tf1_jp46.yml?label=l4t-tensorflow-tf1:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-tensorflow-tf1_jp46.yml) [![`l4t-tensorflow-tf1_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-tensorflow-tf1_jp51.yml?label=l4t-tensorflow-tf1:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-tensorflow-tf1_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`protobuf:cpp`](/packages/protobuf/protobuf_cpp) [`tensorflow`](/packages/tensorflow) [`opencv`](/packages/opencv) [`pycuda`](/packages/cuda/pycuda) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`protobuf:cpp`](/packages/build/protobuf/protobuf_cpp) [`tensorflow`](/packages/tensorflow) [`opencv`](/packages/opencv) [`pycuda`](/packages/cuda/pycuda) | |    Images | [`dustynv/l4t-tensorflow:tf1-r32.7.1`](https://hub.docker.com/r/dustynv/l4t-tensorflow/tags) `(2023-12-06, 0.9GB)`
[`dustynv/l4t-tensorflow:tf1-r35.2.1`](https://hub.docker.com/r/dustynv/l4t-tensorflow/tags) `(2023-12-06, 5.5GB)`
[`dustynv/l4t-tensorflow:tf1-r35.3.1`](https://hub.docker.com/r/dustynv/l4t-tensorflow/tags) `(2023-08-29, 5.6GB)`
[`dustynv/l4t-tensorflow:tf1-r35.4.1`](https://hub.docker.com/r/dustynv/l4t-tensorflow/tags) `(2023-10-07, 5.5GB)` | | **`l4t-tensorflow:tf2`** | | | :-- | :-- | |    Builds | [![`l4t-tensorflow-tf2_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-tensorflow-tf2_jp46.yml?label=l4t-tensorflow-tf2:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-tensorflow-tf2_jp46.yml) [![`l4t-tensorflow-tf2_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-tensorflow-tf2_jp60.yml?label=l4t-tensorflow-tf2:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-tensorflow-tf2_jp60.yml) [![`l4t-tensorflow-tf2_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-tensorflow-tf2_jp51.yml?label=l4t-tensorflow-tf2:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-tensorflow-tf2_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`protobuf:cpp`](/packages/protobuf/protobuf_cpp) [`tensorflow2`](/packages/tensorflow) [`opencv`](/packages/opencv) [`pycuda`](/packages/cuda/pycuda) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`protobuf:cpp`](/packages/build/protobuf/protobuf_cpp) [`tensorflow2`](/packages/tensorflow) [`opencv`](/packages/opencv) [`pycuda`](/packages/cuda/pycuda) | |    Images | [`dustynv/l4t-tensorflow:tf2-r32.7.1`](https://hub.docker.com/r/dustynv/l4t-tensorflow/tags) `(2023-12-06, 1.0GB)`
[`dustynv/l4t-tensorflow:tf2-r35.2.1`](https://hub.docker.com/r/dustynv/l4t-tensorflow/tags) `(2023-09-07, 5.7GB)`
[`dustynv/l4t-tensorflow:tf2-r35.3.1`](https://hub.docker.com/r/dustynv/l4t-tensorflow/tags) `(2023-12-06, 5.7GB)`
[`dustynv/l4t-tensorflow:tf2-r35.4.1`](https://hub.docker.com/r/dustynv/l4t-tensorflow/tags) `(2023-10-07, 5.7GB)`
[`dustynv/l4t-tensorflow:tf2-r36.2.0`](https://hub.docker.com/r/dustynv/l4t-tensorflow/tags) `(2023-12-06, 7.3GB)` | @@ -47,29 +47,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag l4t-tensorflow) +jetson-containers run $(autotag l4t-tensorflow) # or explicitly specify one of the container images above -./run.sh dustynv/l4t-tensorflow:tf2-r32.7.1 +jetson-containers run dustynv/l4t-tensorflow:tf2-r32.7.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/l4t-tensorflow:tf2-r32.7.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag l4t-tensorflow) +jetson-containers run -v /path/on/host:/path/in/container $(autotag l4t-tensorflow) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag l4t-tensorflow) my_app --abc xyz +jetson-containers run $(autotag l4t-tensorflow) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -77,7 +77,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh l4t-tensorflow +jetson-containers build l4t-tensorflow ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/l4t/l4t-text-generation/README.md b/packages/l4t/l4t-text-generation/README.md index 9cbcaf71a..e4a62e406 100644 --- a/packages/l4t/l4t-text-generation/README.md +++ b/packages/l4t/l4t-text-generation/README.md @@ -9,9 +9,9 @@ | **`l4t-text-generation`** | | | :-- | :-- | |    Builds | [![`l4t-text-generation_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-text-generation_jp51.yml?label=l4t-text-generation:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-text-generation_jp51.yml) [![`l4t-text-generation_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/l4t-text-generation_jp60.yml?label=l4t-text-generation:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/l4t-text-generation_jp60.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`mlc`](/packages/llm/mlc) [`auto_gptq`](/packages/llm/auto_gptq) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`exllama`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`jupyterlab`](/packages/jupyterlab) [`docker`](/packages/docker) [`openai`](/packages/llm/openai) | -|    Images | [`dustynv/l4t-text-generation:r35.2.1`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) `(2023-12-12, 6.4GB)`
[`dustynv/l4t-text-generation:r35.3.1`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) `(2024-02-22, 6.7GB)`
[`dustynv/l4t-text-generation:r35.4.1`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) `(2023-12-19, 6.4GB)`
[`dustynv/l4t-text-generation:r36.2.0`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) `(2024-03-08, 10.3GB)` | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`mlc`](/packages/llm/mlc) [`auto_gptq`](/packages/llm/auto_gptq) [`exllama`](/packages/llm/exllama) [`llama_cpp`](/packages/llm/llama_cpp) [`jupyterlab`](/packages/jupyterlab) [`docker`](/packages/build/docker) [`openai`](/packages/llm/openai) | +|    Images | [`dustynv/l4t-text-generation:r35.2.1`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) `(2023-12-12, 6.4GB)`
[`dustynv/l4t-text-generation:r35.3.1`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) `(2024-02-22, 6.7GB)`
[`dustynv/l4t-text-generation:r35.4.1`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) `(2023-12-19, 6.4GB)`
[`dustynv/l4t-text-generation:r36.2.0`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) `(2024-03-12, 10.4GB)` | @@ -24,7 +24,7 @@ |   [`dustynv/l4t-text-generation:r35.2.1`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) | `2023-12-12` | `arm64` | `6.4GB` | |   [`dustynv/l4t-text-generation:r35.3.1`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) | `2024-02-22` | `arm64` | `6.7GB` | |   [`dustynv/l4t-text-generation:r35.4.1`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) | `2023-12-19` | `arm64` | `6.4GB` | -|   [`dustynv/l4t-text-generation:r36.2.0`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) | `2024-03-08` | `arm64` | `10.3GB` | +|   [`dustynv/l4t-text-generation:r36.2.0`](https://hub.docker.com/r/dustynv/l4t-text-generation/tags) | `2024-03-12` | `arm64` | `10.4GB` | > Container images are compatible with other minor versions of JetPack/L4T:
>     â€¢ L4T R32.7 containers can run on other versions of L4T R32.7 (JetPack 4.6+)
@@ -35,29 +35,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag l4t-text-generation) +jetson-containers run $(autotag l4t-text-generation) # or explicitly specify one of the container images above -./run.sh dustynv/l4t-text-generation:r36.2.0 +jetson-containers run dustynv/l4t-text-generation:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/l4t-text-generation:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag l4t-text-generation) +jetson-containers run -v /path/on/host:/path/in/container $(autotag l4t-text-generation) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag l4t-text-generation) my_app --abc xyz +jetson-containers run $(autotag l4t-text-generation) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -65,7 +65,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh l4t-text-generation +jetson-containers build l4t-text-generation ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/auto_awq/README.md b/packages/llm/auto_awq/README.md index 3ce695b6a..c0f16b7b9 100644 --- a/packages/llm/auto_awq/README.md +++ b/packages/llm/auto_awq/README.md @@ -6,10 +6,11 @@ CONTAINERS
-| **`auto_awq`** | | +| **`auto_awq:0.2.4`** | | | :-- | :-- | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | +|    Aliases | `auto_awq` | +|    Requires | `L4T ['>=36']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | |    Dockerfile | [`Dockerfile`](Dockerfile) | @@ -18,27 +19,27 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag auto_awq) +jetson-containers run $(autotag auto_awq) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host auto_awq:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag auto_awq) +jetson-containers run -v /path/on/host:/path/in/container $(autotag auto_awq) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag auto_awq) my_app --abc xyz +jetson-containers run $(autotag auto_awq) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -46,7 +47,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh auto_awq +jetson-containers build auto_awq ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/auto_gptq/README.md b/packages/llm/auto_gptq/README.md index 5bf3a8312..60e70601b 100644 --- a/packages/llm/auto_gptq/README.md +++ b/packages/llm/auto_gptq/README.md @@ -20,14 +20,13 @@ If you get the error `Exllama kernel does not support query/key/value fusion wit CONTAINERS
-| **`auto_gptq`** | | +| **`auto_gptq:0.7.1`** | | | :-- | :-- | -|    Builds | [![`auto_gptq_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/auto_gptq_jp60.yml?label=auto_gptq:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/auto_gptq_jp60.yml) [![`auto_gptq_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/auto_gptq_jp51.yml?label=auto_gptq:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/auto_gptq_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | +|    Aliases | `auto_gptq` | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | |    Dependants | [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/auto_gptq:r35.2.1`](https://hub.docker.com/r/dustynv/auto_gptq/tags) `(2023-12-15, 6.0GB)`
[`dustynv/auto_gptq:r35.3.1`](https://hub.docker.com/r/dustynv/auto_gptq/tags) `(2023-12-11, 6.0GB)`
[`dustynv/auto_gptq:r35.4.1`](https://hub.docker.com/r/dustynv/auto_gptq/tags) `(2023-12-14, 6.0GB)`
[`dustynv/auto_gptq:r36.2.0`](https://hub.docker.com/r/dustynv/auto_gptq/tags) `(2023-12-15, 7.7GB)` | @@ -51,29 +50,29 @@ If you get the error `Exllama kernel does not support query/key/value fusion wit RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag auto_gptq) +jetson-containers run $(autotag auto_gptq) # or explicitly specify one of the container images above -./run.sh dustynv/auto_gptq:r36.2.0 +jetson-containers run dustynv/auto_gptq:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/auto_gptq:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag auto_gptq) +jetson-containers run -v /path/on/host:/path/in/container $(autotag auto_gptq) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag auto_gptq) my_app --abc xyz +jetson-containers run $(autotag auto_gptq) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -81,7 +80,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh auto_gptq +jetson-containers build auto_gptq ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/awq/README.md b/packages/llm/awq/README.md index ecc52a032..9005bf6e4 100644 --- a/packages/llm/awq/README.md +++ b/packages/llm/awq/README.md @@ -42,18 +42,11 @@ Make sure that you load the output from the quantization steps above with `--qua CONTAINERS
-| **`awq`** | | +| **`awq:0.1.0`** | | | :-- | :-- | -|    Builds | [![`awq_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/awq_jp60.yml?label=awq:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/awq_jp60.yml) [![`awq_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/awq_jp51.yml?label=awq:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/awq_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/awq:r35.2.1`](https://hub.docker.com/r/dustynv/awq/tags) `(2023-12-14, 6.1GB)`
[`dustynv/awq:r35.3.1`](https://hub.docker.com/r/dustynv/awq/tags) `(2023-12-15, 6.1GB)`
[`dustynv/awq:r35.4.1`](https://hub.docker.com/r/dustynv/awq/tags) `(2023-12-12, 6.1GB)`
[`dustynv/awq:r36.2.0`](https://hub.docker.com/r/dustynv/awq/tags) `(2023-12-15, 7.8GB)` | - -| **`awq:dev`** | | -| :-- | :-- | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | +|    Aliases | `awq` | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | |    Dockerfile | [`Dockerfile`](Dockerfile) | @@ -78,29 +71,29 @@ Make sure that you load the output from the quantization steps above with `--qua RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag awq) +jetson-containers run $(autotag awq) # or explicitly specify one of the container images above -./run.sh dustynv/awq:r35.3.1 +jetson-containers run dustynv/awq:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/awq:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag awq) +jetson-containers run -v /path/on/host:/path/in/container $(autotag awq) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag awq) my_app --abc xyz +jetson-containers run $(autotag awq) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -108,7 +101,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh awq +jetson-containers build awq ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/bitsandbytes/README.md b/packages/llm/bitsandbytes/README.md index 4ce0f9810..bed203284 100644 --- a/packages/llm/bitsandbytes/README.md +++ b/packages/llm/bitsandbytes/README.md @@ -6,12 +6,19 @@ CONTAINERS
+| **`bitsandbytes:builder`** | | +| :-- | :-- | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | +|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | +|    Notes | fork of https://github.com/TimDettmers/bitsandbytes for Jetson | + | **`bitsandbytes`** | | | :-- | :-- | |    Builds | [![`bitsandbytes_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/bitsandbytes_jp51.yml?label=bitsandbytes:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/bitsandbytes_jp51.yml) | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dependants | [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | +|    Dependants | [`text-generation-inference`](/packages/llm/text-generation-inference) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/bitsandbytes:r35.2.1`](https://hub.docker.com/r/dustynv/bitsandbytes/tags) `(2023-12-15, 5.9GB)`
[`dustynv/bitsandbytes:r35.3.1`](https://hub.docker.com/r/dustynv/bitsandbytes/tags) `(2023-12-11, 6.0GB)`
[`dustynv/bitsandbytes:r35.4.1`](https://hub.docker.com/r/dustynv/bitsandbytes/tags) `(2023-12-14, 5.9GB)` | |    Notes | fork of https://github.com/TimDettmers/bitsandbytes for Jetson | @@ -37,29 +44,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag bitsandbytes) +jetson-containers run $(autotag bitsandbytes) # or explicitly specify one of the container images above -./run.sh dustynv/bitsandbytes:r35.2.1 +jetson-containers run dustynv/bitsandbytes:r35.2.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/bitsandbytes:r35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag bitsandbytes) +jetson-containers run -v /path/on/host:/path/in/container $(autotag bitsandbytes) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag bitsandbytes) my_app --abc xyz +jetson-containers run $(autotag bitsandbytes) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +74,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh bitsandbytes +jetson-containers build bitsandbytes ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/exllama/README.md b/packages/llm/exllama/README.md index a3f18539e..b84c268a8 100644 --- a/packages/llm/exllama/README.md +++ b/packages/llm/exllama/README.md @@ -34,25 +34,20 @@ Substitute the GPTQ model from [HuggingFace Hub](https://huggingface.co/models?s CONTAINERS
-| **`exllama:v1`** | | +| **`exllama:0.0.15`** | | | :-- | :-- | |    Aliases | `exllama` | -|    Builds | [![`exllama-v1_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/exllama-v1_jp60.yml?label=exllama-v1:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/exllama-v1_jp60.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`huggingface_hub`](/packages/llm/huggingface_hub) | -|    Dependants | [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) | +|    Requires | `L4T ['>=36']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`huggingface_hub`](/packages/llm/huggingface_hub) | +|    Dependants | [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/exllama:v1-r36.2.0`](https://hub.docker.com/r/dustynv/exllama/tags) `(2023-12-15, 7.2GB)` | -| **`exllama:v2`** | | +| **`exllama:0.0.14`** | | | :-- | :-- | |    Aliases | `exllama` | -|    Builds | [![`exllama-v2_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/exllama-v2_jp51.yml?label=exllama-v2:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/exllama-v2_jp51.yml) [![`exllama-v2_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/exllama-v2_jp60.yml?label=exllama-v2:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/exllama-v2_jp60.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`huggingface_hub`](/packages/llm/huggingface_hub) | -|    Dependants | [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) | -|    Dockerfile | [`Dockerfile.v2`](Dockerfile.v2) | -|    Images | [`dustynv/exllama:v2-r35.2.1`](https://hub.docker.com/r/dustynv/exllama/tags) `(2023-12-15, 5.5GB)`
[`dustynv/exllama:v2-r35.3.1`](https://hub.docker.com/r/dustynv/exllama/tags) `(2023-12-14, 5.5GB)`
[`dustynv/exllama:v2-r35.4.1`](https://hub.docker.com/r/dustynv/exllama/tags) `(2023-12-12, 5.5GB)`
[`dustynv/exllama:v2-r36.2.0`](https://hub.docker.com/r/dustynv/exllama/tags) `(2023-12-15, 7.2GB)` | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`huggingface_hub`](/packages/llm/huggingface_hub) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | @@ -80,29 +75,29 @@ Substitute the GPTQ model from [HuggingFace Hub](https://huggingface.co/models?s RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag exllama) +jetson-containers run $(autotag exllama) # or explicitly specify one of the container images above -./run.sh dustynv/exllama:v1-r36.2.0 +jetson-containers run dustynv/exllama:v1-r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/exllama:v1-r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag exllama) +jetson-containers run -v /path/on/host:/path/in/container $(autotag exllama) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag exllama) my_app --abc xyz +jetson-containers run $(autotag exllama) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -110,7 +105,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh exllama +jetson-containers build exllama ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/flash-attention/README.md b/packages/llm/flash-attention/README.md new file mode 100644 index 000000000..dcd8464d4 --- /dev/null +++ b/packages/llm/flash-attention/README.md @@ -0,0 +1,52 @@ +# flash-attention + +> [`CONTAINERS`](#user-content-containers) [`IMAGES`](#user-content-images) [`RUN`](#user-content-run) [`BUILD`](#user-content-build) + +
+CONTAINERS +
+ +| **`flash-attention`** | | +| :-- | :-- | +|    Requires | `L4T ['>=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + +
+ +
+RUN CONTAINER +
+ +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +```bash +# automatically pull or build a compatible container image +jetson-containers run $(autotag flash-attention) + +# or if using 'docker run' (specify image and mounts/ect) +sudo docker run --runtime nvidia -it --rm --network=host flash-attention:35.2.1 + +``` +> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. + +To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: +```bash +jetson-containers run -v /path/on/host:/path/in/container $(autotag flash-attention) +``` +To launch the container running a command, as opposed to an interactive shell: +```bash +jetson-containers run $(autotag flash-attention) my_app --abc xyz +``` +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +
+
+BUILD CONTAINER +
+ +If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: +```bash +jetson-containers build flash-attention +``` +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options. +
diff --git a/packages/llm/gptq-for-llama/README.md b/packages/llm/gptq-for-llama/README.md index fe6cb612a..f28ce3dd4 100644 --- a/packages/llm/gptq-for-llama/README.md +++ b/packages/llm/gptq-for-llama/README.md @@ -9,9 +9,8 @@ | **`gptq-for-llama`** | | | :-- | :-- | |    Builds | [![`gptq-for-llama_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/gptq-for-llama_jp60.yml?label=gptq-for-llama:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/gptq-for-llama_jp60.yml) [![`gptq-for-llama_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/gptq-for-llama_jp51.yml?label=gptq-for-llama:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/gptq-for-llama_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dependants | [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/gptq-for-llama:r35.2.1`](https://hub.docker.com/r/dustynv/gptq-for-llama/tags) `(2023-12-06, 5.9GB)`
[`dustynv/gptq-for-llama:r35.3.1`](https://hub.docker.com/r/dustynv/gptq-for-llama/tags) `(2023-12-14, 5.9GB)`
[`dustynv/gptq-for-llama:r35.4.1`](https://hub.docker.com/r/dustynv/gptq-for-llama/tags) `(2023-12-15, 5.9GB)`
[`dustynv/gptq-for-llama:r36.2.0`](https://hub.docker.com/r/dustynv/gptq-for-llama/tags) `(2024-01-02, 7.6GB)` | |    Notes | https://github.com/oobabooga/GPTQ-for-LLaMa | @@ -38,29 +37,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag gptq-for-llama) +jetson-containers run $(autotag gptq-for-llama) # or explicitly specify one of the container images above -./run.sh dustynv/gptq-for-llama:r36.2.0 +jetson-containers run dustynv/gptq-for-llama:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/gptq-for-llama:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag gptq-for-llama) +jetson-containers run -v /path/on/host:/path/in/container $(autotag gptq-for-llama) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag gptq-for-llama) my_app --abc xyz +jetson-containers run $(autotag gptq-for-llama) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -68,7 +67,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh gptq-for-llama +jetson-containers build gptq-for-llama ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/huggingface_hub/README.md b/packages/llm/huggingface_hub/README.md index 2a48c9a1f..0ecf5957f 100644 --- a/packages/llm/huggingface_hub/README.md +++ b/packages/llm/huggingface_hub/README.md @@ -9,9 +9,9 @@ | **`huggingface_hub`** | | | :-- | :-- | |    Builds | [![`huggingface_hub_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/huggingface_hub_jp46.yml?label=huggingface_hub:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/huggingface_hub_jp46.yml) [![`huggingface_hub_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/huggingface_hub_jp51.yml?label=huggingface_hub:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/huggingface_hub_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`efficientvit`](/packages/vit/efficientvit) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faster-whisper`](/packages/audio/faster-whisper) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:ggml`](/packages/llm/llama_cpp) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`optimum`](/packages/llm/optimum) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`whisperx`](/packages/audio/whisperx) [`xtts`](/packages/audio/xtts) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) | +|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`efficientvit`](/packages/vit/efficientvit) [`exllama:0.0.14`](/packages/llm/exllama) [`exllama:0.0.15`](/packages/llm/exllama) [`faster-whisper`](/packages/audio/faster-whisper) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:0.2.57`](/packages/llm/llama_cpp) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`optimum`](/packages/llm/optimum) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`whisperx`](/packages/audio/whisperx) [`xtts`](/packages/audio/xtts) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/huggingface_hub:r32.7.1`](https://hub.docker.com/r/dustynv/huggingface_hub/tags) `(2023-12-15, 0.4GB)`
[`dustynv/huggingface_hub:r35.2.1`](https://hub.docker.com/r/dustynv/huggingface_hub/tags) `(2023-12-15, 5.0GB)`
[`dustynv/huggingface_hub:r35.3.1`](https://hub.docker.com/r/dustynv/huggingface_hub/tags) `(2023-09-07, 5.0GB)`
[`dustynv/huggingface_hub:r35.4.1`](https://hub.docker.com/r/dustynv/huggingface_hub/tags) `(2023-10-07, 5.0GB)` | |    Notes | provides `huggingface-cli` and `huggingface-downloader` tools | @@ -38,29 +38,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag huggingface_hub) +jetson-containers run $(autotag huggingface_hub) # or explicitly specify one of the container images above -./run.sh dustynv/huggingface_hub:r35.2.1 +jetson-containers run dustynv/huggingface_hub:r35.2.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/huggingface_hub:r35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag huggingface_hub) +jetson-containers run -v /path/on/host:/path/in/container $(autotag huggingface_hub) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag huggingface_hub) my_app --abc xyz +jetson-containers run $(autotag huggingface_hub) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -68,7 +68,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh huggingface_hub +jetson-containers build huggingface_hub ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/langchain/README.md b/packages/llm/langchain/README.md index b002a6d68..8d9754866 100644 --- a/packages/llm/langchain/README.md +++ b/packages/llm/langchain/README.md @@ -16,8 +16,8 @@ Use your web browser to access `http://HOSTNAME:8888` | :-- | :-- | |    Aliases | `langchain:main` | |    Builds | [![`langchain_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/langchain_jp51.yml?label=langchain:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/langchain_jp51.yml) [![`langchain_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/langchain_jp60.yml?label=langchain:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/langchain_jp60.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`huggingface_hub`](/packages/llm/huggingface_hub) [`llama_cpp`](/packages/llm/llama_cpp) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`huggingface_hub`](/packages/llm/huggingface_hub) [`llama_cpp`](/packages/llm/llama_cpp) | |    Dependants | [`langchain:samples`](/packages/llm/langchain) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/langchain:r35.2.1`](https://hub.docker.com/r/dustynv/langchain/tags) `(2023-12-06, 5.6GB)`
[`dustynv/langchain:r35.3.1`](https://hub.docker.com/r/dustynv/langchain/tags) `(2023-12-19, 5.6GB)`
[`dustynv/langchain:r35.4.1`](https://hub.docker.com/r/dustynv/langchain/tags) `(2024-01-24, 5.6GB)`
[`dustynv/langchain:r36.2.0`](https://hub.docker.com/r/dustynv/langchain/tags) `(2024-01-24, 7.3GB)`
[`dustynv/langchain:samples-r35.2.1`](https://hub.docker.com/r/dustynv/langchain/tags) `(2024-01-24, 6.0GB)`
[`dustynv/langchain:samples-r35.3.1`](https://hub.docker.com/r/dustynv/langchain/tags) `(2024-01-24, 6.0GB)`
[`dustynv/langchain:samples-r35.4.1`](https://hub.docker.com/r/dustynv/langchain/tags) `(2024-03-07, 6.2GB)`
[`dustynv/langchain:samples-r36.2.0`](https://hub.docker.com/r/dustynv/langchain/tags) `(2024-03-07, 7.8GB)` | @@ -25,8 +25,8 @@ Use your web browser to access `http://HOSTNAME:8888` | **`langchain:samples`** | | | :-- | :-- | |    Builds | [![`langchain-samples_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/langchain-samples_jp60.yml?label=langchain-samples:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/langchain-samples_jp60.yml) [![`langchain-samples_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/langchain-samples_jp51.yml?label=langchain-samples:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/langchain-samples_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`huggingface_hub`](/packages/llm/huggingface_hub) [`llama_cpp`](/packages/llm/llama_cpp) [`langchain:main`](/packages/llm/langchain) [`rust`](/packages/rust) [`jupyterlab`](/packages/jupyterlab) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`huggingface_hub`](/packages/llm/huggingface_hub) [`llama_cpp`](/packages/llm/llama_cpp) [`langchain:main`](/packages/llm/langchain) [`rust`](/packages/build/rust) [`jupyterlab`](/packages/jupyterlab) | |    Dockerfile | [`Dockerfile.samples`](Dockerfile.samples) | |    Images | [`dustynv/langchain:samples-r35.2.1`](https://hub.docker.com/r/dustynv/langchain/tags) `(2024-01-24, 6.0GB)`
[`dustynv/langchain:samples-r35.3.1`](https://hub.docker.com/r/dustynv/langchain/tags) `(2024-01-24, 6.0GB)`
[`dustynv/langchain:samples-r35.4.1`](https://hub.docker.com/r/dustynv/langchain/tags) `(2024-03-07, 6.2GB)`
[`dustynv/langchain:samples-r36.2.0`](https://hub.docker.com/r/dustynv/langchain/tags) `(2024-03-07, 7.8GB)` | @@ -56,29 +56,29 @@ Use your web browser to access `http://HOSTNAME:8888`
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag langchain) +jetson-containers run $(autotag langchain) # or explicitly specify one of the container images above -./run.sh dustynv/langchain:samples-r36.2.0 +jetson-containers run dustynv/langchain:samples-r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/langchain:samples-r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag langchain) +jetson-containers run -v /path/on/host:/path/in/container $(autotag langchain) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag langchain) my_app --abc xyz +jetson-containers run $(autotag langchain) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -86,7 +86,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh langchain +jetson-containers build langchain ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/llama_cpp/README.md b/packages/llm/llama_cpp/README.md index 1aff1aa39..aebee1206 100644 --- a/packages/llm/llama_cpp/README.md +++ b/packages/llm/llama_cpp/README.md @@ -57,23 +57,13 @@ To use the Python API and [`benchmark.py`](/packages/llm/llama_cpp/benchmark.py) CONTAINERS
-| **`llama_cpp:ggml`** | | -| :-- | :-- | -|    Builds | [![`llama_cpp-ggml_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/llama_cpp-ggml_jp51.yml?label=llama_cpp-ggml:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/llama_cpp-ggml_jp51.yml) [![`llama_cpp-ggml_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/llama_cpp-ggml_jp60.yml?label=llama_cpp-ggml:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/llama_cpp-ggml_jp60.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) [`numpy`](/packages/numpy) [`huggingface_hub`](/packages/llm/huggingface_hub) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/llama_cpp:ggml-r35.2.1`](https://hub.docker.com/r/dustynv/llama_cpp/tags) `(2023-12-05, 5.2GB)`
[`dustynv/llama_cpp:ggml-r35.3.1`](https://hub.docker.com/r/dustynv/llama_cpp/tags) `(2023-12-06, 5.2GB)`
[`dustynv/llama_cpp:ggml-r35.4.1`](https://hub.docker.com/r/dustynv/llama_cpp/tags) `(2023-12-19, 5.2GB)`
[`dustynv/llama_cpp:ggml-r36.2.0`](https://hub.docker.com/r/dustynv/llama_cpp/tags) `(2023-12-19, 5.1GB)` | - -| **`llama_cpp:gguf`** | | +| **`llama_cpp:0.2.57`** | | | :-- | :-- | |    Aliases | `llama_cpp` | -|    Builds | [![`llama_cpp-gguf_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/llama_cpp-gguf_jp60.yml?label=llama_cpp-gguf:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/llama_cpp-gguf_jp60.yml) [![`llama_cpp-gguf_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/llama_cpp-gguf_jp51.yml?label=llama_cpp-gguf:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/llama_cpp-gguf_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) [`numpy`](/packages/numpy) [`huggingface_hub`](/packages/llm/huggingface_hub) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`huggingface_hub`](/packages/llm/huggingface_hub) | |    Dependants | [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/llama_cpp:gguf-r35.2.1`](https://hub.docker.com/r/dustynv/llama_cpp/tags) `(2023-12-15, 5.1GB)`
[`dustynv/llama_cpp:gguf-r35.3.1`](https://hub.docker.com/r/dustynv/llama_cpp/tags) `(2023-12-19, 5.2GB)`
[`dustynv/llama_cpp:gguf-r35.4.1`](https://hub.docker.com/r/dustynv/llama_cpp/tags) `(2023-12-15, 5.1GB)`
[`dustynv/llama_cpp:gguf-r36.2.0`](https://hub.docker.com/r/dustynv/llama_cpp/tags) `(2023-12-19, 5.1GB)` | @@ -105,29 +95,29 @@ To use the Python API and [`benchmark.py`](/packages/llm/llama_cpp/benchmark.py) RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag llama_cpp) +jetson-containers run $(autotag llama_cpp) # or explicitly specify one of the container images above -./run.sh dustynv/llama_cpp:r36.2.0 +jetson-containers run dustynv/llama_cpp:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/llama_cpp:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag llama_cpp) +jetson-containers run -v /path/on/host:/path/in/container $(autotag llama_cpp) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag llama_cpp) my_app --abc xyz +jetson-containers run $(autotag llama_cpp) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -135,7 +125,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh llama_cpp +jetson-containers build llama_cpp ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/llamaspeak/README.md b/packages/llm/llamaspeak/README.md index 124b9b40d..b97a620c5 100644 --- a/packages/llm/llamaspeak/README.md +++ b/packages/llm/llamaspeak/README.md @@ -93,8 +93,8 @@ The default port is `8050`, but that can be changed with the `--port` argument. | **`llamaspeak`** | | | :-- | :-- | |    Builds | [![`llamaspeak_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/llamaspeak_jp51.yml?label=llamaspeak:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/llamaspeak_jp51.yml) [![`llamaspeak_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/llamaspeak_jp60.yml?label=llamaspeak:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/llamaspeak_jp60.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) [`riva-client:python`](/packages/audio/riva-client) [`numpy`](/packages/numpy) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) [`riva-client:python`](/packages/audio/riva-client) [`numpy`](/packages/numpy) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/llamaspeak:r35.2.1`](https://hub.docker.com/r/dustynv/llamaspeak/tags) `(2023-09-07, 5.0GB)`
[`dustynv/llamaspeak:r35.3.1`](https://hub.docker.com/r/dustynv/llamaspeak/tags) `(2023-08-29, 5.0GB)`
[`dustynv/llamaspeak:r35.4.1`](https://hub.docker.com/r/dustynv/llamaspeak/tags) `(2023-12-05, 5.0GB)` | @@ -119,29 +119,29 @@ The default port is `8050`, but that can be changed with the `--port` argument.
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag llamaspeak) +jetson-containers run $(autotag llamaspeak) # or explicitly specify one of the container images above -./run.sh dustynv/llamaspeak:r35.4.1 +jetson-containers run dustynv/llamaspeak:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/llamaspeak:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag llamaspeak) +jetson-containers run -v /path/on/host:/path/in/container $(autotag llamaspeak) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag llamaspeak) my_app --abc xyz +jetson-containers run $(autotag llamaspeak) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -149,7 +149,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh llamaspeak +jetson-containers build llamaspeak ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/llava/README.md b/packages/llm/llava/README.md index da6ef4685..3aff6277f 100644 --- a/packages/llm/llava/README.md +++ b/packages/llm/llava/README.md @@ -60,8 +60,8 @@ ASSISTANT: The environment is a desert setting, with a mountain in the backgroun | **`llava`** | | | :-- | :-- | |    Builds | [![`llava_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/llava_jp60.yml?label=llava:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/llava_jp60.yml) [![`llava_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/llava_jp51.yml?label=llava:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/llava_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/llava:r35.2.1`](https://hub.docker.com/r/dustynv/llava/tags) `(2023-12-15, 6.3GB)`
[`dustynv/llava:r35.3.1`](https://hub.docker.com/r/dustynv/llava/tags) `(2023-12-12, 6.3GB)`
[`dustynv/llava:r35.4.1`](https://hub.docker.com/r/dustynv/llava/tags) `(2023-12-14, 6.3GB)`
[`dustynv/llava:r36.2.0`](https://hub.docker.com/r/dustynv/llava/tags) `(2023-12-18, 8.0GB)` | @@ -87,29 +87,29 @@ ASSISTANT: The environment is a desert setting, with a mountain in the backgroun
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag llava) +jetson-containers run $(autotag llava) # or explicitly specify one of the container images above -./run.sh dustynv/llava:r36.2.0 +jetson-containers run dustynv/llava:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/llava:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag llava) +jetson-containers run -v /path/on/host:/path/in/container $(autotag llava) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag llava) my_app --abc xyz +jetson-containers run $(autotag llava) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -117,7 +117,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh llava +jetson-containers build llava ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/local_llm/README.md b/packages/llm/local_llm/README.md index df87d52d2..092037f35 100644 --- a/packages/llm/local_llm/README.md +++ b/packages/llm/local_llm/README.md @@ -240,9 +240,9 @@ You can also tag incoming images and add them to the database using the panel in | :-- | :-- | |    Builds | [![`local_llm_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/local_llm_jp60.yml?label=local_llm:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/local_llm_jp60.yml) [![`local_llm_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/local_llm_jp51.yml?label=local_llm:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/local_llm_jp51.yml) | |    Requires | `L4T ['>=34.1.0']` | -|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:12.2`](/packages/cuda/cuda) [`cudnn:8.9`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`cuda-python`](/packages/cuda/cuda-python) [`faiss`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`tensorrt`](/packages/tensorrt) [`torch2trt`](/packages/pytorch/torch2trt) [`nanodb`](/packages/vectordb/nanodb) [`mlc`](/packages/llm/mlc) [`riva-client:python`](/packages/audio/riva-client) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`onnxruntime`](/packages/onnxruntime) [`torchaudio`](/packages/pytorch/torchaudio) [`xtts`](/packages/audio/xtts) | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`cuda-python`](/packages/cuda/cuda-python) [`faiss`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`tensorrt`](/packages/tensorrt) [`torch2trt`](/packages/pytorch/torch2trt) [`nanodb`](/packages/vectordb/nanodb) [`mlc`](/packages/llm/mlc) [`riva-client:python`](/packages/audio/riva-client) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`onnxruntime`](/packages/onnxruntime) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/local_llm:dev-r36.2.0`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-03-03, 10.3GB)`
[`dustynv/local_llm:r35.2.1`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-02-22, 8.8GB)`
[`dustynv/local_llm:r35.3.1`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-02-22, 8.8GB)`
[`dustynv/local_llm:r35.4.1`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-02-22, 8.8GB)`
[`dustynv/local_llm:r36.2.0`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-03-17, 11.3GB)`
[`dustynv/local_llm:r36.2.0-20240127`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-03-07, 11.3GB)`
[`dustynv/local_llm:r36.2.0-20240303`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-03-07, 10.3GB)`
[`dustynv/local_llm:r36.2.0-20240309`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-03-11, 10.5GB)`
[`dustynv/local_llm:r36.2.0-20240315`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-03-16, 11.3GB)` | +|    Images | [`dustynv/local_llm:dev-r36.2.0`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-03-03, 10.3GB)`
[`dustynv/local_llm:r35.2.1`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-02-22, 8.8GB)`
[`dustynv/local_llm:r35.3.1`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-02-22, 8.8GB)`
[`dustynv/local_llm:r35.4.1`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-04-10, 8.3GB)`
[`dustynv/local_llm:r36.2.0`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-04-10, 9.7GB)`
[`dustynv/local_llm:r36.2.0-20240127`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-03-07, 11.3GB)`
[`dustynv/local_llm:r36.2.0-20240303`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-03-07, 10.3GB)`
[`dustynv/local_llm:r36.2.0-20240309`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-03-11, 10.5GB)`
[`dustynv/local_llm:r36.2.0-20240315`](https://hub.docker.com/r/dustynv/local_llm/tags) `(2024-03-16, 11.3GB)` | @@ -255,8 +255,8 @@ You can also tag incoming images and add them to the database using the panel in |   [`dustynv/local_llm:dev-r36.2.0`](https://hub.docker.com/r/dustynv/local_llm/tags) | `2024-03-03` | `arm64` | `10.3GB` | |   [`dustynv/local_llm:r35.2.1`](https://hub.docker.com/r/dustynv/local_llm/tags) | `2024-02-22` | `arm64` | `8.8GB` | |   [`dustynv/local_llm:r35.3.1`](https://hub.docker.com/r/dustynv/local_llm/tags) | `2024-02-22` | `arm64` | `8.8GB` | -|   [`dustynv/local_llm:r35.4.1`](https://hub.docker.com/r/dustynv/local_llm/tags) | `2024-02-22` | `arm64` | `8.8GB` | -|   [`dustynv/local_llm:r36.2.0`](https://hub.docker.com/r/dustynv/local_llm/tags) | `2024-03-17` | `arm64` | `11.3GB` | +|   [`dustynv/local_llm:r35.4.1`](https://hub.docker.com/r/dustynv/local_llm/tags) | `2024-04-10` | `arm64` | `8.3GB` | +|   [`dustynv/local_llm:r36.2.0`](https://hub.docker.com/r/dustynv/local_llm/tags) | `2024-04-10` | `arm64` | `9.7GB` | |   [`dustynv/local_llm:r36.2.0-20240127`](https://hub.docker.com/r/dustynv/local_llm/tags) | `2024-03-07` | `arm64` | `11.3GB` | |   [`dustynv/local_llm:r36.2.0-20240303`](https://hub.docker.com/r/dustynv/local_llm/tags) | `2024-03-07` | `arm64` | `10.3GB` | |   [`dustynv/local_llm:r36.2.0-20240309`](https://hub.docker.com/r/dustynv/local_llm/tags) | `2024-03-11` | `arm64` | `10.5GB` | @@ -271,29 +271,29 @@ You can also tag incoming images and add them to the database using the panel in
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag local_llm) +jetson-containers run $(autotag local_llm) # or explicitly specify one of the container images above -./run.sh dustynv/local_llm:r36.2.0 +jetson-containers run dustynv/local_llm:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) -sudo docker run --runtime nvidia -it --rm --network=host dustynv/local_llm:r36.2.0 +sudo docker run --runtime nvidia -it --rm --network=host dustynv/local_llm:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag local_llm) +jetson-containers run -v /path/on/host:/path/in/container $(autotag local_llm) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag local_llm) my_app --abc xyz +jetson-containers run $(autotag local_llm) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -301,7 +301,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh local_llm +jetson-containers build local_llm ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/minigpt4/README.md b/packages/llm/minigpt4/README.md index 32307613f..c8153c215 100644 --- a/packages/llm/minigpt4/README.md +++ b/packages/llm/minigpt4/README.md @@ -38,8 +38,8 @@ Then navigate your browser to `http://HOSTNAME:7860` | **`minigpt4`** | | | :-- | :-- | |    Builds | [![`minigpt4_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/minigpt4_jp51.yml?label=minigpt4:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/minigpt4_jp51.yml) [![`minigpt4_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/minigpt4_jp60.yml?label=minigpt4:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/minigpt4_jp60.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/minigpt4:r35.2.1`](https://hub.docker.com/r/dustynv/minigpt4/tags) `(2023-12-11, 5.9GB)`
[`dustynv/minigpt4:r35.3.1`](https://hub.docker.com/r/dustynv/minigpt4/tags) `(2023-12-15, 5.9GB)`
[`dustynv/minigpt4:r35.4.1`](https://hub.docker.com/r/dustynv/minigpt4/tags) `(2023-12-14, 5.9GB)`
[`dustynv/minigpt4:r36.2.0`](https://hub.docker.com/r/dustynv/minigpt4/tags) `(2023-12-15, 7.6GB)` | @@ -65,29 +65,29 @@ Then navigate your browser to `http://HOSTNAME:7860`
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag minigpt4) +jetson-containers run $(autotag minigpt4) # or explicitly specify one of the container images above -./run.sh dustynv/minigpt4:r35.3.1 +jetson-containers run dustynv/minigpt4:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/minigpt4:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag minigpt4) +jetson-containers run -v /path/on/host:/path/in/container $(autotag minigpt4) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag minigpt4) my_app --abc xyz +jetson-containers run $(autotag minigpt4) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -95,7 +95,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh minigpt4 +jetson-containers build minigpt4 ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/mlc/README.md b/packages/llm/mlc/README.md index 1cb34b0de..96f274b1c 100644 --- a/packages/llm/mlc/README.md +++ b/packages/llm/mlc/README.md @@ -57,129 +57,24 @@ The prefill time is how long the model takes to process the input context before CONTAINERS
-| **`mlc:dev-builder`** | | -| :-- | :-- | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/731616e) commit SHA [`731616e`](https://github.com/mlc-ai/mlc-llm/tree/731616e) | - -| **`mlc:dev`** | | -| :-- | :-- | -|    Builds | [![`mlc-dev_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/mlc-dev_jp60.yml?label=mlc-dev:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/mlc-dev_jp60.yml) [![`mlc-dev_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/mlc-dev_jp51.yml?label=mlc-dev:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/mlc-dev_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/mlc:dev-r35.3.1`](https://hub.docker.com/r/dustynv/mlc/tags) `(2023-10-30, 9.0GB)`
[`dustynv/mlc:dev-r35.4.1`](https://hub.docker.com/r/dustynv/mlc/tags) `(2023-12-16, 9.4GB)`
[`dustynv/mlc:dev-r36.2.0`](https://hub.docker.com/r/dustynv/mlc/tags) `(2023-12-16, 10.6GB)` | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/731616e) commit SHA [`731616e`](https://github.com/mlc-ai/mlc-llm/tree/731616e) | - -| **`mlc:9bf5723-builder`** | | -| :-- | :-- | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/9bf5723) commit SHA [`9bf5723`](https://github.com/mlc-ai/mlc-llm/tree/9bf5723) | - -| **`mlc:9bf5723`** | | -| :-- | :-- | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/9bf5723) commit SHA [`9bf5723`](https://github.com/mlc-ai/mlc-llm/tree/9bf5723) | - -| **`mlc:51fb0f4-builder`** | | -| :-- | :-- | -|    Aliases | `mlc:builder` | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | -|    Images | [`dustynv/mlc:51fb0f4-builder-r35.4.1`](https://hub.docker.com/r/dustynv/mlc/tags) `(2024-02-16, 9.5GB)`
[`dustynv/mlc:51fb0f4-builder-r36.2.0`](https://hub.docker.com/r/dustynv/mlc/tags) `(2024-02-16, 10.6GB)` | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/51fb0f4) commit SHA [`51fb0f4`](https://github.com/mlc-ai/mlc-llm/tree/51fb0f4) | - | **`mlc:51fb0f4`** | | | :-- | :-- | |    Aliases | `mlc` | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dependants | [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`local_llm`](/packages/llm/local_llm) | +|    Requires | `L4T ['>=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | +|    Dependants | [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/mlc:51fb0f4-builder-r35.4.1`](https://hub.docker.com/r/dustynv/mlc/tags) `(2024-02-16, 9.5GB)`
[`dustynv/mlc:51fb0f4-builder-r36.2.0`](https://hub.docker.com/r/dustynv/mlc/tags) `(2024-02-16, 10.6GB)` | |    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/51fb0f4) commit SHA [`51fb0f4`](https://github.com/mlc-ai/mlc-llm/tree/51fb0f4) | -| **`mlc:3feed05-builder`** | | -| :-- | :-- | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | -|    Images | [`dustynv/mlc:3feed05-builder-r36.2.0`](https://hub.docker.com/r/dustynv/mlc/tags) `(2024-02-16, 10.8GB)` | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/3feed05) commit SHA [`3feed05`](https://github.com/mlc-ai/mlc-llm/tree/3feed05) | - -| **`mlc:3feed05`** | | -| :-- | :-- | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/mlc:3feed05-builder-r36.2.0`](https://hub.docker.com/r/dustynv/mlc/tags) `(2024-02-16, 10.8GB)`
[`dustynv/mlc:3feed05-r36.2.0`](https://hub.docker.com/r/dustynv/mlc/tags) `(2024-02-16, 9.6GB)` | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/3feed05) commit SHA [`3feed05`](https://github.com/mlc-ai/mlc-llm/tree/3feed05) | - -| **`mlc:5584cac-builder`** | | -| :-- | :-- | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/5584cac) commit SHA [`5584cac`](https://github.com/mlc-ai/mlc-llm/tree/5584cac) | - -| **`mlc:5584cac`** | | -| :-- | :-- | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/mlc:5584cac-r36.2.0`](https://hub.docker.com/r/dustynv/mlc/tags) `(2024-02-22, 9.6GB)` | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/5584cac) commit SHA [`5584cac`](https://github.com/mlc-ai/mlc-llm/tree/5584cac) | - -| **`mlc:607dc5a-builder`** | | -| :-- | :-- | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/607dc5a) commit SHA [`607dc5a`](https://github.com/mlc-ai/mlc-llm/tree/607dc5a) | - | **`mlc:607dc5a`** | | | :-- | :-- | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | +|    Requires | `L4T ['>=36']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/mlc:607dc5a-r36.2.0`](https://hub.docker.com/r/dustynv/mlc/tags) `(2024-02-27, 9.6GB)` | |    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/607dc5a) commit SHA [`607dc5a`](https://github.com/mlc-ai/mlc-llm/tree/607dc5a) | -| **`mlc:1f70d71-builder`** | | -| :-- | :-- | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/1f70d71) commit SHA [`1f70d71`](https://github.com/mlc-ai/mlc-llm/tree/1f70d71) | - -| **`mlc:1f70d71`** | | -| :-- | :-- | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/1f70d71) commit SHA [`1f70d71`](https://github.com/mlc-ai/mlc-llm/tree/1f70d71) | - -| **`mlc:731616e-builder`** | | -| :-- | :-- | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/731616e) commit SHA [`731616e`](https://github.com/mlc-ai/mlc-llm/tree/731616e) | - -| **`mlc:731616e`** | | -| :-- | :-- | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Notes | [mlc-ai/mlc-llm](https://github.com/mlc-ai/mlc-llm/tree/731616e) commit SHA [`731616e`](https://github.com/mlc-ai/mlc-llm/tree/731616e) | -
@@ -201,7 +96,7 @@ The prefill time is how long the model takes to process the input context before |   [`dustynv/mlc:r35.2.1`](https://hub.docker.com/r/dustynv/mlc/tags) | `2023-12-16` | `arm64` | `9.4GB` | |   [`dustynv/mlc:r35.3.1`](https://hub.docker.com/r/dustynv/mlc/tags) | `2023-11-05` | `arm64` | `8.9GB` | |   [`dustynv/mlc:r35.4.1`](https://hub.docker.com/r/dustynv/mlc/tags) | `2024-01-27` | `arm64` | `9.4GB` | -|   [`dustynv/mlc:r36.2.0`](https://hub.docker.com/r/dustynv/mlc/tags) | `2024-01-27` | `arm64` | `10.6GB` | +|   [`dustynv/mlc:r36.2.0`](https://hub.docker.com/r/dustynv/mlc/tags) | `2024-03-09` | `arm64` | `9.6GB` | > Container images are compatible with other minor versions of JetPack/L4T:
>     â€¢ L4T R32.7 containers can run on other versions of L4T R32.7 (JetPack 4.6+)
@@ -212,29 +107,29 @@ The prefill time is how long the model takes to process the input context before RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag mlc) +jetson-containers run $(autotag mlc) # or explicitly specify one of the container images above -./run.sh dustynv/mlc:607dc5a-r36.2.0 +jetson-containers run dustynv/mlc:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) -sudo docker run --runtime nvidia -it --rm --network=host dustynv/mlc:607dc5a-r36.2.0 +sudo docker run --runtime nvidia -it --rm --network=host dustynv/mlc:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag mlc) +jetson-containers run -v /path/on/host:/path/in/container $(autotag mlc) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag mlc) my_app --abc xyz +jetson-containers run $(autotag mlc) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -242,7 +137,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh mlc +jetson-containers build mlc ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/nano_llm/README.md b/packages/llm/nano_llm/README.md index 2d1f1ae83..c62eda2ca 100644 --- a/packages/llm/nano_llm/README.md +++ b/packages/llm/nano_llm/README.md @@ -17,9 +17,16 @@ | :-- | :-- | |    Aliases | `nano_llm` | |    Requires | `L4T ['>=35']` | -|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:12.2`](/packages/cuda/cuda) [`cudnn:8.9`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`cuda-python`](/packages/cuda/cuda-python) [`faiss`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`tensorrt`](/packages/tensorrt) [`torch2trt`](/packages/pytorch/torch2trt) [`nanodb`](/packages/vectordb/nanodb) [`mlc`](/packages/llm/mlc) [`riva-client:python`](/packages/audio/riva-client) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`torchaudio`](/packages/pytorch/torchaudio) [`onnxruntime`](/packages/onnxruntime) | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`cuda-python`](/packages/cuda/cuda-python) [`faiss`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`tensorrt`](/packages/tensorrt) [`torch2trt`](/packages/pytorch/torch2trt) [`nanodb`](/packages/vectordb/nanodb) [`mlc`](/packages/llm/mlc) [`riva-client:python`](/packages/audio/riva-client) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`torchaudio`](/packages/pytorch/torchaudio) [`onnxruntime`](/packages/onnxruntime) | |    Dockerfile | [`Dockerfile`](Dockerfile) | +| **`nano_llm:24.4`** | | +| :-- | :-- | +|    Requires | `L4T ['>=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`cuda-python`](/packages/cuda/cuda-python) [`faiss`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`tensorrt`](/packages/tensorrt) [`torch2trt`](/packages/pytorch/torch2trt) [`nanodb`](/packages/vectordb/nanodb) [`mlc`](/packages/llm/mlc) [`riva-client:python`](/packages/audio/riva-client) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`torchaudio`](/packages/pytorch/torchaudio) [`onnxruntime`](/packages/onnxruntime) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Images | [`dustynv/nano_llm:24.4-r35.4.1`](https://hub.docker.com/r/dustynv/nano_llm/tags) `(2024-04-15, 8.5GB)`
[`dustynv/nano_llm:24.4-r36.2.0`](https://hub.docker.com/r/dustynv/nano_llm/tags) `(2024-04-15, 9.7GB)` | +
diff --git a/packages/llm/ollama/README.md b/packages/llm/ollama/README.md index 132d1c962..1ca7f0687 100644 --- a/packages/llm/ollama/README.md +++ b/packages/llm/ollama/README.md @@ -29,7 +29,7 @@ Start the Ollama front-end with your desired model (for example: mistral 7b) | **`ollama`** | | | :-- | :-- | |    Requires | `L4T ['>=34.1.0']` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/ollama:r35.4.1`](https://hub.docker.com/r/dustynv/ollama/tags) `(2024-04-05, 5.4GB)`
[`dustynv/ollama:r36.2.0`](https://hub.docker.com/r/dustynv/ollama/tags) `(2024-04-05, 3.9GB)` | @@ -53,29 +53,29 @@ Start the Ollama front-end with your desired model (for example: mistral 7b)
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag ollama) +jetson-containers run $(autotag ollama) # or explicitly specify one of the container images above -./run.sh dustynv/ollama:r35.4.1 +jetson-containers run dustynv/ollama:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/ollama:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag ollama) +jetson-containers run -v /path/on/host:/path/in/container $(autotag ollama) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag ollama) my_app --abc xyz +jetson-containers run $(autotag ollama) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -83,7 +83,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh ollama +jetson-containers build ollama ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/openai/README.md b/packages/llm/openai/README.md new file mode 100644 index 000000000..4147bbd37 --- /dev/null +++ b/packages/llm/openai/README.md @@ -0,0 +1,54 @@ +# openai + +> [`CONTAINERS`](#user-content-containers) [`IMAGES`](#user-content-images) [`RUN`](#user-content-run) [`BUILD`](#user-content-build) + +
+CONTAINERS +
+ +| **`openai`** | | +| :-- | :-- | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) | +|    Dependants | [`l4t-text-generation`](/packages/l4t/l4t-text-generation) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | OpenAI API Python client from https://github.com/openai/openai-python | + +
+ +
+RUN CONTAINER +
+ +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +```bash +# automatically pull or build a compatible container image +jetson-containers run $(autotag openai) + +# or if using 'docker run' (specify image and mounts/ect) +sudo docker run --runtime nvidia -it --rm --network=host openai:35.2.1 + +``` +> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. + +To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: +```bash +jetson-containers run -v /path/on/host:/path/in/container $(autotag openai) +``` +To launch the container running a command, as opposed to an interactive shell: +```bash +jetson-containers run $(autotag openai) my_app --abc xyz +``` +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +
+
+BUILD CONTAINER +
+ +If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: +```bash +jetson-containers build openai +``` +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options. +
diff --git a/packages/llm/optimum/README.md b/packages/llm/optimum/README.md index e1251f104..ab971a329 100644 --- a/packages/llm/optimum/README.md +++ b/packages/llm/optimum/README.md @@ -9,8 +9,8 @@ | **`optimum`** | | | :-- | :-- | |    Builds | [![`optimum_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/optimum_jp46.yml?label=optimum:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/optimum_jp46.yml) [![`optimum_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/optimum_jp51.yml?label=optimum:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/optimum_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/cmake/cmake_pip) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) [`onnxruntime`](/packages/onnxruntime) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) [`onnxruntime`](/packages/onnxruntime) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/optimum:r32.7.1`](https://hub.docker.com/r/dustynv/optimum/tags) `(2023-12-15, 1.7GB)`
[`dustynv/optimum:r35.2.1`](https://hub.docker.com/r/dustynv/optimum/tags) `(2023-12-15, 6.1GB)`
[`dustynv/optimum:r35.3.1`](https://hub.docker.com/r/dustynv/optimum/tags) `(2023-12-14, 6.1GB)`
[`dustynv/optimum:r35.4.1`](https://hub.docker.com/r/dustynv/optimum/tags) `(2023-11-05, 6.1GB)` | |    Notes | https://github.com/huggingface/optimum | @@ -37,29 +37,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag optimum) +jetson-containers run $(autotag optimum) # or explicitly specify one of the container images above -./run.sh dustynv/optimum:r35.2.1 +jetson-containers run dustynv/optimum:r35.2.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/optimum:r35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag optimum) +jetson-containers run -v /path/on/host:/path/in/container $(autotag optimum) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag optimum) my_app --abc xyz +jetson-containers run $(autotag optimum) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +67,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh optimum +jetson-containers build optimum ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/tensorrt_llm/README.md b/packages/llm/tensorrt_llm/README.md new file mode 100644 index 000000000..6f0f64ac6 --- /dev/null +++ b/packages/llm/tensorrt_llm/README.md @@ -0,0 +1,78 @@ +# tensorrt_llm + +> [`CONTAINERS`](#user-content-containers) [`IMAGES`](#user-content-images) [`RUN`](#user-content-run) [`BUILD`](#user-content-build) + +
+CONTAINERS +
+ +| **`tensorrt_llm:0.10.dev0`** | | +| :-- | :-- | +|    Aliases | `tensorrt_llm` | +|    Requires | `L4T ['==r36.*', '>=cu124']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`cuda-python`](/packages/cuda/cuda-python) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | The `tensorrt-llm:builder` container includes the C++ binaries under `/opt` | + +| **`tensorrt_llm:0.10.dev0-builder`** | | +| :-- | :-- | +|    Aliases | `tensorrt_llm:builder` | +|    Requires | `L4T ['==r36.*', '>=cu124']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`cuda-python`](/packages/cuda/cuda-python) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | The `tensorrt-llm:builder` container includes the C++ binaries under `/opt` | + +| **`tensorrt_llm:0.5`** | | +| :-- | :-- | +|    Aliases | `tensorrt_llm` | +|    Requires | `L4T ['==r36.*', '==cu122']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`cuda-python`](/packages/cuda/cuda-python) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | The `tensorrt-llm:builder` container includes the C++ binaries under `/opt` | + +| **`tensorrt_llm:0.5-builder`** | | +| :-- | :-- | +|    Aliases | `tensorrt_llm:builder` | +|    Requires | `L4T ['==r36.*', '==cu122']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`cuda-python`](/packages/cuda/cuda-python) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | The `tensorrt-llm:builder` container includes the C++ binaries under `/opt` | + +
+ +
+RUN CONTAINER +
+ +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +```bash +# automatically pull or build a compatible container image +jetson-containers run $(autotag tensorrt_llm) + +# or if using 'docker run' (specify image and mounts/ect) +sudo docker run --runtime nvidia -it --rm --network=host tensorrt_llm:35.2.1 + +``` +> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. + +To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: +```bash +jetson-containers run -v /path/on/host:/path/in/container $(autotag tensorrt_llm) +``` +To launch the container running a command, as opposed to an interactive shell: +```bash +jetson-containers run $(autotag tensorrt_llm) my_app --abc xyz +``` +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +
+
+BUILD CONTAINER +
+ +If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: +```bash +jetson-containers build tensorrt_llm +``` +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options. +
diff --git a/packages/llm/text-generation-inference/README.md b/packages/llm/text-generation-inference/README.md index a3ef4b779..f0fadf1d3 100644 --- a/packages/llm/text-generation-inference/README.md +++ b/packages/llm/text-generation-inference/README.md @@ -9,8 +9,8 @@ | **`text-generation-inference`** | | | :-- | :-- | |    Builds | [![`text-generation-inference_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/text-generation-inference_jp51.yml?label=text-generation-inference:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/text-generation-inference_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`bitsandbytes`](/packages/llm/bitsandbytes) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`bitsandbytes`](/packages/llm/bitsandbytes) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/text-generation-inference:r35.2.1`](https://hub.docker.com/r/dustynv/text-generation-inference/tags) `(2023-11-04, 7.0GB)`
[`dustynv/text-generation-inference:r35.3.1`](https://hub.docker.com/r/dustynv/text-generation-inference/tags) `(2023-11-05, 7.0GB)`
[`dustynv/text-generation-inference:r35.4.1`](https://hub.docker.com/r/dustynv/text-generation-inference/tags) `(2023-11-05, 7.0GB)` | |    Notes | https://github.com/huggingface/text-generation-inference | @@ -36,29 +36,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag text-generation-inference) +jetson-containers run $(autotag text-generation-inference) # or explicitly specify one of the container images above -./run.sh dustynv/text-generation-inference:r35.3.1 +jetson-containers run dustynv/text-generation-inference:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/text-generation-inference:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag text-generation-inference) +jetson-containers run -v /path/on/host:/path/in/container $(autotag text-generation-inference) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag text-generation-inference) my_app --abc xyz +jetson-containers run $(autotag text-generation-inference) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -66,7 +66,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh text-generation-inference +jetson-containers build text-generation-inference ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/text-generation-webui/README.md b/packages/llm/text-generation-webui/README.md index 5ccd9b73e..2d7da128c 100644 --- a/packages/llm/text-generation-webui/README.md +++ b/packages/llm/text-generation-webui/README.md @@ -89,22 +89,22 @@ I'm a large language model, so I can play text-based games and answer questions | **`text-generation-webui:main`** | | | :-- | :-- | |    Aliases | `text-generation-webui` | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`auto_gptq`](/packages/llm/auto_gptq) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`bitsandbytes`](/packages/llm/bitsandbytes) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`auto_gptq`](/packages/llm/auto_gptq) [`exllama`](/packages/llm/exllama) [`llama_cpp`](/packages/llm/llama_cpp) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/text-generation-webui:main-r36.2.0`](https://hub.docker.com/r/dustynv/text-generation-webui/tags) `(2023-12-18, 8.1GB)` | | **`text-generation-webui:1.7`** | | | :-- | :-- | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`auto_gptq`](/packages/llm/auto_gptq) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`bitsandbytes`](/packages/llm/bitsandbytes) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`auto_gptq`](/packages/llm/auto_gptq) [`exllama`](/packages/llm/exllama) [`llama_cpp`](/packages/llm/llama_cpp) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/text-generation-webui:1.7-r35.4.1`](https://hub.docker.com/r/dustynv/text-generation-webui/tags) `(2023-12-05, 6.4GB)` | | **`text-generation-webui:6a7cd01`** | | | :-- | :-- | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`auto_gptq`](/packages/llm/auto_gptq) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`bitsandbytes`](/packages/llm/bitsandbytes) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`auto_gptq`](/packages/llm/auto_gptq) [`exllama`](/packages/llm/exllama) [`llama_cpp`](/packages/llm/llama_cpp) | |    Dockerfile | [`Dockerfile`](Dockerfile) | @@ -120,6 +120,7 @@ I'm a large language model, so I can play text-based games and answer questions |   [`dustynv/text-generation-webui:r35.2.1`](https://hub.docker.com/r/dustynv/text-generation-webui/tags) | `2024-02-01` | `arm64` | `6.6GB` | |   [`dustynv/text-generation-webui:r35.3.1`](https://hub.docker.com/r/dustynv/text-generation-webui/tags) | `2024-02-03` | `arm64` | `6.6GB` | |   [`dustynv/text-generation-webui:r35.4.1`](https://hub.docker.com/r/dustynv/text-generation-webui/tags) | `2024-02-01` | `arm64` | `6.6GB` | +|   [`dustynv/text-generation-webui:r35.4.1-cp310`](https://hub.docker.com/r/dustynv/text-generation-webui/tags) | `2024-04-12` | `arm64` | `6.4GB` | |   [`dustynv/text-generation-webui:r36.2.0`](https://hub.docker.com/r/dustynv/text-generation-webui/tags) | `2024-02-03` | `arm64` | `8.3GB` | > Container images are compatible with other minor versions of JetPack/L4T:
@@ -131,29 +132,29 @@ I'm a large language model, so I can play text-based games and answer questions
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag text-generation-webui) +jetson-containers run $(autotag text-generation-webui) # or explicitly specify one of the container images above -./run.sh dustynv/text-generation-webui:r35.3.1 +jetson-containers run dustynv/text-generation-webui:r35.4.1-cp310 # or if using 'docker run' (specify image and mounts/ect) -sudo docker run --runtime nvidia -it --rm --network=host dustynv/text-generation-webui:r35.3.1 +sudo docker run --runtime nvidia -it --rm --network=host dustynv/text-generation-webui:r35.4.1-cp310 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag text-generation-webui) +jetson-containers run -v /path/on/host:/path/in/container $(autotag text-generation-webui) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag text-generation-webui) my_app --abc xyz +jetson-containers run $(autotag text-generation-webui) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -161,7 +162,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh text-generation-webui +jetson-containers build text-generation-webui ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/transformers/README.md b/packages/llm/transformers/README.md index dcb4ca583..f8c91e926 100644 --- a/packages/llm/transformers/README.md +++ b/packages/llm/transformers/README.md @@ -50,9 +50,9 @@ Other libraries like [`exllama`](/packages/llm/exllama), [`awq`](/packages/llm/a | **`transformers`** | | | :-- | :-- | |    Builds | [![`transformers_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/transformers_jp60.yml?label=transformers:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/transformers_jp60.yml) [![`transformers_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/transformers_jp51.yml?label=transformers:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/transformers_jp51.yml) [![`transformers_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/transformers_jp46.yml?label=transformers:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/transformers_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`efficientvit`](/packages/vit/efficientvit) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`optimum`](/packages/llm/optimum) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`whisperx`](/packages/audio/whisperx) [`xtts`](/packages/audio/xtts) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) | +|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`efficientvit`](/packages/vit/efficientvit) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`optimum`](/packages/llm/optimum) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`whisperx`](/packages/audio/whisperx) [`xtts`](/packages/audio/xtts) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/transformers:git-r35.2.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-15, 5.9GB)`
[`dustynv/transformers:git-r35.3.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-12, 5.9GB)`
[`dustynv/transformers:git-r35.4.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-11, 5.9GB)`
[`dustynv/transformers:nvgpt-r35.2.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-05, 5.9GB)`
[`dustynv/transformers:nvgpt-r35.3.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-15, 5.9GB)`
[`dustynv/transformers:nvgpt-r35.4.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-14, 5.9GB)`
[`dustynv/transformers:r32.7.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-15, 1.5GB)`
[`dustynv/transformers:r35.2.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-11, 5.9GB)`
[`dustynv/transformers:r35.3.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-12, 5.9GB)`
[`dustynv/transformers:r35.4.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-15, 5.9GB)`
[`dustynv/transformers:r36.2.0`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-15, 7.6GB)` | |    Notes | bitsandbytes and auto_gptq dependencies added on JetPack5 for 4-bit/8-bit quantization | @@ -60,8 +60,8 @@ Other libraries like [`exllama`](/packages/llm/exllama), [`awq`](/packages/llm/a | **`transformers:git`** | | | :-- | :-- | |    Builds | [![`transformers-git_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/transformers-git_jp51.yml?label=transformers-git:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/transformers-git_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/transformers:git-r35.2.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-15, 5.9GB)`
[`dustynv/transformers:git-r35.3.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-12, 5.9GB)`
[`dustynv/transformers:git-r35.4.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-11, 5.9GB)` | |    Notes | bitsandbytes and auto_gptq dependencies added on JetPack5 for 4-bit/8-bit quantization | @@ -69,8 +69,8 @@ Other libraries like [`exllama`](/packages/llm/exllama), [`awq`](/packages/llm/a | **`transformers:nvgpt`** | | | :-- | :-- | |    Builds | [![`transformers-nvgpt_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/transformers-nvgpt_jp51.yml?label=transformers-nvgpt:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/transformers-nvgpt_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/transformers:nvgpt-r35.2.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-05, 5.9GB)`
[`dustynv/transformers:nvgpt-r35.3.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-15, 5.9GB)`
[`dustynv/transformers:nvgpt-r35.4.1`](https://hub.docker.com/r/dustynv/transformers/tags) `(2023-12-14, 5.9GB)` | |    Notes | bitsandbytes and auto_gptq dependencies added on JetPack5 for 4-bit/8-bit quantization | @@ -104,29 +104,29 @@ Other libraries like [`exllama`](/packages/llm/exllama), [`awq`](/packages/llm/a
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag transformers) +jetson-containers run $(autotag transformers) # or explicitly specify one of the container images above -./run.sh dustynv/transformers:nvgpt-r35.3.1 +jetson-containers run dustynv/transformers:nvgpt-r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/transformers:nvgpt-r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag transformers) +jetson-containers run -v /path/on/host:/path/in/container $(autotag transformers) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag transformers) my_app --abc xyz +jetson-containers run $(autotag transformers) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -134,7 +134,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh transformers +jetson-containers build transformers ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/llm/xformers/README.md b/packages/llm/xformers/README.md index ad3166f7b..b471a50b7 100644 --- a/packages/llm/xformers/README.md +++ b/packages/llm/xformers/README.md @@ -9,8 +9,8 @@ | **`xformers`** | | | :-- | :-- | |    Builds | [![`xformers_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/xformers_jp51.yml?label=xformers:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/xformers_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:distributed`](/packages/pytorch) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) | |    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/xformers:r35.2.1`](https://hub.docker.com/r/dustynv/xformers/tags) `(2023-12-06, 5.8GB)`
[`dustynv/xformers:r35.3.1`](https://hub.docker.com/r/dustynv/xformers/tags) `(2023-12-14, 5.8GB)`
[`dustynv/xformers:r35.4.1`](https://hub.docker.com/r/dustynv/xformers/tags) `(2024-01-09, 5.9GB)` | @@ -37,29 +37,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag xformers) +jetson-containers run $(autotag xformers) # or explicitly specify one of the container images above -./run.sh dustynv/xformers:r35.4.1 +jetson-containers run dustynv/xformers:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/xformers:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag xformers) +jetson-containers run -v /path/on/host:/path/in/container $(autotag xformers) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag xformers) my_app --abc xyz +jetson-containers run $(autotag xformers) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +67,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh xformers +jetson-containers build xformers ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/nemo/README.md b/packages/nemo/README.md index 295e9554c..ef8bb1db1 100644 --- a/packages/nemo/README.md +++ b/packages/nemo/README.md @@ -10,8 +10,8 @@ NVIDIA NeMo for ASR/NLP/TTS https://nvidia.github.io/NeMo/ | **`nemo`** | | | :-- | :-- | |    Builds | [![`nemo_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nemo_jp46.yml?label=nemo:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nemo_jp46.yml) [![`nemo_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nemo_jp60.yml?label=nemo:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nemo_jp60.yml) [![`nemo_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nemo_jp51.yml?label=nemo:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nemo_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`torchaudio`](/packages/pytorch/torchaudio) [`numba`](/packages/numba) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`torchaudio`](/packages/pytorch/torchaudio) [`numba`](/packages/numba) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/nemo:r32.7.1`](https://hub.docker.com/r/dustynv/nemo/tags) `(2023-11-05, 1.9GB)`
[`dustynv/nemo:r35.2.1`](https://hub.docker.com/r/dustynv/nemo/tags) `(2023-09-11, 7.1GB)`
[`dustynv/nemo:r35.3.1`](https://hub.docker.com/r/dustynv/nemo/tags) `(2023-09-24, 7.1GB)`
[`dustynv/nemo:r35.4.1`](https://hub.docker.com/r/dustynv/nemo/tags) `(2023-08-29, 6.9GB)`
[`dustynv/nemo:r36.2.0`](https://hub.docker.com/r/dustynv/nemo/tags) `(2023-12-15, 9.2GB)` | |    Notes | this Dockerfile gets switched out for `Dockerfile.jp4` on JetPack 4 | @@ -39,29 +39,29 @@ NVIDIA NeMo for ASR/NLP/TTS https://nvidia.github.io/NeMo/
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag nemo) +jetson-containers run $(autotag nemo) # or explicitly specify one of the container images above -./run.sh dustynv/nemo:r36.2.0 +jetson-containers run dustynv/nemo:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/nemo:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag nemo) +jetson-containers run -v /path/on/host:/path/in/container $(autotag nemo) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag nemo) my_app --abc xyz +jetson-containers run $(autotag nemo) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -69,7 +69,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh nemo +jetson-containers build nemo ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/numba/README.md b/packages/numba/README.md index 14b885af1..ce9f5dcc6 100644 --- a/packages/numba/README.md +++ b/packages/numba/README.md @@ -9,8 +9,8 @@ | **`numba`** | | | :-- | :-- | |    Builds | [![`numba_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/numba_jp46.yml?label=numba:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/numba_jp46.yml) [![`numba_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/numba_jp60.yml?label=numba:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/numba_jp60.yml) [![`numba_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/numba_jp51.yml?label=numba:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/numba_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`numpy`](/packages/numpy) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`numpy`](/packages/numpy) | |    Dependants | [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cuml`](/packages/rapids/cuml) [`l4t-ml`](/packages/l4t/l4t-ml) [`nemo`](/packages/nemo) [`raft`](/packages/rapids/raft) [`whisper`](/packages/audio/whisper) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/numba:r32.7.1`](https://hub.docker.com/r/dustynv/numba/tags) `(2023-09-07, 0.5GB)`
[`dustynv/numba:r35.2.1`](https://hub.docker.com/r/dustynv/numba/tags) `(2023-12-05, 5.1GB)`
[`dustynv/numba:r35.3.1`](https://hub.docker.com/r/dustynv/numba/tags) `(2023-12-06, 5.1GB)`
[`dustynv/numba:r35.4.1`](https://hub.docker.com/r/dustynv/numba/tags) `(2023-10-07, 5.1GB)`
[`dustynv/numba:r36.2.0`](https://hub.docker.com/r/dustynv/numba/tags) `(2023-12-06, 3.6GB)` | @@ -38,29 +38,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag numba) +jetson-containers run $(autotag numba) # or explicitly specify one of the container images above -./run.sh dustynv/numba:r35.3.1 +jetson-containers run dustynv/numba:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/numba:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag numba) +jetson-containers run -v /path/on/host:/path/in/container $(autotag numba) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag numba) my_app --abc xyz +jetson-containers run $(autotag numba) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -68,7 +68,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh numba +jetson-containers build numba ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/numpy/README.md b/packages/numpy/README.md index 94403e40a..87299d496 100644 --- a/packages/numpy/README.md +++ b/packages/numpy/README.md @@ -9,9 +9,9 @@ | **`numpy`** | | | :-- | :-- | |    Builds | [![`numpy_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/numpy_jp46.yml?label=numpy:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/numpy_jp46.yml) [![`numpy_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/numpy_jp51.yml?label=numpy:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/numpy_jp51.yml) [![`numpy_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/numpy_jp60.yml?label=numpy:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/numpy_jp60.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`cuda-python`](/packages/cuda/cuda-python) [`cuda-python:builder`](/packages/cuda/cuda-python) [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cuml`](/packages/rapids/cuml) [`cupy`](/packages/cuda/cupy) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faiss:be12427`](/packages/vectordb/faiss) [`faiss:be12427-builder`](/packages/vectordb/faiss) [`faiss:v1.7.3`](/packages/vectordb/faiss) [`faiss:v1.7.3-builder`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`faster-whisper`](/packages/audio/faster-whisper) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`jupyterlab`](/packages/jupyterlab) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:ggml`](/packages/llm/llama_cpp) [`llama_cpp:gguf`](/packages/llm/llama_cpp) [`llamaspeak`](/packages/llm/llamaspeak) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`numba`](/packages/numba) [`onnx`](/packages/onnx) [`onnxruntime`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`opencv:4.5.0`](/packages/opencv) [`opencv:4.8.1`](/packages/opencv) [`optimum`](/packages/llm/optimum) [`pycuda`](/packages/cuda/pycuda) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.11`](/packages/pytorch) [`pytorch:1.12`](/packages/pytorch) [`pytorch:1.13`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.0-distributed`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.1-builder`](/packages/pytorch) [`pytorch:2.1-distributed`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio`](/packages/pytorch/torchaudio) [`torchvision`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) | +|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`cuda-python:11.4`](/packages/cuda/cuda-python) [`cudf:21.10.02`](/packages/rapids/cudf) [`cudf:23.10.03`](/packages/rapids/cudf) [`cuml`](/packages/rapids/cuml) [`cupy`](/packages/cuda/cupy) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`exllama:0.0.14`](/packages/llm/exllama) [`exllama:0.0.15`](/packages/llm/exllama) [`faiss:1.7.3`](/packages/vectordb/faiss) [`faiss:1.7.3-builder`](/packages/vectordb/faiss) [`faiss:1.7.4`](/packages/vectordb/faiss) [`faiss:1.7.4-builder`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`faster-whisper`](/packages/audio/faster-whisper) [`flash-attention`](/packages/llm/flash-attention) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`jupyterlab`](/packages/jupyterlab) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llama_cpp:0.2.57`](/packages/llm/llama_cpp) [`llamaspeak`](/packages/llm/llamaspeak) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`numba`](/packages/numba) [`onnx`](/packages/onnx) [`onnxruntime:1.11`](/packages/onnxruntime) [`onnxruntime:1.11-builder`](/packages/onnxruntime) [`onnxruntime:1.16.3`](/packages/onnxruntime) [`onnxruntime:1.16.3-builder`](/packages/onnxruntime) [`onnxruntime:1.17`](/packages/onnxruntime) [`onnxruntime:1.17-builder`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`openai-triton:builder`](/packages/openai-triton) [`opencv:4.5.0`](/packages/opencv) [`opencv:4.8.1`](/packages/opencv) [`opencv:4.9.0`](/packages/opencv) [`optimum`](/packages/llm/optimum) [`piper-tts`](/packages/audio/piper-tts) [`pycuda`](/packages/cuda/pycuda) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.2`](/packages/pytorch) [`pytorch:2.3`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio:0.10.0`](/packages/pytorch/torchaudio) [`torchaudio:0.9.0`](/packages/pytorch/torchaudio) [`torchaudio:2.0.1`](/packages/pytorch/torchaudio) [`torchaudio:2.1.0`](/packages/pytorch/torchaudio) [`torchaudio:2.2.2`](/packages/pytorch/torchaudio) [`torchaudio:2.3.0`](/packages/pytorch/torchaudio) [`torchvision:0.10.0`](/packages/pytorch/torchvision) [`torchvision:0.11.1`](/packages/pytorch/torchvision) [`torchvision:0.15.1`](/packages/pytorch/torchvision) [`torchvision:0.16.2`](/packages/pytorch/torchvision) [`torchvision:0.17.2`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/numpy:r32.7.1`](https://hub.docker.com/r/dustynv/numpy/tags) `(2023-12-05, 0.4GB)`
[`dustynv/numpy:r35.2.1`](https://hub.docker.com/r/dustynv/numpy/tags) `(2023-09-07, 5.0GB)`
[`dustynv/numpy:r35.3.1`](https://hub.docker.com/r/dustynv/numpy/tags) `(2023-12-05, 5.0GB)`
[`dustynv/numpy:r35.4.1`](https://hub.docker.com/r/dustynv/numpy/tags) `(2023-10-07, 5.0GB)`
[`dustynv/numpy:r36.2.0`](https://hub.docker.com/r/dustynv/numpy/tags) `(2023-12-06, 0.2GB)` | @@ -38,29 +38,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag numpy) +jetson-containers run $(autotag numpy) # or explicitly specify one of the container images above -./run.sh dustynv/numpy:r36.2.0 +jetson-containers run dustynv/numpy:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/numpy:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag numpy) +jetson-containers run -v /path/on/host:/path/in/container $(autotag numpy) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag numpy) my_app --abc xyz +jetson-containers run $(autotag numpy) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -68,7 +68,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh numpy +jetson-containers build numpy ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/onnx/README.md b/packages/onnx/README.md index b5e053bab..0e65e1c7a 100644 --- a/packages/onnx/README.md +++ b/packages/onnx/README.md @@ -9,12 +9,12 @@ | **`onnx`** | | | :-- | :-- | |    Builds | [![`onnx_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/onnx_jp46.yml?label=onnx:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/onnx_jp46.yml) [![`onnx_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/onnx_jp51.yml?label=onnx:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/onnx_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) [`numpy`](/packages/numpy) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`efficientvit`](/packages/vit/efficientvit) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faiss_lite`](/packages/vectordb/faiss_lite) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jetson-inference`](/packages/jetson-inference) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`onnxruntime`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`optimum`](/packages/llm/optimum) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.11`](/packages/pytorch) [`pytorch:1.12`](/packages/pytorch) [`pytorch:1.13`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.0-distributed`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.1-builder`](/packages/pytorch) [`pytorch:2.1-distributed`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio`](/packages/pytorch/torchaudio) [`torchvision`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) | +|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`efficientvit`](/packages/vit/efficientvit) [`exllama:0.0.14`](/packages/llm/exllama) [`exllama:0.0.15`](/packages/llm/exllama) [`faiss_lite`](/packages/vectordb/faiss_lite) [`flash-attention`](/packages/llm/flash-attention) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jetson-inference`](/packages/jetson-inference) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`onnxruntime:1.11`](/packages/onnxruntime) [`onnxruntime:1.11-builder`](/packages/onnxruntime) [`onnxruntime:1.16.3`](/packages/onnxruntime) [`onnxruntime:1.16.3-builder`](/packages/onnxruntime) [`onnxruntime:1.17`](/packages/onnxruntime) [`onnxruntime:1.17-builder`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`openai-triton:builder`](/packages/openai-triton) [`optimum`](/packages/llm/optimum) [`piper-tts`](/packages/audio/piper-tts) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.2`](/packages/pytorch) [`pytorch:2.3`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio:0.10.0`](/packages/pytorch/torchaudio) [`torchaudio:0.9.0`](/packages/pytorch/torchaudio) [`torchaudio:2.0.1`](/packages/pytorch/torchaudio) [`torchaudio:2.1.0`](/packages/pytorch/torchaudio) [`torchaudio:2.2.2`](/packages/pytorch/torchaudio) [`torchaudio:2.3.0`](/packages/pytorch/torchaudio) [`torchvision:0.10.0`](/packages/pytorch/torchvision) [`torchvision:0.11.1`](/packages/pytorch/torchvision) [`torchvision:0.15.1`](/packages/pytorch/torchvision) [`torchvision:0.16.2`](/packages/pytorch/torchvision) [`torchvision:0.17.2`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/onnx:r32.7.1`](https://hub.docker.com/r/dustynv/onnx/tags) `(2023-12-11, 0.4GB)`
[`dustynv/onnx:r35.2.1`](https://hub.docker.com/r/dustynv/onnx/tags) `(2023-12-12, 5.0GB)`
[`dustynv/onnx:r35.3.1`](https://hub.docker.com/r/dustynv/onnx/tags) `(2023-12-11, 5.0GB)`
[`dustynv/onnx:r35.4.1`](https://hub.docker.com/r/dustynv/onnx/tags) `(2023-10-07, 5.0GB)` | -|    Notes | protobuf_apt is added as a dependency on JetPack 4 (newer versions of onnx build it in-tree) | +|    Notes | the `protobuf_apt` is added as a dependency on JetPack 4 (newer versions of onnx build it in-tree) | @@ -38,29 +38,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag onnx) +jetson-containers run $(autotag onnx) # or explicitly specify one of the container images above -./run.sh dustynv/onnx:r35.2.1 +jetson-containers run dustynv/onnx:r35.2.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/onnx:r35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag onnx) +jetson-containers run -v /path/on/host:/path/in/container $(autotag onnx) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag onnx) my_app --abc xyz +jetson-containers run $(autotag onnx) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -68,7 +68,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh onnx +jetson-containers build onnx ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/onnxruntime/README.md b/packages/onnxruntime/README.md index 7cc4c2d66..d40d0365d 100644 --- a/packages/onnxruntime/README.md +++ b/packages/onnxruntime/README.md @@ -6,15 +6,54 @@ CONTAINERS
-| **`onnxruntime`** | | +| **`onnxruntime:1.17`** | | | :-- | :-- | -|    Builds | [![`onnxruntime_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/onnxruntime_jp51.yml?label=onnxruntime:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/onnxruntime_jp51.yml) [![`onnxruntime_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/onnxruntime_jp60.yml?label=onnxruntime:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/onnxruntime_jp60.yml) [![`onnxruntime_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/onnxruntime_jp46.yml?label=onnxruntime:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/onnxruntime_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/cmake/cmake_pip) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) | -|    Dependants | [`efficientvit`](/packages/vit/efficientvit) [`l4t-ml`](/packages/l4t/l4t-ml) [`local_llm`](/packages/llm/local_llm) [`optimum`](/packages/llm/optimum) [`sam`](/packages/vit/sam) [`tam`](/packages/vit/tam) | +|    Aliases | `onnxruntime` | +|    Requires | `L4T ['>=36']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) | +|    Dependants | [`efficientvit`](/packages/vit/efficientvit) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`optimum`](/packages/llm/optimum) [`piper-tts`](/packages/audio/piper-tts) [`sam`](/packages/vit/sam) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/onnxruntime:r32.7.1`](https://hub.docker.com/r/dustynv/onnxruntime/tags) `(2023-12-11, 0.5GB)`
[`dustynv/onnxruntime:r35.2.1`](https://hub.docker.com/r/dustynv/onnxruntime/tags) `(2023-12-12, 5.2GB)`
[`dustynv/onnxruntime:r35.3.1`](https://hub.docker.com/r/dustynv/onnxruntime/tags) `(2023-11-13, 5.2GB)`
[`dustynv/onnxruntime:r35.4.1`](https://hub.docker.com/r/dustynv/onnxruntime/tags) `(2023-11-08, 5.1GB)`
[`dustynv/onnxruntime:r36.2.0`](https://hub.docker.com/r/dustynv/onnxruntime/tags) `(2023-12-12, 6.9GB)` | -|    Notes | the onnxruntime-gpu wheel that's built is saved in the container under /opt | +|    Notes | the `onnxruntime-gpu` wheel that's built is saved in the container under `/opt` | + +| **`onnxruntime:1.17-builder`** | | +| :-- | :-- | +|    Aliases | `onnxruntime:builder` | +|    Requires | `L4T ['>=36']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | the `onnxruntime-gpu` wheel that's built is saved in the container under `/opt` | + +| **`onnxruntime:1.16.3`** | | +| :-- | :-- | +|    Aliases | `onnxruntime` | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | the `onnxruntime-gpu` wheel that's built is saved in the container under `/opt` | + +| **`onnxruntime:1.16.3-builder`** | | +| :-- | :-- | +|    Aliases | `onnxruntime:builder` | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | the `onnxruntime-gpu` wheel that's built is saved in the container under `/opt` | + +| **`onnxruntime:1.11`** | | +| :-- | :-- | +|    Aliases | `onnxruntime` | +|    Requires | `L4T ['==32.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | the `onnxruntime-gpu` wheel that's built is saved in the container under `/opt` | + +| **`onnxruntime:1.11-builder`** | | +| :-- | :-- | +|    Aliases | `onnxruntime:builder` | +|    Requires | `L4T ['==32.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Notes | the `onnxruntime-gpu` wheel that's built is saved in the container under `/opt` | @@ -39,29 +78,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag onnxruntime) +jetson-containers run $(autotag onnxruntime) # or explicitly specify one of the container images above -./run.sh dustynv/onnxruntime:r36.2.0 +jetson-containers run dustynv/onnxruntime:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/onnxruntime:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag onnxruntime) +jetson-containers run -v /path/on/host:/path/in/container $(autotag onnxruntime) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag onnxruntime) my_app --abc xyz +jetson-containers run $(autotag onnxruntime) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -69,7 +108,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh onnxruntime +jetson-containers build onnxruntime ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/openai-triton/README.md b/packages/openai-triton/README.md index d6dce1b72..e32b5d367 100644 --- a/packages/openai-triton/README.md +++ b/packages/openai-triton/README.md @@ -6,12 +6,19 @@ CONTAINERS
+| **`openai-triton:builder`** | | +| :-- | :-- | +|    Requires | `L4T ['>=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) | +|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | +|    Notes | The OpenAI `triton` (https://github.com/openai/triton) wheel that's built is saved in the container under `/opt`. Based on https://cloud.tencent.com/developer/article/2317398, https://zhuanlan.zhihu.com/p/681714973, https://zhuanlan.zhihu.com/p/673525339 | + | **`openai-triton`** | | | :-- | :-- | -|    Requires | `L4T >=35` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) | +|    Requires | `L4T ['>=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`numpy`](/packages/numpy) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Notes | The openai-triton wheel that's built is saved in the container under /opt. Based on https://cloud.tencent.com/developer/article/2317398, https://zhuanlan.zhihu.com/p/681714973, https://zhuanlan.zhihu.com/p/673525339 | +|    Notes | The OpenAI `triton` (https://github.com/openai/triton) wheel that's built is saved in the container under `/opt`. Based on https://cloud.tencent.com/developer/article/2317398, https://zhuanlan.zhihu.com/p/681714973, https://zhuanlan.zhihu.com/p/673525339 | @@ -19,27 +26,27 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag openai-triton) +jetson-containers run $(autotag openai-triton) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host openai-triton:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag openai-triton) +jetson-containers run -v /path/on/host:/path/in/container $(autotag openai-triton) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag openai-triton) my_app --abc xyz +jetson-containers run $(autotag openai-triton) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -47,7 +54,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh openai-triton +jetson-containers build openai-triton ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/opencv/README.md b/packages/opencv/README.md index 449ebab6a..b26933407 100644 --- a/packages/opencv/README.md +++ b/packages/opencv/README.md @@ -10,20 +10,27 @@ | :-- | :-- | |    Aliases | `opencv` | |    Builds | [![`opencv-481_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/opencv-481_jp60.yml?label=opencv-481:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/opencv-481_jp60.yml) | -|    Requires | `L4T ==36.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`numpy`](/packages/numpy) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`local_llm`](/packages/llm/local_llm) [`nanoowl`](/packages/vit/nanoowl) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) | +|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`gstreamer`](/packages/gstreamer) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanoowl`](/packages/vit/nanoowl) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) | +|    Dockerfile | [`Dockerfile.pip`](Dockerfile.pip) | |    Images | [`dustynv/opencv:4.8.1-r36.2.0`](https://hub.docker.com/r/dustynv/opencv/tags) `(2023-12-07, 5.1GB)` | -|    Notes | install OpenCV (with CUDA) from binaries built by opencv_builder | +|    Notes | install OpenCV (with CUDA) from binaries built by `opencv_builder` | + +| **`opencv:4.9.0`** | | +| :-- | :-- | +|    Requires | `L4T ['==36.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) | +|    Dockerfile | [`Dockerfile.pip`](Dockerfile.pip) | +|    Notes | install OpenCV (with CUDA) from binaries built by `opencv_builder` | | **`opencv:4.5.0`** | | | :-- | :-- | |    Aliases | `opencv` | -|    Requires | `L4T ==32.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`numpy`](/packages/numpy) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Notes | install OpenCV (with CUDA) from binaries built by opencv_builder | +|    Requires | `L4T ['==32.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) | +|    Dockerfile | [`Dockerfile.pip`](Dockerfile.pip) | +|    Notes | install OpenCV (with CUDA) from binaries built by `opencv_builder` | @@ -48,29 +55,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag opencv) +jetson-containers run $(autotag opencv) # or explicitly specify one of the container images above -./run.sh dustynv/opencv:4.8.1-r36.2.0 +jetson-containers run dustynv/opencv:4.8.1-r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/opencv:4.8.1-r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag opencv) +jetson-containers run -v /path/on/host:/path/in/container $(autotag opencv) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag opencv) my_app --abc xyz +jetson-containers run $(autotag opencv) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -78,7 +85,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh opencv +jetson-containers build opencv ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/opencv/opencv_builder/README.md b/packages/opencv/opencv_builder/README.md index 4dcdf70a3..fcac31e93 100644 --- a/packages/opencv/opencv_builder/README.md +++ b/packages/opencv/opencv_builder/README.md @@ -8,15 +8,15 @@ | **`opencv:4.8.1-builder`** | | | :-- | :-- | -|    Requires | `L4T ==36.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['==36.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Notes | the built packages are bundled into a .tar.gz under /opt | | **`opencv:4.5.0-builder`** | | | :-- | :-- | -|    Requires | `L4T <=35` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['<=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Notes | the built packages are bundled into a .tar.gz under /opt | @@ -42,29 +42,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag opencv_builder) +jetson-containers run $(autotag opencv_builder) # or explicitly specify one of the container images above -./run.sh dustynv/opencv_builder:r35.4.1 +jetson-containers run dustynv/opencv_builder:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/opencv_builder:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag opencv_builder) +jetson-containers run -v /path/on/host:/path/in/container $(autotag opencv_builder) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag opencv_builder) my_app --abc xyz +jetson-containers run $(autotag opencv_builder) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -72,7 +72,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh opencv_builder +jetson-containers build opencv_builder ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/pytorch/README.md b/packages/pytorch/README.md index 6b98cef6d..145937007 100644 --- a/packages/pytorch/README.md +++ b/packages/pytorch/README.md @@ -3,64 +3,55 @@ > [`CONTAINERS`](#user-content-containers) [`IMAGES`](#user-content-images) [`RUN`](#user-content-run) [`BUILD`](#user-content-build) Containers for PyTorch with CUDA support. -Note that the [`l4t-pytorch`](/packages/l4t/l4t-pytorch) containers also include PyTorch, torchvision, and torchaudio. +Note that the [`l4t-pytorch`](/packages/l4t/l4t-pytorch) containers also include PyTorch, `torchvision`, and `torchaudio`.
CONTAINERS
-| **`pytorch:2.1`** | | -| :-- | :-- | -|    Aliases | `torch:2.1` | -|    Builds | [![`pytorch-21_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-21_jp60.yml?label=pytorch-21:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-21_jp60.yml) [![`pytorch-21_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-21_jp51.yml?label=pytorch-21:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-21_jp51.yml) | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/pytorch:2.1-r35.2.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-11, 5.4GB)`
[`dustynv/pytorch:2.1-r35.3.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-14, 5.4GB)`
[`dustynv/pytorch:2.1-r35.4.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-11-05, 5.4GB)`
[`dustynv/pytorch:2.1-r36.2.0`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-14, 7.2GB)` | - | **`pytorch:2.0`** | | | :-- | :-- | -|    Aliases | `torch:2.0` `pytorch` `torch` | +|    Aliases | `torch:2.0` | |    Builds | [![`pytorch-20_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-20_jp51.yml?label=pytorch-20:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-20_jp51.yml) | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) | -|    Dependants | [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faiss_lite`](/packages/vectordb/faiss_lite) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jetson-inference`](/packages/jetson-inference) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`openai-triton`](/packages/openai-triton) [`optimum`](/packages/llm/optimum) [`raft`](/packages/rapids/raft) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio`](/packages/pytorch/torchaudio) [`torchvision`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xtts`](/packages/audio/xtts) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) | +|    Dependants | [`torchaudio:2.0.1`](/packages/pytorch/torchaudio) [`torchvision:0.15.1`](/packages/pytorch/torchvision) | +|    Dockerfile | [`Dockerfile.pip`](Dockerfile.pip) | |    Images | [`dustynv/pytorch:2.0-r35.2.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-06, 5.4GB)`
[`dustynv/pytorch:2.0-r35.3.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-14, 5.4GB)`
[`dustynv/pytorch:2.0-r35.4.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-10-07, 5.4GB)` | -| **`pytorch:1.13`** | | +| **`pytorch:2.1`** | | | :-- | :-- | -|    Aliases | `torch:1.13` | -|    Builds | [![`pytorch-113_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-113_jp51.yml?label=pytorch-113:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-113_jp51.yml) | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/pytorch:1.13-r35.2.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-08-29, 5.5GB)`
[`dustynv/pytorch:1.13-r35.3.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-12, 5.5GB)`
[`dustynv/pytorch:1.13-r35.4.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-14, 5.5GB)` | +|    Aliases | `torch:2.1` | +|    Builds | [![`pytorch-21_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-21_jp60.yml?label=pytorch-21:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-21_jp60.yml) [![`pytorch-21_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-21_jp51.yml?label=pytorch-21:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-21_jp51.yml) | +|    Requires | `L4T ['>=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) | +|    Dependants | [`torchaudio:2.1.0`](/packages/pytorch/torchaudio) [`torchvision:0.16.2`](/packages/pytorch/torchvision) | +|    Dockerfile | [`Dockerfile.pip`](Dockerfile.pip) | +|    Images | [`dustynv/pytorch:2.1-r35.2.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-11, 5.4GB)`
[`dustynv/pytorch:2.1-r35.3.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-14, 5.4GB)`
[`dustynv/pytorch:2.1-r35.4.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-11-05, 5.4GB)`
[`dustynv/pytorch:2.1-r36.2.0`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-14, 7.2GB)` | -| **`pytorch:1.12`** | | +| **`pytorch:2.2`** | | | :-- | :-- | -|    Aliases | `torch:1.12` | -|    Builds | [![`pytorch-112_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-112_jp51.yml?label=pytorch-112:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-112_jp51.yml) | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/pytorch:1.12-r35.2.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-14, 5.5GB)`
[`dustynv/pytorch:1.12-r35.3.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-08-29, 5.5GB)`
[`dustynv/pytorch:1.12-r35.4.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-11-03, 5.5GB)` | +|    Aliases | `torch:2.2` `pytorch` `torch` | +|    Requires | `L4T ['>=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) | +|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`efficientvit`](/packages/vit/efficientvit) [`exllama:0.0.14`](/packages/llm/exllama) [`exllama:0.0.15`](/packages/llm/exllama) [`faiss_lite`](/packages/vectordb/faiss_lite) [`flash-attention`](/packages/llm/flash-attention) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jetson-inference`](/packages/jetson-inference) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`openai-triton`](/packages/openai-triton) [`openai-triton:builder`](/packages/openai-triton) [`optimum`](/packages/llm/optimum) [`raft`](/packages/rapids/raft) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio:2.2.2`](/packages/pytorch/torchaudio) [`torchvision:0.17.2`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) | +|    Dockerfile | [`Dockerfile.pip`](Dockerfile.pip) | -| **`pytorch:1.11`** | | +| **`pytorch:2.3`** | | | :-- | :-- | -|    Aliases | `torch:1.11` | -|    Builds | [![`pytorch-111_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-111_jp51.yml?label=pytorch-111:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-111_jp51.yml) | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/pytorch:1.11-r35.2.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-11-05, 5.4GB)`
[`dustynv/pytorch:1.11-r35.3.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-14, 5.4GB)`
[`dustynv/pytorch:1.11-r35.4.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-11, 5.4GB)` | +|    Aliases | `torch:2.3` | +|    Requires | `L4T ['==36.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) | +|    Dependants | [`torchaudio:2.3.0`](/packages/pytorch/torchaudio) | +|    Dockerfile | [`Dockerfile.pip`](Dockerfile.pip) | | **`pytorch:1.10`** | | | :-- | :-- | -|    Aliases | `torch:1.10` `pytorch` `torch` | +|    Aliases | `torch:1.10` | |    Builds | [![`pytorch-110_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-110_jp46.yml?label=pytorch-110:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-110_jp46.yml) | -|    Requires | `L4T ==32.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) | +|    Requires | `L4T ['==32.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) | +|    Dependants | [`torchaudio:0.10.0`](/packages/pytorch/torchaudio) [`torchvision:0.11.1`](/packages/pytorch/torchvision) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/pytorch:1.10-r32.7.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-14, 1.1GB)` | @@ -68,33 +59,12 @@ Note that the [`l4t-pytorch`](/packages/l4t/l4t-pytorch) containers also include | :-- | :-- | |    Aliases | `torch:1.9` | |    Builds | [![`pytorch-19_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/pytorch-19_jp46.yml?label=pytorch-19:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/pytorch-19_jp46.yml) | -|    Requires | `L4T ==32.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) | +|    Requires | `L4T ['==32.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) | +|    Dependants | [`torchaudio:0.9.0`](/packages/pytorch/torchaudio) [`torchvision:0.10.0`](/packages/pytorch/torchvision) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/pytorch:1.9-r32.7.1`](https://hub.docker.com/r/dustynv/pytorch/tags) `(2023-12-14, 1.0GB)` | -| **`pytorch:2.0-distributed`** | | -| :-- | :-- | -|    Aliases | `torch:2.0-distributed` | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | - -| **`pytorch:2.1-distributed`** | | -| :-- | :-- | -|    Aliases | `torch:2.1-distributed` `pytorch:distributed` | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`efficientvit`](/packages/vit/efficientvit) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`xformers`](/packages/llm/xformers) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | - -| **`pytorch:2.1-builder`** | | -| :-- | :-- | -|    Aliases | `torch:2.1-builder` | -|    Requires | `L4T ==36.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | -
@@ -131,29 +101,29 @@ Note that the [`l4t-pytorch`](/packages/l4t/l4t-pytorch) containers also include RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag pytorch) +jetson-containers run $(autotag pytorch) # or explicitly specify one of the container images above -./run.sh dustynv/pytorch:2.1-r36.2.0 +jetson-containers run dustynv/pytorch:2.1-r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/pytorch:2.1-r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag pytorch) +jetson-containers run -v /path/on/host:/path/in/container $(autotag pytorch) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag pytorch) my_app --abc xyz +jetson-containers run $(autotag pytorch) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -161,7 +131,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh pytorch +jetson-containers build pytorch ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/pytorch/torch2trt/README.md b/packages/pytorch/torch2trt/README.md index 55bc43b85..b6a679bfd 100644 --- a/packages/pytorch/torch2trt/README.md +++ b/packages/pytorch/torch2trt/README.md @@ -9,9 +9,9 @@ | **`torch2trt`** | | | :-- | :-- | |    Builds | [![`torch2trt_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torch2trt_jp51.yml?label=torch2trt:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torch2trt_jp51.yml) [![`torch2trt_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torch2trt_jp46.yml?label=torch2trt:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torch2trt_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) | -|    Dependants | [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`local_llm`](/packages/llm/local_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`xtts`](/packages/audio/xtts) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`tensorrt`](/packages/tensorrt) | +|    Dependants | [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`xtts`](/packages/audio/xtts) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/torch2trt:r32.7.1`](https://hub.docker.com/r/dustynv/torch2trt/tags) `(2023-12-14, 1.1GB)`
[`dustynv/torch2trt:r35.2.1`](https://hub.docker.com/r/dustynv/torch2trt/tags) `(2023-12-14, 5.5GB)`
[`dustynv/torch2trt:r35.3.1`](https://hub.docker.com/r/dustynv/torch2trt/tags) `(2023-08-29, 5.5GB)`
[`dustynv/torch2trt:r35.4.1`](https://hub.docker.com/r/dustynv/torch2trt/tags) `(2023-12-05, 5.5GB)` | @@ -37,29 +37,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag torch2trt) +jetson-containers run $(autotag torch2trt) # or explicitly specify one of the container images above -./run.sh dustynv/torch2trt:r35.2.1 +jetson-containers run dustynv/torch2trt:r35.2.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/torch2trt:r35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag torch2trt) +jetson-containers run -v /path/on/host:/path/in/container $(autotag torch2trt) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag torch2trt) my_app --abc xyz +jetson-containers run $(autotag torch2trt) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +67,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh torch2trt +jetson-containers build torch2trt ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/pytorch/torch_tensorrt/README.md b/packages/pytorch/torch_tensorrt/README.md index 31a3d68c6..420ba23aa 100644 --- a/packages/pytorch/torch_tensorrt/README.md +++ b/packages/pytorch/torch_tensorrt/README.md @@ -9,8 +9,8 @@ | **`torch_tensorrt`** | | | :-- | :-- | |    Builds | [![`torch_tensorrt_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torch_tensorrt_jp46.yml?label=torch_tensorrt:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torch_tensorrt_jp46.yml) [![`torch_tensorrt_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torch_tensorrt_jp51.yml?label=torch_tensorrt:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torch_tensorrt_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`bazel`](/packages/bazel) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`tensorrt`](/packages/tensorrt) [`bazel`](/packages/build/bazel) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/torch_tensorrt:r32.7.1`](https://hub.docker.com/r/dustynv/torch_tensorrt/tags) `(2023-12-14, 1.5GB)`
[`dustynv/torch_tensorrt:r35.2.1`](https://hub.docker.com/r/dustynv/torch_tensorrt/tags) `(2023-12-11, 5.9GB)`
[`dustynv/torch_tensorrt:r35.3.1`](https://hub.docker.com/r/dustynv/torch_tensorrt/tags) `(2023-12-14, 5.9GB)`
[`dustynv/torch_tensorrt:r35.4.1`](https://hub.docker.com/r/dustynv/torch_tensorrt/tags) `(2023-10-07, 5.9GB)` | |    Notes | https://pytorch.org/TensorRT/getting_started/installation.html#installation | @@ -37,29 +37,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag torch_tensorrt) +jetson-containers run $(autotag torch_tensorrt) # or explicitly specify one of the container images above -./run.sh dustynv/torch_tensorrt:r35.3.1 +jetson-containers run dustynv/torch_tensorrt:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/torch_tensorrt:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag torch_tensorrt) +jetson-containers run -v /path/on/host:/path/in/container $(autotag torch_tensorrt) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag torch_tensorrt) my_app --abc xyz +jetson-containers run $(autotag torch_tensorrt) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +67,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh torch_tensorrt +jetson-containers build torch_tensorrt ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/pytorch/torchaudio/README.md b/packages/pytorch/torchaudio/README.md index e0d09db0f..6c8fc5a2b 100644 --- a/packages/pytorch/torchaudio/README.md +++ b/packages/pytorch/torchaudio/README.md @@ -6,14 +6,43 @@ CONTAINERS
-| **`torchaudio`** | | +| **`torchaudio:2.0.1`** | | | :-- | :-- | -|    Builds | [![`torchaudio_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torchaudio_jp51.yml?label=torchaudio:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torchaudio_jp51.yml) [![`torchaudio_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torchaudio_jp46.yml?label=torchaudio:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torchaudio_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`local_llm`](/packages/llm/local_llm) [`nemo`](/packages/nemo) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xtts`](/packages/audio/xtts) | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.0`](/packages/pytorch) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + +| **`torchaudio:2.1.0`** | | +| :-- | :-- | +|    Requires | `L4T ['>=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.1`](/packages/pytorch) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + +| **`torchaudio:2.2.2`** | | +| :-- | :-- | +|    Aliases | `torchaudio` | +|    Requires | `L4T ['>=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) | +|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nemo`](/packages/nemo) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xtts`](/packages/audio/xtts) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + +| **`torchaudio:2.3.0`** | | +| :-- | :-- | +|    Requires | `L4T ['==36.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.3`](/packages/pytorch) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + +| **`torchaudio:0.10.0`** | | +| :-- | :-- | +|    Requires | `L4T ['==32.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:1.10`](/packages/pytorch) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + +| **`torchaudio:0.9.0`** | | +| :-- | :-- | +|    Requires | `L4T ['==32.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:1.9`](/packages/pytorch) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/torchaudio:r32.7.1`](https://hub.docker.com/r/dustynv/torchaudio/tags) `(2023-12-14, 1.1GB)`
[`dustynv/torchaudio:r35.2.1`](https://hub.docker.com/r/dustynv/torchaudio/tags) `(2023-12-14, 5.4GB)`
[`dustynv/torchaudio:r35.3.1`](https://hub.docker.com/r/dustynv/torchaudio/tags) `(2023-12-11, 5.5GB)`
[`dustynv/torchaudio:r35.4.1`](https://hub.docker.com/r/dustynv/torchaudio/tags) `(2023-12-12, 5.4GB)` | @@ -37,29 +66,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag torchaudio) +jetson-containers run $(autotag torchaudio) # or explicitly specify one of the container images above -./run.sh dustynv/torchaudio:r35.2.1 +jetson-containers run dustynv/torchaudio:r35.2.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/torchaudio:r35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag torchaudio) +jetson-containers run -v /path/on/host:/path/in/container $(autotag torchaudio) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag torchaudio) my_app --abc xyz +jetson-containers run $(autotag torchaudio) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +96,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh torchaudio +jetson-containers build torchaudio ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/pytorch/torchvision/README.md b/packages/pytorch/torchvision/README.md index ab1a80509..3d8e8a5c2 100644 --- a/packages/pytorch/torchvision/README.md +++ b/packages/pytorch/torchvision/README.md @@ -6,14 +6,37 @@ CONTAINERS
-| **`torchvision`** | | +| **`torchvision:0.15.1`** | | | :-- | :-- | -|    Builds | [![`torchvision_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torchvision_jp46.yml?label=torchvision:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torchvision_jp46.yml) [![`torchvision_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/torchvision_jp51.yml?label=torchvision:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/torchvision_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`efficientvit`](/packages/vit/efficientvit) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jetson-inference`](/packages/jetson-inference) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`optimum`](/packages/llm/optimum) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`whisperx`](/packages/audio/whisperx) [`xtts`](/packages/audio/xtts) | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.0`](/packages/pytorch) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + +| **`torchvision:0.16.2`** | | +| :-- | :-- | +|    Requires | `L4T ['>=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.1`](/packages/pytorch) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + +| **`torchvision:0.17.2`** | | +| :-- | :-- | +|    Aliases | `torchvision` | +|    Requires | `L4T ['>=35']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) | +|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq:0.2.4`](/packages/llm/auto_awq) [`auto_gptq:0.7.1`](/packages/llm/auto_gptq) [`awq:0.1.0`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`bitsandbytes:builder`](/packages/llm/bitsandbytes) [`efficientvit`](/packages/vit/efficientvit) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jetson-inference`](/packages/jetson-inference) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`optimum`](/packages/llm/optimum) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`whisperx`](/packages/audio/whisperx) [`xtts`](/packages/audio/xtts) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + +| **`torchvision:0.11.1`** | | +| :-- | :-- | +|    Requires | `L4T ['==32.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:1.10`](/packages/pytorch) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | + +| **`torchvision:0.10.0`** | | +| :-- | :-- | +|    Requires | `L4T ['==32.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:1.9`](/packages/pytorch) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/torchvision:r32.7.1`](https://hub.docker.com/r/dustynv/torchvision/tags) `(2023-12-14, 1.1GB)`
[`dustynv/torchvision:r35.2.1`](https://hub.docker.com/r/dustynv/torchvision/tags) `(2023-12-11, 5.5GB)`
[`dustynv/torchvision:r35.3.1`](https://hub.docker.com/r/dustynv/torchvision/tags) `(2023-12-14, 5.5GB)`
[`dustynv/torchvision:r35.4.1`](https://hub.docker.com/r/dustynv/torchvision/tags) `(2023-11-05, 5.4GB)` | @@ -37,29 +60,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag torchvision) +jetson-containers run $(autotag torchvision) # or explicitly specify one of the container images above -./run.sh dustynv/torchvision:r35.3.1 +jetson-containers run dustynv/torchvision:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/torchvision:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag torchvision) +jetson-containers run -v /path/on/host:/path/in/container $(autotag torchvision) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag torchvision) my_app --abc xyz +jetson-containers run $(autotag torchvision) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +90,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh torchvision +jetson-containers build torchvision ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/rapids/cudf/README.md b/packages/rapids/cudf/README.md index 9d57e266e..25fb79c09 100644 --- a/packages/rapids/cudf/README.md +++ b/packages/rapids/cudf/README.md @@ -10,8 +10,8 @@ | :-- | :-- | |    Aliases | `cudf` | |    Builds | [![`cudf-231003_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cudf-231003_jp60.yml?label=cudf-231003:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cudf-231003_jp60.yml) | -|    Requires | `L4T >=36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) [`numpy`](/packages/numpy) [`cupy`](/packages/cuda/cupy) [`numba`](/packages/numba) [`protobuf:apt`](/packages/protobuf/protobuf_apt) | +|    Requires | `L4T ['>=36']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`cupy`](/packages/cuda/cupy) [`numba`](/packages/numba) [`protobuf:apt`](/packages/build/protobuf/protobuf_apt) | |    Dependants | [`cuml`](/packages/rapids/cuml) | |    Dockerfile | [`Dockerfile.jp5`](Dockerfile.jp5) | |    Images | [`dustynv/cudf:23.10.03-r36.2.0`](https://hub.docker.com/r/dustynv/cudf/tags) `(2023-12-06, 5.2GB)` | @@ -20,8 +20,8 @@ | **`cudf:21.10.02`** | | | :-- | :-- | |    Aliases | `cudf` | -|    Requires | `L4T ==35.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) [`numpy`](/packages/numpy) [`cupy`](/packages/cuda/cupy) [`numba`](/packages/numba) [`protobuf:apt`](/packages/protobuf/protobuf_apt) | +|    Requires | `L4T ['==35.*']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`cupy`](/packages/cuda/cupy) [`numba`](/packages/numba) [`protobuf:apt`](/packages/build/protobuf/protobuf_apt) | |    Dockerfile | [`Dockerfile.jp5`](Dockerfile.jp5) | |    Notes | installed under `/usr/local` | @@ -47,29 +47,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag cudf) +jetson-containers run $(autotag cudf) # or explicitly specify one of the container images above -./run.sh dustynv/cudf:23.10.03-r36.2.0 +jetson-containers run dustynv/cudf:23.10.03-r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/cudf:23.10.03-r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag cudf) +jetson-containers run -v /path/on/host:/path/in/container $(autotag cudf) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag cudf) my_app --abc xyz +jetson-containers run $(autotag cudf) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -77,7 +77,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh cudf +jetson-containers build cudf ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/rapids/cuml/README.md b/packages/rapids/cuml/README.md index 69708f3d4..a7548a40b 100644 --- a/packages/rapids/cuml/README.md +++ b/packages/rapids/cuml/README.md @@ -9,8 +9,8 @@ | **`cuml`** | | | :-- | :-- | |    Builds | [![`cuml_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/cuml_jp51.yml?label=cuml:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/cuml_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) [`numpy`](/packages/numpy) [`cupy`](/packages/cuda/cupy) [`numba`](/packages/numba) [`protobuf:apt`](/packages/protobuf/protobuf_apt) [`cudf`](/packages/rapids/cudf) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) [`numpy`](/packages/numpy) [`cupy`](/packages/cuda/cupy) [`numba`](/packages/numba) [`protobuf:apt`](/packages/build/protobuf/protobuf_apt) [`cudf`](/packages/rapids/cudf) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/cuml:r35.2.1`](https://hub.docker.com/r/dustynv/cuml/tags) `(2023-09-07, 7.9GB)`
[`dustynv/cuml:r35.3.1`](https://hub.docker.com/r/dustynv/cuml/tags) `(2023-08-29, 8.0GB)`
[`dustynv/cuml:r35.4.1`](https://hub.docker.com/r/dustynv/cuml/tags) `(2023-10-07, 7.9GB)` | |    Notes | installed under `/usr/local` | @@ -36,29 +36,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag cuml) +jetson-containers run $(autotag cuml) # or explicitly specify one of the container images above -./run.sh dustynv/cuml:r35.4.1 +jetson-containers run dustynv/cuml:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/cuml:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag cuml) +jetson-containers run -v /path/on/host:/path/in/container $(autotag cuml) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag cuml) my_app --abc xyz +jetson-containers run $(autotag cuml) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -66,7 +66,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh cuml +jetson-containers build cuml ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/rapids/raft/README.md b/packages/rapids/raft/README.md index c4475a291..f1b02b582 100644 --- a/packages/rapids/raft/README.md +++ b/packages/rapids/raft/README.md @@ -8,8 +8,8 @@ | **`raft`** | | | :-- | :-- | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`numba`](/packages/numba) [`cuda-python`](/packages/cuda/cuda-python) [`cupy`](/packages/cuda/cupy) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`numba`](/packages/numba) [`cuda-python`](/packages/cuda/cuda-python) [`cupy`](/packages/cuda/cupy) | |    Dockerfile | [`Dockerfile`](Dockerfile) | @@ -18,27 +18,27 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag raft) +jetson-containers run $(autotag raft) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host raft:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag raft) +jetson-containers run -v /path/on/host:/path/in/container $(autotag raft) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag raft) my_app --abc xyz +jetson-containers run $(autotag raft) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -46,7 +46,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh raft +jetson-containers build raft ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/realsense/README.md b/packages/realsense/README.md index 8613972b7..71be32a80 100644 --- a/packages/realsense/README.md +++ b/packages/realsense/README.md @@ -9,8 +9,8 @@ | **`realsense`** | | | :-- | :-- | |    Builds | [![`realsense_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/realsense_jp60.yml?label=realsense:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/realsense_jp60.yml) [![`realsense_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/realsense_jp51.yml?label=realsense:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/realsense_jp51.yml) [![`realsense_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/realsense_jp46.yml?label=realsense:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/realsense_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/realsense:r32.7.1`](https://hub.docker.com/r/dustynv/realsense/tags) `(2024-02-22, 0.8GB)`
[`dustynv/realsense:r35.2.1`](https://hub.docker.com/r/dustynv/realsense/tags) `(2023-08-29, 5.5GB)`
[`dustynv/realsense:r35.3.1`](https://hub.docker.com/r/dustynv/realsense/tags) `(2024-02-22, 5.4GB)`
[`dustynv/realsense:r35.4.1`](https://hub.docker.com/r/dustynv/realsense/tags) `(2023-10-07, 5.5GB)`
[`dustynv/realsense:r36.2.0`](https://hub.docker.com/r/dustynv/realsense/tags) `(2024-02-22, 4.0GB)` | |    Notes | https://github.com/IntelRealSense/librealsense/blob/master/doc/installation_jetson.md | @@ -38,29 +38,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag realsense) +jetson-containers run $(autotag realsense) # or explicitly specify one of the container images above -./run.sh dustynv/realsense:r36.2.0 +jetson-containers run dustynv/realsense:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/realsense:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag realsense) +jetson-containers run -v /path/on/host:/path/in/container $(autotag realsense) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag realsense) my_app --abc xyz +jetson-containers run $(autotag realsense) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -68,7 +68,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh realsense +jetson-containers build realsense ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/ros/README.md b/packages/ros/README.md index 5f3b43028..c812901be 100644 --- a/packages/ros/README.md +++ b/packages/ros/README.md @@ -18,8 +18,8 @@ Supported ROS packages: `ros_base` `ros_core` `desktop` | **`ros:melodic-ros-base`** | | | :-- | :-- | |    Builds | [![`ros-melodic-ros-base_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-melodic-ros-base_jp46.yml?label=ros-melodic-ros-base:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-melodic-ros-base_jp46.yml) | -|    Requires | `L4T <34` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cmake:apt`](/packages/cmake/cmake_apt) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['<34']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cmake:apt`](/packages/build/cmake/cmake_apt) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros.melodic`](Dockerfile.ros.melodic) | |    Images | [`dustynv/ros:melodic-ros-base-l4t-r32.4.4`](https://hub.docker.com/r/dustynv/ros/tags) `(2021-08-06, 0.5GB)`
[`dustynv/ros:melodic-ros-base-l4t-r32.5.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2021-09-23, 0.5GB)`
[`dustynv/ros:melodic-ros-base-l4t-r32.6.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-03-02, 0.5GB)`
[`dustynv/ros:melodic-ros-base-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.5GB)` | |    Notes | ROS Melodic is for JetPack 4 only | @@ -27,8 +27,8 @@ Supported ROS packages: `ros_base` `ros_core` `desktop` | **`ros:melodic-ros-core`** | | | :-- | :-- | |    Builds | [![`ros-melodic-ros-core_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-melodic-ros-core_jp46.yml?label=ros-melodic-ros-core:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-melodic-ros-core_jp46.yml) | -|    Requires | `L4T <34` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cmake:apt`](/packages/cmake/cmake_apt) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['<34']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cmake:apt`](/packages/build/cmake/cmake_apt) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros.melodic`](Dockerfile.ros.melodic) | |    Images | [`dustynv/ros:melodic-ros-core-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.5GB)` | |    Notes | ROS Melodic is for JetPack 4 only | @@ -36,8 +36,8 @@ Supported ROS packages: `ros_base` `ros_core` `desktop` | **`ros:melodic-desktop`** | | | :-- | :-- | |    Builds | [![`ros-melodic-desktop_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-melodic-desktop_jp46.yml?label=ros-melodic-desktop:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-melodic-desktop_jp46.yml) | -|    Requires | `L4T <34` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cmake:apt`](/packages/cmake/cmake_apt) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['<34']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cmake:apt`](/packages/build/cmake/cmake_apt) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros.melodic`](Dockerfile.ros.melodic) | |    Images | [`dustynv/ros:melodic-desktop-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.7GB)` | |    Notes | ROS Melodic is for JetPack 4 only | @@ -45,120 +45,120 @@ Supported ROS packages: `ros_base` `ros_core` `desktop` | **`ros:noetic-ros-base`** | | | :-- | :-- | |    Builds | [![`ros-noetic-ros-base_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-ros-base_jp46.yml?label=ros-noetic-ros-base:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-ros-base_jp46.yml) [![`ros-noetic-ros-base_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-ros-base_jp51.yml?label=ros-noetic-ros-base:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-ros-base_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros.noetic`](Dockerfile.ros.noetic) | |    Images | [`dustynv/ros:noetic-ros-base-l4t-r32.4.4`](https://hub.docker.com/r/dustynv/ros/tags) `(2021-08-06, 0.5GB)`
[`dustynv/ros:noetic-ros-base-l4t-r32.5.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2021-09-23, 0.5GB)`
[`dustynv/ros:noetic-ros-base-l4t-r32.6.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-03-02, 0.5GB)`
[`dustynv/ros:noetic-ros-base-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.6GB)`
[`dustynv/ros:noetic-ros-base-l4t-r34.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-04-18, 5.6GB)`
[`dustynv/ros:noetic-ros-base-l4t-r34.1.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-09-23, 5.6GB)`
[`dustynv/ros:noetic-ros-base-l4t-r35.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-04-29, 5.6GB)`
[`dustynv/ros:noetic-ros-base-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 5.2GB)`
[`dustynv/ros:noetic-ros-base-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-24, 5.2GB)`
[`dustynv/ros:noetic-ros-base-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-07, 5.2GB)` | | **`ros:noetic-ros-core`** | | | :-- | :-- | |    Builds | [![`ros-noetic-ros-core_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-ros-core_jp51.yml?label=ros-noetic-ros-core:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-ros-core_jp51.yml) [![`ros-noetic-ros-core_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-ros-core_jp46.yml?label=ros-noetic-ros-core:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-ros-core_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros.noetic`](Dockerfile.ros.noetic) | |    Images | [`dustynv/ros:noetic-ros-core-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.6GB)`
[`dustynv/ros:noetic-ros-core-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 5.2GB)`
[`dustynv/ros:noetic-ros-core-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-24, 5.2GB)`
[`dustynv/ros:noetic-ros-core-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-05, 5.2GB)` | | **`ros:noetic-desktop`** | | | :-- | :-- | |    Builds | [![`ros-noetic-desktop_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-desktop_jp51.yml?label=ros-noetic-desktop:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-desktop_jp51.yml) [![`ros-noetic-desktop_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-noetic-desktop_jp46.yml?label=ros-noetic-desktop:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-noetic-desktop_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros.noetic`](Dockerfile.ros.noetic) | |    Images | [`dustynv/ros:noetic-desktop-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.6GB)`
[`dustynv/ros:noetic-desktop-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 5.2GB)`
[`dustynv/ros:noetic-desktop-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-08-29, 5.2GB)`
[`dustynv/ros:noetic-desktop-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-07, 5.2GB)` | | **`ros:foxy-ros-base`** | | | :-- | :-- | |    Builds | [![`ros-foxy-ros-base_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-foxy-ros-base_jp51.yml?label=ros-foxy-ros-base:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-foxy-ros-base_jp51.yml) [![`ros-foxy-ros-base_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-foxy-ros-base_jp46.yml?label=ros-foxy-ros-base:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-foxy-ros-base_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:foxy-ros-base-l4t-r32.4.4`](https://hub.docker.com/r/dustynv/ros/tags) `(2021-08-06, 1.1GB)`
[`dustynv/ros:foxy-ros-base-l4t-r32.5.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2021-09-23, 1.1GB)`
[`dustynv/ros:foxy-ros-base-l4t-r32.6.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-03-02, 1.1GB)`
[`dustynv/ros:foxy-ros-base-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.8GB)`
[`dustynv/ros:foxy-ros-base-l4t-r34.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-04-18, 5.9GB)`
[`dustynv/ros:foxy-ros-base-l4t-r34.1.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-09-23, 5.9GB)`
[`dustynv/ros:foxy-ros-base-l4t-r35.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-04-29, 5.9GB)`
[`dustynv/ros:foxy-ros-base-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-09-07, 5.3GB)`
[`dustynv/ros:foxy-ros-base-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 5.4GB)`
[`dustynv/ros:foxy-ros-base-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-07, 5.3GB)` | | **`ros:foxy-ros-core`** | | | :-- | :-- | |    Builds | [![`ros-foxy-ros-core_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-foxy-ros-core_jp51.yml?label=ros-foxy-ros-core:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-foxy-ros-core_jp51.yml) [![`ros-foxy-ros-core_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-foxy-ros-core_jp46.yml?label=ros-foxy-ros-core:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-foxy-ros-core_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:foxy-ros-core-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.8GB)`
[`dustynv/ros:foxy-ros-core-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-08-29, 5.3GB)`
[`dustynv/ros:foxy-ros-core-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-24, 5.3GB)`
[`dustynv/ros:foxy-ros-core-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 5.3GB)` | | **`ros:foxy-desktop`** | | | :-- | :-- | |    Builds | [![`ros-foxy-desktop_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-foxy-desktop_jp51.yml?label=ros-foxy-desktop:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-foxy-desktop_jp51.yml) [![`ros-foxy-desktop_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-foxy-desktop_jp46.yml?label=ros-foxy-desktop:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-foxy-desktop_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:foxy-desktop-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 1.1GB)`
[`dustynv/ros:foxy-desktop-l4t-r34.1.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-09-23, 6.5GB)`
[`dustynv/ros:foxy-desktop-l4t-r35.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-04-29, 6.4GB)`
[`dustynv/ros:foxy-desktop-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 5.9GB)`
[`dustynv/ros:foxy-desktop-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-24, 5.9GB)`
[`dustynv/ros:foxy-desktop-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-05, 5.9GB)` | | **`ros:galactic-ros-base`** | | | :-- | :-- | |    Builds | [![`ros-galactic-ros-base_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-galactic-ros-base_jp46.yml?label=ros-galactic-ros-base:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-galactic-ros-base_jp46.yml) [![`ros-galactic-ros-base_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-galactic-ros-base_jp51.yml?label=ros-galactic-ros-base:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-galactic-ros-base_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:galactic-ros-base-l4t-r32.4.4`](https://hub.docker.com/r/dustynv/ros/tags) `(2021-08-06, 0.8GB)`
[`dustynv/ros:galactic-ros-base-l4t-r32.5.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2021-09-23, 0.8GB)`
[`dustynv/ros:galactic-ros-base-l4t-r32.6.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-03-02, 0.8GB)`
[`dustynv/ros:galactic-ros-base-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.6GB)`
[`dustynv/ros:galactic-ros-base-l4t-r34.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-04-18, 5.6GB)`
[`dustynv/ros:galactic-ros-base-l4t-r34.1.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-09-23, 5.6GB)`
[`dustynv/ros:galactic-ros-base-l4t-r35.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-04-29, 5.6GB)`
[`dustynv/ros:galactic-ros-base-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-08-29, 5.2GB)`
[`dustynv/ros:galactic-ros-base-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-05, 5.2GB)`
[`dustynv/ros:galactic-ros-base-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 5.2GB)` | | **`ros:galactic-ros-core`** | | | :-- | :-- | |    Builds | [![`ros-galactic-ros-core_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-galactic-ros-core_jp51.yml?label=ros-galactic-ros-core:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-galactic-ros-core_jp51.yml) [![`ros-galactic-ros-core_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-galactic-ros-core_jp46.yml?label=ros-galactic-ros-core:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-galactic-ros-core_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:galactic-ros-core-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.6GB)`
[`dustynv/ros:galactic-ros-core-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-08-29, 5.1GB)`
[`dustynv/ros:galactic-ros-core-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 5.2GB)`
[`dustynv/ros:galactic-ros-core-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-07, 5.2GB)` | | **`ros:galactic-desktop`** | | | :-- | :-- | |    Builds | [![`ros-galactic-desktop_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-galactic-desktop_jp46.yml?label=ros-galactic-desktop:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-galactic-desktop_jp46.yml) [![`ros-galactic-desktop_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-galactic-desktop_jp51.yml?label=ros-galactic-desktop:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-galactic-desktop_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:galactic-desktop-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 1.0GB)`
[`dustynv/ros:galactic-desktop-l4t-r34.1.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-09-23, 6.2GB)`
[`dustynv/ros:galactic-desktop-l4t-r35.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-04-29, 6.1GB)`
[`dustynv/ros:galactic-desktop-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-08-29, 5.7GB)`
[`dustynv/ros:galactic-desktop-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-24, 5.8GB)`
[`dustynv/ros:galactic-desktop-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 5.8GB)` | | **`ros:humble-ros-base`** | | | :-- | :-- | |    Builds | [![`ros-humble-ros-base_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-humble-ros-base_jp60.yml?label=ros-humble-ros-base:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-humble-ros-base_jp60.yml) [![`ros-humble-ros-base_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-humble-ros-base_jp46.yml?label=ros-humble-ros-base:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-humble-ros-base_jp46.yml) [![`ros-humble-ros-base_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-humble-ros-base_jp51.yml?label=ros-humble-ros-base:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-humble-ros-base_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:humble-ros-base-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.7GB)`
[`dustynv/ros:humble-ros-base-l4t-r34.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-05-26, 5.6GB)`
[`dustynv/ros:humble-ros-base-l4t-r34.1.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-09-23, 5.6GB)`
[`dustynv/ros:humble-ros-base-l4t-r35.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-04-29, 5.6GB)`
[`dustynv/ros:humble-ros-base-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-05, 5.2GB)`
[`dustynv/ros:humble-ros-base-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 5.2GB)`
[`dustynv/ros:humble-ros-base-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-07, 5.2GB)`
[`dustynv/ros:humble-ros-base-l4t-r36.2.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 6.9GB)` | | **`ros:humble-ros-core`** | | | :-- | :-- | |    Builds | [![`ros-humble-ros-core_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-humble-ros-core_jp60.yml?label=ros-humble-ros-core:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-humble-ros-core_jp60.yml) [![`ros-humble-ros-core_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-humble-ros-core_jp46.yml?label=ros-humble-ros-core:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-humble-ros-core_jp46.yml) [![`ros-humble-ros-core_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-humble-ros-core_jp51.yml?label=ros-humble-ros-core:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-humble-ros-core_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:humble-ros-core-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.7GB)`
[`dustynv/ros:humble-ros-core-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 5.2GB)`
[`dustynv/ros:humble-ros-core-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-24, 5.2GB)`
[`dustynv/ros:humble-ros-core-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-05, 5.2GB)`
[`dustynv/ros:humble-ros-core-l4t-r36.2.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 6.9GB)` | | **`ros:humble-desktop`** | | | :-- | :-- | |    Builds | [![`ros-humble-desktop_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-humble-desktop_jp46.yml?label=ros-humble-desktop:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-humble-desktop_jp46.yml) [![`ros-humble-desktop_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-humble-desktop_jp60.yml?label=ros-humble-desktop:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-humble-desktop_jp60.yml) [![`ros-humble-desktop_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-humble-desktop_jp51.yml?label=ros-humble-desktop:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-humble-desktop_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:humble-desktop-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-24, 1.0GB)`
[`dustynv/ros:humble-desktop-l4t-r34.1.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2022-09-23, 6.2GB)`
[`dustynv/ros:humble-desktop-l4t-r35.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-04-29, 6.2GB)`
[`dustynv/ros:humble-desktop-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-09-07, 5.8GB)`
[`dustynv/ros:humble-desktop-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 5.8GB)`
[`dustynv/ros:humble-desktop-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-07, 5.8GB)`
[`dustynv/ros:humble-desktop-l4t-r36.2.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 7.6GB)`
[`dustynv/ros:humble-desktop-pytorch-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-11-14, 6.1GB)` | | **`ros:iron-ros-base`** | | | :-- | :-- | |    Builds | [![`ros-iron-ros-base_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-ros-base_jp51.yml?label=ros-iron-ros-base:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-ros-base_jp51.yml) [![`ros-iron-ros-base_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-ros-base_jp46.yml?label=ros-iron-ros-base:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-ros-base_jp46.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:iron-ros-base-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.7GB)`
[`dustynv/ros:iron-ros-base-l4t-r35.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-05-26, 5.6GB)`
[`dustynv/ros:iron-ros-base-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-09-07, 5.2GB)`
[`dustynv/ros:iron-ros-base-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-05, 5.2GB)`
[`dustynv/ros:iron-ros-base-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 5.2GB)` | | **`ros:iron-ros-core`** | | | :-- | :-- | |    Builds | [![`ros-iron-ros-core_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-ros-core_jp46.yml?label=ros-iron-ros-core:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-ros-core_jp46.yml) [![`ros-iron-ros-core_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-ros-core_jp51.yml?label=ros-iron-ros-core:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-ros-core_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:iron-ros-core-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-06, 0.7GB)`
[`dustynv/ros:iron-ros-core-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-08-29, 5.2GB)`
[`dustynv/ros:iron-ros-core-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 5.2GB)`
[`dustynv/ros:iron-ros-core-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-07, 5.2GB)` | | **`ros:iron-desktop`** | | | :-- | :-- | |    Builds | [![`ros-iron-desktop_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-desktop_jp46.yml?label=ros-iron-desktop:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-desktop_jp46.yml) [![`ros-iron-desktop_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/ros-iron-desktop_jp51.yml?label=ros-iron-desktop:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/ros-iron-desktop_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`opencv`](/packages/opencv) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile.ros2`](Dockerfile.ros2) | |    Images | [`dustynv/ros:iron-desktop-l4t-r32.7.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-24, 1.0GB)`
[`dustynv/ros:iron-desktop-l4t-r35.1.0`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-05-26, 6.2GB)`
[`dustynv/ros:iron-desktop-l4t-r35.2.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-09-07, 5.8GB)`
[`dustynv/ros:iron-desktop-l4t-r35.3.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-10-24, 5.8GB)`
[`dustynv/ros:iron-desktop-l4t-r35.4.1`](https://hub.docker.com/r/dustynv/ros/tags) `(2023-12-07, 5.8GB)` | @@ -323,29 +323,29 @@ Supported ROS packages: `ros_base` `ros_core` `desktop`
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag ros) +jetson-containers run $(autotag ros) # or explicitly specify one of the container images above -./run.sh dustynv/ros:humble-ros-base-l4t-r36.2.0 +jetson-containers run dustynv/ros:humble-ros-base-l4t-r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/ros:humble-ros-base-l4t-r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag ros) +jetson-containers run -v /path/on/host:/path/in/container $(autotag ros) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag ros) my_app --abc xyz +jetson-containers run $(autotag ros) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -353,7 +353,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh ros +jetson-containers build ros ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/smart-home/homeassistant-base/README.md b/packages/smart-home/homeassistant-base/README.md index 043ca2f84..e693f7b06 100644 --- a/packages/smart-home/homeassistant-base/README.md +++ b/packages/smart-home/homeassistant-base/README.md @@ -9,7 +9,7 @@ | **`homeassistant-base`** | | | :-- | :-- | |    Requires | `L4T ['>=34.1.0']` | -|    Dependencies | [`build-essential`](/packages/build-essential) | +|    Dependencies | [`build-essential`](/packages/build/build-essential) | |    Dependants | [`homeassistant-core:2024.4.2`](/packages/smart-home/homeassistant-core) [`homeassistant-core:latest`](/packages/smart-home/homeassistant-core) [`wyoming-openwakeword:latest`](/packages/smart-home/wyoming/openwakeword) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Notes | The `homeassistant` base ubuntu image with pre-installed dependencies based on `https://github.com/home-assistant/docker-base/blob/master/ubuntu/Dockerfile` | @@ -20,27 +20,27 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag homeassistant-base) +jetson-containers run $(autotag homeassistant-base) # or if using 'docker run' (specify image and mounts/ect) -sudo docker run --runtime nvidia -it --rm --network=host homeassistant-base:36.2.0 +sudo docker run --runtime nvidia -it --rm --network=host homeassistant-base:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag homeassistant-base) +jetson-containers run -v /path/on/host:/path/in/container $(autotag homeassistant-base) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag homeassistant-base) my_app --abc xyz +jetson-containers run $(autotag homeassistant-base) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -48,7 +48,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh homeassistant-base +jetson-containers build homeassistant-base ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/smart-home/homeassistant-core/README.md b/packages/smart-home/homeassistant-core/README.md index 78d663ae5..354aa6e7b 100644 --- a/packages/smart-home/homeassistant-core/README.md +++ b/packages/smart-home/homeassistant-core/README.md @@ -75,10 +75,18 @@ In order to provide `HA` with access to the host's `Bluetooth` device, Home Assi | :-- | :-- | |    Aliases | `homeassistant-core` | |    Requires | `L4T ['>=34.1.0']` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`homeassistant-base`](/packages/smart-home/homeassistant-base) [`ffmpeg`](/packages/ffmpeg) [`python:3.12`](/packages/python) | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`homeassistant-base`](/packages/smart-home/homeassistant-base) [`ffmpeg`](/packages/ffmpeg) [`python:3.12`](/packages/build/python) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Notes | The `homeassistant-core` wheel that's build is saved in `/usr/src/homeassistant` | +| **`homeassistant-core:2024.4.2`** | | +| :-- | :-- | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`homeassistant-base`](/packages/smart-home/homeassistant-base) [`ffmpeg`](/packages/ffmpeg) [`python:3.12`](/packages/build/python) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Images | [`dustynv/homeassistant-core:2024.4.2-r35.4.1`](https://hub.docker.com/r/dustynv/homeassistant-core/tags) `(2024-04-09, 6.0GB)`
[`dustynv/homeassistant-core:2024.4.2-r36.2.0`](https://hub.docker.com/r/dustynv/homeassistant-core/tags) `(2024-04-09, 1.4GB)` | +|    Notes | The `homeassistant-core` wheel that's build is saved in `/usr/src/homeassistant` | +
@@ -101,29 +109,29 @@ In order to provide `HA` with access to the host's `Bluetooth` device, Home Assi RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag homeassistant-core) +jetson-containers run $(autotag homeassistant-core) # or explicitly specify one of the container images above -./run.sh dustynv/homeassistant-core:r35.4.1 +jetson-containers run dustynv/homeassistant-core:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/homeassistant-core:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag homeassistant-core) +jetson-containers run -v /path/on/host:/path/in/container $(autotag homeassistant-core) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag homeassistant-core) my_app --abc xyz +jetson-containers run $(autotag homeassistant-core) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -131,7 +139,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh homeassistant-core +jetson-containers build homeassistant-core ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/smart-home/wyoming/openwakeword/README.md b/packages/smart-home/wyoming/openwakeword/README.md index 137e0fb1c..71deadc7b 100644 --- a/packages/smart-home/wyoming/openwakeword/README.md +++ b/packages/smart-home/wyoming/openwakeword/README.md @@ -104,7 +104,7 @@ Read more how to configure `wyoming-openwakeword` in the [official documentation | :-- | :-- | |    Aliases | `wyoming-openwakeword` | |    Requires | `L4T ['>=34.1.0']` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`homeassistant-base`](/packages/smart-home/homeassistant-base) [`python:3.11`](/packages/python) | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`homeassistant-base`](/packages/smart-home/homeassistant-base) [`python:3.11`](/packages/build/python) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Notes | The `openWakeWord` using the `wyoming` protocol for usage with Home Assistant. Based on `https://github.com/home-assistant/addons/blob/master/openwakeword/Dockerfile` | @@ -114,27 +114,27 @@ Read more how to configure `wyoming-openwakeword` in the [official documentation RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag openwakeword) +jetson-containers run $(autotag openwakeword) # or if using 'docker run' (specify image and mounts/ect) -sudo docker run --runtime nvidia -it --rm --network=host openwakeword:36.2.0 +sudo docker run --runtime nvidia -it --rm --network=host openwakeword:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag openwakeword) +jetson-containers run -v /path/on/host:/path/in/container $(autotag openwakeword) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag openwakeword) my_app --abc xyz +jetson-containers run $(autotag openwakeword) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -142,7 +142,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh openwakeword +jetson-containers build openwakeword ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/tensorflow/README.md b/packages/tensorflow/README.md index 0d2ff3114..f54af7808 100644 --- a/packages/tensorflow/README.md +++ b/packages/tensorflow/README.md @@ -14,8 +14,8 @@ The TensorFlow wheels used in these are from https://docs.nvidia.com/deeplearnin | **`tensorflow`** | | | :-- | :-- | |    Builds | [![`tensorflow_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorflow_jp46.yml?label=tensorflow:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorflow_jp46.yml) [![`tensorflow_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorflow_jp51.yml?label=tensorflow:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorflow_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`protobuf:cpp`](/packages/protobuf/protobuf_cpp) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`protobuf:cpp`](/packages/build/protobuf/protobuf_cpp) | |    Dependants | [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/tensorflow:r32.7.1`](https://hub.docker.com/r/dustynv/tensorflow/tags) `(2023-12-06, 0.8GB)`
[`dustynv/tensorflow:r35.2.1`](https://hub.docker.com/r/dustynv/tensorflow/tags) `(2023-12-05, 5.5GB)`
[`dustynv/tensorflow:r35.3.1`](https://hub.docker.com/r/dustynv/tensorflow/tags) `(2023-08-29, 5.5GB)`
[`dustynv/tensorflow:r35.4.1`](https://hub.docker.com/r/dustynv/tensorflow/tags) `(2023-12-06, 5.5GB)` | @@ -23,8 +23,8 @@ The TensorFlow wheels used in these are from https://docs.nvidia.com/deeplearnin | **`tensorflow2`** | | | :-- | :-- | |    Builds | [![`tensorflow2_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorflow2_jp46.yml?label=tensorflow2:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorflow2_jp46.yml) [![`tensorflow2_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorflow2_jp60.yml?label=tensorflow2:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorflow2_jp60.yml) [![`tensorflow2_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorflow2_jp51.yml?label=tensorflow2:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorflow2_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`protobuf:cpp`](/packages/protobuf/protobuf_cpp) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`protobuf:cpp`](/packages/build/protobuf/protobuf_cpp) | |    Dependants | [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/tensorflow2:r32.7.1`](https://hub.docker.com/r/dustynv/tensorflow2/tags) `(2023-12-06, 0.9GB)`
[`dustynv/tensorflow2:r35.2.1`](https://hub.docker.com/r/dustynv/tensorflow2/tags) `(2023-12-06, 5.6GB)`
[`dustynv/tensorflow2:r35.3.1`](https://hub.docker.com/r/dustynv/tensorflow2/tags) `(2023-12-05, 5.6GB)`
[`dustynv/tensorflow2:r35.4.1`](https://hub.docker.com/r/dustynv/tensorflow2/tags) `(2023-10-07, 5.6GB)`
[`dustynv/tensorflow2:r36.2.0`](https://hub.docker.com/r/dustynv/tensorflow2/tags) `(2023-12-05, 7.2GB)` | @@ -51,29 +51,29 @@ The TensorFlow wheels used in these are from https://docs.nvidia.com/deeplearnin
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag tensorflow) +jetson-containers run $(autotag tensorflow) # or explicitly specify one of the container images above -./run.sh dustynv/tensorflow:r32.7.1 +jetson-containers run dustynv/tensorflow:r32.7.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/tensorflow:r32.7.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag tensorflow) +jetson-containers run -v /path/on/host:/path/in/container $(autotag tensorflow) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag tensorflow) my_app --abc xyz +jetson-containers run $(autotag tensorflow) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -81,7 +81,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh tensorflow +jetson-containers build tensorflow ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/tensorrt/README.md b/packages/tensorrt/README.md index 7e0f6a381..2a308a454 100644 --- a/packages/tensorrt/README.md +++ b/packages/tensorrt/README.md @@ -8,19 +8,23 @@ | **`tensorrt:8.6`** | | | :-- | :-- | -|    Aliases | `tensorrt` | |    Builds | [![`tensorrt-86_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tensorrt-86_jp60.yml?label=tensorrt-86:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tensorrt-86_jp60.yml) | -|    Requires | `L4T ==36.*` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faiss_lite`](/packages/vectordb/faiss_lite) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`onnxruntime`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`optimum`](/packages/llm/optimum) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.11`](/packages/pytorch) [`pytorch:1.12`](/packages/pytorch) [`pytorch:1.13`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.0-distributed`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.1-builder`](/packages/pytorch) [`pytorch:2.1-distributed`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt`](/packages/tensorrt) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio`](/packages/pytorch/torchaudio) [`torchvision`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tritonserver`](/packages/tritonserver) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | -|    Dockerfile | [`Dockerfile`](Dockerfile) | +|    Requires | `L4T ['==r36.*', '==cu122']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:12.2`](/packages/cuda/cuda) [`cudnn:8.9`](/packages/cuda/cudnn) [`python`](/packages/build/python) | +|    Dockerfile | [`Dockerfile.deb`](Dockerfile.deb) | |    Images | [`dustynv/tensorrt:8.6-r36.2.0`](https://hub.docker.com/r/dustynv/tensorrt/tags) `(2023-12-05, 6.7GB)` | +| **`tensorrt:10.0`** | | +| :-- | :-- | +|    Requires | `L4T ['==r36.*', '==cu124']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:12.4`](/packages/cuda/cuda) [`cudnn:9.0`](/packages/cuda/cudnn) [`python`](/packages/build/python) | +|    Dockerfile | [`Dockerfile.tar`](Dockerfile.tar) | + | **`tensorrt`** | | | :-- | :-- | -|    Requires | `L4T <36` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) | -|    Dependants | [`audiocraft`](/packages/audio/audiocraft) [`auto_awq`](/packages/llm/auto_awq) [`auto_gptq`](/packages/llm/auto_gptq) [`awq`](/packages/llm/awq) [`awq:dev`](/packages/llm/awq) [`bitsandbytes`](/packages/llm/bitsandbytes) [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`exllama:v1`](/packages/llm/exllama) [`exllama:v2`](/packages/llm/exllama) [`faiss_lite`](/packages/vectordb/faiss_lite) [`gptq-for-llama`](/packages/llm/gptq-for-llama) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`l4t-text-generation`](/packages/l4t/l4t-text-generation) [`langchain`](/packages/llm/langchain) [`langchain:samples`](/packages/llm/langchain) [`llava`](/packages/llm/llava) [`local_llm`](/packages/llm/local_llm) [`minigpt4`](/packages/llm/minigpt4) [`mlc:1f70d71`](/packages/llm/mlc) [`mlc:1f70d71-builder`](/packages/llm/mlc) [`mlc:3feed05`](/packages/llm/mlc) [`mlc:3feed05-builder`](/packages/llm/mlc) [`mlc:51fb0f4`](/packages/llm/mlc) [`mlc:51fb0f4-builder`](/packages/llm/mlc) [`mlc:5584cac`](/packages/llm/mlc) [`mlc:5584cac-builder`](/packages/llm/mlc) [`mlc:607dc5a`](/packages/llm/mlc) [`mlc:607dc5a-builder`](/packages/llm/mlc) [`mlc:731616e`](/packages/llm/mlc) [`mlc:731616e-builder`](/packages/llm/mlc) [`mlc:9bf5723`](/packages/llm/mlc) [`mlc:9bf5723-builder`](/packages/llm/mlc) [`mlc:dev`](/packages/llm/mlc) [`mlc:dev-builder`](/packages/llm/mlc) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`nemo`](/packages/nemo) [`onnxruntime`](/packages/onnxruntime) [`openai-triton`](/packages/openai-triton) [`optimum`](/packages/llm/optimum) [`pytorch:1.10`](/packages/pytorch) [`pytorch:1.11`](/packages/pytorch) [`pytorch:1.12`](/packages/pytorch) [`pytorch:1.13`](/packages/pytorch) [`pytorch:1.9`](/packages/pytorch) [`pytorch:2.0`](/packages/pytorch) [`pytorch:2.0-distributed`](/packages/pytorch) [`pytorch:2.1`](/packages/pytorch) [`pytorch:2.1-builder`](/packages/pytorch) [`pytorch:2.1-distributed`](/packages/pytorch) [`raft`](/packages/rapids/raft) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion`](/packages/diffusion/stable-diffusion) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt`](/packages/tensorrt) [`text-generation-inference`](/packages/llm/text-generation-inference) [`text-generation-webui:1.7`](/packages/llm/text-generation-webui) [`text-generation-webui:6a7cd01`](/packages/llm/text-generation-webui) [`text-generation-webui:main`](/packages/llm/text-generation-webui) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`torchaudio`](/packages/pytorch/torchaudio) [`torchvision`](/packages/pytorch/torchvision) [`transformers`](/packages/llm/transformers) [`transformers:git`](/packages/llm/transformers) [`transformers:nvgpt`](/packages/llm/transformers) [`tritonserver`](/packages/tritonserver) [`tvm`](/packages/tvm) [`whisper`](/packages/audio/whisper) [`whisperx`](/packages/audio/whisperx) [`xformers`](/packages/llm/xformers) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | +|    Requires | `L4T ['<36']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) | +|    Dependants | [`deepstream`](/packages/deepstream) [`efficientvit`](/packages/vit/efficientvit) [`jetson-inference`](/packages/jetson-inference) [`jetson-utils`](/packages/jetson-utils) [`l4t-diffusion`](/packages/l4t/l4t-diffusion) [`l4t-ml`](/packages/l4t/l4t-ml) [`l4t-pytorch`](/packages/l4t/l4t-pytorch) [`l4t-tensorflow:tf1`](/packages/l4t/l4t-tensorflow) [`l4t-tensorflow:tf2`](/packages/l4t/l4t-tensorflow) [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) [`nanoowl`](/packages/vit/nanoowl) [`nanosam`](/packages/vit/nanosam) [`onnxruntime:1.11`](/packages/onnxruntime) [`onnxruntime:1.11-builder`](/packages/onnxruntime) [`onnxruntime:1.16.3`](/packages/onnxruntime) [`onnxruntime:1.16.3-builder`](/packages/onnxruntime) [`onnxruntime:1.17`](/packages/onnxruntime) [`onnxruntime:1.17-builder`](/packages/onnxruntime) [`optimum`](/packages/llm/optimum) [`piper-tts`](/packages/audio/piper-tts) [`ros:foxy-desktop`](/packages/ros) [`ros:foxy-ros-base`](/packages/ros) [`ros:foxy-ros-core`](/packages/ros) [`ros:galactic-desktop`](/packages/ros) [`ros:galactic-ros-base`](/packages/ros) [`ros:galactic-ros-core`](/packages/ros) [`ros:humble-desktop`](/packages/ros) [`ros:humble-ros-base`](/packages/ros) [`ros:humble-ros-core`](/packages/ros) [`ros:iron-desktop`](/packages/ros) [`ros:iron-ros-base`](/packages/ros) [`ros:iron-ros-core`](/packages/ros) [`ros:melodic-desktop`](/packages/ros) [`ros:melodic-ros-base`](/packages/ros) [`ros:melodic-ros-core`](/packages/ros) [`ros:noetic-desktop`](/packages/ros) [`ros:noetic-ros-base`](/packages/ros) [`ros:noetic-ros-core`](/packages/ros) [`sam`](/packages/vit/sam) [`stable-diffusion-webui`](/packages/diffusion/stable-diffusion-webui) [`tam`](/packages/vit/tam) [`tensorflow`](/packages/tensorflow) [`tensorflow2`](/packages/tensorflow) [`tensorrt_llm:0.10.dev0`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.10.dev0-builder`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5`](/packages/llm/tensorrt_llm) [`tensorrt_llm:0.5-builder`](/packages/llm/tensorrt_llm) [`torch2trt`](/packages/pytorch/torch2trt) [`torch_tensorrt`](/packages/pytorch/torch_tensorrt) [`tritonserver`](/packages/tritonserver) [`xtts`](/packages/audio/xtts) [`zed`](/packages/zed) | |    Images | [`dustynv/tensorrt:8.6-r36.2.0`](https://hub.docker.com/r/dustynv/tensorrt/tags) `(2023-12-05, 6.7GB)` | @@ -42,29 +46,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag tensorrt) +jetson-containers run $(autotag tensorrt) # or explicitly specify one of the container images above -./run.sh dustynv/tensorrt:8.6-r36.2.0 +jetson-containers run dustynv/tensorrt:8.6-r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/tensorrt:8.6-r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag tensorrt) +jetson-containers run -v /path/on/host:/path/in/container $(autotag tensorrt) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag tensorrt) my_app --abc xyz +jetson-containers run $(autotag tensorrt) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -72,7 +76,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh tensorrt +jetson-containers build tensorrt ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/tritonserver/README.md b/packages/tritonserver/README.md index 86d686039..db1c0fa5c 100644 --- a/packages/tritonserver/README.md +++ b/packages/tritonserver/README.md @@ -9,8 +9,8 @@ | **`tritonserver`** | | | :-- | :-- | |    Builds | [![`tritonserver_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tritonserver_jp46.yml?label=tritonserver:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tritonserver_jp46.yml) [![`tritonserver_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tritonserver_jp51.yml?label=tritonserver:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tritonserver_jp51.yml) [![`tritonserver_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tritonserver_jp60.yml?label=tritonserver:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tritonserver_jp60.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) | |    Dependants | [`deepstream`](/packages/deepstream) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/tritonserver:r32.7.1`](https://hub.docker.com/r/dustynv/tritonserver/tags) `(2024-02-27, 1.3GB)`
[`dustynv/tritonserver:r35.2.1`](https://hub.docker.com/r/dustynv/tritonserver/tags) `(2023-09-07, 5.9GB)`
[`dustynv/tritonserver:r35.3.1`](https://hub.docker.com/r/dustynv/tritonserver/tags) `(2024-02-27, 5.9GB)`
[`dustynv/tritonserver:r35.4.1`](https://hub.docker.com/r/dustynv/tritonserver/tags) `(2024-02-27, 5.9GB)`
[`dustynv/tritonserver:r36.2.0`](https://hub.docker.com/r/dustynv/tritonserver/tags) `(2024-02-27, 8.4GB)` | @@ -39,29 +39,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag tritonserver) +jetson-containers run $(autotag tritonserver) # or explicitly specify one of the container images above -./run.sh dustynv/tritonserver:r35.4.1 +jetson-containers run dustynv/tritonserver:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/tritonserver:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag tritonserver) +jetson-containers run -v /path/on/host:/path/in/container $(autotag tritonserver) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag tritonserver) my_app --abc xyz +jetson-containers run $(autotag tritonserver) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -69,7 +69,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh tritonserver +jetson-containers build tritonserver ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/tvm/README.md b/packages/tvm/README.md index 05d6ac561..6364e5f5a 100644 --- a/packages/tvm/README.md +++ b/packages/tvm/README.md @@ -9,8 +9,8 @@ | **`tvm`** | | | :-- | :-- | |    Builds | [![`tvm_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tvm_jp51.yml?label=tvm:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tvm_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`rust`](/packages/rust) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`rust`](/packages/build/rust) | |    Dockerfile | [`Dockerfile`](Dockerfile) | @@ -19,27 +19,27 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag tvm) +jetson-containers run $(autotag tvm) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host tvm:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag tvm) +jetson-containers run -v /path/on/host:/path/in/container $(autotag tvm) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag tvm) my_app --abc xyz +jetson-containers run $(autotag tvm) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -47,7 +47,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh tvm +jetson-containers build tvm ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/vectordb/faiss/README.md b/packages/vectordb/faiss/README.md index 2197d860b..5c638a126 100644 --- a/packages/vectordb/faiss/README.md +++ b/packages/vectordb/faiss/README.md @@ -6,34 +6,32 @@ CONTAINERS
-| **`faiss:v1.7.3-builder`** | | +| **`faiss:1.7.3`** | | | :-- | :-- | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | -| **`faiss:v1.7.3`** | | +| **`faiss:1.7.3-builder`** | | | :-- | :-- | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -| **`faiss:be12427-builder`** | | +| **`faiss:1.7.4`** | | | :-- | :-- | -|    Aliases | `faiss:builder` | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) | -|    Dockerfile | [`Dockerfile.builder`](Dockerfile.builder) | -|    Images | [`dustynv/faiss:be12427-builder-r36.2.0`](https://hub.docker.com/r/dustynv/faiss/tags) `(2024-03-09, 4.2GB)` | +|    Aliases | `faiss` | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) | +|    Dependants | [`faiss_lite`](/packages/vectordb/faiss_lite) [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) | +|    Dockerfile | [`Dockerfile`](Dockerfile) | -| **`faiss:be12427`** | | +| **`faiss:1.7.4-builder`** | | | :-- | :-- | -|    Aliases | `faiss` | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/python) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) | -|    Dependants | [`faiss_lite`](/packages/vectordb/faiss_lite) [`local_llm`](/packages/llm/local_llm) [`nanodb`](/packages/vectordb/nanodb) | +|    Aliases | `faiss:builder` | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) | |    Dockerfile | [`Dockerfile`](Dockerfile) | -|    Images | [`dustynv/faiss:be12427-builder-r36.2.0`](https://hub.docker.com/r/dustynv/faiss/tags) `(2024-03-09, 4.2GB)` | @@ -43,7 +41,9 @@ | Repository/Tag | Date | Arch | Size | | :-- | :--: | :--: | :--: | +|   [`dustynv/faiss:be12427-builder-r35.4.1`](https://hub.docker.com/r/dustynv/faiss/tags) | `2024-03-26` | `arm64` | `6.2GB` | |   [`dustynv/faiss:be12427-builder-r36.2.0`](https://hub.docker.com/r/dustynv/faiss/tags) | `2024-03-09` | `arm64` | `4.2GB` | +|   [`dustynv/faiss:builder-r35.4.1`](https://hub.docker.com/r/dustynv/faiss/tags) | `2024-03-26` | `arm64` | `6.2GB` | |   [`dustynv/faiss:lite-r35.2.1`](https://hub.docker.com/r/dustynv/faiss/tags) | `2023-12-11` | `arm64` | `6.4GB` | |   [`dustynv/faiss:lite-r35.3.1`](https://hub.docker.com/r/dustynv/faiss/tags) | `2023-11-05` | `arm64` | `6.4GB` | |   [`dustynv/faiss:lite-r35.4.1`](https://hub.docker.com/r/dustynv/faiss/tags) | `2023-12-14` | `arm64` | `6.4GB` | @@ -58,29 +58,29 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag faiss) +jetson-containers run $(autotag faiss) # or explicitly specify one of the container images above -./run.sh dustynv/faiss:be12427-builder-r36.2.0 +jetson-containers run dustynv/faiss:be12427-builder-r35.4.1 # or if using 'docker run' (specify image and mounts/ect) -sudo docker run --runtime nvidia -it --rm --network=host dustynv/faiss:be12427-builder-r36.2.0 +sudo docker run --runtime nvidia -it --rm --network=host dustynv/faiss:be12427-builder-r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag faiss) +jetson-containers run -v /path/on/host:/path/in/container $(autotag faiss) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag faiss) my_app --abc xyz +jetson-containers run $(autotag faiss) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -88,7 +88,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh faiss +jetson-containers build faiss ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/vectordb/faiss_lite/README.md b/packages/vectordb/faiss_lite/README.md index d5427ebd3..85a825b43 100644 --- a/packages/vectordb/faiss_lite/README.md +++ b/packages/vectordb/faiss_lite/README.md @@ -8,9 +8,9 @@ | **`faiss_lite`** | | | :-- | :-- | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`cuda-python`](/packages/cuda/cuda-python) [`faiss`](/packages/vectordb/faiss) | -|    Dependants | [`local_llm`](/packages/llm/local_llm) [`nanodb`](/packages/vectordb/nanodb) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`cuda-python`](/packages/cuda/cuda-python) [`faiss`](/packages/vectordb/faiss) | +|    Dependants | [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) [`nanodb`](/packages/vectordb/nanodb) | |    Dockerfile | [`Dockerfile`](Dockerfile) | @@ -19,27 +19,27 @@ RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag faiss_lite) +jetson-containers run $(autotag faiss_lite) # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host faiss_lite:35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag faiss_lite) +jetson-containers run -v /path/on/host:/path/in/container $(autotag faiss_lite) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag faiss_lite) my_app --abc xyz +jetson-containers run $(autotag faiss_lite) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -47,7 +47,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh faiss_lite +jetson-containers build faiss_lite ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/vectordb/nanodb/README.md b/packages/vectordb/nanodb/README.md index f1ff886c8..0bc456405 100644 --- a/packages/vectordb/nanodb/README.md +++ b/packages/vectordb/nanodb/README.md @@ -96,9 +96,9 @@ Then navigate your browser to `http://HOSTNAME:7860?__theme=dark`, and you can e | **`nanodb`** | | | :-- | :-- | |    Builds | [![`nanodb_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nanodb_jp51.yml?label=nanodb:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nanodb_jp51.yml) [![`nanodb_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nanodb_jp60.yml?label=nanodb:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nanodb_jp60.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`cuda-python`](/packages/cuda/cuda-python) [`faiss`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`torch2trt`](/packages/pytorch/torch2trt) | -|    Dependants | [`local_llm`](/packages/llm/local_llm) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda:11.4`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`cuda-python`](/packages/cuda/cuda-python) [`faiss`](/packages/vectordb/faiss) [`faiss_lite`](/packages/vectordb/faiss_lite) [`torchvision`](/packages/pytorch/torchvision) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`tensorrt`](/packages/tensorrt) [`torch2trt`](/packages/pytorch/torch2trt) | +|    Dependants | [`local_llm`](/packages/llm/local_llm) [`nano_llm:24.4`](/packages/llm/nano_llm) [`nano_llm:main`](/packages/llm/nano_llm) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/nanodb:r35.2.1`](https://hub.docker.com/r/dustynv/nanodb/tags) `(2023-12-14, 6.9GB)`
[`dustynv/nanodb:r35.3.1`](https://hub.docker.com/r/dustynv/nanodb/tags) `(2023-12-15, 7.0GB)`
[`dustynv/nanodb:r35.4.1`](https://hub.docker.com/r/dustynv/nanodb/tags) `(2023-12-12, 6.9GB)`
[`dustynv/nanodb:r36.2.0`](https://hub.docker.com/r/dustynv/nanodb/tags) `(2024-03-08, 7.8GB)` | @@ -124,29 +124,29 @@ Then navigate your browser to `http://HOSTNAME:7860?__theme=dark`, and you can e
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag nanodb) +jetson-containers run $(autotag nanodb) # or explicitly specify one of the container images above -./run.sh dustynv/nanodb:r36.2.0 +jetson-containers run dustynv/nanodb:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/nanodb:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag nanodb) +jetson-containers run -v /path/on/host:/path/in/container $(autotag nanodb) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag nanodb) my_app --abc xyz +jetson-containers run $(autotag nanodb) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -154,7 +154,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh nanodb +jetson-containers build nanodb ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/vit/efficientvit/README.md b/packages/vit/efficientvit/README.md index 33fc30156..854001546 100644 --- a/packages/vit/efficientvit/README.md +++ b/packages/vit/efficientvit/README.md @@ -10,8 +10,8 @@ docs.md | **`efficientvit`** | | | :-- | :-- | |    Builds | [![`efficientvit_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/efficientvit_jp60.yml?label=efficientvit:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/efficientvit_jp60.yml) [![`efficientvit_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/efficientvit_jp51.yml?label=efficientvit:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/efficientvit_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:distributed`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`opencv`](/packages/opencv) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`onnxruntime`](/packages/onnxruntime) [`jupyterlab`](/packages/jupyterlab) [`sam`](/packages/vit/sam) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`opencv`](/packages/opencv) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`tensorrt`](/packages/tensorrt) [`onnxruntime`](/packages/onnxruntime) [`jupyterlab`](/packages/jupyterlab) [`sam`](/packages/vit/sam) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/efficientvit:r35.3.1`](https://hub.docker.com/r/dustynv/efficientvit/tags) `(2024-03-07, 6.5GB)`
[`dustynv/efficientvit:r35.4.1`](https://hub.docker.com/r/dustynv/efficientvit/tags) `(2024-01-13, 6.5GB)`
[`dustynv/efficientvit:r36.2.0`](https://hub.docker.com/r/dustynv/efficientvit/tags) `(2024-01-13, 8.1GB)` | @@ -36,29 +36,29 @@ docs.md
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag efficientvit) +jetson-containers run $(autotag efficientvit) # or explicitly specify one of the container images above -./run.sh dustynv/efficientvit:r35.3.1 +jetson-containers run dustynv/efficientvit:r35.3.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/efficientvit:r35.3.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag efficientvit) +jetson-containers run -v /path/on/host:/path/in/container $(autotag efficientvit) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag efficientvit) my_app --abc xyz +jetson-containers run $(autotag efficientvit) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -66,7 +66,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh efficientvit +jetson-containers build efficientvit ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/vit/nanoowl/README.md b/packages/vit/nanoowl/README.md index 115aeb351..62755e5e2 100644 --- a/packages/vit/nanoowl/README.md +++ b/packages/vit/nanoowl/README.md @@ -44,8 +44,8 @@ | **`nanoowl`** | | | :-- | :-- | |    Builds | [![`nanoowl_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nanoowl_jp51.yml?label=nanoowl:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nanoowl_jp51.yml) [![`nanoowl_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nanoowl_jp60.yml?label=nanoowl:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nanoowl_jp60.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`torch2trt`](/packages/pytorch/torch2trt) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`tensorrt`](/packages/tensorrt) [`torch2trt`](/packages/pytorch/torch2trt) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) [`opencv`](/packages/opencv) [`gstreamer`](/packages/gstreamer) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/nanoowl:r35.2.1`](https://hub.docker.com/r/dustynv/nanoowl/tags) `(2023-12-14, 7.1GB)`
[`dustynv/nanoowl:r35.3.1`](https://hub.docker.com/r/dustynv/nanoowl/tags) `(2024-02-22, 7.1GB)`
[`dustynv/nanoowl:r35.4.1`](https://hub.docker.com/r/dustynv/nanoowl/tags) `(2023-12-11, 7.1GB)`
[`dustynv/nanoowl:r36.2.0`](https://hub.docker.com/r/dustynv/nanoowl/tags) `(2024-02-22, 9.0GB)` | @@ -71,29 +71,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag nanoowl) +jetson-containers run $(autotag nanoowl) # or explicitly specify one of the container images above -./run.sh dustynv/nanoowl:r36.2.0 +jetson-containers run dustynv/nanoowl:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/nanoowl:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag nanoowl) +jetson-containers run -v /path/on/host:/path/in/container $(autotag nanoowl) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag nanoowl) my_app --abc xyz +jetson-containers run $(autotag nanoowl) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -101,7 +101,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh nanoowl +jetson-containers build nanoowl ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/vit/nanosam/README.md b/packages/vit/nanosam/README.md index 527bda31c..6a20e9f25 100644 --- a/packages/vit/nanosam/README.md +++ b/packages/vit/nanosam/README.md @@ -33,8 +33,8 @@ | **`nanosam`** | | | :-- | :-- | |    Builds | [![`nanosam_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nanosam_jp60.yml?label=nanosam:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nanosam_jp60.yml) [![`nanosam_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/nanosam_jp51.yml?label=nanosam:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/nanosam_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`torch2trt`](/packages/pytorch/torch2trt) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/rust) [`transformers`](/packages/llm/transformers) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`tensorrt`](/packages/tensorrt) [`torch2trt`](/packages/pytorch/torch2trt) [`huggingface_hub`](/packages/llm/huggingface_hub) [`rust`](/packages/build/rust) [`transformers`](/packages/llm/transformers) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/nanosam:r35.2.1`](https://hub.docker.com/r/dustynv/nanosam/tags) `(2023-12-15, 6.2GB)`
[`dustynv/nanosam:r35.3.1`](https://hub.docker.com/r/dustynv/nanosam/tags) `(2023-12-14, 6.2GB)`
[`dustynv/nanosam:r35.4.1`](https://hub.docker.com/r/dustynv/nanosam/tags) `(2023-11-05, 6.2GB)`
[`dustynv/nanosam:r36.2.0`](https://hub.docker.com/r/dustynv/nanosam/tags) `(2023-12-15, 7.9GB)` | @@ -60,29 +60,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag nanosam) +jetson-containers run $(autotag nanosam) # or explicitly specify one of the container images above -./run.sh dustynv/nanosam:r35.2.1 +jetson-containers run dustynv/nanosam:r35.2.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/nanosam:r35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag nanosam) +jetson-containers run -v /path/on/host:/path/in/container $(autotag nanosam) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag nanosam) my_app --abc xyz +jetson-containers run $(autotag nanosam) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -90,7 +90,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh nanosam +jetson-containers build nanosam ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/vit/sam/README.md b/packages/vit/sam/README.md index 967f6233e..2e962cf40 100644 --- a/packages/vit/sam/README.md +++ b/packages/vit/sam/README.md @@ -62,8 +62,8 @@ Outputs are: | **`sam`** | | | :-- | :-- | |    Builds | [![`sam_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/sam_jp51.yml?label=sam:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/sam_jp51.yml) [![`sam_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/sam_jp60.yml?label=sam:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/sam_jp60.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`onnxruntime`](/packages/onnxruntime) [`opencv`](/packages/opencv) [`rust`](/packages/rust) [`jupyterlab`](/packages/jupyterlab) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`tensorrt`](/packages/tensorrt) [`onnxruntime`](/packages/onnxruntime) [`opencv`](/packages/opencv) [`rust`](/packages/build/rust) [`jupyterlab`](/packages/jupyterlab) | |    Dependants | [`efficientvit`](/packages/vit/efficientvit) [`tam`](/packages/vit/tam) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/sam:r35.2.1`](https://hub.docker.com/r/dustynv/sam/tags) `(2023-11-05, 6.1GB)`
[`dustynv/sam:r35.3.1`](https://hub.docker.com/r/dustynv/sam/tags) `(2024-03-07, 6.1GB)`
[`dustynv/sam:r35.4.1`](https://hub.docker.com/r/dustynv/sam/tags) `(2024-01-13, 6.1GB)`
[`dustynv/sam:r36.2.0`](https://hub.docker.com/r/dustynv/sam/tags) `(2024-03-07, 7.9GB)` | @@ -90,29 +90,29 @@ Outputs are:
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag sam) +jetson-containers run $(autotag sam) # or explicitly specify one of the container images above -./run.sh dustynv/sam:r36.2.0 +jetson-containers run dustynv/sam:r36.2.0 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/sam:r36.2.0 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag sam) +jetson-containers run -v /path/on/host:/path/in/container $(autotag sam) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag sam) my_app --abc xyz +jetson-containers run $(autotag sam) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -120,7 +120,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh sam +jetson-containers build sam ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/vit/tam/README.md b/packages/vit/tam/README.md index 07f73059f..a348bbd71 100644 --- a/packages/vit/tam/README.md +++ b/packages/vit/tam/README.md @@ -17,8 +17,8 @@ Use your web browser to access `http://HOSTNAME:12212` | **`tam`** | | | :-- | :-- | |    Builds | [![`tam_jp60`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tam_jp60.yml?label=tam:jp60)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tam_jp60.yml) [![`tam_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/tam_jp51.yml?label=tam:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/tam_jp51.yml) | -|    Requires | `L4T >=34.1.0` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) [`numpy`](/packages/numpy) [`cmake`](/packages/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:distributed`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`onnxruntime`](/packages/onnxruntime) [`opencv`](/packages/opencv) [`rust`](/packages/rust) [`jupyterlab`](/packages/jupyterlab) [`sam`](/packages/vit/sam) | +|    Requires | `L4T ['>=34.1.0']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`numpy`](/packages/numpy) [`cmake`](/packages/build/cmake/cmake_pip) [`onnx`](/packages/onnx) [`pytorch:2.2`](/packages/pytorch) [`torchvision`](/packages/pytorch/torchvision) [`tensorrt`](/packages/tensorrt) [`onnxruntime`](/packages/onnxruntime) [`opencv`](/packages/opencv) [`rust`](/packages/build/rust) [`jupyterlab`](/packages/jupyterlab) [`sam`](/packages/vit/sam) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/tam:r35.2.1`](https://hub.docker.com/r/dustynv/tam/tags) `(2023-12-12, 6.9GB)`
[`dustynv/tam:r35.3.1`](https://hub.docker.com/r/dustynv/tam/tags) `(2024-01-13, 7.0GB)`
[`dustynv/tam:r35.4.1`](https://hub.docker.com/r/dustynv/tam/tags) `(2024-03-07, 7.0GB)`
[`dustynv/tam:r36.2.0`](https://hub.docker.com/r/dustynv/tam/tags) `(2024-01-13, 8.6GB)` | @@ -44,29 +44,29 @@ Use your web browser to access `http://HOSTNAME:12212`
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag tam) +jetson-containers run $(autotag tam) # or explicitly specify one of the container images above -./run.sh dustynv/tam:r35.4.1 +jetson-containers run dustynv/tam:r35.4.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/tam:r35.4.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag tam) +jetson-containers run -v /path/on/host:/path/in/container $(autotag tam) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag tam) my_app --abc xyz +jetson-containers run $(autotag tam) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -74,7 +74,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh tam +jetson-containers build tam ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.
diff --git a/packages/zed/README.md b/packages/zed/README.md index 64d89c4e2..84307b927 100644 --- a/packages/zed/README.md +++ b/packages/zed/README.md @@ -9,8 +9,8 @@ | **`zed`** | | | :-- | :-- | |    Builds | [![`zed_jp46`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/zed_jp46.yml?label=zed:jp46)](https://github.com/dusty-nv/jetson-containers/actions/workflows/zed_jp46.yml) [![`zed_jp51`](https://img.shields.io/github/actions/workflow/status/dusty-nv/jetson-containers/zed_jp51.yml?label=zed:jp51)](https://github.com/dusty-nv/jetson-containers/actions/workflows/zed_jp51.yml) | -|    Requires | `L4T >=32.6` | -|    Dependencies | [`build-essential`](/packages/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/python) [`tensorrt`](/packages/tensorrt) | +|    Requires | `L4T ['>=32.6']` | +|    Dependencies | [`build-essential`](/packages/build/build-essential) [`cuda`](/packages/cuda/cuda) [`cudnn`](/packages/cuda/cudnn) [`python`](/packages/build/python) [`tensorrt`](/packages/tensorrt) | |    Dockerfile | [`Dockerfile`](Dockerfile) | |    Images | [`dustynv/zed:r32.7.1`](https://hub.docker.com/r/dustynv/zed/tags) `(2023-09-07, 0.6GB)`
[`dustynv/zed:r35.2.1`](https://hub.docker.com/r/dustynv/zed/tags) `(2023-12-11, 5.2GB)`
[`dustynv/zed:r35.3.1`](https://hub.docker.com/r/dustynv/zed/tags) `(2023-08-29, 5.2GB)`
[`dustynv/zed:r35.4.1`](https://hub.docker.com/r/dustynv/zed/tags) `(2023-10-07, 5.1GB)` | |    Notes | https://github.com/stereolabs/zed-docker/blob/master/4.X/l4t/py-devel/Dockerfile | @@ -37,29 +37,29 @@
RUN CONTAINER
-To start the container, you can use the [`run.sh`](/docs/run.md)/[`autotag`](/docs/run.md#autotag) helpers or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: +To start the container, you can use [`jetson-containers run`](/docs/run.md) and [`autotag`](/docs/run.md#autotag), or manually put together a [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) command: ```bash # automatically pull or build a compatible container image -./run.sh $(./autotag zed) +jetson-containers run $(autotag zed) # or explicitly specify one of the container images above -./run.sh dustynv/zed:r35.2.1 +jetson-containers run dustynv/zed:r35.2.1 # or if using 'docker run' (specify image and mounts/ect) sudo docker run --runtime nvidia -it --rm --network=host dustynv/zed:r35.2.1 ``` -> [`run.sh`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
+> [`jetson-containers run`](/docs/run.md) forwards arguments to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/) with some defaults added (like `--runtime nvidia`, mounts a `/data` cache, and detects devices)
> [`autotag`](/docs/run.md#autotag) finds a container image that's compatible with your version of JetPack/L4T - either locally, pulled from a registry, or by building it. To mount your own directories into the container, use the [`-v`](https://docs.docker.com/engine/reference/commandline/run/#volume) or [`--volume`](https://docs.docker.com/engine/reference/commandline/run/#volume) flags: ```bash -./run.sh -v /path/on/host:/path/in/container $(./autotag zed) +jetson-containers run -v /path/on/host:/path/in/container $(autotag zed) ``` To launch the container running a command, as opposed to an interactive shell: ```bash -./run.sh $(./autotag zed) my_app --abc xyz +jetson-containers run $(autotag zed) my_app --abc xyz ``` -You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it. +You can pass any options to it that you would to [`docker run`](https://docs.docker.com/engine/reference/commandline/run/), and it'll print out the full command that it constructs before executing it.
BUILD CONTAINER @@ -67,7 +67,7 @@ You can pass any options to [`run.sh`](/docs/run.md) that you would to [`docker If you use [`autotag`](/docs/run.md#autotag) as shown above, it'll ask to build the container for you if needed. To manually build it, first do the [system setup](/docs/setup.md), then run: ```bash -./build.sh zed +jetson-containers build zed ``` -The dependencies from above will be built into the container, and it'll be tested during. See [`./build.sh --help`](/jetson_containers/build.py) for build options. +The dependencies from above will be built into the container, and it'll be tested during. Run it with [`--help`](/jetson_containers/build.py) for build options.