This is a RunPod worker implementation for FastHunyuan, a fast inference version of the Hunyuan text-to-video model.
- RunPod account with API access
- NVIDIA GPU with CUDA support
- (Optional) VSCode for remote development
-
Create a development pod on RunPod:
- Go to RunPod Console
- Click "Deploy"
- Select base image:
runpod/pytorch:2.1.2-py3.10-cuda12.1.0
(includes SSH and JupyterLab) - Choose your GPU type
- Name your pod (e.g., "fasthunyuan-dev")
- Deploy
-
Connect to your pod (choose one method):
a. Using RunPod Web Terminal:
- Click "Connect" on your pod in RunPod console
- Select "SSH Terminal" or use the web terminal
b. Using VSCode (Recommended for development):
- Follow the official RunPod guide for VSCode setup
- This allows you to develop directly on the pod with full IDE support
-
Install dependencies:
cd /workspace chmod +x builder/install.sh ./builder/install.sh
-
Test the handler:
python src/handler.py
- The handler is located in
src/handler.py
- Dependencies are managed in
builder/requirements.txt
- System dependencies and installation steps are in
builder/install.sh
- Using VSCode remote development provides a better development experience with:
- Integrated terminal
- Code completion
- Debugging capabilities
- Git integration
-
Build the image:
docker build -t your-registry/fasthunyuan-worker:version .
Note: For production, we use a minimal base image (
nvidia/cuda:12.1.0-cudnn8-devel-ubuntu22.04
) without development tools. -
Push to your registry:
docker push your-registry/fasthunyuan-worker:version
-
Create a template on RunPod:
- Base image: your-registry/fasthunyuan-worker:version
- Container disk: at least 10GB
- HTTP endpoints: enabled
- Environment variables:
HUGGING_FACE_HUB_TOKEN=your_token
-
Deploy serverless endpoints using the template
Example request:
{
"input": {
"prompt": "A cinematic video of a beautiful landscape",
"height": 720,
"width": 1280,
"num_frames": 45,
"num_inference_steps": 6,
"seed": 1024,
"fps": 24
}
}
Example response:
{
"video_path": "/tmp/output_1024.mp4"
}
.
├── builder/
│ ├── install.sh # Development installation script
│ └── requirements.txt # Python dependencies
├── src/
│ └── handler.py # RunPod handler implementation
├── Dockerfile # Production container definition
└── README.md # This file
This project is licensed under the Apache License 2.0 - see the LICENSE file for details.