This folder contains the following examples for code llama 34B models:
File | Description | Model Used | GPU Minimum Requirement |
---|---|---|---|
01_load_inference | Environment setup and suggested configurations when inferencing Code Llama 34B models on Databricks. | CodeLlama-34b-hf CodeLlama-34b-hf-instructions CodeLlama-34b-hf-python |
4xA10-24GB / 1xA100-80GB |