Nosana is a decentralized computing network designed to offer cloud services at a fraction of the cost, leveraging the Solana blockchain for secure, fast, and scalable computing resources. It enables developers to run CI/CD pipelines, perform compute-intensive tasks, and more, all within a decentralized ecosystem.
This project aims to benchmark various AI models using the Nosana network, providing insights into performance, efficiency, and scalability. By leveraging a Docker container environment, we ensure consistency, reproducibility, and ease of deployment across the decentralized nodes of the Nosana network.
Model Name | Description |
---|---|
mistralai/Mistral-7B-Instruct-v0.2 | |
Qwen/Qwen1.5-72B-Chat | |
meta-llama/Llama-2-7b | |
databricks/dbrx-instruct | |
01-ai/Yi-34B-200K | |
xai-org/grok-1 |
The Docker container is built on an Ubuntu 22.04 base image with CUDA support, ensuring compatibility with GPU-accelerated tasks for AI model benchmarking. The container includes Python 3, pip, PyTorch, and the Hugging Face Transformers library, which are essential for running the models.# nosana-llm-benchmarking