Skip to content

Latest commit

 

History

History
35 lines (20 loc) · 2.35 KB

README.md

File metadata and controls

35 lines (20 loc) · 2.35 KB
Graphcore logo

Dolly 2.0 on IPUs

Dolly logo

Dolly 2.0 is a 12B parameter language model from Databricks. The large language model (LLM) has been trained and instruction fine-tuned making it better suited for human interactivity. Crucially, Databricks released all code, model weights, and their fine-tuning dataset with an open-source license that permits commercial use. This makes Dolly 2.0 the world's first truly open-source instruction-tuned LLM, ready for you to run and test out with your own prompts via a Paperspace notebook.

Dolly notebooks powered by IPUs

Notebook Framework Type Try for Free
Dolly 2.0 – The World’s First, Truly Open Instruction-Tuned LLM on IPUs – Inference Hugging Face Inference Gradient

In this Paperspace Gradient notebook, you will learn how to run Dolly 2.0 with your own prompts using IPUs. The IPU (Intelligence Processing Unit) is a completely new kind of massively parallel processor designed to accelerate machine intelligence. Create and configure a Dolly inference pipeline, then run Dolly inference on a text prompt to generate answers to questions specified by users.

Dolly resources

To do more with Dolly on IPUs, or to speak to an expert, please feel free to contact us.

IPU community

Join our growing community and interact with AI experts, IPU developers and researchers. Hear the latest IPU news and get access to our newest models.

Join our Slack Community

License

The contents of this repository are made available according to the terms of the MIT license. See the included LICENSE file for details.