-
Notifications
You must be signed in to change notification settings - Fork 68
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
chore: Incremental Docs Reorg Pt. 1 (#2262)
* part 1 * part 2 * part 2 - undo a lot * part 2 - undo a lot * part 2 - undo a lot * tone down the tools section * small adjustments * small adjustments * updated * lint * Apply suggestions from code review Andrew changes Co-authored-by: Andrew Truong <[email protected]> --------- Co-authored-by: Andrew Truong <[email protected]>
- Loading branch information
1 parent
144ea88
commit 8c1b193
Showing
26 changed files
with
789 additions
and
673 deletions.
There are no files selected for viewing
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,18 +1,37 @@ | ||
# Weave Integrations | ||
|
||
Weave contains automatic logging integrations for popular LLMs and orchestration frameworks. Weave will automatically trace calls made via the following libraries: | ||
|
||
- **[OpenAI](/guides/integrations/openai)** | ||
- **[Anthropic](/guides/integrations/anthropic)** | ||
- **[Cohere](/guides/integrations/cohere)** | ||
- **[MistralAI](/guides/integrations/mistral)** | ||
- **[LangChain](/guides/integrations/langchain)** | ||
- **[LlamaIndex](/guides/integrations/llamaindex)** | ||
- **[DSPy](/guides/integrations/dspy)** | ||
- **[Google Gemini](/guides/integrations/google-gemini)** | ||
- **[Together AI](/guides/integrations/together_ai)** | ||
- **[Groq](/guides/integrations/groq)** | ||
- **[Open Router](/guides/integrations/openrouter)** | ||
- **[Local Models](/guides/integrations/local_models)** | ||
- **[LiteLLM](/guides/integrations/litellm)** | ||
# Integrations | ||
|
||
:::success[Automatic Tracking] | ||
In most cases, all you need to do is call `weave.init()` at the top of your script or program in order for Weave to automatically patch and track any of these libraries! | ||
::: | ||
|
||
Weave provides automatic logging integrations for popular LLM providers and orchestration frameworks. These integrations allow you to seamlessly trace calls made through various libraries, enhancing your ability to monitor and analyze your AI applications. | ||
|
||
## LLM Providers | ||
|
||
LLM providers are the vendors that offer access to large language models for generating predictions. Weave integrates with these providers to log and trace the interactions with their APIs: | ||
|
||
- **[OpenAI](/guides/integrations/openai)** | ||
- **[Anthropic](/guides/integrations/anthropic)** | ||
- **[Cerebras](/guides/integrations/cerebras)** | ||
- **[Cohere](/guides/integrations/cohere)** | ||
- **[MistralAI](/guides/integrations/mistral)** | ||
- **[Google Gemini](/guides/integrations/google-gemini)** | ||
- **[Together AI](/guides/integrations/together_ai)** | ||
- **[Groq](/guides/integrations/groq)** | ||
- **[Open Router](/guides/integrations/openrouter)** | ||
- **[LiteLLM](/guides/integrations/litellm)** | ||
|
||
|
||
**[Local Models](/guides/integrations/local_models)**: For when you're running models on your own infrastructure. | ||
|
||
## Frameworks | ||
|
||
Frameworks help orchestrate the actual execution pipelines in AI applications. They provide tools and abstractions for building complex workflows. Weave integrates with these frameworks to trace the entire pipeline: | ||
|
||
- **[LangChain](/guides/integrations/langchain)** | ||
- **[LlamaIndex](/guides/integrations/llamaindex)** | ||
- **[DSPy](/guides/integrations/dspy)** | ||
|
||
|
||
|
||
Choose an integration from the lists above to learn more about how to use Weave with your preferred LLM provider or framework. Whether you're directly accessing LLM APIs or building complex pipelines with orchestration frameworks, Weave provides the tools to trace and analyze your AI applications effectively. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,21 @@ | ||
# Tools | ||
# Tools & Utilities | ||
|
||
- [serve](/guides/tools/serve): Serve Weave ops and models | ||
- [deploy](/guides/tools/deploy): Deploy Weave ops and models to various targets | ||
Weave is developing a set of tools and utilities to help with your workflow and deployment process for AI applications. These are currently in early alpha stages and subject to change. Here's an overview of what we're working on: | ||
|
||
## Serve (experimental) | ||
|
||
[Serve](/guides/tools/serve) is a feature to expose your Weave ops and models as API endpoints. We're exploring possibilities such as: | ||
|
||
- Creating web services for your Weave components | ||
- Integrating Weave components into existing applications | ||
- Providing a way to test models in a more production-like setting | ||
|
||
## Deploy (experimental) | ||
|
||
[Deploy](/guides/tools/deploy) is another alpha-stage utility we're developing to help with deploying Weave ops and models. Some potential features we're considering include: | ||
|
||
- Pushing models to cloud platforms | ||
- Managing different deployment environments | ||
- Exploring ways to automate parts of the deployment process | ||
|
||
Please note that these tools are still in very early stages of development. They may not be fully functional, could change significantly, or might be discontinued. We recommend using them for experimental purposes only at this time. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,8 +1,30 @@ | ||
# Tracking | ||
# Tracing | ||
|
||
Weave track and versions objects and function calls. | ||
Weave provides powerful tracing capabilities to track and version objects and function calls in your applications. This comprehensive system enables better monitoring, debugging, and iterative development of AI-powered applications, allowing you to "track insights between commits." | ||
|
||
- [objects](/guides/tracking/objects): Weave's extensible serialization layer versions Python objects. | ||
- [ops](/guides/tracking/ops): A Weave op is a function that is automatically versioned and tracked | ||
- [tracing](/guides/tracking/tracing) | ||
- [feedback](/guides/tracking/feedback): Annotate calls with emojis and notes | ||
## Key Tracing Features | ||
|
||
Weave's tracing functionality comprises three main components: | ||
|
||
### Calls | ||
|
||
[Calls](/guides/tracking/tracing) trace function calls, inputs, and outputs, enabling you to: | ||
- Analyze data flow through your application | ||
- Debug complex interactions between components | ||
- Optimize application performance based on call patterns | ||
|
||
### Ops | ||
|
||
[Ops](/guides/tracking/ops) are automatically versioned and tracked functions (which produce Calls) that allow you to: | ||
- Monitor function performance and behavior | ||
- Maintain a record of function modifications | ||
- Ensure experiment reproducibility | ||
|
||
### Objects | ||
|
||
[Objects](/guides/tracking/objects) form Weave's extensible serialization layer, automatically versioning runtime objects (often the inputs and outputs of Calls). This feature allows you to: | ||
- Track changes in data structures over time | ||
- Maintain a clear history of object modifications | ||
- Easily revert to previous versions when needed | ||
|
||
By leveraging these tracing capabilities, you can gain deeper insights into your application's behavior, streamline your development process, and build more robust AI-powered systems. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.