From eb4258932c7214f9d0911576a7d1ffbafcce98f7 Mon Sep 17 00:00:00 2001 From: Rohan Sharma Date: Mon, 11 Nov 2024 13:57:07 +0000 Subject: [PATCH] added new integration to cookbook --- .../integration_anthropic_joke_example.ipynb | 172 ++++++++++++++++++ 1 file changed, 172 insertions(+) create mode 100644 cookbook/integration_anthropic_joke_example.ipynb diff --git a/cookbook/integration_anthropic_joke_example.ipynb b/cookbook/integration_anthropic_joke_example.ipynb new file mode 100644 index 000000000..5bf0f47e0 --- /dev/null +++ b/cookbook/integration_anthropic_joke_example.ipynb @@ -0,0 +1,172 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "---\n", + "description: Cookbook with examples of the Langfuse Integration for Traceloop SDK and Anthropic (Python).\n", + "category: Integrations\n", + "---\n", + "\n", + "# Cookbook: Traceloop SDK Integration with Anthropic (Python)\n", + "\n", + "This cookbook provides step-by-step examples of integrating Langfuse with the Traceloop SDK and Anthropic API in Python. By following these examples, you'll learn how to seamlessly log and trace interactions with Anthropic's language models using Traceloop, enhancing transparency, debuggability, and performance monitoring of your AI-driven applications.\n", + "\n", + "---\n", + "\n", + "Note: Langfuse also offers native integrations with [LangChain](https://langfuse.com/docs/integrations/langchain/tracing), [LlamaIndex](https://langfuse.com/docs/integrations/llama-index/get-started), [LiteLLM](https://langfuse.com/docs/integrations/litellm/tracing), and [other frameworks](https://langfuse.com/docs/integrations/overview). If you use any of these frameworks, Langfuse instrumentation is immediately available.\n", + "\n", + "---" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Overview\n", + "\n", + "In this notebook, we will explore various use cases where Langfuse can be integrated with the Traceloop SDK and Anthropic API, including:\n", + "\n", + "- **Basic Workflow Creation:** Learn how to wrap standard Anthropic model interactions with the Traceloop SDK to create workflows with logging and tracing.\n", + "- **Chained Function Calls:** See how to manage and observe complex workflows where multiple model interactions are linked together to produce a final result.\n", + "- **Asynchronous Support:** Discover how to use Langfuse and Traceloop with asynchronous and real-time model responses, ensuring that concurrent interactions are fully traceable.\n", + "- **External Function Calling:** Understand how to implement and observe external tool integrations, allowing the model to interact with custom functions and APIs.\n", + "\n", + "For more detailed guidance on the Traceloop SDK or the Anthropic API, please refer to the [Traceloop SDK documentation](https://traceloop.com/docs) and the [Anthropic Documentation](https://anthropic.com/docs)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Dependencies\n", + "\n", + "To use this code, ensure you have the following packages installed:\n", + "- `traceloop-sdk`: For creating workflows and managing trace data.\n", + "- `anthropic`: For interacting with Anthropic's Claude models.\n\n", + "Install these packages using:\n" + ] + }, + { + "cell_type": "code", + "metadata": { + "id": "install-code-1" + }, + "execution_count": null, + "outputs": [], + "source": [ + "# Install required packages\n", + "%pip install traceloop-sdk anthropic\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Code Explanation\n", + "\n", + "This code performs the following actions:\n", + "1. **Imports Required Libraries**: Imports the necessary components from Traceloop and Anthropic.\n", + "2. **Initializes Traceloop**: Prepares Traceloop for handling workflows by calling `Traceloop.init()`.\n", + "3. **Defines a Workflow**: The `@workflow` decorator is used to define a workflow function (`joke_workflow`). This function initializes an Anthropic client and sends a request to Claude for generating a joke about OpenTelemetry.\n", + "4. **Executes the Workflow**: The `joke_workflow()` function is called to run the workflow and display the joke returned by Claude.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "from anthropic import Anthropic\n", + "from traceloop.sdk import Traceloop\n", + "from traceloop.sdk.decorators import workflow\n\n", + "# Initialize Traceloop\n", + "Traceloop.init()\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Define and Execute the Workflow\n", + "\n", + "The code below defines the `joke_workflow` function and uses Anthropic's API to generate a joke." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "@workflow(name=\"pirate_joke_generator\")\n", + "def joke_workflow():\n", + " # Create an instance of the Anthropic API\n", + " anthropic = Anthropic()\n\n", + " # Request a joke about OpenTelemetry\n", + " response = anthropic.messages.create(\n", + " max_tokens=1024,\n", + " messages=[\n", + " {\n", + " \"role\": \"user\",\n", + " \"content\": \"Tell me a joke about OpenTelemetry\",\n", + " }\n", + " ],\n", + " model=\"claude-3-opus-20240229\",\n", + " )\n\n", + " # Print the response\n", + " print(response.content)\n", + " return response\n\n", + "# Run the workflow\n", + "joke_workflow()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Output\n", + "\n", + "When you run the `joke_workflow()` function, it will print a joke about OpenTelemetry generated by the Claude model." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Explanation of Key Components\n", + "\n", + "- **Traceloop SDK**: Used to create and manage workflows, as well as track traces of data for better observability.\n", + "- **Anthropic API**: Communicates with Anthropic's language models to generate responses based on input prompts.\n", + "- **Claude Model**: The `claude-3-opus-20240229` model from Anthropic, specified in the `messages.create` request, is a large language model optimized for conversation.\n", + "- **`max_tokens`**: This parameter limits the response length, ensuring the joke fits within a defined range." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Questions?\n", + "\n", + "Join our [Discord community](https://langfuse.com/discord) in case you have any questions." + ] + } + ], + "metadata": { + "colab": { + "provenance": [] + }, + "kernelspec": { + "display_name": "Python 3", + "name": "python3" + }, + "language_info": { + "name": "python" + } + }, + "nbformat": 4, + "nbformat_minor": 0 + } + \ No newline at end of file