Skip to content

Commit

Permalink
wip
Browse files Browse the repository at this point in the history
  • Loading branch information
Raunak Chowdhuri committed Aug 8, 2023
1 parent 9a558d8 commit 0b593f7
Show file tree
Hide file tree
Showing 10 changed files with 1,019 additions and 444 deletions.
15 changes: 0 additions & 15 deletions docs/pages/_meta.json

This file was deleted.

3 changes: 0 additions & 3 deletions docs/pages/about.mdx

This file was deleted.

3 changes: 0 additions & 3 deletions docs/pages/advanced.mdx

This file was deleted.

31 changes: 0 additions & 31 deletions docs/pages/another.mdx

This file was deleted.

74 changes: 67 additions & 7 deletions docs/pages/index.mdx
Original file line number Diff line number Diff line change
@@ -1,11 +1,71 @@
# Introduction

Welcome to Nextra! This is a basic docs template. You can use it as a starting point for your own project :)
# XML AI

## What is Nextra?
Forget Elon's xAI. Introducing XML AI, the fastest and most ergonomic way to get structured input and output out of your large language model.

A **simple**, **powerful** and **flexible** site generation framework with everything you love from Next.js.
Just write your prompt in JSON and we'll autoconvert it to XML that the model can work with. The best part? You can [stream the response](/streaming) back as a JSON object in real time. No more sacrificing streaming for function calling or schema following capabilities. Built and optimized for Anthropic's Claude models, but also including OpenAI support.

## Documentation
## Getting Started

The documentation is available at [https://nextra.site](https://nextra.site).
### Python

```python
from xmlai.llm import anthropic_prompt
prompt = anthropic_prompt(
{
"question": "what is the answer to the ultimate question of life?",
"reference": "The Hitchhiker's Guide to the Galaxy",
},
response_root_tag="answer",
)

completion = anthropic.completions.create(
model="claude-instant-1",
max_tokens_to_sample=300,
temperature=0.1,
**prompt,
)

completion.completion # 42
```

The generated prompt looks like this:
```json
{
"prompt":"\n\nHuman:<question>what is the answer to the ultimate question of life?</question>
<reference>The Hitchhiker's Guide to the Galaxy</reference>\n\nAssistant:<answer>",
"stop_sequences":[
"</answer>"
]
}
```

Note that we feed the opening tag to the beginning of the assistant's response! This combined with the closing tag as the stop token almost always ensures that the response is valid XML.

## Typescript

```typescript
import { anthropicPrompt } from "xmlai";

const prompt = anthropicPrompt(
{
question: "what is the answer to the ultimate question of life?",
reference: "The Hitchhiker's Guide to the Galaxy",
},
"answer"
);

const completion = await anthropic.completions.create(
"claude-instant-1",
300,
0.1,
prompt
);
```


## Why another prompting library?
Anthropic's LLM Claude is trained on lots and lots of XML data. It is quite good at following XML schemas. In fact at the [Anthropic Hackathon](https://twitter.com/sauhaarda/status/1685892051043508224?s=20), the prompting workshop specifically presented some extra tips on how to get the best out of Claude when it comes to XML. I incorporated those tricks into this library to make it easier for others to take advantage of.

Also, [the regex for dealing with XML streams](https://github.com/sauhaardac/xmlai/blob/9a558d855e6b4e64f933599a249a0864c41eb273/python/src/xmlai/__init__.py#L17C41-L17C41) is surprisingly grotesque. I figured I'd limit the monstrosity to one codebase where it can be tested and maintained.

The library is designed to be as lightweight as possible.
2 changes: 2 additions & 0 deletions docs/pages/python.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Python

Empty file added docs/pages/streaming.mdx
Empty file.
Empty file added docs/pages/typescript.mdx
Empty file.
Loading

0 comments on commit 0b593f7

Please sign in to comment.