"
-```
-
-:::info Version compatibility
-
-`xef-scala` is currently only available for Scala 3.
-
-:::
-
-The Scala module depends on project [Loom](https://openjdk.org/projects/loom/),
-so you will need at least Java 19 to use the library. Furthermore, you need to pass
-the `--enable-preview` flag.
-
-
-
-
-```shell
-sbt -J--enable-preview
-```
-
-
-
-
-
-- Set the Java version to at least 19.
-- Set VM options to
--enable-preview
.
-
-
-
-
-
-By default, the `conversation` block connects to [OpenAI](https://platform.openai.com/).
-To use their services you should provide the corresponding API key in the `OPENAI_TOKEN`
-environment variable, and have enough credits.
-
-
-
-
-```shell
-env OPENAI_TOKEN=
-```
-
-
-
-
-Set the environment variable `OPENAI_TOKEN=xxx` in the properties.
-
-
-
-
-:::caution
-
-This library may transmit source code and potentially user input data to third-party services as part of its functionality.
-Developers integrating this library into their applications should be aware of this behavior and take necessary precautions to ensure that sensitive data is not inadvertently transmitted.
-Read our [_Data Transmission Disclosure_](https://github.com/xebia-functional/xef#%EF%B8%8F-data-transmission-disclosure) for further information.
-
-:::
-
-## Your first prompt
-
-After adding the library to your project,
-you get access to the `conversation` function, which is your gate to the modern AI world.
-Inside of it, you can _prompt_ for information, which means posing the question to an LLM
-(Large Language Model). The easiest way is to just get the information back as a string.
-
-```scala
-import com.xebia.functional.xef.scala.auto.*
-
-@main def runBook: Unit = conversation {
- val topic: String = "functional programming"
- val result = promptMessage(s"Give me a selection of books about $topic")
- println(result)
-}
-```
-
-## Structure
-
-The output from the `books` function above may be hard to parse back from the
-strings we obtain. Fortunately, you can also ask xef.ai to give you back the information
-using a _custom type_. The library takes care of instructing the LLM on building such
-a structure, and deserialize the result back for you.
-
-```scala
-import com.xebia.functional.xef.scala.conversation.*
-import io.circe.Decoder
-import com.xebia.functional.xef.prompt.Prompt
-
-case class Book(name: String, author: String, summary: String) derives SerialDescriptor, Decoder
-
-def summarizeBook(title: String, author: String)(using conversation: ScalaConversation): Book =
- prompt(Prompt(s"$title by $author summary."))
-
-@main def runBook: Unit =
- conversation {
- val toKillAMockingBird = summarizeBook("To Kill a Mockingbird", "Harper Lee")
- println(s"${toKillAMockingBird.name} by ${toKillAMockingBird.author} summary:\n ${toKillAMockingBird.summary}")
- }
-```
-
-xef.ai for Scala uses `xef-core`, which it's based on Kotlin. Hence, the core
-reuses [Kotlin's common serialization](https://kotlinlang.org/docs/serialization.html), and
-Scala uses [circe](https://github.com/circe/circe) to derive the required serializable instance.
-The LLM is usually able to detect which kind of information should
-go on each field based on its name (like `title` and `author` above).
-
-For those cases where the LLM is not able to infer the type, you can use the `@Description` annotation:
-
-## @Description annotations
-
-```scala
-import com.xebia.functional.xef.scala.conversation.Description
-import com.xebia.functional.xef.scala.conversation.*
-import io.circe.Decoder
-import com.xebia.functional.xef.prompt.Prompt
-
-@Description("A book")
-case class Book(
- @Description("the name of the book") name: String,
- @Description("the author of the book") author: String,
- @Description("A 50 word paragraph with a summary of this book") summary: String
- ) derives SerialDescriptor, Decoder
-
-def summarizeBook(title: String, author: String)(using conversation: ScalaConversation): Book =
- prompt(Prompt(s"$title by $author summary."))
-
-@main def runBook: Unit =
- conversation {
- val toKillAMockingBird = summarizeBook("To Kill a Mockingbird", "Harper Lee")
- println(s"${toKillAMockingBird.name} by ${toKillAMockingBird.author} summary:\n ${toKillAMockingBird.summary}")
- }
-```
-
-
-## Context
-
-LLMs have knowledge about a broad variety of topics. But by construction they are not able
-to respond to questions about information not available in their training set. However, you
-often want to supplement the LLM with more data:
-- Transient information referring to the current moment, like the current weather, or
- the trends in the stock market in the past 10 days.
-- Non-public information, for example for summarizing a piece of text you're creating
- within you organization.
-
-These additional pieces of information are called the _context_ in xef.ai, and are attached
-to every question to the LLM. Although you can add arbitrary strings to the context at any
-point, the most common mode of usage is using an _agent_ to consult an external service,
-and make its response part of the context. One such agent is `search`, which uses a web
-search service to enrich that context.
-
-```scala
-import com.xebia.functional.xef.reasoning.serpapi.Search
-import com.xebia.functional.xef.scala.conversation.*
-import com.xebia.functional.xef.conversation.llm.openai.OpenAI
-import io.circe.Decoder
-import com.xebia.functional.xef.prompt.Prompt
-
-private final case class MealPlanRecipe(name: String, ingredients: List[String]) derives SerialDescriptor, Decoder
-
-private final case class MealPlan(name: String, recipes: List[MealPlanRecipe]) derives SerialDescriptor, Decoder
-
-@main def runMealPlan: Unit =
- conversation {
- val search = Search(OpenAI.FromEnvironment.DEFAULT_CHAT, summon[ScalaConversation], 3)
- addContext(search.search("gall bladder stones meals").get())
- val mealPlan = prompt[MealPlan](Prompt("Meal plan for the week for a person with gall bladder stones that includes 5 recipes."))
- println(mealPlan)
- }
-```
-
-:::note Better vector stores
-
-The underlying mechanism of the context is a _vector store_, a data structure which
-saves a set of strings, and is able to find those similar to another given one.
-By default, xef.ai uses an _in-memory_ vector store, since it provides maximum
-compatibility across platforms. However, if you foresee your context growing above
-the hundreds of elements, you may consider switching to another alternative, like
-Lucene or PostgreSQL.
-
-:::
diff --git a/docusaurus.config.js b/docusaurus.config.js
index d9877af..45f1368 100644
--- a/docusaurus.config.js
+++ b/docusaurus.config.js
@@ -110,28 +110,9 @@ const createConfig = async () => {
position: 'right',
},
{
- type: 'dropdown',
label: 'Quickstart',
- position: 'right',
to: '/learn/quickstart',
- items: [
- {
- label: 'Kotlin',
- to: '/learn/quickstart/kotlin',
- },
- {
- label: 'Scala',
- to: '/learn/quickstart/scala',
- },
- {
- label: 'Java',
- to: '/learn/quickstart/java',
- },
- {
- label: 'Examples',
- to: '/learn/examples',
- },
- ],
+ position: 'right',
},
{
type: 'dropdown',
@@ -171,23 +152,6 @@ const createConfig = async () => {
},
],
},
- {
- title: 'Quickstart',
- items: [
- {
- label: 'Kotlin',
- to: '/learn/quickstart/kotlin',
- },
- {
- label: 'Scala',
- to: '/learn/quickstart/scala',
- },
- {
- label: 'Examples',
- to: '/learn/examples',
- },
- ],
- },
{
title: 'Integrations',
items: [
diff --git a/src/pages/index.tsx b/src/pages/index.tsx
index 822ccd1..a0d5df0 100644
--- a/src/pages/index.tsx
+++ b/src/pages/index.tsx
@@ -39,28 +39,6 @@ export default function Home(): JSX.Element {
Discover its potential
-
-
- {`package examples
-
-import com.xebia.functional.xef.scala.conversation.*
-import io.circe.Decoder
-import com.xebia.functional.xef.prompt.Prompt
-
-private final case class TouristAttraction(name: String, location: String, history: String) derives SerialDescriptor, Decoder
-
-@main def runTouristAttraction: Unit = conversation {
- val statueOfLiberty: TouristAttraction = prompt(Prompt("Statue of Liberty location and history."))
- println(
- s"""
- |\${statueOfLiberty.name} is located in \${statueOfLiberty.location} and has the following history:
- |\${statueOfLiberty.history}
- """.stripMargin
- )
-}
-`}
-
-
{`package examples
@@ -81,36 +59,6 @@ suspend fun main() =
.trimMargin()
)
}
-`}
-
-
-
-
- {`package example;
-
-import com.xebia.functional.xef.conversation.*;
-import com.xebia.functional.xef.conversation.llm.openai.OpenAI;
-import com.xebia.functional.xef.prompt.Prompt;
-
-import java.util.concurrent.ExecutionException;
-
-public class TouristAttractions {
-
- public record TouristAttraction(String name, String location, String history) {}
-
- public static void main(String[] args) throws ExecutionException, InterruptedException {
- try (var scope = OpenAI.conversation()) {
- var statueOfLiberty = scope.prompt(
- OpenAI.FromEnvironment.DEFAULT_SERIALIZATION,
- new Prompt("Statue of Liberty location and history."),
- TouristAttraction.class
- ).get()
- System.out.println(
- statueOfLiberty.name + "is located in " + statueOfLiberty.location +
- " and has the following history: " + statueOfLiberty.history
- );
- }
- }
`}