Skip to content

Releases: iansinnott/prompta

Prompta v5.1.0

30 Oct 08:13
Compare
Choose a tag to compare

Image Support

screenshot_20241030084359@2x

You can now interact with multi-modal models by adding images to your chat threads.

What's Changed

New Contributors

Full Changelog: v4.1.7...v5.1.0

Prompta v5.0.1

29 Oct 13:59
Compare
Choose a tag to compare

See the assets to download and install this version.

v5.0.0

29 Oct 13:47
Compare
Choose a tag to compare

What's Changed

  • fix cmd+n not working in tauri by @nikvdp in #44
  • Add HOST and BODYLIMIT env vars for sync server by @struanb in #47
  • fix all instances of $page.url.pathname checks for tauri by @nikvdp in #45
  • Make sync server endpoints path-relative by @struanb in #48
  • tauri 2 by @iansinnott in #51

What broke?

This is a major version bump, but only because I bumped the major versions of Tauri, Svelte and SvelteKit. Thus there may be breakage that I haven't found yet and I don't want anyone to be caught off guard.

New Contributors

Full Changelog: v4.1.7...v5.0.0

Prompta v4.1.7

04 Oct 14:50
Compare
Choose a tag to compare

See the assets to download and install this version.

v4.1.6

04 Oct 14:41
Compare
Choose a tag to compare

What's Changed

  • make sure model picker uses dark mode by @nikvdp in #42
  • updated dependencies
  • removed experimental vector features

Full Changelog: v4.1.5...v4.1.6

Prompta v4.1.1

29 Jan 13:01
Compare
Choose a tag to compare

See the assets to download and install this version.

Prompta v4.1.0

29 Jan 06:49
Compare
Choose a tag to compare

Feature Flags

The app now includes feature flags. The main motivation here is to be able to ship experimental, potentially breaking changes without breaking the app for anyone. These experiements will be opt in until they seem stable enough to roll out without a flag.

Experiments

Along with the feature flags this PR creates a new experiments page with various things that might be of interest. The first experiment is browser-centric vector search over chat history.

UI Improvements

Small UI improvements. Most notably the controls in the chat window are more compact.

screenshot_20240129063908@2x

What's Changed

Full Changelog: v4.0.2...v4.1.0

Prompta v4.0.2

05 Jan 02:29
Compare
Choose a tag to compare

This fixes a cors issue related to using 3rd party APIs (custom providers). See prior release notes for more details: https://github.com/iansinnott/prompta/releases/tag/v4.0.0

See the assets to download and install this version.

v4.0.0

05 Jan 01:48
Compare
Choose a tag to compare

More LLMs

This release brings two main features:

  • Improved support for 3rd party LLM providers (Fireworks, Mistral, LiteLLM, etc)
  • Prompta-provided LLMs to users get started

What's new

screenshot_20240105013118.mp4

Using Mistral as a LLM providers

screenshot_20240105015316.mp4

3rd party LLMs

There are currently lots of LLM providers that offer APIs that are compatible with OpenAI. Prompta should be able to use any of these LLMs easily. Previously the experience was not great. You could use 3rd party LLM APIs but you had to fiddle with settings to use a custom provider and had to undo those settings to get back to OpenAI. How LLM providers can all exist together and you can add as many as you like.

Prompta LLMs

For some time I've wanted to improve the first-time user experience of Prompta by making it possible to immediately chat with the LLM without having to go bring their own key. For all of us existing users this is no big deal, we have OpenAI accounts and we know how to generate API keys. However, this is quite a lot of hassle for some people. I want to be able to introduce friends and family to AI by sending them to Prompta. If they have to sign up for OpenAI at the same time it's a non-starter.

There are lots of competing LLM providers now so I've set up an endpoint to provide LLM access for free to users without having to sign up. Assumedly almost no one will use the app so it won't be cost prohibitive on my end, but I may have to revisit if the API starts getting hammered.

For now this just means that all you need to use Prompta is to open it in a browser tab.

I recognize existing users probably don't care about this, so it's also possible to disable the Prompta LLMs in the settings. This provides the same experience as before—simply providing access to OpenAI via your own key.

What's Changed

  • Allow arbitrary LLM providers, Prompta-provided LLMs by @iansinnott in #29

Full Changelog: v3.3.0...v4.0.0

Prompta v3.3.0

20 Dec 06:43
Compare
Choose a tag to compare

See the assets to download and install this version.