Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add combobox to switch model #1

Open
samy80 opened this issue Sep 18, 2024 · 9 comments
Open

Add combobox to switch model #1

samy80 opened this issue Sep 18, 2024 · 9 comments

Comments

@samy80
Copy link

samy80 commented Sep 18, 2024

Hi! Very nice work you did with this plugin. Haven't you manage to package it for easy install?

I think it could be interesting to allow the user to switch model in the chat, or have the same query reevaluated by another model for alternatives.

@jukofyork
Copy link
Owner

Yeah, I am planning to allow the models to be changed via a drop down selection just above the "user chat area" as I too have found myself changing models mid discussion to take advantage of the strengths of different models (particularly now we have GPT-o1).

I should also point out that I completely ditched the Ollama stuff after finding so many bugs that I wasn't really sure what was getting sent back to the models :/ So the images you'll see in the readme are out of date and it currently just takes an OpenAI endpoint and key again... You can still use via Ollama using the OpenAI compatibile API, but it doesn't get the model list nor rely on the Ollama API anymore (I currently use it via: llama.cpp server for local models, and openai and/or openrouter for non-local models).

The reason I never packaged it is because I didn't really want to step on the toes of the original author :) I really just wrote this for myself and in the process stripped away a lot of his Java-specific stuff so I could use it with other languages (mainly C++ but have recently used it a lot with Python). I just got carried away and added lots of other stuff: templated prompts, compare editor, spell checking and so on :D

It looks like he may have abandoned his project now, so if that is the case I will try and polish it up more and make a proper release version people can use.

@samy80
Copy link
Author

samy80 commented Sep 18, 2024

Don't abandon ollama. It's a very viable alternative to commercial models. Plus lots of new models popping every day. Definitely worth it. Also confidentiality... Maybe have a parallel dev branch? (indeed, i couldn't find the model list as shown in your screenshots...)
And definitely go for packaging the plugin, please!

@jukofyork
Copy link
Owner

Don't abandon ollama. It's a very viable alternative to commercial models. Plus lots of new models popping every day. Definitely worth it. Also confidentiality... Maybe have a parallel dev branch? (indeed, i couldn't find the model list as shown in your screenshots...) And definitely go for packaging the plugin, please!

Don't worry it will still work with Ollama via Ollama's "OpenAI compatibile API" endpoint; it just doesn't use the old/buggy Ollama API endpoint any more! :)

@samy80
Copy link
Author

samy80 commented Sep 18, 2024

Sure, but what about the list of models?

@jukofyork
Copy link
Owner

Sure, but what about the list of models?

Most of the OpenAI API comparable endpoints have a way to get the model list (usually v1\models) so it shouldn't be a problem.

@jukofyork
Copy link
Owner

I've got a couple of other things on the go but being able to select models is definitely my next priority: I'm currently using Claude Sonnet 3.5 via openrouter, GPT-4-Turbo via OpenAI and a couple of other models via llama.cpp, and having to switch these is a lot of hassle!

@jukofyork
Copy link
Owner

jukofyork commented Nov 7, 2024

I've added back the ability to get the model list but only tested it on openrouter and openai, so not sure if it works with ollama or llama.cpp's server...

Beware the settings page is very buggy unless you quit and reopen it and Apply and Restore Defaults don't work properly a lot of the time (I'll try and get to the bottom of this later).

@Sam-candice
Copy link

I'll test it here on ollama and let you know.

Are you making any progress on packaging as a plugin? It's just cumbersome having two instances of eclipse always running...

@jukofyork
Copy link
Owner

I'll test it here on ollama and let you know.

I'd hold off as the current method isn't really usable at all:

#4

Are you making any progress on packaging as a plugin? It's just cumbersome having two instances of eclipse always running...

I'll look into it after I've done the last few things on my ToDo list.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants