-
Notifications
You must be signed in to change notification settings - Fork 51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Ramalama Container needs updating on the quay.io to use new llama-simple-chat #458
Comments
Yeah we really need to automate our container builds somehow. I'll be away for the next while... @rhatdan can we hold off on new pip releases, etc. until we have rebuild and push all our containers? llama-simple-chat binary will probably change again to be ramalama-core or some other name. I'm discussing with llama.cpp folk how we are going to do that. |
Long-term we probably have to version all our containers: ramalama:latest (basically rawhide, expect breakages) The other thing to worry about is cleaning up the old container images when one upgrades. |
We have been doing this for a while for the built-in Arch Linux and Ubuntu images for Toolbx: We rebuild the images every Monday, even if their sources haven't changed, and we rebuild them when their sources have been changed. |
Care to open a new PR to add support for this on RamaLama? |
It was @travier who did all the heavy lifting and I still have one Podman pull request that I should wrap up before I can get to this one. In short, you can start by copy pasting one of the YAML files from https://github.com/containers/toolbox/tree/main/.github/workflows and replacing the specifics, like the paths to the image sources, etc.. In short, it's simple, but one needs to devote some time to iron out the annoying details, and I am struggling to find the time to own that work. :/ |
The original problem reported here went away because the change to |
We will move to llama-run next time I guess. That was recently upstreamed to llama.cpp . We need to version our images at some point. Having only "latest" images really restricts our ability to make breaking changes. |
Hey @ericcurtin I was testing out the new changes and I noticed the ramalama container file on quay.io needs to be updated to have llama-simple-chat
Everything worked when I built the container from scratch. Thanks!
The text was updated successfully, but these errors were encountered: