Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Discussion] Inference Endpoints #13

Open
lucasvanmol opened this issue Dec 4, 2023 · 0 comments
Open

[Discussion] Inference Endpoints #13

lucasvanmol opened this issue Dec 4, 2023 · 0 comments

Comments

@lucasvanmol
Copy link
Owner

Currently, the project uses huggingface's "Inference API". It's a great free service but comes at the cost of having to wait for servers to be provisioned.

An option would be to move to https://huggingface.co/inference-endpoints, a paid service, which would allow high uptimes and potentially faster inference. This could be offered through some sort of low-cost subscription service (I'm hoping less than a few euros a month), if there is enough interest for this.

If you yourself would be interested in this, please react to this post. Feel free to also use this issue as a place to discuss this idea or ask questions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant