LLM (like Ollama) integration to generate summaries automatically with a demo #3333
easyselfhost
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I hope you're doing well! I'm a YouTuber covering self-hosting topics on my channel. I'm reaching out with a feature proposal for integrating self-hosted (or online ones if people really want) large language models (LLMs) into Memos, specifically for generating summaries of documents and notes.
Here is the video demo: https://youtu.be/VcJCUMpp_hw?si=sMwxvsq1UBfMmuUZ&t=378
I've implemented this feature on my fork for the video. This works via an Ollama endpoint but could support other models/endpoints too.
I'd love to discuss the potential for integrating this feature into the main project, and I'm more than happy to help refine or adapt it. Let me know your thoughts, and we can work together to bring this to the broader Memos community.
Beta Was this translation helpful? Give feedback.
All reactions