Latest highly efficient open source LLMs suggestions.
I would like to suggest some of the latest, cheap and very powerfull llms, so that you can replace current LLMs for better ones, to improve consistency and manage costs.
hf.co/Qwen/Qwen3-Next-80B-A3B-Instruct
hf.co/Qwen/Qwen3-Next-80B-A3B-Thinking
hf.co/openai/gpt-oss-120b
hf.co/zai-org/GLM-4.5V
please consider using them in lumo instead of the current models
10
votes
-
Enrique Matta-Rodriguez
commented
Now that GLM 5, MinMax 2.5, Kimi K2.5, and Qwen 3 VL are strong options, can we get an update on the intake decision? Perhaps y'all can self-host the LLMs to keep them secure?
-
Sam Irmscher commented
Hope this idea gets approved