Lumo Developer API
I would love to be able to programmatically call Lumo to perform advanced queries against the LLM, including the ability to run tools and functions. I'm an avid Home Assistant user that uses the OpenAI API to interface with local devices. I would prefer to use a secure and private API to do this, and am willing to pay for this capability with a similar cost structure to the OpenAI API.
-
Radiance
commented
One more thing: I've found technical documentation *sorely* lacking for Lumo's actual capabilities. Perhaps if I had the time I could dig through the github and make sense of it all, but I have found critical pieces of information on which models are used, how the switching works, token limits, and upload limits to all be very obscure. I haven't found any userbase-made documentation as well (which, we ultimately shouldn't even have to do). I can't even find the day of the week that token limits reset, leaving me stranded with a project whose work I can't advance. I love Proton, I am with the mission 100%, the company has been making amazing products and I am honored to support them; but the community that loves you so much, deserves better treatment than this. Thank you again.
-
Radiance
commented
Lumo is one of a vanishingly few number of AI systems that I trust to actually respect my privacy, which means for certain use cases, it is the only acceptable option. I can only imagine the security and privacy work needed to make a robust developer API that doesn't massively expand the attack surface, but it would be an extremely desirable resource all the same.
-
Georgi Kamaliev
commented
Being able to use Lumo AI API for multi-agent systems and business processes automation would be a great feature. I hope it will be available soon
-
nova
commented
I agree with most people here, I am a security researcher and been using regular "ai" in terminal to assist me with research. The only reason I use them is because I can't use lumo fully in its current state.
-
Ambrozi
commented
Hello Proton team,
I am an active Lumo Plus subscriber and use Lumo regularly in my technical workflows.I would very much welcome an official Lumo API for secure and automated integration into my own systems (DevOps scripts, automation tools, and internal services).
Having an API would allow me to use Lumo in production environments while fully respecting Proton’s privacy and security standards. I would also be interested in a paid or business API plan if available.
Thank you for considering this request.
-
Cedric
commented
The simplest is probably to implement an OpenAI-compatible API. See https://bentoml.com/llm/llm-inference-basics/openai-compatible-api
-
Nicola Jelmorini
commented
Office tools use AI integration more and more. For example OnlyOffice allows to integrate many AI models through their API interfaces. It would be nice to be able to use the Lumo model with an API key, instead of using OpenAI, Gemini, etc., or local downloaded outdated models.
-
Max Max commented
Honestly only feature that is blocking the full end-to-end adoption of lumo.
-
llewdis commented
I would love to have access to Lumo via API on the command line. I currently use aichat, gemini, openAI codex and a cobled version of perplexity. It offers flexibility and customization that you just can't get with a web gui. Please consider providing this access as it would position you well to compete against competitors that already do. For example, being able to use a private, secure, end-to-end encrypted interface w/the cli that allowed me to maintain context and launch multiple research queries would be excellent and much preferred to my gemini or codex interactions.
-
MedCol
commented
I typically use OpenAI ChatGPT on the Continue VSCode plugin to implement an MCP-based agent to assist me with my development projects. I use ChatGPT to orchestrate the execution of my MCP tools. OpenAI then sees all the MCP tools I have implemented and can deduce the intentions and use cases of my projects. In this specific case, it is essential to use Lumo rather than OpenAI ChatGPT. And in this case, I would find it perfectly normal to pay for such a feature.
-
Szymon Filipiak
commented
I’d love to see a developer‑focused API for Proton Lumo that lets me call the model programmatically and run advanced queries, including the ability to invoke tools and custom functions. I’m looking for a secure, privacy‑first endpoint that fits seamlessly into my existing automations.
-
Anonymous
commented
I like the idea but I don’t think proton will do it, at least not anytime soon- its unrealistic.
Proton isn’t AI provider, they just have it. Just like how duckduckgo isn’t an AI platform but they do have it.
Note: if you actually have a use for AI API self-host any model you like on a vps with GPU. Cheaper if you don’t exhaust resources and private. You’ll also be able to host way better models than Proton’s.
-
Maar73n
commented
I would love to have a private llm for homeassistant. Running it local on my own hardware is just wastefu since it will be idle for 95% of the time!
-
Denix
commented
I am running openvoiceos and I would love to use lumo as a AI provider to replace my local superslow ollama
-
john@threeohfive.org
commented
hugely important to have access to lumo via api for all sorts of self hosted shenanigans that require more oomf that i can run with my local ai. in fact, lumo told me that API keys were available, i just can’t find where 🤣
-
Aravinth
commented
This might bring in additional revenue stream to proton as well, if API usage based pricing is established well, like other main strema offerings, where techncial enthusiest or app developers who might want to use privacy friendly e2ee AI in their app/service/usecase :)
-
Ty
commented
Home Assistant integration would be amazing.
-
Damian
commented
I have multiple self hosted applications i want to connect AI with. With no API connectivity this is a dealbreaker for me.
-
Wouter
commented
I would like this very much, this would make me use AI a lot more often than I do now! But creating an API that is OpenAI compatible is not easy due to the security model used by Proton (eg. key generation needs to be done on the client side). One option would be to develop some kind of bridge (similar to what they have done for email) that you can run locally and handles the encryption for you. This bridge would be able to provide an OpenAI compatible API.
-
Hichiro
commented
with a token and a fair limit of use by day (to avoid people sharing the token not restricting users) it can be easily done for feature with only text in/out and not interaction with document,.. Other feature will be more complexes (image in, analyse document un the drive,..) and can be done later