Lumo Developer API
I would love to be able to programmatically call Lumo to perform advanced queries against the LLM, including the ability to run tools and functions. I'm an avid Home Assistant user that uses the OpenAI API to interface with local devices. I would prefer to use a secure and private API to do this, and am willing to pay for this capability with a similar cost structure to the OpenAI API.
-
Ambrozi
commented
Hello Proton team,
I am an active Lumo Plus subscriber and use Lumo regularly in my technical workflows.I would very much welcome an official Lumo API for secure and automated integration into my own systems (DevOps scripts, automation tools, and internal services).
Having an API would allow me to use Lumo in production environments while fully respecting Proton’s privacy and security standards. I would also be interested in a paid or business API plan if available.
Thank you for considering this request.
-
Cedric
commented
The simplest is probably to implement an OpenAI-compatible API. See https://bentoml.com/llm/llm-inference-basics/openai-compatible-api
-
Nicola Jelmorini
commented
Office tools use AI integration more and more. For example OnlyOffice allows to integrate many AI models through their API interfaces. It would be nice to be able to use the Lumo model with an API key, instead of using OpenAI, Gemini, etc., or local downloaded outdated models.
-
Max Max commented
Honestly only feature that is blocking the full end-to-end adoption of lumo.
-
llewdis commented
I would love to have access to Lumo via API on the command line. I currently use aichat, gemini, openAI codex and a cobled version of perplexity. It offers flexibility and customization that you just can't get with a web gui. Please consider providing this access as it would position you well to compete against competitors that already do. For example, being able to use a private, secure, end-to-end encrypted interface w/the cli that allowed me to maintain context and launch multiple research queries would be excellent and much preferred to my gemini or codex interactions.
-
MedCol
commented
I typically use OpenAI ChatGPT on the Continue VSCode plugin to implement an MCP-based agent to assist me with my development projects. I use ChatGPT to orchestrate the execution of my MCP tools. OpenAI then sees all the MCP tools I have implemented and can deduce the intentions and use cases of my projects. In this specific case, it is essential to use Lumo rather than OpenAI ChatGPT. And in this case, I would find it perfectly normal to pay for such a feature.
-
Szymon Filipiak
commented
I’d love to see a developer‑focused API for Proton Lumo that lets me call the model programmatically and run advanced queries, including the ability to invoke tools and custom functions. I’m looking for a secure, privacy‑first endpoint that fits seamlessly into my existing automations.
-
Anonymous
commented
I like the idea but I don’t think proton will do it, at least not anytime soon- its unrealistic.
Proton isn’t AI provider, they just have it. Just like how duckduckgo isn’t an AI platform but they do have it.
Note: if you actually have a use for AI API self-host any model you like on a vps with GPU. Cheaper if you don’t exhaust resources and private. You’ll also be able to host way better models than Proton’s.
-
Maar73n
commented
I would love to have a private llm for homeassistant. Running it local on my own hardware is just wastefu since it will be idle for 95% of the time!
-
Denix
commented
I am running openvoiceos and I would love to use lumo as a AI provider to replace my local superslow ollama
-
john@threeohfive.org
commented
hugely important to have access to lumo via api for all sorts of self hosted shenanigans that require more oomf that i can run with my local ai. in fact, lumo told me that API keys were available, i just can’t find where 🤣
-
Aravinth
commented
This might bring in additional revenue stream to proton as well, if API usage based pricing is established well, like other main strema offerings, where techncial enthusiest or app developers who might want to use privacy friendly e2ee AI in their app/service/usecase :)
-
Ty
commented
Home Assistant integration would be amazing.
-
Damian
commented
I have multiple self hosted applications i want to connect AI with. With no API connectivity this is a dealbreaker for me.
-
Wouter
commented
I would like this very much, this would make me use AI a lot more often than I do now! But creating an API that is OpenAI compatible is not easy due to the security model used by Proton (eg. key generation needs to be done on the client side). One option would be to develop some kind of bridge (similar to what they have done for email) that you can run locally and handles the encryption for you. This bridge would be able to provide an OpenAI compatible API.
-
Hichiro
commented
with a token and a fair limit of use by day (to avoid people sharing the token not restricting users) it can be easily done for feature with only text in/out and not interaction with document,.. Other feature will be more complexes (image in, analyse document un the drive,..) and can be done later
-
Dusty
commented
This would also be beneficial to integrate with Synology's AI Console. A privacy oriented product, using privacy oriented AI. Seems like a good fit.
-
Branden R Thompson
commented
Also ensuring that generating API keys for Lumo is easy to find. Specifically I want to be able to hit Lumo from my local n8n instance on my machine.
-
Marco
commented
Game changer for me
-
Sashteck
commented
I would like an API to integrate Lumo with Synology apps. They request an API key to integrate AI tooling and I would love to use Lumo.