Lumo Developer API
I would love to be able to programmatically call Lumo to perform advanced queries against the LLM, including the ability to run tools and functions. I'm an avid Home Assistant user that uses the OpenAI API to interface with local devices. I would prefer to use a secure and private API to do this, and am willing to pay for this capability with a similar cost structure to the OpenAI API.
-
MedCol
commented
I typically use OpenAI ChatGPT on the Continue VSCode plugin to implement an MCP-based agent to assist me with my development projects. I use ChatGPT to orchestrate the execution of my MCP tools. OpenAI then sees all the MCP tools I have implemented and can deduce the intentions and use cases of my projects. In this specific case, it is essential to use Lumo rather than OpenAI ChatGPT. And in this case, I would find it perfectly normal to pay for such a feature.
-
Szymon Filipiak
commented
I’d love to see a developer‑focused API for Proton Lumo that lets me call the model programmatically and run advanced queries, including the ability to invoke tools and custom functions. I’m looking for a secure, privacy‑first endpoint that fits seamlessly into my existing automations.
-
I like Proton :p
commented
I like the idea but I don’t think proton will do it, at least not anytime soon- its unrealistic.
Proton isn’t AI provider, they just have it. Just like how duckduckgo isn’t an AI platform but they do have it.
Note: if you actually have a use for AI API self-host any model you like on a vps with GPU. Cheaper if you don’t exhaust resources and private. You’ll also be able to host way better models than Proton’s.
-
Maar73n
commented
I would love to have a private llm for homeassistant. Running it local on my own hardware is just wastefu since it will be idle for 95% of the time!
-
Denix
commented
I am running openvoiceos and I would love to use lumo as a AI provider to replace my local superslow ollama
-
john@threeohfive.org
commented
hugely important to have access to lumo via api for all sorts of self hosted shenanigans that require more oomf that i can run with my local ai. in fact, lumo told me that API keys were available, i just can’t find where 🤣
-
Aravinth
commented
This might bring in additional revenue stream to proton as well, if API usage based pricing is established well, like other main strema offerings, where techncial enthusiest or app developers who might want to use privacy friendly e2ee AI in their app/service/usecase :)
-
Ty
commented
Home Assistant integration would be amazing.
-
Damian
commented
I have multiple self hosted applications i want to connect AI with. With no API connectivity this is a dealbreaker for me.
-
Wouter
commented
I would like this very much, this would make me use AI a lot more often than I do now! But creating an API that is OpenAI compatible is not easy due to the security model used by Proton (eg. key generation needs to be done on the client side). One option would be to develop some kind of bridge (similar to what they have done for email) that you can run locally and handles the encryption for you. This bridge would be able to provide an OpenAI compatible API.
-
Hichiro
commented
with a token and a fair limit of use by day (to avoid people sharing the token not restricting users) it can be easily done for feature with only text in/out and not interaction with document,.. Other feature will be more complexes (image in, analyse document un the drive,..) and can be done later
-
Dusty
commented
This would also be beneficial to integrate with Synology's AI Console. A privacy oriented product, using privacy oriented AI. Seems like a good fit.
-
Branden R Thompson
commented
Also ensuring that generating API keys for Lumo is easy to find. Specifically I want to be able to hit Lumo from my local n8n instance on my machine.
-
Marco
commented
Game changer for me
-
Sashteck
commented
I would like an API to integrate Lumo with Synology apps. They request an API key to integrate AI tooling and I would love to use Lumo.
-
Sean
commented
It is becoming increasingly popular and useful to access LLMs programatically. This should be supported in general.
In particular, the (free) R package ellmer allows easy programmatic interaction with Claude, Gemini, Deepseek and others. It is mainly for person and scientific use. In my case I just want to ask it to mark a set of markdown or Word assignments submitted by a class of students without opening many chats.
I imagine some people may want to abuse this, so I would understand some limitations, expanded depending on payment I suppose.
-
Tony
commented
Agreed that API access to Lumo would be fantastic, but something tells me this will/would be a very difficult endeavor with the encryption involved. I hope I'm wrong though, and would also pay an additional premium on top of the normal Lumo+ fee to access an API.