Lumo
213 results found
-
Lumo on Signal
As a privacy‑focused user who uses Signal for my everyday ongoing conversations, I want to be able to converse with Lumo directly within the Signal app like I do with my other contacts, so that my interactions stay seamless, feel less fragmented, and make it easier for me to rely on Lumo’s assistance without juggling separate apps or websites.
5 votes -
Generate C code
As I am writing a lot of code in C, I'd like Lumo to be able to understand my projects written in C / be able to modify them.
2 votes -
Floating widget
Add the option to enable a floating Lumo widget that can help me writing better, answer chats, or summarize long texts.
8 votes -
Voice typing in Ubuntu Linux and/or Firefox
It would be great to have at least a web-based voice typing for Lumo on Firefox for those who, like me, are using Linux OS's. I'm on Ubuntu but am forced to another IA b/c of the lack of voice typing on LUMO. 😭
1 vote -
Maskottchen tauschen
Man sollte doch nicht vergessen das man bei Lumo mit einer Maschine spricht. Wäre es nich sinnvoll, hilfreich oder zumindest interessant das Katzen Maskottchen bei Bedarf gegen ein Server Symbol tauschen zu können?
KI ist mächtig, positiv und anders auch.2 votes -
Lumo animated stickers pack
Hi,
One of my favorite sticker pack for iMessage (available for other platforms too) is Milk & Mocha bears.
Now Proton you have just a very cute cat and you did already put a lot of effort in the design and animation.
I would gladly use Lumo stickers if you can deliver something as nice :-)
It will also make a good advertising for the product itself: Lumo.Thanks for reading / voting.
Cheers
M3 votes -
Answers on Top instead of Bottom
I would like to suggest, for answers to appear on top of a thread instead on the bottom. Instead of scrolling through the history from top to bottom, new answers appear on top. Should I want to reread the hlstory I can do so top to bottom Especially for longer threads that you would like to continue it would be more helpful.
4 votes -
Let the cat learn to speak!
Add an option where the cat starts reading the answer aloud through the speaker when we enter the message via microphone. This has the small effect of allowing children to grow up with AI in a privacy-compliant manner and give free rein to their imagination, developing stories that the cat can then read aloud. The cat can already write and develop stories, but it still can't read.
4 votes -
Catnip mode: an expert mode for advanced users (ie ‘go crazy’ basically)
I know I’m missing other features and whatnot, but hopefully this idea is somewhat helpful.
So…catnip mode. When toggled on (default off) in a new Developer Settings menu, this mode will unlock additional parameters:
Catnip/developer setting panel:
- Toggle proton system prompt instructions off (default on), allowing a custom instruction set (open weights by design have native guardrails as is, reduce overhead)
- Ideally, the developer setting should allow us to see the system prompt that Proton uses (increase transparency)Lumo chat area:
- Ability to explicitly select models
- Brief description of model, recommended use when hovering over model…1 vote -
Add Voxtral as an LLM in Lumo
I’m an enthusiastic user of Lumo and appreciate the privacy‑first approach you’ve built into the assistant. I noticed that Lumo already incorporates several open‑source large language models, including Mistral, which demonstrates your willingness to adopt cutting‑edge, community‑driven models.
Given the impressive performance and open‑source nature of Voxtral, I’d like to kindly request that you consider adding it to Lumo’s model roster. Integrating Voxtral would further diversify the options available to users, potentially improving multilingual capabilities and offering additional flexibility for specialized workloads—all while staying aligned with Lumo’s privacy and security standards.
Thank you for continuously improving Lumo, and I look…
2 votes -
Private Lumo App Store for sharing & monetizing custom “MyLumos” (GDPR-safe alternative to GPT Store)
Hi Proton Team and Community,
first of all, thank you for creating Lumo — a privacy-first AI assistant that finally allows European users and businesses to use AI without compromising confidentiality or GDPR compliance. Proton has solved the fundamental problem that all other AI platforms struggle with: trust.
Now that Lumo exists, I would like to propose the next major step:
➡️ A Private Lumo App Store for creating, sharing, and monetizing custom “MyLumos”.
Just like OpenAI has custom GPTs (and a GPT Store), Proton could offer a secure and encrypted alternative — fully compliant, private, and built inside…
4 votes -
Transcription of long audio STT
It would be useful to be able to use Lumo to transcribe long audio, for example from podcasts, educational audio, etc.
4 votes -
Add Concise/Think Longer option for all answers
Add an option to choose to always be concise or think longer for all answers.
5 votes -
lumo.ai for quick access > lumo.proton.me
Also: https://lumo.ai/search?q=%s site search support
5 votes -
Allow Lumo’s cat image to exist as an electronic pet that people can raise and interact with.
I love Lumo’s cat and want to have one to keep me company.
3 votes -
add a clear command to clear the current chat
I'd like to be able to type "clear" or "clear chat" and have Lumo wipe the current conversation.
I tried it, and the AI knows what I mean but can't do it.
2 votes -
How about ditching these stupid time limited upgrade offers?
Sick of seeing them,desperate and bloody annoying!
2 votes -
Lumo Client (or API) with secure local execution of Lumo‑generated Python graphical output
When users request visualizations, Lumo frequently replies with Python code snippets (e.g., matplotlib graphics). Users can copy‑paste the code into their own environment to render the graphics.
It would be interesting to have some kind of Lumo‑Python bridge that delivers the generated Python code to a local client. The client runs the code in a sandboxed, offline Python/IPython/Jupyter/... session and returns the rendered image directly to the chat.
This would look like a local client chat application communicating with proton servers, able to execute Lumo-generated Python code on the user’s machine. No data leaves the device, preserving confidentiality and complying…
5 votes -
Lumo Integration Part Two (Lumo Dedicated Encrypted Desktop Assistant, Legacy & New Applications Work Options)
All code generation & storage is encrypted and wrapped by a YubiKey.
Running code requires a second‑factor OTP/TOTP from the same YubiKey.
If the device can’t handle the workload, LDA warns the user and offers a secure cloud‑inference fallback.
The assistant stays offline except for signed updates or an explicit, user‑approved cloud request.
Implement the modules, follow the testing plan, and you’ll have a privacy‑preserving, hardware‑rooted Lumo Desktop Assistant that can safely generate, store, and execute programs on demand.
4 votes -
Apple Shortcuts
Can we please get Apple Shortcuts for Lumo :D
I’d love to migrate my current automation away from GPT
6 votes
- Don't see your idea?