Lumo
- or
No existing idea results
- ~ No ideas found ~
227 results found
-
Support for Libre Office file types
Most people I know use Libre office for much of their Office needs, as I do - is there any future plans to support Libre Office files? .odt, .ods, etc.
8 votes -
IOS/Android Widget
I'd Love to have the Lumo cat with a textbox to ask questions on my homescreen.
15 votes -
Please list reference links for various parts of the answer
It would be great if the Lumo AI would include web links with all answers referencing where answers came from and/or where further direct research could be carried out. Perplexity offers this functionality and I really like it because it allows you to put a little more trust in the quality of the answers.
13 votes -
Enable https://lumo.proton.me/search?q=%s as a direct Firefox search engine
Hi all!
Short description:
Please add support for reading the ?q= query parameter and automatically executing the search when the URL https://lumo.proton.me/search?q=%s is opened. This would allow users to add Lumo as a regular search engine in Firefox (and other browsers) without being redirected to the empty guest page (/guest/).
Additional notes:
The ?q= pattern is already used by other AI‑search services (e.g., Perplexity) and works out‑of‑the‑box with Firefox’s “Add custom search engine” feature. Native handling would greatly improve the user experience and position Lumo as a true alternative to existing AI‑search providers.Please vote for this feature!
8 votes -
Highlight‑to‑Answer Feature
Add a capability that lets the user highlight (select) any fragment of text in the chat and then have Lumo generate a response that is explicitly focused on that highlighted portion. This makes clarifying questions, getting quick explanations, and avoiding unnecessary back‑and‑forth much easier.
13 votes -
Feature Request: “Auto‑Websearch” / “Always Enable Websearch” in the Personalization Tab
Hey Proton team,
thanks for the awesome Lumo 1.2 update! The new personalization options are super handy. To make Lumo’s answers truly real‑time, could you add a toggle under Personalization called “auto‑websearch” (or “always enable websearch”)?
When turned on, Lumo would automatically pull in up‑to‑date info for any query that needs fresh data—no extra steps required. This would be a game‑changer for users who rely on accurate, real‑time responses.
Appreciate the consideration!
7 votes -
Copy/export AI reply the way it preserves font/formating/headers/quotations
Would it be handy if a button to copy the AI answer would be multi-functional to allow copying answer also in a
- markdown
- HTML
- BBCode ("Bulletin Board Code")
- readable text (currently pasted formated text with quotations is not so easily readable when pasted into a plaint text editor/viewer)13 votes -
Notify existing Proton users that Lumo is a new Proton product offering
Until I logged-in to Proton uservoice.com, I didn't know that Lumo exists. I do not recall getting an email notification. I got a notification, e.g., for Proton Authenticator. Anyway, I'm researching Lumo now, as I definitely can see myself using a Proton chatbot vs. other (data-collecting) offerings from those evil IT megacompanies. Currently, I have DuckDuckGo's DuckAssist set to offer answers as frequently as possible.
I really see traditional web search being replaced by AI-based search entirely. Perhaps AI-based search is the answer to finally being FREED from all the marketing results when you really want information rather than product/company/shopping results. (Good riddance Google, Bing, etc.)
Until I logged-in to Proton uservoice.com, I didn't know that Lumo exists. I do not recall getting an email notification. I got a notification, e.g., for Proton Authenticator. Anyway, I'm researching Lumo now, as I definitely can see myself using a Proton chatbot vs. other (data-collecting) offerings from those evil IT megacompanies. Currently, I have DuckDuckGo's DuckAssist set to offer answers as frequently as possible.
I really see traditional web search being replaced by AI-based search entirely. Perhaps AI-based search is the answer to finally being FREED from all the marketing results when you really want information rather than product/company/shopping…
15 votes -
Intel Mac Desktop app
Please develop the app for Intel Mac as you did for all of your other Proton Products.
15 votes -
Allow anonymous data sharing
I know that Proton doesn't create/train its own AI model, but a good european alternative would be nice to see. As unpleasant as this might be, to create a good model, lots of data is needed. A great app is worth nothing without good models. The current market leader, openAI, uses user data to train their models and have a huge advantage.
If my data is ever used for training, I want to give it to a company I trust (or think I can trust) to handle them carefully. And this is you. I would like to see the option to anonymously share the chats to help train/refine models. In the best case, only chats without inputs that would give away who asks would be used (this would need to be checked somehow).
I know that this is probably an unpopular opinion, but we have to face the reality: If we want good open models, we need to provide our data.
I know that Proton doesn't create/train its own AI model, but a good european alternative would be nice to see. As unpleasant as this might be, to create a good model, lots of data is needed. A great app is worth nothing without good models. The current market leader, openAI, uses user data to train their models and have a huge advantage.
If my data is ever used for training, I want to give it to a company I trust (or think I can trust) to handle them carefully. And this is you. I would like to see the option…
11 votes -
Option to change avatar
I wanted to suggest adding an option to customise or hide the default cat avatar. I'm not much of a cat person, so having the ability to change or turn it off entirely would be a great improvement.
11 votes -
Landscape mode
Adding landscape mode to the Lumo Android app for tablet users and to be able to read tables more easily.
7 votes -
VSCode integration
Other LLM's like ChatGPT and Claude have integration with VSCode. It would be great to see an integration for that or another opensource IDE.
The most amazing thing would be if you can replace all the things most people use day to day with apps and have full support on Linux.
11 votes -
Learning Feature on Lumo
I would love to see Lumo receive a learning feature similar to ChatGPT or other AI tools. The idea would be that Lumo could help users study by generating learning materials such as quizzes, flashcards, or practice exercises automatically based on input. This would make Proton not only a privacy-focused productivity suite but also a helpful tool for education and personal development. Adding such a feature would make Lumo far more versatile and improve its value for students and lifelong learners.
9 votes -
Python interpreter
Hi there, I'm a Proton user from around 4–5 years. I consider myself an early adopter of new technologies, so I've been testing out ChatGPT, Gemini, and other variants of LLM chats. One of my main uses of them is as a researcher and engineer, so I want LLM that are robust, in the sense all a LLM can be robust and predictable. But, in the sense, I want a "translator" o a "meta-transpilator" that translate commands into actions to be performed. For example, the task "translate this from Spanish to Catalan", is in fact a direct task that involves translation. But describing a need for coding a Python snippet to draw a figure using matplotlib is also a translation.
Until a month a go, before the holiday season in Europe, I was using ChatGPT 4o as my default model and interface to do such tasks. As you will know, the ChatGPT interface allows running a IPython kernel in a sandboxed environment. For me, this is more than enough to really speed up writing papers, automating the boring stuff in my day to day, etc.
Could this idea be brought to Lumo? I understand this is not an easy request, but a powerful one for Lumo. If running a sandboxed IPython environment in Lumo's servers could be a security issue, I propose a very straight forward cool trick that IPython kernels can perform. In the same way Google Colaboratory can connect to a local IPython kernel, using a TCP/IP socket from the local computer, sending code from the local machine using JS to the local port to be executed in the kernel, Lumo could potentially do so, by developing the necessary software in the client to extract the code from the Lumo results and launch it in the machine.
Cheers,
IsmaelHi there, I'm a Proton user from around 4–5 years. I consider myself an early adopter of new technologies, so I've been testing out ChatGPT, Gemini, and other variants of LLM chats. One of my main uses of them is as a researcher and engineer, so I want LLM that are robust, in the sense all a LLM can be robust and predictable. But, in the sense, I want a "translator" o a "meta-transpilator" that translate commands into actions to be performed. For example, the task "translate this from Spanish to Catalan", is in fact a direct task that involves…
12 votes -
Automatic calendar for displaying contact birthdays
Please add automatic calendar for displaying contact birthdays. This is a feature in Google Calendar and is the only reason I still hang on to that app
5 votes -
public URL parameter that pre‑loads a specific prompt
It would be nice to have a public URL parameter that pre‑loads a specific prompt, such as https://lumo.proton.me&q=my_specific_prompt because this would make it possible to define "search"-shortcuts in Browsers or for example in krunner under Linux to open Lumo directly with a starting-prompt.
10 votes -
Add models for creative writing
None of the models Lumo uses are very good at creative writing. Please add a model that can provide helpful writing critiques, ideas, etc., for creative uses such as storytelling, poetry, songwriting, and creative non-fiction.
11 votes -
Latest highly efficient open source LLMs suggestions.
I would like to suggest some of the latest, cheap and very powerfull llms, so that you can replace current LLMs for better ones, to improve consistency and manage costs.
hf.co/Qwen/Qwen3-Next-80B-A3B-Instruct
hf.co/Qwen/Qwen3-Next-80B-A3B-Thinking
hf.co/openai/gpt-oss-120b
hf.co/zai-org/GLM-4.5Vplease consider using them in lumo instead of the current models
9 votes -
Better Mathematical Expression Display
Currently, the mathematical expressions produced by Lumo in responses, are not legible. They are generally in raw latex syntax, which is absolutely unsuited for display purposes. They should be rendered either as legible plain text (probably bold-faced for distinctiveness) or as document-typeset graphic math expressions.
8 votes
- Don't see your idea?