Open up Lumo
I am not a hater of AI, but the handling of Lumos launch is disastrous to say the least.
Never ever release a product, call it open and then have no code and no information available about it. Basing something upon something open doesn't make it open itself. Saying so is literally lying and is eroding trust.
Neither do we know, what models Lumo uses, nor do we have access to any part of its code right now. It's a giant blackbox.
Please open up the code and training weights, adhering to the OSI definition of open AI.
Ideally make it available for running locally.
I'm in favor of the vision you outlined in your launch blog post, but your actions seem to speak another language.
-
Kelly
commented
I also am not a hater of ai op.
-
Vlf
commented
i'd also like Proton to be more open and forthcoming with information. right now it feels like Lumo is just a proprietary front end built upon publicly available but undisclosed 3rd party software, i.e. the LLMs, two of which i think might be GPT 4o-mini and o4-mini (which are in fact different models). they also don't seem to mention anywhere what the rate limits are for free tier users. Lumo itself told me it was 100 messages per day, but i haven't been able to verify that information with the FAQs or terms of service.
currently, Duckduckgo has a significantly better supposedly private "AI" chat feature that's more transparent. they straight up let you choose which model you want to use in a chat and even tell you who developed the model and whether or not it's open source