Michael
My feedback
32 results found
-
8 votes
Michael
supported this idea
·
-
4 votes
Michael
supported this idea
·
An error occurred while saving the comment -
8 votes
An error occurred while saving the comment
Michael
commented
I would question the value of learning from an LLM. Not just Lumo but any LLM like ChatGPT or Gemini. Accepting LLM answers as fact without verifying it is a long-term recipe for mis-information, at a stage where the user would be most receptive to that information (when they're actively learning it). As many people are aware of, confabulation or "hallucination" with even best-in-class LLMs are an ever-present pitfall. Having a learning mode just "bakes in" those mistakes and the learner (grade-school child, etc.) may not be aware enough to fact check the LLM if something seems off. Just my 2 cents.
-
5 votes
Michael
supported this idea
·
-
65 votes
Michael
supported this idea
·
-
391 votes
Michael
supported this idea
·
-
51 votes
Michael
supported this idea
·
-
49 votes
Michael
supported this idea
·
-
133 votes
Michael
supported this idea
·
-
195 votes
Michael
supported this idea
·
-
234 votes
Michael
supported this idea
·
-
251 votes
Michael
supported this idea
·
Yup, can confirm this has happened 5 or 6 times in less than a month of usage. Still working out the bugs I guess...