Michael
My feedback
32 results found
-
10 votes
Michael
supported this idea
·
-
5 votes
Michael
supported this idea
·
An error occurred while saving the comment -
12 votes
An error occurred while saving the comment
Michael
commented
I would question the value of learning from an LLM. Not just Lumo but any LLM like ChatGPT or Gemini. Accepting LLM answers as fact without verifying it is a long-term recipe for mis-information, at a stage where the user would be most receptive to that information (when they're actively learning it). As many people are aware of, confabulation or "hallucination" with even best-in-class LLMs are an ever-present pitfall. Having a learning mode just "bakes in" those mistakes and the learner (grade-school child, etc.) may not be aware enough to fact check the LLM if something seems off. Just my 2 cents.
-
5 votes
Michael
supported this idea
·
-
68 votes
Michael
supported this idea
·
-
612 votes
Michael
supported this idea
·
-
93 votes
Michael
supported this idea
·
-
85 votes
Michael
supported this idea
·
-
186 votes
Michael
supported this idea
·
-
336 votes
Michael
supported this idea
·
-
382 votes
Michael
supported this idea
·
-
262 votes
Michael
supported this idea
·
Yup, can confirm this has happened 5 or 6 times in less than a month of usage. Still working out the bugs I guess...