Michael
My feedback
32 results found
-
9 votes
Michael
supported this idea
·
-
5 votes
Michael
supported this idea
·
An error occurred while saving the comment -
12 votes
An error occurred while saving the comment
Michael
commented
I would question the value of learning from an LLM. Not just Lumo but any LLM like ChatGPT or Gemini. Accepting LLM answers as fact without verifying it is a long-term recipe for mis-information, at a stage where the user would be most receptive to that information (when they're actively learning it). As many people are aware of, confabulation or "hallucination" with even best-in-class LLMs are an ever-present pitfall. Having a learning mode just "bakes in" those mistakes and the learner (grade-school child, etc.) may not be aware enough to fact check the LLM if something seems off. Just my 2 cents.
-
5 votes
Michael
supported this idea
·
-
67 votes
Michael
supported this idea
·
-
503 votes
Michael
supported this idea
·
-
75 votes
Michael
supported this idea
·
-
72 votes
Michael
supported this idea
·
-
165 votes
Michael
supported this idea
·
-
270 votes
Michael
supported this idea
·
-
319 votes
Michael
supported this idea
·
-
256 votes
Michael
supported this idea
·
Yup, can confirm this has happened 5 or 6 times in less than a month of usage. Still working out the bugs I guess...