Ability to run local models
Some small models are capable to run locally on phones. This would be great for privacy and also as a free alternative for the more powerful paid LLMs.
49
votes
-
Baud
commented
You can already do this very easily using ollama. It has a nice gui. Note that only powerful computers (e.g 32gb macbook or PC with an expensive NVIDIA GPU) can run actually useful LLMs. Cloud computing is puch more efficient.