Baud
My feedback
7 results found
-
8 votes
Baud
shared this idea
·
-
27 votes
Baud
supported this idea
·
-
49 votes
An error occurred while saving the comment -
109 votes
Baud
supported this idea
·
-
134 votes
Baud
supported this idea
·
-
196 votes
Baud
supported this idea
·
-
393 votes
Baud
supported this idea
·
You can already do this very easily using ollama. It has a nice gui. Note that only powerful computers (e.g 32gb macbook or PC with an expensive NVIDIA GPU) can run actually useful LLMs. Cloud computing is puch more efficient.