Local writing assistant option to use local Ollama server, when available
The built in writing assistant (offline version) is a little … stupid … to say the least. It seems to completely disregard the input language as well, opting to rewrite the entire thing in English.
I wanted to propose that you leave the feature set as is for people who aren't setup with alternatives, but add another option so that we can use our own Ollama server within our network, if available.
I run a variety of llama models locally anyways, so might as well use that (and the http API it provides), opposed to having yet another assistant bundled within proton mail).
It would also give me a little more trust in the privacy and control of things.
5
votes