Lumo Client (or API) with secure local execution of Lumo‑generated Python graphical output
When users request visualizations, Lumo frequently replies with Python code snippets (e.g., matplotlib graphics). Users can copy‑paste the code into their own environment to render the graphics.
It would be interesting to have some kind of Lumo‑Python bridge that delivers the generated Python code to a local client. The client runs the code in a sandboxed, offline Python/IPython/Jupyter/... session and returns the rendered image directly to the chat.
This would look like a local client chat application communicating with proton servers, able to execute Lumo-generated Python code on the user’s machine. No data leaves the device, preserving confidentiality and complying with privacy‑first principles, and a sandboxed Python would prevent from possible side-effect. Users would then be able to get visual answers from Lumo in both an efficient and private way.
This may be implemented as an independent client app, or (maybe even better) as an OS-agnostic jupyter kernel for taking advantage of the notebook interface (with something that would be essentially a notebook-integrated Python API to Lumo).