In many corporate environments, you have a local LLM backend (like Ollama or a private API) but zero rights to install Python, Node.js, or Docker. You are stuck with a browser and your documents.
Less is FORe is a minimalist, agentic AI assistant contained entirely within one single .html file.
Just download it, open it in Chrome/Edge, and start chatting with your local folders.
.pdf, .docx, .pptx, and text files.LessIsFORe.html file.
If you’re using Ollama or another local API and getting CORS (Cross-Origin Resource Sharing) errors in the browser console, you need to configure the API to allow browser requests:
For Ollama:
export OLLAMA_ORIGINS="*"
ollama serve
Then restart Ollama and try again. This is the most common issue when using a pure HTML application with a local backend.
This file acts as the Frontend. It requires an OpenAI-compatible API to function.
For a simple, configurable proxy to handle your LLM backends, CORS managemet compatible, we recommend looking at the MunchFORsen project. It pairs perfectly with this UI to provide a full-stack, corporate-compatible AI experience.
This project received funding from the French “IA Cluster” program within the Artificial and Natural Intelligence Toulouse Institute (ANITI) and from the “France 2030” program within IRT Saint Exupery. The authors gratefully acknowledge the support of the FOR projects.
This project is open source, under the MIT license.