LessIsFORe

Less is FORe

The “Zero-Install” Agentic RAG in a single HTML file.

🚀 Try the Live Demo


📖 The Problem: Corporate “Software Walls”

In many corporate environments, you have a local LLM backend (like Ollama or a private API) but zero rights to install Python, Node.js, or Docker. You are stuck with a browser and your documents.

💡 The Solution: A Portable AI Gateway

Less is FORe is a minimalist, agentic AI assistant contained entirely within one single .html file.

Just download it, open it in Chrome/Edge, and start chatting with your local folders.


✨ Key Features


🛠️ Getting Started

  1. Download the LessIsFORe.html file.
  2. Open it in a modern browser (Chrome, Edge, or Brave).
  3. Connect your OpenAI-compatible API (Ollama, vLLM, OpenAI).
  4. Select a folder and start querying.

Quick Demo

Less is FORe Demo


🛡️ Security & Privacy


🔧 Troubleshooting

CORS Issues with Local APIs

If you’re using Ollama or another local API and getting CORS (Cross-Origin Resource Sharing) errors in the browser console, you need to configure the API to allow browser requests:

For Ollama:

export OLLAMA_ORIGINS="*"
ollama serve

Then restart Ollama and try again. This is the most common issue when using a pure HTML application with a local backend.


🔗 LLM Backend & Proxying

This file acts as the Frontend. It requires an OpenAI-compatible API to function.

For a simple, configurable proxy to handle your LLM backends, CORS managemet compatible, we recommend looking at the MunchFORsen project. It pairs perfectly with this UI to provide a full-stack, corporate-compatible AI experience.


Acknowledgement

This project received funding from the French “IA Cluster” program within the Artificial and Natural Intelligence Toulouse Institute (ANITI) and from the “France 2030” program within IRT Saint Exupery. The authors gratefully acknowledge the support of the FOR projects.

📜 License

This project is open source, under the MIT license.