Skip to content

etalab-ia/OpenGateLLM

Repository files navigation

Version Coverage License Stars

Logo

OpenGateLLM

Warning

The API is still under beta version, major breaking changes may occur.

Production-ready API gateway for self-hosted LLMs developed by the French Government, fully open-source forever.

Feature Description
Gateway • OpenAI compatible API
• Self-hosted models backend support: vLLM, HuggingFace TEI, Ollama
• Commercial backend support: OpenAI
• Full stack genAI API: chat, embeddings, transcription, RAG and OCR
Account services • SSO support
• Organization, project, key management
• Budget, usage and carbon footprint monitoring
Monitoring • Usage and carbon footprint monitoring
Privacy • No chat history storage

📊 Comparison


Feature OpenGateLLM LiteLLM OpenRouter
OpenAI Compatibility
Open Source
Free (all features)
Support commercial models
Support self-hosted models
Built-in RAG
Built-in OCR

🚀 Quickstart


Deploy OpenGateLLM quickly with Docker connected to our own free model and start using it:

make quickstart

Note

It will copy the config.example.yml and .env.example files into config.yml and .env files if they don't already exist.

Tip

Use make help to see all available commands.

Test the API:

curl -X POST "http://localhost:8000/v1/chat/completions" \
-H "Content-Type: application/json" \
-H "Authorization: Bearer changeme" \
-d '{"model": "albert-testbed", "messages": [{"role": "user", "content": "Hello, how are you?"}]}'

The default master API key is changeme.

User interface

A user interface is available at: http://localhost:8501/playground

User: master Password: changeme

Create a first user

make create-user

Configure your models and add features

With configuration file, you can connect to your own models and add addtionnal services to OpenGateLLM. Start by creating a configuration file and a .env dedicated:

cp config.example.yml config.yml
cp .env.example .env

Check the configuration documentation to configure your configuration file.

Vou can then set your environment variables in .env according to your needs.

You can run the services you need by running:

docker compose --env-file .env up <services_you_need> --detach 

For instance:

docker compose --env-file .env up api playground postgres redis elasticsearch --detach 

Alternative: use kubernetes

You can check our helmchart and instructions here: https://github.com/etalab-ia/albert-api-helm

📘 Tutorials


Explore practical use cases:

Tutorial Link
Chat Completions Chat Completions
Multi-Model Access Multi-Model Access
Retrieval-Augmented Generation (RAG) Retrieval-Augmented Generation (RAG)
Audio Transcriptions Audio Transcriptions
Optical Character Recognition (OCR) Optical Character Recognition (OCR)

🤝 Contribute


This project exists thanks to all the people who contribute. OpenGateLLM thrives on open-source contributions. Join our community!

Check out our Contribution Guide to get started.

🎖️ Sponsors


    DINUM logo CentraleSupélec logo

About

An open-source, unified interface for running and managing self-hosted LLMs.

Topics

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Packages

 
 
 

Languages