LocalAI is the free, Open Source OpenAI alternative. LocalAI act as a drop-in replacement REST API that’s compatible with OpenAI API specifications for local inferencing. It allows you to run LLMs, generate images, audio (and not only) locally or on-prem with consumer grade hardware, supporting multiple model families and architectures. Does not require GPU. It is created and maintained by Ettore Di Giacinto.
Start the image with Docker to have a functional clone of OpenAI! 🚀:
docker run -p 8080:8080 --name local-ai -ti localai/localai:latest-aio-cpu
# Do you have a Nvidia GPUs? Use this instead
# CUDA 11
# docker run -p 8080:8080 --gpus all --name local-ai -ti localai/localai:latest-aio-gpu-nvidia-cuda-11
# CUDA 12
# docker run -p 8080:8080 --gpus all --name local-ai -ti localai/localai:latest-aio-gpu-nvidia-cuda-12
See the 💻 Quickstart for all the options and way you can run LocalAI!
Local, OpenAI drop-in alternative REST API. You own your data.
NO GPU required. NO Internet access is required either
Optional, GPU Acceleration is available. See also the build section.
Supports multiple models
🏃 Once loaded the first time, it keep models loaded in memory for faster inference
⚡ Doesn’t shell-out, but uses bindings for a faster inference and better performance.
LocalAI is focused on making the AI accessible to anyone. Any contribution, feedback and PR is welcome!
Note that this started just as a fun weekend project by mudler in order to try to create the necessary pieces for a full AI assistant like ChatGPT: the community is growing fast and we are working hard to make it better and more stable. If you want to help, please consider contributing (see below)!
If you have technological skills and want to contribute to development, have a look at the open issues. If you are new you can have a look at the good-first-issue and help-wanted labels.
If you don’t have technological skills you can still help improving documentation or add examples or share your user-stories with our community, any help and contribution is welcome!