Aidena

LocalAI

Model InferenceOpen SourceVerifiedOpen Source

Open-source, self-hosted OpenAI API-compatible inference server for running LLMs, image generation, and audio models on consumer hardware. Supports multiple backends with CPU and GPU inference, requiring no cloud dependency. Best suited for developers needing private, local AI inference with drop-in OpenAI API compatibility.

Price

$0 – $0

License: MIT