MLC LLM

Model InferenceOpen SourceVerified

Open-source framework for compiling and running LLMs natively on any hardware including mobile, edge, and browser. Uses machine learning compilation to achieve near-native performance without runtime dependencies.

Price

From $0