Aidena

Guardrails AI

Guardrails & SafetyOpen SourceVerifiedOpen Source

Open-source framework for adding structural and semantic validation guardrails to LLM outputs. Provides a hub of reusable validators for output validation, PII detection, hallucination prevention, and policy enforcement. Best suited for developers building production GenAI apps needing runtime safety checks across any LLM provider.

Price

From $0/ per month

License: Apache-2.0