You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Open-source LLM router & AI cost optimizer. Routes simple prompts to cheap/local models, complex ones to premium — automatically. Drop-in OpenAI-compatible proxy for Claude Code, Codex, Cursor, OpenClaw. Saves 40-70% on AI API costs. Self-hosted, no middleman.
Self-hosted LLM gateway that routes requests across AI providers (OpenAI, Anthropic, Gemini, Mistral, Ollama) using intelligent multi-policy scoring — including an LLM-native routing policy. Drop-in compatible: just swap the base URL. No database required, built-in cost tracking, budget enforcement and multi-tenant isolation.
AI-powered semantic search router for SPAs - find the best route by meaning, not keywords. Runs entirely in the browser via Web Worker, zero backend required.
Local-first AI model router for serious agents. One endpoint for OpenClaw, Codex, Claude Code, Cursor, Ollama, NVIDIA NIM, and authorized provider fallback.
🧠 Smart AI chatbot that automatically routes queries to the optimal LLM based on complexity — saving up to 75% on API costs. Simple questions go to free local models (Ollama), while complex ones route to GPT-4o. Built with Next.js, TypeScript, and Turborepo.
AI-powered request routing system built with FastAPI that intelligently directs user queries to appropriate services using rule-based logic and fallback mechanisms, with a responsive web UI.
Aeonic is an open-source AI-AGI-LLM router and orchestration framework designed for powering highly scalable, distributed intelligent systems and applications.