Multi-Agent AI Framework
An advanced orchestration system where a MainAgent intelligently routes tasks to specialized sub-agents (Codebase, GitHub, Google Workspace, Frontend, Search, Hugging Face) using a backend-aware execution model.

Overview
A modular and extensible multi-agent AI framework built with Next.js and LangChain. A central MainAgent interprets user intent and delegates tasks to specialized agents: CodebaseAgent for repo interaction, GitHubAgent for issue/PR management, GoogleWorkspaceAgent for productivity tools, FrontendAgent for UI generation, and SearchAgent for real-time information. The system supports flexible backend execution (OpenAI, Anthropic, Google Gemini, Ollama, LM Studio, llama.cpp) and features a modern chat interface with multimodal input support.
Key Features
Intelligent Routing: MainAgent dynamically routes tasks to specialized sub-agents based on user intent
Specialized Agents: Includes Codebase, GitHub, Google Workspace, Frontend, Hugging Face, and Search agents
Backend Flexibility: Switch between cloud APIs (OpenAI, Anthropic) and local inference (llama.cpp) backends
Multimodal Interface: Support for text and image inputs with a modern, responsive chat UI
Deployment Ready: Docker support for local inference and optimized for Vercel deployment