Guardian Forge

AI-Powered Tool Generation

Autonomous, self-extending MCP server that generates Python tools on-demand with enterprise-grade security auditing. Request tools in natural language, get production-ready, sandboxed code.

Multi-Agent Architecture Hybrid Security Audit MCP Protocol Native Self-Hosted

How Guardian Forge Works

From natural language request to production-ready tool in four automated steps.

1

Request Tool

Describe the tool you need in natural language via MCP protocol from Claude, GPT, or any AI assistant.

2

AI Generates

Engineer Agent creates Python code using your configured LLM with sandbox-safe patterns.

3

Security Audit

Guardian Agent performs Bandit SAST static analysis + LLM contextual review for vulnerabilities.

4

Approve & Deploy

Human approval via dashboard, then tool is live and available via MCP.

Multi-Agent System

Four specialized agents work together to generate secure, tested tools.

Manager

Orchestrator

Coordinates workflow, handles retries (max 3 attempts), and queues approved tools for deployment.

Engineer

Code Generator

Generates Python code via LLM using optimized prompts for sandbox-safe, production-ready output.

Guardian

Security Auditor

Two-phase audit: Bandit SAST static analysis + LLM contextual review catches vulnerabilities others miss.

Tester

Validator

Validates code executes correctly in Docker sandbox with strict resource limits.

See It In Action

Clean, intuitive interface for tool generation and management.

Guardian Forge Chat Interface

Chat Interface

Request tools in natural language. MCP-connected for seamless AI integration.

Guardian Forge Dashboard

Live Dashboard

Real-time agent activity. Watch Engineer, Guardian, and Tester agents work.

Core Capabilities

Enterprise-grade features for secure AI tool generation.

Hybrid Security Auditing

Combines Bandit SAST static analysis with LLM contextual review. Catches both pattern-based vulnerabilities and semantic security issues that traditional scanners miss.

Sandboxed Execution

All generated tools run in isolated Docker containers with strict resource limits, no network access, and 5-second timeout. Complete isolation from your infrastructure.

MCP Protocol Native

Native Model Context Protocol support (2024-11-05+ Streamable HTTP spec) for seamless integration with Claude, GPT, and other AI assistants.

Multi-User Isolation

Each authenticated user has their own isolated tool namespace and API keys. Per-user Redis namespacing ensures complete data separation.

Self-Hosted

Run Guardian Forge entirely within your infrastructure. Docker Compose deployment, your LLM keys, your data stays yours. Air-gapped environment support.

Multi-LLM Support

Works with OpenAI, Anthropic, Google Gemini, Azure OpenAI, Ollama (local), and OpenRouter. Hot-swap providers without restart.

Request a Tool

Simple natural language requests create production-ready tools.

{ "description": "Get current weather for a city using OpenWeather API", "function_name": "get_weather", "parameters": { "type": "object", "properties": { "city": {"type": "string"} }, "required": ["city"] } }

Guardian Forge generates, audits, tests, and deploys a secure weather tool automatically.

Get Guardian Forge

Self-hosted deployment with your own LLM keys. Complete control over your AI tool generation infrastructure.

github.com/virtuallifehub-del/guardian-forge-dist
OpenAI Anthropic Google Gemini Azure OpenAI Ollama (Local) OpenRouter