Skip to content

jj-devhub/graphbit

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GraphBit - High Performance Agentic Framework

Logo

GraphBit - Developer-first, enterprise-grade LLM framework. | Product Hunt GraphBit - Developer-first, enterprise-grade LLM framework. | Product Hunt

Website | Docs | Discord

PyPI PyPI Downloads Build Status PRs Welcome
Rust Version Python Version License

YouTube X Discord LinkedIn

Type-Safe AI Agent Workflows with Rust Performance


Read this in other languages: 🇨🇳 简体中文 | 🇨🇳 繁體中文 | 🇪🇸 Español | 🇫🇷 Français | 🇩🇪 Deutsch | 🇯🇵 日本語 | 🇰🇷 한국어 | 🇮🇳 हिन्दी | 🇸🇦 العربية | 🇮🇹 Italiano | 🇧🇷 Português | 🇷🇺 Русский | 🇧🇩 বাংলা


GraphBit is an open-source agentic AI framework for developers who need deterministic, concurrent, and low-overhead execution.

Why GraphBit?

Efficiency decides who scales, GraphBit is built for developers who need deterministic, concurrent, and ultra-efficient AI execution without the overhead.

Built with a Rust core and a minimal Python layer, GraphBit delivers up to 68× lower CPU usage and 140× lower memory footprint than other frameworks, while maintaining equal or greater throughput.

It powers multi-agent workflows that run in parallel, persist memory across steps, self-recover from failures, and ensure 100 % task reliability. GraphBit is built for production workloads, from enterprise AI systems to low-resource edge deployments.

Key Features

  • Tool Selection - LLMs intelligently choose tools based on descriptions
  • Type Safety - Strong typing through every execution layer
  • Reliability - Circuit breakers, retry policies, and error handling and fault recovery
  • Multi-LLM Support - OpenAI, Azure OpenAI, Anthropic, OpenRouter, DeepSeek, Replicate, Ollama, TogetherAI and more
  • Resource Management - Concurrency controls and memory optimization
  • Observability - Built-in tracing, structured logs, and performance metrics

Benchmark

GraphBit was built for efficiency at scale, not theoretical claims, but measured results.

Our internal benchmark suite compared GraphBit to leading Python-based agent frameworks across identical workloads.

Metric GraphBit Other Frameworks Gain
CPU Usage 1.0× baseline 68.3× higher ~68× CPU
Memory Footprint 1.0× baseline 140× higher ~140× Memory
Execution Speed ≈ equal / faster Consistent throughput
Determinism 100% success Variable Guaranteed reliability

GraphBit consistently delivers production-grade efficiency across LLM calls, tool invocations, and multi-agent chains.

Benchmark Demo

GraphBit Benchmark Demo

Watch the GraphBit Benchmark Demo

When to Use GraphBit

Choose GraphBit if you need:

  • Production-grade multi-agent systems that won't collapse under load
  • Type-safe execution and reproducible outputs
  • Real-time orchestration for hybrid or streaming AI applications
  • Rust-level efficiency with Python-level ergonomics

If you're scaling beyond prototypes or care about runtime determinism, GraphBit is for you.

Quick Start

Installation

Recommended to use virtual environment.

pip install graphbit

Quick Start Video Tutorial

GraphBit Quick Start Tutorial

Watch the Install GraphBit via PyPI | Full Example & Run Guide tutorial

Environment Setup

Set up API keys you want to use in your project:

# OpenAI (optional – required if using OpenAI models)
export OPENAI_API_KEY=your_openai_api_key_here

# Anthropic (optional – required if using Anthropic models)
export ANTHROPIC_API_KEY=your_anthropic_api_key_here

Security Note: Never commit API keys to version control. Always use environment variables or secure secret management.

Basic Usage

import os

from graphbit import LlmConfig, Executor, Workflow, Node, tool

# Initialize and configure
config = LlmConfig.openai(os.getenv("OPENAI_API_KEY"), "gpt-4o-mini")

# Create executor
executor = Executor(config)

# Create tools with clear descriptions for LLM selection
@tool(_description="Get current weather information for any city")
def get_weather(location: str) -> dict:
    return {"location": location, "temperature": 22, "condition": "sunny"}

@tool(_description="Perform mathematical calculations and return results")
def calculate(expression: str) -> str:
    return f"Result: {eval(expression)}"

# Build workflow
workflow = Workflow("Analysis Pipeline")

# Create agent nodes
smart_agent = Node.agent(
    name="Smart Agent",
    prompt="What's the weather in Paris and calculate 15 + 27?",
    system_prompt="You are an assistant skilled in weather lookup and math calculations. Use tools to answer queries accurately.",
    tools=[get_weather, calculate]
)

processor = Node.agent(
    name="Data Processor",
    prompt="Process the results obtained from Smart Agent.",
    system_prompt="""You process and organize results from other agents.

    - Summarize and clarify key points
    - Structure your output for easy reading
    - Focus on actionable insights
    """
)

# Connect and execute
id1 = workflow.add_node(smart_agent)
id2 = workflow.add_node(processor)
workflow.connect(id1, id2)

result = executor.execute(workflow)
print(f"Workflow completed: {result.is_success()}")
print("\nSmart Agent Output: \n", result.get_node_output("Smart Agent"))
print("\nData Processor Output: \n", result.get_node_output("Data Processor"))

Building Your First Agent Workflow by GraphBit

Making Agent Workflow by GraphBit

Watch the Making Agent Workflow by GraphBit tutorial

Observability & Tracing

GraphBit Tracer captures and monitors LLM calls and AI workflows with minimal configuration. It wraps GraphBit LLM clients and workflow executors to trace prompts, responses, token usage, latency, and errors without changing your code.

GraphBit Observability & Tracing

Watch the GraphBit Observability & Tracing tutorial

High-Level Architecture

GraphBit Architecture

Three-tier design for reliability and performance:

  • Rust Core - Workflow engine, agents, and LLM providers
  • Orchestration Layer - Project management and execution
  • Python API - PyO3 bindings with async support

Python API Integrations

GraphBit provides a rich Python API for building and integrating agentic workflows:

  • LLM Clients - Multi-provider LLM integrations (OpenAI, Anthropic, Azure, and more)
  • Workflows - Define and manage multi-agent workflow graphs with state management
  • Nodes - Agent nodes, tool nodes, and custom workflow components
  • Executors - Workflow execution engine with configuration management
  • Tool System - Function decorators, registry, and execution framework for agent tools
  • Workflow Results - Execution results with metadata, timing, and output access
  • Embeddings - Vector embeddings for semantic search and retrieval
  • Workflow Context - Shared state and variables across workflow execution
  • Document Loaders - Load and parse documents from multiple formats (PDF, DOCX, TXT, JSON, CSV, XML, HTML)
  • Text Splitters - Split documents into chunks (character, token, sentence, recursive)

For the complete list of classes, methods, and usage examples, see the Python API Reference.

Ecosystem & Extensions

GraphBit's modular architecture supports external integrations:

Category Examples
LLM Providers OpenAI, Anthropic, Azure OpenAI, DeepSeek, Together, Ollama, OpenRouter, Fireworks, Mistral AI, Replicate, Perplexity, HuggingFace, AI21, Bytedance, xAI, and more
Vector Stores Pinecone, Qdrant, Chroma, Milvus, Weaviate, FAISS, Elasticsearch, AstraDB, Redis, and more
Databases PostgreSQL (PGVector), MongoDB, MariaDB, IBM DB2, Redis, and more
Cloud Platforms AWS (Boto3), Azure, Google Cloud Platform, and more
Search APIs Serper, Google Search, GitHub Search, GitLab Search, and more
Embedding Models OpenAI Embeddings, Voyage AI, and more

Extensions are developed and maintained by the community.

GraphBit Ecosystem - Stop Choosing, Start Orchestrating

Contributing to GraphBit

We welcome contributions. To get started, please see the Contributing file for development setup and guidelines.

GraphBit is built by a wonderful community of researchers and engineers.

Security

GraphBit is committed to maintaining security standards for our agentic framework. We recommend using environment variables for API keys, keeping GraphBit updated, and using proper secret management for production environments. If you discover a security vulnerability, please report it responsibly through GitHub Security or via email rather than creating a public issue.

For detailed reporting procedures and response timelines, see our Security Policy.

License

GraphBit is licensed under a three-tier model: Model A (Free Use) for individuals, academic institutions, and small teams (up to 10 employees/users), Model B (Free Trial) for 30-day evaluation, and Model C (Enterprise) for commercial/production use. Redistribution is prohibited under all models without an explicit Enterprise License.

For complete terms and conditions, see the Full License.

Copyright © 2023–2025 InfinitiBit GmbH. All rights reserved.

About

GraphBit is an agentic framework built for workflow automation

Resources

License

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Rust 54.5%
  • Python 45.4%
  • Other 0.1%