Graphiti - Building real-time knowledge graphs for AI agents

Written by
Silas Grey
Updated on:June-24th-2025
Recommendation

Graphiti framework creates dynamic knowledge graphs for AI agents and opens a new chapter in interactive AI applications.

Core content:
1. The core functions and application scenarios of the Graphiti framework
2. The difference and advantages of Graphiti and traditional RAG methods
3. The application of Graphiti in the memory layer of Zep AI agents and its open source significance

Yang Fangxian
Founder of 53A/Most Valuable Expert of Tencent Cloud (TVP)

⭐Tips

Check out Graphiti’s new MCP server [1] ! It provides powerful knowledge graph-based memory capabilities for Claude, Cursor, and other MCP clients.

Introduction to Graphiti

Graphiti is a framework for building and querying time-aware knowledge graphs, tailored for AI agents operating in dynamic environments. Unlike traditional retrieval-augmented generation (RAG) approaches, Graphiti continuously integrates user interactions, structured and unstructured enterprise data, and external information into a coherent, queryable graph. The framework supports incremental data updates, efficient retrieval, and precise historical queries without recomputing the entire graph, making it ideal for developing interactive, context-aware AI applications.

With Graphiti, you can:

Integrate and maintain dynamic user interactions and business data. Facilitates state-based reasoning and task automation. Query complex and evolving data using semantic, keyword, and graph-based search methods.

A knowledge graph is a network of interconnected facts, such as "Kendra likes Adidas shoes". Each fact is a "triplet" consisting of two entities or nodes ("Kendra", "Adidas shoes") and their relationship or edge ("likes"). Knowledge graphs have been widely explored in information retrieval. Graphiti is unique in that it can automatically build a knowledge graph while handling changing relationships and maintaining historical context.

Graphiti and Zep Memory

Graphiti provides core support for Zep’s AI agent memory layer.

Using Graphiti, we demonstrate that Zep is the state-of-the-art in AI agent memory [2] .

Read our paper: Zep: A Temporal Knowledge Graph Architecture for Agentic Memory [3] .

We are excited to open source Graphiti and believe its potential goes far beyond AI memory applications.

Agent Memory" class="rich_pages wxw-img" data-imgfileid="100000766" data-type="png" style="height: auto !important;" data-ratio="0.42962962962962964" data-w="1080" src="https://api.ibos.cn/v4/weapparticle/accesswximg?aid=110728&url=aHR0cHM6Ly9tbWJpei5xcGljLmNuL21tYml6X3BuZy9LNTlSWlAyYkhpYUR1VjFFYkx 6UFloOWdVeTEzNmFVZEZzSENnMHhQd1BuTHBEalhsTDB5dllaMW01TGMwYW5HdlZoWm5zSUtVTlQ1UXlGQ2FzaWJlUEh3LzY0MD93eF9mbXQ9cG5nJmFtcA==;from=appmsg">

Why Graphiti?

Traditional RAG approaches often rely on batch processing and static data summaries, which makes them less efficient when dealing with frequently changing data. Graphiti addresses these challenges by providing the following capabilities:

Real-time incremental updates : No batch recalculation is required, and new data fragments can be integrated instantly.Dual-time model : Explicitly tracks when events occur and when they are ingested, allowing precise point-in-time queries.Efficient hybrid retrieval : combines semantic embedding, keyword retrieval (BM25), and graph traversal to achieve low-latency queries without relying on LLM summaries.Custom entity definition : Supports flexible ontology creation and developer-defined entities through simple Pydantic models.Scalability : Efficiently manage large data sets through parallel processing, suitable for enterprise environments.

The difference between Graphiti and GraphRAG

aspect
GraphRAG
Graphiti
Main Application
Static Document Summary
Dynamic Data Management
Data processing
Batch Processing
Continuous incremental updates
Knowledge Structure
Entity Clusters and Community Summary
Event data, semantic entities, communities
Search Method
Sequential LLM Summary
Hybrid semantic, keyword, and graph-based search
Adaptability
Low
high
Time Processing
Basic timestamp tracking
Explicit dual time tracking
Conflict resolution
LLM-driven summary judgment
Time edge failure
Query latency
Seconds to tens of seconds
Usually at sub-second level
Custom entity types
no
Yes, customizable
Scalability
middle
High, optimized for large data sets

Graphiti is specifically designed to address the challenges faced by dynamic and frequently updated datasets, making it particularly suitable for applications that require real-time interactivity and precise historical query.

Install

System Requirements:

Python 3.10 or higher Neo4j 5.26 or higher (as embedded storage backend) OpenAI API key (for LLM inference and embeddings)

Important Tips

Graphiti works best with LLM services that support structured outputs (such as OpenAI and Gemini). Using other services may result in incorrect output schemas and data ingestion failures. This is especially true when using smaller models.

Optional:

Google Gemini, Anthropic, or Groq API key (for other LLM providers)

Tips

The easiest way to install Neo4j is through  the Neo4j Desktop [4] . It provides a user-friendly interface for managing Neo4j instances and databases.

Install Graphiti

pip install graphiti-core

or

poetry add graphiti-core

You can also install the optional LLM provider as an additional feature:

# Install the version that supports Anthropicpip install graphiti-core[anthropic]
# Install a version that supports Groqpip install graphiti-core[groq]
# Install the version that supports Google Geminipip install graphiti-core[google-genai]
# Installing multiple provider supportpip install graphiti-core[anthropic,groq,google-genai]

Quick Start

Important Tips

Graphiti uses OpenAI for LLM inference and embedding. Make sure you have set up OPENAI_API_KEYAnthropic and Groq LLM inference is also supported. Other LLM providers may also be supported via an OpenAI-compatible API.

To see a complete working example, see examplesThe Quickstart example [5]  in the directory  . This example shows:

1. Connect to the Neo4j database2. Initialize Graphiti indexes and constraints3. Add events to the graph (both text and structured JSON)4. Use hybrid search to find relationships (edges)5. Re-rank search results using graph distance6. Find nodes using predefined search recipes

The sample is fully documented with clear explanations of each feature and includes a detailed README providing setup instructions and next steps.

MCP Server

mcp_server The directory contains Graphiti's Model Context Protocol (MCP) server implementation. This server allows AI assistants to interact with Graphiti's knowledge graph capabilities through the MCP protocol.

Key features of the MCP server include:

Event management (add, retrieve, delete) Entity management and relationship processing Semantic and hybrid search capabilities Group management for organizing related data Graph maintenance operations

The MCP server can be deployed using Docker and Neo4j, making it easier to integrate Graphiti into your AI assistant workflow.

For detailed setup instructions and usage examples, see the MCP server  README file [6] .

REST Services

server The directory contains the API service that interacts with the Graphiti API. This service is built using FastAPI.

For more information, see the server's README file.

Optional environment variables

In addition to Neo4j and OpenAI-compatible credentials, Graphiti also has some optional environment variables. If you are using one of our supported models, such as the Anthropic or Voyage models, you must set the necessary environment variables.

USE_PARALLEL_RUNTIME  is an optional boolean variable that can be set if you wish to enable Neo4j's parallel runtime capabilities for our multiple search queries truePlease note that this feature is not supported with Neo4j Community Edition or smaller AuraDB instances, so it is turned off by default.

Using Graphiti with Azure OpenAI

Graphiti supports Azure OpenAI for LLM inference and embedding. To use Azure OpenAI, you need to configure the LLM client and embedder with your Azure OpenAI credentials.

from openai importAsyncAzureOpenAIfrom graphiti_core importGraphitifrom graphiti_core.llm_client importOpenAIClientfrom graphiti_core.embedder.openai importOpenAIEmbedder,OpenAIEmbedderConfigfrom graphiti_core.cross_encoder.openai_reranker_client importOpenAIRerankerClient
Azure OpenAI Configurationapi_key = "<your-api-key>"api_version="<your-api-version>"azure_endpoint="<your-azure-endpoint>"
# Create Azure OpenAI client for LLMazure_openai_client =AsyncAzureOpenAI(    api_key=api_key,    api_version=api_version,    azure_endpoint=azure_endpoint)
# Initialize Graphiti using the Azure OpenAI clientgraphiti =Graphiti("bolt://localhost:7687","neo4j","password",    llm_client = OpenAIClient(        client=azure_openai_client),    embedder=OpenAIEmbedder(        config=OpenAIEmbedderConfig(            embedding_model="text-embedding-3-small"# Use the embedding model name of your Azure deployment),        client=azure_openai_client),# Optional: Configure OpenAI Cross Encoder with Azure OpenAI    cross_encoder=OpenAIRerankerClient(        client=azure_openai_client))
# Now you can use Azure OpenAI to integrate with Graphiti

Make sure to replace the placeholder values ​​with your actual Azure OpenAI credentials, and specify the correct embedding model name deployed in the Azure OpenAI service.

Using Graphiti with Google Gemini

Graphiti supports Google's Gemini model for LLM inference and embedding. To use Gemini, you need to configure the LLM client and embedder with your Google API key.

Install Graphiti

poetry add "graphiti-core[google-genai]"
# or
uv add "graphiti-core[google-genai]"
from graphiti_core importGraphitifrom graphiti_core.llm_client.gemini_client importGeminiClient,LLMConfigfrom graphiti_core.embedder.gemini importGeminiEmbedder,GeminiEmbedderConfig
# Google API Key Configurationapi_key="<your-google-api-key>"
# Initialize Graphiti using the Gemini clientgraphiti =Graphiti("bolt://localhost:7687","neo4j","password",    llm_client=GeminiClient(        config=LLMConfig(            api_key=api_key,            model="gemini-2.0-flash")),    embedder=GeminiEmbedder(        config=GeminiEmbedderConfig(            api_key=api_key,            embedding_model="embedding-001")))
# Now you can use Google Gemini to integrate with Graphiti