Graphite framework revealed: How to use it to build scalable AI workflows

Explore how the Graphite framework helps build highly customizable AI solutions.
Core content:
1. The background and necessity of the Graphite framework
2. The simplicity, power and composability of the Graphite core architecture
3. The four core features of Graphite and how they improve the reliability of AI applications
In today's digital age, artificial intelligence (AI) has penetrated into every aspect of our lives, from everyday voice assistants to complex enterprise-level applications, AI is changing the way we work and live. However, with the continuous expansion of AI applications, enterprises and developers face a common challenge: how to build AI solutions that meet specific business needs? Today, we are going to introduce the Graphite framework, which was born to solve this problem.
1. Why do we need Graphite?
In the world of AI, there are already many powerful tools and platforms, such as ChatGPT, Claude, etc. However, these general AI solutions often seem to be unable to cope with some critical tasks. For example, in fields such as finance and medicine that require extremely high accuracy, even small errors may lead to huge losses. This requires a more flexible and controllable AI framework that can be customized according to specific business needs. Graphite was born in this context.
Graphite is an open source framework designed specifically for building domain-specific AI assistants. It provides a highly scalable platform that can be customized according to unique enterprise needs, allowing developers to build personalized workflows that meet specific business areas. To use a vivid metaphor, Graphite is like the "Lego blocks" in the field of AI. Developers can freely combine various components like building blocks to create the AI applications they want.
2. Graphite's core architecture: simple, powerful and composable
Graphite's architecture is very cleverly designed. It consists of three conceptual layers: Assistants, Nodes, and Tools.
Assistants : They are the "commanders" of the entire workflow, responsible for coordinating the operation of the workflow and managing the status of the entire conversation. You can think of an assistant as a project manager who determines the progress and direction of the entire project. Nodes : Nodes are "little workers" in the workflow. Each node has its own responsibilities, such as calling a language model or executing a function. They perform their respective duties and work together to complete complex tasks. Tools : Tools are "toolboxes" that nodes use to complete specific tasks, such as calling an API or running a Python function. These tools provide nodes with the ability to perform actual operations.
Graphite also uses the event tracing mode, which means that every change in state will be recorded. This is like installing a "black box" for the entire system. Whenever a problem occurs, the cause can be traced back through these records.
The benefits of this architecture are obvious. First, it makes the entire system very modular. Developers can freely add and remove nodes like building blocks, and can even modify a node without affecting other parts. Second, this architecture improves the flexibility and scalability of the system, and the workflow can be easily adjusted regardless of how business needs change.
3. Four core features of Graphite: making AI applications more reliable
Observability
In complex AI systems, finding the root cause of a problem is often as difficult as finding a needle in a haystack. Graphite uses event-driven architecture, logging, and tracing functions to allow developers to monitor the system's operating status in real time and quickly locate bottlenecks or errors. This is like installing a pair of "perspective eyes" for the AI system, making every link transparent and measurable.
2. Idempotency
In an asynchronous workflow, some operations may need to be repeated due to network fluctuations or partial failures. The design of Graphite emphasizes idempotent operations to ensure that data will not be duplicated or corrupted even if they are called repeatedly. This is like adding a "protective shield" to the system to avoid confusion caused by repeated operations.
3. Auditability
Graphite uses events as the only source of truth, automatically recording every state change and decision path. This is a lifesaver for industries that need to strictly comply with regulations. Whether it is for compliance checks or for debugging and tracking problems, these detailed records provide strong support.
4. Restorability
In a long-running AI task, if a failure occurs midway, restarting may waste a lot of time and resources. Graphite uses checkpoints and event-based playback functions to allow workflows to recover accurately from the moment of failure, minimizing downtime and resource waste.
4. Hands-on practice: Building an AI assistant to “know your customer” with Graphite
Having said so much, you may ask: "It sounds good, but is it difficult to actually operate?" Don't worry, let's use a simple example to experience the power of Graphite.
Let's say we want to build a Know Your Customer (KYC) AI assistant for a gym. The main task of this assistant is to collect the customer's full name and email address to complete the gym's registration process. If the customer provides incomplete information, the assistant will pause the process and ask the customer to provide more information.
1. Building a workflow
First, we need to install the Graphite framework. To complete the installation, run the following command in the terminal:
pip install grafi
Next, we need to define the components in the workflow. Based on the previous description, we need to create the following components:
7 Topics : including user input topics, user information extraction topics, manual intervention topics, etc. 5 nodes : including user information extraction node, action node, manual intervention node, registered user node and response user node.
The following is part of the code implementation:
from grafi.common.topics.topic import Topic
from grafi.common.topics.human_request_topic import human_request_topic
from grafi.common.topics.output_topic import Agent_output_topic
from grafi.nodes.llm_node import LLMNode
from grafi.nodes.llm_function_call_node import LLMFunctionCallNode
from grafi.commands.llm_response_command import LLMResponseCommand
from grafi.commands.function_calling_command import FunctionCallingCommand
from grafi.tools.openai_tool import OpenAITool
from grafi.common.models.message import Message
from grafi.common.decorators.llm_function import llm_function
from grafi.common.models.execution_context import ExecutionContext
import json
import uuid
# Define the theme
user_info_extract_topic = Topic(name= "user_info_extract_topic" )
hitl_call_topic = Topic(
name = "hitl_call_topic" ,
condition= lambda msgs: msgs[ -1 ].tool_calls[ 0 ].function.name != "register_client" ,
)
register_user_topic = Topic(
name = "register_user_topic" ,
condition= lambda msgs: msgs[ -1 ].tool_calls[ 0 ].function.name == "register_client" ,
)
register_user_respond_topic = Topic(name= "register_user_respond" )
# Define user information extraction node
user_info_extract_node = (
LLMNode.Builder()
.name( "UserInfoExtractNode" )
.subscribe(agent_input_topic)
.command(
LLMResponseCommand.Builder()
.llm(
OpenAITool.Builder()
.name( "UserInfoExtractLLM" )
.api_key( "YOUR_OPENAI_API_KEY" )
.model( "gpt-3.5-turbo" )
.system_message( "Extract user's full name and email from the input." )
.build()
)
.build()
)
.publish_to(user_info_extract_topic)
.build()
)
# Define action nodes
action_node = (
LLMNode.Builder()
.name( "ActionNode" )
.subscribe(user_info_extract_topic)
.command(
LLMResponseCommand.Builder()
.llm(
OpenAITool.Builder()
.name( "ActionLLM" )
.api_key( "YOUR_OPENAI_API_KEY" )
.model( "gpt-3.5-turbo" )
.system_message( "Decide the next action based on the extracted information." )
.build()
)
.build()
)
.publish_to(hitl_call_topic)
.publish_to(register_user_topic)
.build()
)
# Define manual intervention nodes
human_request_function_call_node = (
LLMFunctionCallNode.Builder()
.name( "HumanRequestNode" )
.subscribe(hitl_call_topic)
.command(
FunctionCallingCommand.Builder()
.function_tool(ClientInfo())
.build()
)
.publish_to(human_request_topic)
.build()
)
# Define registered user node
register_user_node = (
LLMFunctionCallNode.Builder()
.name( "RegisterUserNode" )
.subscribe(register_user_topic)
.command(
FunctionCallingCommand.Builder()
.function_tool(RegisterClient())
.build()
)
.publish_to(register_user_respond_topic)
.build()
)
# Define the response user node
user_reply_node = (
LLMNode.Builder()
.name( "UserReplyNode" )
.subscribe(register_user_respond_topic)
.command(
LLMResponseCommand.Builder()
.llm(
OpenAITool.Builder()
.name( "UserReplyLLM" )
.api_key( "YOUR_OPENAI_API_KEY" )
.model( "gpt-3.5-turbo" )
.system_message( "Generate a response to the user based on the registration result." )
.build()
)
.build()
)
.publish_to(agent_output_topic)
.build()
)
2. Test Assistant
Now that we have built the workflow, we can test our "Know Your Customer" AI assistant. Here is the test code:
def test_kyc_assistant () :
execution_context = ExecutionContext(
conversation_id= "conversation_id" ,
execution_id=uuid.uuid4().hex,
assistant_request_id=uuid.uuid4().hex,
)
# Initialization helper
assistant = (
KycAssistant.Builder()
.name( "KycAssistant" )
.api_key( "YOUR_OPENAI_API_KEY" )
.user_info_extract_system_message(user_info_extract_system_message)
.action_llm_system_message(
"Select the most appropriate tool based on the request."
)
.summary_llm_system_message(
"Response to user with result of registering. You must include 'registered' in the response if succeed."
)
.hitl_request(ClientInfo())
.register_request(RegisterClient())
.build()
)
while True :
# Get user input
user_input = input( "User: " )
input_data = [Message(role= "user" , content=user_input)]
# Execution Assistant
output = assistant.execute(execution_context, input_data)
# Processing output
responses = []
for message in output:
try :
content_json = json.loads(message.content)
responses.append(content_json[ "question_description" ])
except json.JSONDecodeError:
responses.append(message.content)
respond_to_user = " and " .join(responses)
print( "Assistant:" , respond_to_user)
# If the registration is successful, end the loop
if "registered" in output[ 0 ].content:
break
if __name__ == "__main__" :
test_kyc_assistant()
After running this code, you can interact with our AI assistant through the terminal. For example:
User: Hi, I 'd like to sign up for your gym. Could you help me with the process?
Assistant: Please provide your full name and email address to sign up for the gym.
User: My name is John Doe, and my email is john.doe@example.com
Assistant: Congratulations, John! You are now registered at our gym. If you have any questions or need assistance, feel free to ask!
5. Testing, Observation, Debugging and Improvement: Making the Assistant More Perfect
In actual use, we may encounter various problems. Graphite provides powerful tracking and observation capabilities by integrating OpenTelemetry and Arize's OpenInference. We can easily capture the assistant's behavior data and quickly locate problems.
For example, suppose we find in testing that the assistant reports an error when the user enters something unrelated to the registration. By using the tracking tool, we can quickly locate the root cause of the problem - the action LLM does not select the tool correctly. So, we can update the system prompt of the action LLM to call therequest_client_information
tool, politely asking the user if they need help signing up.
This process of rapid iteration and improvement is the charm of Graphite. It not only helps us build AI applications quickly, but also allows us to continuously optimize and make the assistant more and more intelligent.
Conclusion: Graphite is built for real-world AI applications
Graphite is more than just a framework, it is a new way of thinking about building AI applications. Through a simple and powerful three-tier execution model, event-driven orchestration mechanism, and support for observability, idempotence, auditability, and recoverability, Graphite provides developers with a flexible, scalable, and reliable platform.
Whether you are building a conversational assistant or automating workflows, Graphite can meet your needs. It allows us to easily build AI applications that meet specific business needs, just like building blocks. If you are also passionate about AI development, you might as well try Graphite, maybe it is the "magic tool" you have been looking for.