【Practical skills】Step by step guide to understand the Function Call of large models

Master the big model Function Call and release new potential of AI interaction.
Core content:
1. Function Call function and its improvement on big model capabilities
2. Function Call working principle and step analysis
3. Actual code example: Using OpenAI to implement weather query function
Function Call in large models (such as OpenAI's GPT series, Claude, Gemini, etc.) is a powerful feature introduced in recent years. It greatly expands the capabilities of large language models, enabling them to not only generate text, but also interact with external tools, APIs, databases and other systems to perform more complex tasks. It is also a key to realizing AI Agent.
This technology greatly enhances the practicality of large models, enabling them to perform complex tasks such as checking the weather, booking flights, searching for information, etc. Function Call makes up for the limitations of pure text models, allowing AI to interact with external systems and APIs to provide more accurate and real-time services. Its obvious advantages include structured output, expanded capabilities, reduced hallucinations, and improved processing capabilities for complex tasks.
What is Function Call
Function Call is a mechanism that enables the large model to call predefined functions (or APIs) during a conversation and continue the conversation based on the return result of the called function. Its principles are as follows:
1. Function Schema
Developers need to provide a set of function definitions to the big model, including:
Function name (name)
Function description
Parameter structure (parameters), usually using JSON Schema format to define the type and meaning of the parameter.
2. The model decides to call the function
When a user enters a question (e.g., "What's the weather like in Shanghai today?"), the big model determines whether a function needs to be called. If so, it automatically generates the parameters required to call the function.
3. External system execution function
This request is sent to the actual backend function or API, which is implemented by the developer. For example, calling the weather API to obtain weather information and return the result.
4. Model continues the conversation
After the large model receives the function return result, it will use it as part of the context and continue to generate a natural language response.
Code Example
Let's use OpenAI to implement Function Call and use a simple example to illustrate. We need to query the weather conditions of cities around the world. Suppose we have an external function get_current_weather, which calls a weather API and returns the result.
First, define a function get_current_weather, which will call a weather API (such as OpenWeatherMap API) to get real-time weather data. Next, we will use the OpenAI client to implement Function Call. Assume that we already have an OpenAI API key and have enabled the Function Call feature.
import openai
import requests
# Set OpenAI's API key
openai.api_key = "xxx"
# Define a function to get the current weather information for a specified location
def get_current_weather ( location: str , unit: str = "metric" ):
"""
Get the current weather information for the specified location.
parameter:
location (str): The name of a place, such as "Beijing" or "Shanghai".
unit (str): temperature unit, "metric" means Celsius, "imperial" means Fahrenheit, the default is "metric".
return:
dict: A dictionary containing temperatures and weather descriptions.
"""
api_key = "xxx" # Replace with your OpenWeatherMap API key
# Construct the requested URL
url = f"http://api.openweathermap.org/data/2.5/weather?q= {location} &units= {unit} &appid= {api_key} "
try :
# Send HTTP GET request
response = requests.get(url)
# Check if the response status code is 200 (success)
if response.status_code == 200 :
data = response.json()
# Extract temperature and weather description from JSON data
return {
"temperature" : data[ "main" ][ "temp" ],
"description" : data[ "weather" ][ 0 ][ "description" ]
}
else :
# If the response status code is not 200, print an error message
print ( f"Request to weather API failed, status code: {response.status_code} " )
return None
except Exception as e:
# Catch exceptions and print error messages
print ( f"Error parsing weather API: {e} " )
return None
# Define metadata for function calls
functions = [
{
"name" : "get_current_weather" ,
"description" : "Get the current weather of the specified city" ,
"parameters" : {
"type" : "object" ,
"properties" : {
"location" : {
"type" : "string" ,
"description" : "City name"
},
"unit" : {
"type" : "string" ,
"description" : "Temperature unit (metric or imperial)" ,
"default" : "metric"
}
},
"required" : [ "location" ]
}
}
]
# User Input
user_input = "What's the weather like in Shenyang today?"
# Call OpenAI's ChatCompletion interface
response = openai.ChatCompletion.create(
model = "gpt-4" ,
messages=[
{ "role" : "user" , "content" : user_input}
],
functions=functions,
function_call = "auto" # Automatically decide whether to call the function
)
# Check if the function call is triggered
if response.choices[ 0 ].finish_reason == "function_call" :
function_args = response.choices[ 0 ].message.function_call.arguments
function_name = response.choices[ 0 ].message.function_call.name
# If the triggered function is get_current_weather
if function_name == "get_current_weather" :
# Call the get_current_weather function to get weather information
weather_result = get_current_weather(
location=function_args[ "location" ],
unit=function_args.get( "unit" , "metric" )
)
# If you get weather information
if weather_result:
# Call the ChatCompletion interface again and pass the weather information as context
second_response = openai.ChatCompletion.create(
model = "gpt-4" ,
messages=[
{ "role" : "user" , "content" : user_input},
{ "role" : "assistant" , "content" : response.choices[ 0 ].message.content},
{ "role" : "function" , "name" : function_name, "arguments" : function_args},
{ "role" : "function" , "name" : function_name, "content" : weather_result}
]
)
# Print the final answer
print (second_response.choices[ 0 ].message.content)
else :
# If weather information acquisition fails, prompt the user
print ( "Unable to obtain weather information, please check the network connection or try again later." )
else :
# If the function call is not triggered, print the model's answer directly
print (response.choices[ 0 ].message.content)
Summary of Function Call
Function Call is a complete process in which the big model automatically determines which function to call, constructs parameters, executes the call, receives the result, and continues the conversation by understanding the user's intention. Its function system is as follows:
1. Enhanced capabilities: The model can call external tools or services through Function Call to perform tasks that it cannot complete by itself, such as real-time data retrieval, file processing, database query, etc.
2. Real-time data access: Large models are usually trained based on static data sets and do not have real-time information. Function Call allows the model to access the latest data and provide more accurate and timely answers.
3. Improved accuracy: When precise calculations or domain-specific knowledge are required, large models can improve the accuracy of answers by calling specialized functions.
4. Personalized services: Function Call enables the large model to call different services according to the specific needs of users and provide a personalized user experience.
5. Operate complex tasks: Some tasks are too complex to be solved by the model’s built-in knowledge alone. Function Call allows the model to break these tasks into manageable subtasks and call the corresponding functions to solve them.
6. Interactive Applications: When building interactive applications, such as chatbots or virtual assistants, Function Call enables the model to perform more complex interactions.
7. Security and compliance: Through Function Call, sensitive data can be processed outside the model to ensure data security and compliance.
--THE END--