Function Tools in OCI Generative AI
Function Tools (also known as function-calling tools) in OCI Generative AI allow you to define custom functions in your client application as tools that the model can invoke. This provides a flexible way for models to interact with application-defined logic, such as APIs or computations.
Key Features
- Client-side execution: Unlike platform-hosted tools (for example, MCP or Agent Tools), the API returns control to your application after the model selects a function. You execute it and send results back in a second API call.
- Two-step process:
- Model decides if or which function to call.
- You provide function output and model generates the final response.
- State management:
- API-managed (recommended, default if
store=True): Useprevious_response_idin follow-up calls to link to prior responses; the API automatically tracks conversation state. - User-managed (alternative): Manually accumulate the full conversation history (for example,
input_list += response.output) and pass it entirely in each API call—noprevious_response_idneeded.
- API-managed (recommended, default if
- OpenAI-compatible,: OpenAI-compatible and part of OCI Generative AI tools.
Best practice: Define clear function signatures (name, description, JSON Schema parameters) to guide the model. Ensure functions handle errors securely and comply with Oracle guidelines.
Declaring Function Tools
Define tools as an array of objects with type: "function", including schema per OpenAI spec:
tools = [
{
"type": "function",
"name": "get_weather",
"description": "Get current weather for a city.",
"parameters": {
"type": "object",
"properties": {
"city": {"type": "string", "description": "City name"},
},
"required": ["city"],
},
},
]
Pass tools in client.responses.create().
API-Managed State
Let the API track state using previous_response_id (default behavior if store=True).
import json
# Define the actual function
def get_weather(city):
# Normally call a weather API here
return f"The weather in {city} is sunny with 24°C."
# First call: Model selects tool
response = client.responses.create(
model="openai.gpt-oss-120b",
tools=tools,
input="What's the weather in Paris today?",
)
# Handle function call client-side
tool_outputs = []
for item in response.output:
if item.type == "function_call" and item.name == "get_weather":
args = json.loads(item.arguments)
weather = get_weather(**args)
tool_outputs.append({
"type": "function_call_output",
"call_id": item.call_id,
"output": json.dumps({"weather": weather}),
})
# Second call: Provide outputs, reference prior response
final = client.responses.create(
model="openai.gpt-oss-120b",
instructions="Answer concisely using the weather information.",
tools=tools,
input=tool_outputs, # Just tool outputs
previous_response_id=response.id,
)
print(final.output_text)
User-Managed State
Accumulate full conversation history client-side (no previous_response_id needed).
import json
# Start conversation
input_list = [{"role": "user", "content": "What's the weather in Paris today?"}]
# First call
response = client.responses.create(
model="openai.gpt-oss-120b",
tools=tools,
input=input_list,
)
# Accumulate state and handle tool
input_list += response.output
for item in response.output:
if item.type == "function_call" and item.name == "get_weather":
args = json.loads(item.arguments)
weather = get_weather(**args)
input_list.append({
"type": "function_call_output",
"call_id": item.call_id,
"output": json.dumps({"weather": weather}),
})
# Second call with full state
final = client.responses.create(
model="openai.gpt-oss-120b",
instructions="Answer concisely using the weather information.",
tools=tools,
input=input_list,
)
print(final.output_text)