Responses API Tools

OCI Generative AI supports tools with the Responses API, allowing supported models to use integrated tools during response generation. Add tool definitions in the tools property of a Responses API request so the model can retrieve relevant content from the data, run Python code, call application-defined functions, or use tools exposed by a remote MCP server.

Tool support is available only through the API. Depending on the request, the model can decide whether to use one of the configured tools. When needed, you can also guide tool behavior with the tool_choice property.
For example, the following request adds a tool in the tools property:
response = client.responses.create(
    model="openai.gpt-oss-120b",
    tools=[
        {
            "type": "file_search",
            "vector_store_ids": ["<vector_store_id>"]
        }
    ],
    input="Summarize the main ideas in these documents."
)
  • File Search

    Use File Search to let the model retrieve relevant content from a vector store during response generation. This is useful when you want the model to answer by using your documents instead of relying only on built-in knowledge.

  • Code Interpreter

    Use Code Interpreter to let the model write and run Python code in an OCI-managed sandbox. This is useful for calculations, data analysis, file processing, and other computation-heavy tasks.

  • Function Calling

    Use Function Calling when your application defines and runs the tool logic. The model returns the function name and arguments, your application runs the function, and then your application sends the function output back so the workflow can continue.

  • MCP Calling

    Use MCP Calling to let OCI Generative AI call tools hosted on a remote MCP server directly during the request. This reduces client-side orchestration because the platform communicates with the MCP server without requiring an extra application round trip.