File Search
Use File Search to let the model retrieve relevant content from files stored in a vector store during response generation. This is useful when you want responses to reflect the documents you provide rather than relying only on the model’s built-in knowledge.
By creating vector stores and adding files to them, you enable semantic and keyword-based search across your data. This extends the model’s built-in knowledge with your custom content and helps produce more precise, context-aware answers.
Because File Search is handled by the service, your application doesn't need to implement its own retrieval pipeline.
Prepare a Vector Store
Before using File Search, create a vector store and add the files that you want the model to reference. OCI Generative AI supports the following APIs for file and vector store management:
| API Set | Description |
|---|---|
| Files | Upload and manage files. |
| Vector Store Files | Manage files attached to a vector store. |
| Vector Store File Batches | Add and manage multiple files in a vector store batch. |
| Container Files | Manage files in a container. |
Example
To use File Search in a request, add a tool definition in the tools property with type: "file_search" and provide the vector store ID.
response = client.responses.create(
model="openai.gpt-oss-120b",
input="Summarize the main ideas covered in the documents in this vector store.",
tools=[
{
"type": "file_search",
"vector_store_ids": ["<vector_store_id>"]
}
]
)
print(response)
In this example:
- The model can use the vector store content during response generation.
- File retrieval is managed by the platform.
- Hybrid search parameters aren't supported with the File Search tool.