Descriptions:
Nate Herk demonstrates how to use OpenAI’s Responses API natively within n8n (version 1.118 or later) to give AI agents built-in web search and file search capabilities — eliminating the need for external integrations like Perplexity for web queries or Supabase for vector storage. The tutorial is anchored by a concrete before-and-after comparison: an agent with external tools versus an agent using the Responses API’s native features, both returning identical answers about golf rules and NFL standings.
Setup requires enabling the Responses API toggle inside the OpenAI chat model node (version 1.3) in n8n, then configuring options including context window size (low, medium, or high), geographic search filtering, and domain restrictions. For file search, the video walks through creating a vector store on platform.openai.com under the Storage tab, uploading documents (demonstrated with a golf rules PDF), and passing the resulting vector store ID as an array in the node configuration. A notable practical finding: domain filtering requires GPT-5 Mini or later, as GPT-4.1 errors on that parameter.
Herk also notes that code interpreter and MCP server support exist within the Responses API but are not covered in this video. This is a technically detailed reference for n8n builders who want to reduce their agent’s external service dependencies while maintaining powerful search and retrieval capabilities using OpenAI’s managed infrastructure.
📺 Source: Nate Herk · Published December 03, 2025
🏷️ Format: Tutorial Demo







