{"workflow":{"id":12521,"name":"Build a cost-efficient Lookio RAG chatbot with GPT-4.1 models for knowledge Q&A","views":54,"recentViews":0,"totalViews":54,"createdAt":"2026-01-06T13:28:50.248Z","description":"This template provides a high-performance, cost-optimized alternative to standard AI Agents for building RAG (Retrieval-Augmented Generation) chatbots. \n\nInstead of relying on a single expensive model to decide every action, this workflow uses a modular \"Routing & Specialized Steps\" architecture. \n\nIt delivers results up to 50% faster and 3x more cost-efficiently by only involving heavy-duty models when deep internal knowledge is actually required.\n\nBy leveraging **Lookio** as the core RAG platform, you can connect your own documentation (PDFs, Docs, Webpages) to a chat interface without the complexity of managing vector databases or custom chunking strategies manually.\n\n*Learn more about breaking down agents for efficiency in this [YouTube deep dive](https://www.youtube.com/watch?v=BHdJFnx2wrc).* \n\n## 👥 Who is this for?\n\n*   **Customer Support Teams:** Build an automated response system that answers queries based on official product guides or internal FAQs.\n*   **Efficiency-Focused Developers:** Scale AI operations without ballooning API costs by offloading simple queries to smaller models.\n*   **Marketing & Content Teams:** Provide instant access to brand guidelines or past content repositories for internal research.\n\n## 💡 What problem does this solve?\n\n*   **Eliminates Token Waste:** Traditional agents send long system prompts to expensive models even for basic greetings like \"Hello.\" This workflow routes those to a \"nano\" model, saving significant costs.\n*   **Increases Reliability:** By breaking down the \"Agent\" logic into discrete steps (Categorize -&gt; Query Prep -&gt; Retrieval -&gt; Response), you gain more control over the output guidelines at every stage.\n*   **Scalable Knowledge Retrieval:** Uses **Lookio** to handle the heavy lifting of RAG, ensuring sourced and factual answers based on your private data rather than general AI training.\n\n## ⚙️ How it works\n\n1.  **Memory & Intent Routing:** The workflow fetches past messages and uses a specialized **Text Classifier** (powered by a small model) to determine if the user is asking a knowledge-based question or just chatting.\n2.  **Path A (Simple Response):** If it's a greeting, a small model handles the reply instantly.\n3.  **Path B (Knowledge Retrieval):** If information is needed, a specialized LLM step crafts a clean search query specifically for Lookio.\n4.  **RAG Execution:** The **Lookio API** retrieves the exact insights needed from your connected knowledge documents.\n5.  **Final Generation:** A large model synthesizes the specific Lookio results and the conversation history into a final, fact-based response.\n\n## What is Lookio, the RAG platform?\n\n[Lookio](https://www.lookio.app/) is a business-focused AI platform designed for automated knowledge retrieval. \n\nUnlike casual AI tools, Lookio is \"API-first,\" meaning it’s built specifically to integrate with tools like n8n. \n\nIt handles the entire RAG pipeline—from document ingestion to vector storage and logical retrieval—allowing you to focus on building the logic of your automation rather than the infrastructure of your AI. \n\nLookio offers various query modes (Eco, Flash, Deep) so you can prioritize speed or depth depending on your budget.\n\n\n## 🛠️ Setup\n\n1.  **Set up Lookio:** Create an account at [Lookio.app](https://www.lookio.app/), upload your documents, and create an assistant.\n2.  **API Key:** In the **RAG via Lookio** node, replace `&lt;YOUR-API-KEY&gt;` in the header and paste your `assistant_id` in the body.\n3.  **AI Credentials:** Add your OpenAI (or preferred provider) credentials to the **Very small model**, **Mini model**, and **Large model** nodes.\n4.  **Activate:** Turn the workflow on. You can now chat with your knowledge base via the n8n chat interface.\n\n## 🚀 Taking it further\n\n*   **Add More Branches:** Expand the **Intent router** to include paths for specific actions, like extracting emails for lead generation or checking order statuses via a database lookup.\n*   **Formatting Tweaks:** Adjust the system prompts in the **Write the final response** node to match your brand's specific tone (e.g., \"Explain it like I'm five\" or \"Legal professional tone\").\n*   **Deployment:** Connect this backend to your website or a Slack channel for real-time team usage.\n\n\n","workflow":{"nodes":[{"id":"60c2f847-f213-4ca2-8b93-85435a34613c","name":"When chat message received","type":"@n8n/n8n-nodes-langchain.chatTrigger","position":[-176,496],"webhookId":"eef2977c-81d7-4102-8edf-d771d9da2118","parameters":{"options":{"responseMode":"responseNodes"}},"typeVersion":1.3},{"id":"b99d2ceb-4a45-48b9-85ff-50d8b9e4c6fa","name":"Very small model","type":"@n8n/n8n-nodes-langchain.lmChatOpenAi","position":[480,944],"parameters":{"model":{"__rl":true,"mode":"list","value":"gpt-4.1-nano","cachedResultName":"gpt-4.1-nano"},"options":{}},"credentials":{"openAiApi":{"id":"dMiSy27YCK6c6rra","name":"Duv's OpenAI"}},"typeVersion":1.2},{"id":"798c5cdd-38b7-4883-979b-caf044011598","name":"Simple response","type":"@n8n/n8n-nodes-langchain.chainLlm","position":[960,304],"parameters":{"text":"={{ $('When chat message received').item.json.chatInput }}","batching":{},"messages":{"messageValues":[{"message":"=You are a helpful assistant that assists the user.\n\nNB: Here are the previous messages from the conversation:\n\n{{ $('Find past messages').item.json.messages.toJsonString() }}"}]},"promptType":"define"},"typeVersion":1.7},{"id":"bd15aa9e-f288-4033-ba56-ae740f41de1f","name":"Prepare retrieval query","type":"@n8n/n8n-nodes-langchain.chainLlm","position":[784,704],"parameters":{"text":"={{ $('When chat message received').item.json.chatInput }}","batching":{},"messages":{"messageValues":[{"message":"=Based on the user message, you will formulate the short and concise query to send through a knowledge retrieval tool that will enable you to retrieve all the information needed to actually answer that user message later on.\nDirectly output the query as a question.\n\nNB: Here are the previous messages from the conversation:\n\n{{ $('Find past messages').item.json.messages.toJsonString() }}"}]},"promptType":"define"},"typeVersion":1.7},{"id":"c07a6c01-9c13-47aa-896d-60e3f0f9992d","name":"Write the final response","type":"@n8n/n8n-nodes-langchain.chainLlm","position":[1520,704],"parameters":{"text":"=Initial query:\n\n\"{{ $('When chat message received').item.json.chatInput }}\"\n\nKnowledge retrieval output:\n\"{{ $json.Output }}\"","batching":{},"messages":{"messageValues":[{"message":"=The user message contains a query and insights from a knowledge retrieval step to prepare you to actually write the final answer to that user.\n\nNB: Here are the previous messages from the conversation:\n\n{{ $('Find past messages').item.json.messages.toJsonString() }}"}]},"promptType":"define"},"typeVersion":1.7},{"id":"aa532ba1-b606-45c0-8bb2-baeb63daf714","name":"Mini model","type":"@n8n/n8n-nodes-langchain.lmChatOpenAi","position":[848,896],"parameters":{"model":{"__rl":true,"mode":"list","value":"gpt-4.1-mini"},"options":{}},"credentials":{"openAiApi":{"id":"dMiSy27YCK6c6rra","name":"Duv's OpenAI"}},"typeVersion":1.2},{"id":"834eb57c-ded3-46f9-8912-2a323f1f091f","name":"Large model","type":"@n8n/n8n-nodes-langchain.lmChatOpenAi","position":[1584,896],"parameters":{"model":{"__rl":true,"mode":"list","value":"gpt-4.1","cachedResultName":"gpt-4.1"},"options":{}},"credentials":{"openAiApi":{"id":"dMiSy27YCK6c6rra","name":"Duv's OpenAI"}},"typeVersion":1.2},{"id":"4e006500-9e61-45e3-bb20-28cf342667a8","name":"RAG via Lookio","type":"n8n-nodes-base.httpRequest","position":[1216,704],"parameters":{"url":"=https://api.lookio.app/webhook/query","method":"POST","options":{},"sendBody":true,"sendHeaders":true,"bodyParameters":{"parameters":[{"name":"query","value":"={{ $json.text }}"},{"name":"assistant_id","value":"<YOUR-ASSISTANT-ID>"},{"name":"query_mode","value":"flash"}]},"headerParameters":{"parameters":[{"name":"api_key","value":"<YOUR-API-KEY>"}]}},"typeVersion":4.2},{"id":"d90f9ab0-5dd5-46ef-a87a-83f3e077fab9","name":"Intent router","type":"@n8n/n8n-nodes-langchain.textClassifier","position":[400,496],"parameters":{"options":{"systemPromptTemplate":"=Please classify the text provided by the user into one of the following categories: {categories}, and use the provided formatting instructions below. Don't explain, and only output the json.\n\n\nNB: Here are the previous messages from the conversation that came before this new text to categorize - focus on the new text to categorize though, this is just some context:\n\n{{ \n  $json.messages\n    .map(m => `human: ${m.human}\\nai: ${m.ai}`)\n    .join('\\n')\n}}"},"inputText":"={{ $('When chat message received').item.json.chatInput }}","categories":{"categories":[{"category":"=No knowledge retrieval needed to answer this query","description":"=Use this category when the message can be answered directly without consulting any external knowledge or documentation. This includes greetings, confirmations, small talk (e.g., \"hi\", \"hello\", \"thanks\", \"ok\")."},{"category":"=Knowledge retrieval is needed to answer this query","description":"=Use this category when the message requires knowledge to be answered. Good use cases are when the user asks questions."}]}},"typeVersion":1.1},{"id":"f83304af-f74a-4aed-8a64-d7b657a9abaa","name":"Respond to Chat","type":"@n8n/n8n-nodes-langchain.chat","position":[1904,496],"parameters":{"message":"={{ $json.text }}","options":{"memoryConnection":false},"waitUserReply":false},"typeVersion":1},{"id":"e89d675e-eac6-4e95-9238-d7b3501e1b2d","name":"Find past messages","type":"@n8n/n8n-nodes-langchain.memoryManager","position":[48,496],"parameters":{"options":{}},"typeVersion":1.1},{"id":"735ad901-9c58-4851-bf1b-d93aaf46d557","name":"Simple Memory","type":"@n8n/n8n-nodes-langchain.memoryBufferWindow","position":[208,720],"parameters":{},"typeVersion":1.3},{"id":"e1f44ef4-86c7-428b-8e6c-cc35d3181123","name":"Store messages","type":"@n8n/n8n-nodes-langchain.memoryManager","position":[2128,496],"parameters":{"mode":"insert","messages":{"messageValues":[{"type":"user","message":"={{ $('When chat message received').item.json.chatInput }}"},{"type":"ai","message":"={{ $json.text }}"}]}},"typeVersion":1.1},{"id":"61bf6661-fb9b-4179-b2b7-52f66c13cd39","name":"Sticky Note1","type":"n8n-nodes-base.stickyNote","position":[1136,672],"parameters":{"color":3,"width":256,"height":320,"content":"\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n## Action required\n\nMake sure to set your Lookio API key and workspace ID in here."},"typeVersion":1},{"id":"5a21675c-622b-48d5-b076-3c30eb861811","name":"Sticky Note","type":"n8n-nodes-base.stickyNote","position":[-736,224],"parameters":{"width":432,"height":576,"content":"# **Smart & Efficient RAG Chatbot**\n\nThis workflow optimizes AI costs and speed by using an **Intent Router** pattern. \n\nInstead of one big model doing everything, it routes simple chats to small models and saves the heavy-duty LLMs for complex knowledge retrieval via **Lookio**.\n\n## **How to use**\n1.  **Connect AI:** Add credentials to the three **OpenAI Language Model** nodes.\n2.  **Configure Lookio:** In the **RAG via Lookio** node, replace the placeholders with your **API Key** and **Assistant ID** from [Lookio.app](https://www.lookio.app/).\n3.  **Test:** Use the chat window to ask questions about your uploaded documents.\n\n**Note:** For more logic on how to break down agents into modular steps, watch [this video guide](https://www.youtube.com/watch?v=BHdJFnx2wrc).\n\n*A template created by Guillaume Duvernay*"},"typeVersion":1},{"id":"d7dde290-e168-4dd2-b6fd-5b1ee6a592bd","name":"Sticky Note2","type":"n8n-nodes-base.stickyNote","position":[-240,368],"parameters":{"color":7,"width":582,"height":320,"content":"## 1. Receive user message & load the conversation"},"typeVersion":1},{"id":"744c2765-8163-4e99-bc8c-36d83cf659ab","name":"Sticky Note3","type":"n8n-nodes-base.stickyNote","position":[352,368],"parameters":{"color":7,"width":374,"height":320,"content":"## 2. AI confirms if powerful RAG is needed"},"typeVersion":1},{"id":"9275c57e-f39a-4a29-bf23-0bf5082df61c","name":"Sticky Note4","type":"n8n-nodes-base.stickyNote","position":[896,208],"parameters":{"color":7,"width":374,"height":256,"content":"## 3.1. writing a simple response"},"typeVersion":1},{"id":"37947331-e585-4f8e-ad43-90a2449a4458","name":"Sticky Note5","type":"n8n-nodes-base.stickyNote","position":[752,608],"parameters":{"color":7,"width":1046,"height":432,"content":"## 3.2. Writing an AI knowledge retrieval based response"},"typeVersion":1},{"id":"16863033-491b-47df-9da3-ef4f190569d1","name":"Sticky Note6","type":"n8n-nodes-base.stickyNote","position":[1840,400],"parameters":{"color":7,"width":598,"height":272,"content":"## 4. Response handling"},"typeVersion":1}],"connections":{"Mini model":{"ai_languageModel":[[{"node":"Prepare retrieval query","type":"ai_languageModel","index":0}]]},"Large model":{"ai_languageModel":[[{"node":"Write the final response","type":"ai_languageModel","index":0}]]},"Intent router":{"main":[[{"node":"Simple response","type":"main","index":0}],[{"node":"Prepare retrieval query","type":"main","index":0}]]},"Simple Memory":{"ai_memory":[[{"node":"Find past messages","type":"ai_memory","index":0},{"node":"Store messages","type":"ai_memory","index":0}]]},"RAG via Lookio":{"main":[[{"node":"Write the final response","type":"main","index":0}]]},"Store messages":{"main":[[]]},"Respond to Chat":{"main":[[{"node":"Store messages","type":"main","index":0}]]},"Simple response":{"main":[[{"node":"Respond to Chat","type":"main","index":0}]]},"Very small model":{"ai_languageModel":[[{"node":"Simple response","type":"ai_languageModel","index":0},{"node":"Intent router","type":"ai_languageModel","index":0}]]},"Find past messages":{"main":[[{"node":"Intent router","type":"main","index":0}]]},"Prepare retrieval query":{"main":[[{"node":"RAG via Lookio","type":"main","index":0}]]},"Write the final response":{"main":[[{"node":"Respond to Chat","type":"main","index":0}]]},"When chat message received":{"main":[[{"node":"Find past messages","type":"main","index":0}]]}}},"lastUpdatedBy":1,"workflowInfo":{"nodeCount":20,"nodeTypes":{"n8n-nodes-base.stickyNote":{"count":7},"n8n-nodes-base.httpRequest":{"count":1},"@n8n/n8n-nodes-langchain.chat":{"count":1},"@n8n/n8n-nodes-langchain.chainLlm":{"count":3},"@n8n/n8n-nodes-langchain.chatTrigger":{"count":1},"@n8n/n8n-nodes-langchain.lmChatOpenAi":{"count":3},"@n8n/n8n-nodes-langchain.memoryManager":{"count":2},"@n8n/n8n-nodes-langchain.textClassifier":{"count":1},"@n8n/n8n-nodes-langchain.memoryBufferWindow":{"count":1}}},"status":"published","readyToDemo":null,"user":{"name":"Guillaume Duvernay","username":"duv","bio":"AI and automation expert","verified":true,"links":["https://www.linkedin.com/in/guillaume-duvernay/"],"avatar":"https://gravatar.com/avatar/1e93ed2388069da40b3202c5566318982166f1a0b4c4c35c4802c8ca4de79991?r=pg&d=retro&size=200"},"nodes":[{"id":19,"icon":"file:httprequest.svg","name":"n8n-nodes-base.httpRequest","codex":{"data":{"alias":["API","Request","URL","Build","cURL"],"resources":{"generic":[{"url":"https://n8n.io/blog/2021-the-year-to-automate-the-new-you-with-n8n/","icon":"☀️","label":"2021: The Year to Automate the New You with n8n"},{"url":"https://n8n.io/blog/why-business-process-automation-with-n8n-can-change-your-daily-life/","icon":"🧬","label":"Why business process automation with n8n can change your daily life"},{"url":"https://n8n.io/blog/automatically-pulling-and-visualizing-data-with-n8n/","icon":"📈","label":"Automatically pulling and visualizing data with n8n"},{"url":"https://n8n.io/blog/learn-how-to-automatically-cross-post-your-content-with-n8n/","icon":"✍️","label":"Learn how to automatically cross-post your content with n8n"},{"url":"https://n8n.io/blog/automatically-adding-expense-receipts-to-google-sheets-with-telegram-mindee-twilio-and-n8n/","icon":"🧾","label":"Automatically Adding Expense Receipts to Google Sheets with Telegram, Mindee, Twilio, and n8n"},{"url":"https://n8n.io/blog/running-n8n-on-ships-an-interview-with-maranics/","icon":"🛳","label":"Running n8n on ships: An interview with Maranics"},{"url":"https://n8n.io/blog/what-are-apis-how-to-use-them-with-no-code/","icon":" 🪢","label":"What are APIs and how to use them with no code"},{"url":"https://n8n.io/blog/5-tasks-you-can-automate-with-notion-api/","icon":"⚡️","label":"5 tasks you can automate with the new Notion API "},{"url":"https://n8n.io/blog/world-poetry-day-workflow/","icon":"📜","label":"Celebrating World Poetry Day with a daily poem in Telegram"},{"url":"https://n8n.io/blog/automate-google-apps-for-productivity/","icon":"💡","label":"15 Google apps you can combine and automate to increase productivity"},{"url":"https://n8n.io/blog/automate-designs-with-bannerbear-and-n8n/","icon":"🎨","label":"Automate Designs with Bannerbear and n8n"},{"url":"https://n8n.io/blog/how-uproc-scraped-a-multi-page-website-with-a-low-code-workflow/","icon":" 🕸️","label":"How uProc scraped a multi-page website with a low-code workflow"},{"url":"https://n8n.io/blog/building-an-expense-tracking-app-in-10-minutes/","icon":"📱","label":"Building an expense tracking app in 10 minutes"},{"url":"https://n8n.io/blog/5-workflow-automations-for-mattermost-that-we-love-at-n8n/","icon":"🤖","label":"5 workflow automations for Mattermost that we love at n8n"},{"url":"https://n8n.io/blog/how-to-use-the-http-request-node-the-swiss-army-knife-for-workflow-automation/","icon":"🧰","label":"How to use the HTTP Request Node - The Swiss Army Knife for Workflow Automation"},{"url":"https://n8n.io/blog/learn-how-to-use-webhooks-with-mattermost-slash-commands/","icon":"🦄","label":"Learn how to use webhooks with Mattermost slash commands"},{"url":"https://n8n.io/blog/how-a-membership-development-manager-automates-his-work-and-investments/","icon":"📈","label":"How a Membership Development Manager automates his work and investments"},{"url":"https://n8n.io/blog/a-low-code-bitcoin-ticker-built-with-questdb-and-n8n-io/","icon":"📈","label":"A low-code bitcoin ticker built with QuestDB and n8n.io"},{"url":"https://n8n.io/blog/how-to-set-up-a-ci-cd-pipeline-with-no-code/","icon":"🎡","label":"How to set up a no-code CI/CD pipeline with GitHub and TravisCI"},{"url":"https://n8n.io/blog/automations-for-activists/","icon":"✨","label":"How Common Knowledge use workflow automation for activism"},{"url":"https://n8n.io/blog/creating-scheduled-text-affirmations-with-n8n/","icon":"🤟","label":"Creating scheduled text affirmations with n8n"},{"url":"https://n8n.io/blog/how-goomer-automated-their-operations-with-over-200-n8n-workflows/","icon":"🛵","label":"How Goomer automated their operations with over 200 n8n workflows"},{"url":"https://n8n.io/blog/aws-workflow-automation/","label":"7 no-code workflow automations for Amazon Web Services"}],"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.httprequest/"}]},"categories":["Development","Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Helpers"]}}},"group":"[\"output\"]","defaults":{"name":"HTTP Request","color":"#0004F5"},"iconData":{"type":"file","fileBuffer":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iNDAiIGhlaWdodD0iNDAiIHZpZXdCb3g9IjAgMCA0MCA0MCIgZmlsbD0ibm9uZSIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KPHBhdGggZmlsbC1ydWxlPSJldmVub2RkIiBjbGlwLXJ1bGU9ImV2ZW5vZGQiIGQ9Ik00MCAyMEM0MCA4Ljk1MzE0IDMxLjA0NjkgMCAyMCAwQzguOTUzMTQgMCAwIDguOTUzMTQgMCAyMEMwIDMxLjA0NjkgOC45NTMxNCA0MCAyMCA0MEMzMS4wNDY5IDQwIDQwIDMxLjA0NjkgNDAgMjBaTTIwIDM2Ljk0NThDMTguODg1MiAzNi45NDU4IDE3LjEzNzggMzUuOTY3IDE1LjQ5OTggMzIuNjk4NUMxNC43OTY0IDMxLjI5MTggMTQuMTk2MSAyOS41NDMxIDEzLjc1MjYgMjcuNjg0N0gyNi4xODk4QzI1LjgwNDUgMjkuNTQwMyAyNS4yMDQ0IDMxLjI5MDEgMjQuNTAwMiAzMi42OTg1QzIyLjg2MjIgMzUuOTY3IDIxLjExNDggMzYuOTQ1OCAyMCAzNi45NDU4Wk0xMi45MDY0IDIwQzEyLjkwNjQgMjEuNjA5NyAxMy4wMDg3IDIzLjE2NCAxMy4yMDAzIDI0LjYzMDVIMjYuNzk5N0MyNi45OTEzIDIzLjE2NCAyNy4wOTM2IDIxLjYwOTcgMjcuMDkzNiAyMEMyNy4wOTM2IDE4LjM5MDMgMjYuOTkxMyAxNi44MzYgMjYuNzk5NyAxNS4zNjk1SDEzLjIwMDNDMTMuMDA4NyAxNi44MzYgMTIuOTA2NCAxOC4zOTAzIDEyLjkwNjQgMjBaTTIwIDMuMDU0MTlDMjEuMTE0OSAzLjA1NDE5IDIyLjg2MjIgNC4wMzA3OCAyNC41MDAxIDcuMzAwMzlDMjUuMjA2NiA4LjcxNDA4IDI1LjgwNzIgMTAuNDA2NyAyNi4xOTIgMTIuMzE1M0gxMy43NTAxQzE0LjE5MzMgMTAuNDA0NyAxNC43OTQyIDguNzEyNTQgMTUuNDk5OCA3LjMwMDY0QzE3LjEzNzcgNC4wMzA4MyAxOC44ODUxIDMuMDU0MTkgMjAgMy4wNTQxOVpNMzAuMTQ3OCAyMEMzMC4xNDc4IDE4LjQwOTkgMzAuMDU0MyAxNi44NjE3IDI5LjgyMjcgMTUuMzY5NUgzNi4zMDQyQzM2LjcyNTIgMTYuODQyIDM2Ljk0NTggMTguMzk2NCAzNi45NDU4IDIwQzM2Ljk0NTggMjEuNjAzNiAzNi43MjUyIDIzLjE1OCAzNi4zMDQyIDI0LjYzMDVIMjkuODIyN0MzMC4wNTQzIDIzLjEzODMgMzAuMTQ3OCAyMS41OTAxIDMwLjE0NzggMjBaTTI2LjI3NjcgNC4yNTUxMkMyNy42MzY1IDYuMzYwMTkgMjguNzExIDkuMTMyIDI5LjM3NzQgMTIuMzE1M0gzNS4xMDQ2QzMzLjI1MTEgOC42NjggMzAuMTA3IDUuNzgzNDYgMjYuMjc2NyA0LjI1NTEyWk0xMC42MjI2IDEyLjMxNTNINC44OTI5M0M2Ljc1MTQ3IDguNjY3ODQgOS44OTM1MSA1Ljc4MzQxIDEzLjcyMzIgNC4yNTUxM0MxMi4zNjM1IDYuMzYwMjEgMTEuMjg5IDkuMTMyMDEgMTAuNjIyNiAxMi4zMTUzWk0zLjA1NDE5IDIwQzMuMDU0MTkgMjEuNjAzIDMuMjc3NDMgMjMuMTU3NSAzLjY5NDg0IDI0LjYzMDVIMTAuMTIxN0M5Ljk0NjE5IDIzLjE0MiA5Ljg1MjIyIDIxLjU5NDMgOS44NTIyMiAyMEM5Ljg1MjIyIDE4LjQwNTcgOS45NDYxOSAxNi44NTggMTAuMTIxNyAxNS4zNjk1SDMuNjk0ODRDMy4yNzc0MyAxNi44NDI1IDMuMDU0MTkgMTguMzk3IDMuMDU0MTkgMjBaTTI2LjI3NjYgMzUuNzQyN0MyNy42MzY1IDMzLjYzOTMgMjguNzExIDMwLjg2OCAyOS4zNzc0IDI3LjY4NDdIMzUuMTA0NkMzMy4yNTEgMzEuMzMyMiAzMC4xMDY4IDM0LjIxNzkgMjYuMjc2NiAzNS43NDI3Wk0xMy43MjM0IDM1Ljc0MjdDOS44OTM2OSAzNC4yMTc5IDYuNzUxNTUgMzEuMzMyNCA0Ljg5MjkzIDI3LjY4NDdIMTAuNjIyNkMxMS4yODkgMzAuODY4IDEyLjM2MzUgMzMuNjM5MyAxMy43MjM0IDM1Ljc0MjdaIiBmaWxsPSIjM0E0MkU5Ii8+Cjwvc3ZnPgo="},"displayName":"HTTP Request","typeVersion":4,"nodeCategories":[{"id":5,"name":"Development"},{"id":9,"name":"Core Nodes"}]},{"id":565,"icon":"fa:sticky-note","name":"n8n-nodes-base.stickyNote","codex":{"data":{"alias":["Comments","Notes","Sticky"],"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Helpers"]}}},"group":"[\"input\"]","defaults":{"name":"Sticky Note","color":"#FFD233"},"iconData":{"icon":"sticky-note","type":"icon"},"displayName":"Sticky Note","typeVersion":1,"nodeCategories":[{"id":9,"name":"Core Nodes"}]},{"id":1123,"icon":"fa:link","name":"@n8n/n8n-nodes-langchain.chainLlm","codex":{"data":{"alias":["LangChain"],"resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.chainllm/"}]},"categories":["AI","Langchain"],"subcategories":{"AI":["Chains","Root Nodes"]}}},"group":"[\"transform\"]","defaults":{"name":"Basic LLM Chain","color":"#909298"},"iconData":{"icon":"link","type":"icon"},"displayName":"Basic LLM Chain","typeVersion":2,"nodeCategories":[{"id":25,"name":"AI"},{"id":26,"name":"Langchain"}]},{"id":1153,"icon":"file:openAiLight.svg","name":"@n8n/n8n-nodes-langchain.lmChatOpenAi","codex":{"data":{"resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/cluster-nodes/sub-nodes/n8n-nodes-langchain.lmchatopenai/"}]},"categories":["AI","Langchain"],"subcategories":{"AI":["Language Models","Root Nodes"],"Language Models":["Chat Models (Recommended)"]}}},"group":"[\"transform\"]","defaults":{"name":"OpenAI Chat Model"},"iconData":{"type":"file","fileBuffer":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iNDAiIGhlaWdodD0iNDAiIHZpZXdCb3g9IjAgMCA0MCA0MCIgZmlsbD0ibm9uZSIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KPHBhdGggZD0iTTM2Ljg2NzEgMTYuMzcxOEMzNy43NzQ2IDEzLjY0OCAzNy40NjIxIDEwLjY2NDIgMzYuMDEwOCA4LjE4NjYxQzMzLjgyODIgNC4zODY1MyAyOS40NDA3IDIuNDMxNDkgMjUuMTU1NiAzLjM1MTUxQzIzLjI0OTMgMS4yMDM5NiAyMC41MTA1IC0wLjAxNzMxNDggMTcuNjM5MiAwLjAwMDE4NTUzM0MxMy4yNTkxIC0wLjAwOTgxNDY4IDkuMzcyNzMgMi44MTAyNSA4LjAyNTIgNi45Nzc4M0M1LjIxMTM5IDcuNTU0MSAyLjc4MjU4IDkuMzE1MzggMS4zNjEzIDExLjgxMTdDLTAuODM3NDkzIDE1LjYwMTggLTAuMzM2MjMyIDIwLjM3OTQgMi42MDEzMyAyMy42Mjk0QzEuNjkzODEgMjYuMzUzMiAyLjAwNjMyIDI5LjMzNzEgMy40NTc2IDMxLjgxNDZDNS42NDAxNSAzNS42MTQ3IDEwLjAyNzcgMzcuNTY5NyAxNC4zMTI4IDM2LjY0OTdDMTYuMjE3OSAzOC43OTczIDE4Ljk1NzkgNDAuMDE4NSAyMS44MjkyIDM5Ljk5OThDMjYuMjExOCA0MC4wMTEgMzAuMDk5NCAzNy4xODg1IDMxLjQ0NjkgMzMuMDE3MUMzNC4yNjA4IDMyLjQ0MDkgMzYuNjg5NiAzMC42Nzk2IDM4LjExMDggMjguMTgzM0M0MC4zMDcxIDI0LjM5MzIgMzkuODA0NiAxOS42MTk0IDM2Ljg2ODMgMTYuMzY5M0wzNi44NjcxIDE2LjM3MThaTTIxLjgzMTcgMzcuMzg2QzIwLjA3OCAzNy4zODg1IDE4LjM3OTIgMzYuNzc0NyAxNy4wMzI5IDM1LjY1MDlDMTcuMDk0MSAzNS42MTg0IDE3LjIwMDQgMzUuNTU5NyAxNy4yNjkxIDM1LjUxNzJMMjUuMjM0MyAzMC45MTcxQzI1LjY0MTggMzAuNjg1OCAyNS44OTE4IDMwLjI1MjEgMjUuODg5MyAyOS43ODMzVjE4LjU1NDNMMjkuMjU1NyAyMC40OTgxQzI5LjI5MTkgMjAuNTE1NiAyOS4zMTU3IDIwLjU1MDYgMjkuMzIwNyAyMC41OTA2VjI5Ljg4OTZDMjkuMzE1NyAzNC4wMjQ3IDI1Ljk2NjggMzcuMzc3MiAyMS44MzE3IDM3LjM4NlpNNS43MjY0IDMwLjUwNzFDNC44NDc2MyAyOC45ODk2IDQuNTMxMzcgMjcuMjEwOCA0LjgzMjYzIDI1LjQ4NDVDNC44OTEzOCAyNS41MTk1IDQuOTk1MTMgMjUuNTgzMiA1LjA2ODg4IDI1LjYyNTdMMTMuMDM0MSAzMC4yMjU4QzEzLjQzNzggMzAuNDYyMSAxMy45Mzc4IDMwLjQ2MjEgMTQuMzQyOCAzMC4yMjU4TDI0LjA2NjggMjQuNjEwN1YyOC40OTgzQzI0LjA2OTMgMjguNTM4MyAyNC4wNTA1IDI4LjU3NyAyNC4wMTkzIDI4LjYwMkwxNS45Njc5IDMzLjI1MDlDMTIuMzgxNSAzNS4zMTU5IDcuODAxNDQgMzQuMDg4NCA1LjcyNzY1IDMwLjUwNzFINS43MjY0Wk0zLjYzMDEgMTMuMTIwNUM0LjUwNTEyIDExLjYwMDQgNS44ODY0IDEwLjQzNzkgNy41MzE0NCA5LjgzNDE1QzcuNTMxNDQgOS45MDI5IDcuNTI3NjkgMTAuMDI0MiA3LjUyNzY5IDEwLjEwOTJWMTkuMzEwNkM3LjUyNTE5IDE5Ljc3ODEgNy43NzUxOSAyMC4yMTE5IDguMTgxNDUgMjAuNDQzMUwxNy45MDU0IDI2LjA1N0wxNC41MzkxIDI4LjAwMDhDMTQuNTA1MyAyOC4wMjMzIDE0LjQ2MjggMjguMDI3IDE0LjQyNTMgMjguMDEwOEw2LjM3MjY2IDIzLjM1ODJDMi43OTM4MyAyMS4yODU2IDEuNTY2MzEgMTYuNzA2OCAzLjYyODg1IDEzLjEyMTdMMy42MzAxIDEzLjEyMDVaTTMxLjI4ODIgMTkuNTU2OUwyMS41NjQyIDEzLjk0MTdMMjQuOTMwNiAxMS45OTkyQzI0Ljk2NDMgMTEuOTc2NyAyNS4wMDY4IDExLjk3MjkgMjUuMDQ0MyAxMS45ODkyTDMzLjA5NyAxNi42MzhDMzYuNjgyMSAxOC43MDkzIDM3LjkxMDggMjMuMjk1NyAzNS44Mzk1IDI2Ljg4MDhDMzQuOTYzMyAyOC4zOTgzIDMzLjU4MzIgMjkuNTYwOCAzMS45Mzk1IDMwLjE2NThWMjAuNjg5NEMzMS45NDMyIDIwLjIyMTkgMzEuNjk0NSAxOS43ODk0IDMxLjI4OTQgMTkuNTU2OUgzMS4yODgyWk0zNC42MzgzIDE0LjUxNDJDMzQuNTc5NSAxNC40NzggMzQuNDc1OCAxNC40MTU1IDM0LjQwMiAxNC4zNzNMMjYuNDM2OCA5Ljc3Mjg5QzI2LjAzMzEgOS41MzY2NCAyNS41MzMxIDkuNTM2NjQgMjUuMTI4MSA5Ljc3Mjg5TDE1LjQwNDEgMTUuMzg4VjExLjUwMDRDMTUuNDAxNiAxMS40NjA0IDE1LjQyMDQgMTEuNDIxNyAxNS40NTE2IDExLjM5NjdMMjMuNTAzIDYuNzUxNThDMjcuMDg5NCA0LjY4Mjc5IDMxLjY3NDUgNS45MTQwNiAzMy43NDIgOS41MDE2NEMzNC42MTU4IDExLjAxNjcgMzQuOTMyIDEyLjc5MDUgMzQuNjM1OCAxNC41MTQySDM0LjYzODNaTTEzLjU3NDEgMjEuNDQzMUwxMC4yMDY1IDE5LjQ5OTRDMTAuMTcwMiAxOS40ODE5IDEwLjE0NjUgMTkuNDQ2OCAxMC4xNDE1IDE5LjQwNjhWMTAuMTA3OUMxMC4xNDQgNS45Njc4MSAxMy41MDI4IDIuNjEyNzQgMTcuNjQyOSAyLjYxNTI0QzE5LjM5NDIgMi42MTUyNCAyMS4wODkyIDMuMjMwMjUgMjIuNDM1NSA0LjM1MDI4QzIyLjM3NDMgNC4zODI3OCAyMi4yNjkzIDQuNDQxNTMgMjIuMTk5MiA0LjQ4NDAzTDE0LjIzNDEgOS4wODQxM0MxMy44MjY2IDkuMzE1MzggMTMuNTc2NiA5Ljc0Nzg5IDEzLjU3OTEgMTAuMjE2N0wxMy41NzQxIDIxLjQ0MDZWMjEuNDQzMVpNMTUuNDAyOSAxNy41MDA2TDE5LjczNDIgMTQuOTk5M0wyNC4wNjU1IDE3LjQ5OTNWMjIuNTAwN0wxOS43MzQyIDI1LjAwMDdMMTUuNDAyOSAyMi41MDA3VjE3LjUwMDZaIiBmaWxsPSIjN0Q3RDg3Ii8+Cjwvc3ZnPgo="},"displayName":"OpenAI Chat Model","typeVersion":1,"nodeCategories":[{"id":25,"name":"AI"},{"id":26,"name":"Langchain"}]},{"id":1163,"icon":"fa:database","name":"@n8n/n8n-nodes-langchain.memoryBufferWindow","codex":{"data":{"resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/cluster-nodes/sub-nodes/n8n-nodes-langchain.memorybufferwindow/"}]},"categories":["AI","Langchain"],"subcategories":{"AI":["Memory"],"Memory":["For beginners"]}}},"group":"[\"transform\"]","defaults":{"name":"Simple Memory"},"iconData":{"icon":"database","type":"icon"},"displayName":"Simple Memory","typeVersion":1,"nodeCategories":[{"id":25,"name":"AI"},{"id":26,"name":"Langchain"}]},{"id":1246,"icon":"fa:database","name":"@n8n/n8n-nodes-langchain.memoryManager","codex":{"data":{"resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/cluster-nodes/sub-nodes/n8n-nodes-langchain.memorymanager/"}]},"categories":["AI","Langchain"],"subcategories":{"AI":["Miscellaneous","Root Nodes"]}}},"group":"[\"transform\"]","defaults":{"name":"Chat Memory Manager"},"iconData":{"icon":"database","type":"icon"},"displayName":"Chat Memory Manager","typeVersion":1,"nodeCategories":[{"id":25,"name":"AI"},{"id":26,"name":"Langchain"}]},{"id":1247,"icon":"fa:comments","name":"@n8n/n8n-nodes-langchain.chatTrigger","codex":{"data":{"resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.chattrigger/"}]},"categories":["Core Nodes","Langchain"]}},"group":"[\"trigger\"]","defaults":{"name":"When chat message received"},"iconData":{"icon":"comments","type":"icon"},"displayName":"Chat Trigger","typeVersion":1,"nodeCategories":[{"id":9,"name":"Core Nodes"},{"id":26,"name":"Langchain"}]},{"id":1265,"icon":"fa:tags","name":"@n8n/n8n-nodes-langchain.textClassifier","codex":{"data":{"resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.text-classifier/"}]},"categories":["AI","Langchain"],"subcategories":{"AI":["Chains","Root Nodes"]}}},"group":"[\"transform\"]","defaults":{"name":"Text Classifier"},"iconData":{"icon":"tags","type":"icon"},"displayName":"Text Classifier","typeVersion":1,"nodeCategories":[{"id":25,"name":"AI"},{"id":26,"name":"Langchain"}]},{"id":1313,"icon":"fa:comments","name":"@n8n/n8n-nodes-langchain.chat","codex":{"data":{"alias":["human","wait","hitl","respond","approve","confirm","send","message"],"resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.respondtochat/"}]},"categories":["Core Nodes","HITL","Langchain"],"subcategories":{"HITL":["Human in the Loop"]}}},"group":"[\"input\"]","defaults":{"name":"Chat"},"iconData":{"icon":"comments","type":"icon"},"displayName":"Chat","typeVersion":1,"nodeCategories":[{"id":9,"name":"Core Nodes"},{"id":26,"name":"Langchain"},{"id":28,"name":"HITL"}]}],"categories":[{"id":40,"name":"Support Chatbot"},{"id":48,"name":"AI RAG"}],"image":[]}}