{"workflow":{"id":14662,"name":"Translate and dub spokesperson videos using Anthropic and deAPI","views":11,"recentViews":1,"totalViews":11,"createdAt":"2026-04-03T02:34:34.802Z","description":"## Who is this for?\n\n- Marketing teams localizing video content for international markets\n- E-commerce brands creating product videos for multiple regions\n- Agencies producing multilingual ad campaigns for global clients\n- Educators and trainers adapting video courses for different language audiences\n- Anyone who wants to localize a spokesperson video without re-filming\n\n## What problem does this solve?\n\nLocalizing a video for a new market usually means hiring a local presenter, re-filming the entire video, or settling for subtitles that nobody reads. This workflow takes an existing spokesperson video, transcribes it, translates the speech into a target language, generates dubbed audio, and produces a lip-synced talking-head video with a locally-relevant face — all without a camera or a casting call.\n\n## What this workflow does\n\n1. **Reads** the original spokesperson video and a reference image of the local presenter in parallel\n2. **Transcribes** the video's audio to text using deAPI (Whisper Large V3)\n3. **Extracts** the raw transcript text from the transcription result\n4. **AI Agent** translates the transcript into the target language, preserving tone and pacing\n5. **Generates** dubbed speech in the target language using deAPI text-to-speech (Qwen3 TTS Custom Voice)\n6. **Generates** a lip-synced talking-head video from the dubbed audio using deAPI audio-to-video generation (LTX-2.3 22B), with the local presenter image as the first frame\n\n## Setup\n\n### Requirements\n\n- [deAPI](https://deapi.ai) account for transcription, TTS, and video generation\n- Anthropic account for the AI Agent (translation)\n- A spokesperson video\n- A reference image of the local presenter (JPG, JPEG, PNG, GIF, BMP, WebP — max 10 MB)\n\n### Installing the deAPI Node\n\n- **n8n Cloud**: Go to **Settings** → **Community Nodes** and install `n8n-nodes-deapi`\n- **Self-hosted**: Go to **Settings** → **Community Nodes** and install `n8n-nodes-deapi`\n\n### Configuration\n\n1. Add your deAPI credentials (API key + webhook secret)\n2. Add your Anthropic credentials (API key)\n3. Update the **File Path** in the \"Read Source Video\" node to point to your spokesperson video\n4. Update the **File Path** in the \"Read Local Presenter Image\" node to point to the reference image\n5. Edit the **Set Fields** node to set the target language (e.g., \"Spanish\", \"Japanese\", \"French\")\n6. Ensure your n8n instance is on HTTPS\n\n## How to customize this workflow\n\n- **Change the AI model**: Swap Anthropic for OpenAI, Google Gemini, or any other LLM provider for translation\n- **Change the TTS model**: Switch Qwen3 TTS Custom Voice for Kokoro or Chatterbox for different voice characteristics\n- **Use voice cloning**: Replace the Generate Speech node with Clone a Voice to preserve the original speaker's voice in the target language\n- **Batch processing**: Replace the Manual Trigger with a Google Sheets or Airtable trigger containing rows for each target language and local presenter image\n- **Add delivery**: Append a Gmail, Slack, or Google Drive node to automatically deliver the localized video\n","workflow":{"meta":{"templateCredsSetupCompleted":true},"name":"Multilingual Video Localization","tags":[],"nodes":[{"id":"5a7d171d-f9f3-4a92-ad0e-4f4bea085d2c","name":"Sticky Note - Overview","type":"n8n-nodes-base.stickyNote","position":[1360,880],"parameters":{"width":668,"height":960,"content":"## Try It Out!\n### Localize a spokesperson video into another language with a new presenter — no filming required.\n\nThis workflow transcribes a video, translates the speech, generates dubbed audio, creates a lip-synced video.\n\n### How it works\n1. **Manual Trigger** starts the workflow\n2. **Set Fields** defines the target language\n3. **Read Source Video** and **Read Local Presenter Image** load the input files in parallel\n4. **deAPI Transcribe Video** extracts the original speech as text with timestamps\n5. **AI Agent** translates the transcript into the target language\n6. **deAPI Generate Speech** creates dubbed audio in the target language\n7. **deAPI Generate From Audio** produces a lip-synced talking-head video from the dubbed audio, using the local presenter image as the first frame\n\n### Requirements\n- [deAPI](https://deapi.ai) account for transcription, TTS, video generation\n- Anthropic account for the AI Agent (translation)\n- A spokesperson video\n- A reference image of the local presenter\n- n8n instance must be on **HTTPS**\n\n### Need Help?\nJoin the [n8n Discord](https://discord.gg/n8n) or ask in the [Forum](https://community.n8n.io/)!\n\nHappy Automating!"},"typeVersion":1},{"id":"4845a8b4-833f-40b4-a819-306e59b27d13","name":"Sticky Note - Example","type":"n8n-nodes-base.stickyNote","position":[-64,1888],"parameters":{"color":6,"width":380,"height":400,"content":"### Example Input\n\n**Source Video:**\nAn 8-second clip of a presenter speaking in English\n\n**Local Presenter Image:**\nA photo of the person who should appear in the localized video\n\n**Target Language:**\nSpanish"},"typeVersion":1},{"id":"7a3b16f3-4445-49b7-968e-c80368a31265","name":"Sticky Note - Trigger","type":"n8n-nodes-base.stickyNote","position":[352,1888],"parameters":{"color":7,"width":400,"height":460,"content":"## 1. Start & Configure\nClick **Test Workflow** to run.\n\nThe **Set Fields** node defines:\n- **target_language** — the language for the localized video (e.g. Spanish, Japanese, French)"},"typeVersion":1},{"id":"6940f5b8-44b6-4249-bd6c-d33240ef2758","name":"Sticky Note - Read Files","type":"n8n-nodes-base.stickyNote","position":[784,1712],"parameters":{"color":7,"width":452,"height":764,"content":"## 2. Load Files\nReads both input files in parallel:\n\n**Source Video** (top branch)\n- Output field: `video`\n- The original spokesperson video\n- Formats: MP4, MPEG, MOV, AVI, WMV, OGG\n\n**Local Presenter Image** (bottom branch)\n- Output field: `image`\n- Reference photo of the local presenter\n- Formats: JPG, JPEG, PNG, GIF, BMP, WebP\n- Max size: 10 MB\n\nUpdate the **File Path** in each node."},"typeVersion":1},{"id":"f34468c8-4e5b-4f2c-8c71-04422f1add7f","name":"Sticky Note - Transcribe","type":"n8n-nodes-base.stickyNote","position":[1280,1888],"parameters":{"color":7,"width":424,"height":476,"content":"## 3. Transcribe\n[deAPI Documentation](https://docs.deapi.ai)\n\n**Transcribe Video** uses **Whisper Large V3** to extract the spoken text from the video.\n\nTimestamps are included so the AI can preserve pacing during translation."},"typeVersion":1},{"id":"dc61ce3a-0ed5-4f2f-8642-529c58eb5ef8","name":"Sticky Note - Translate","type":"n8n-nodes-base.stickyNote","position":[1744,1888],"parameters":{"color":7,"width":400,"height":540,"content":"## 4. Translate\nThe **AI Agent** translates the transcript into the target language.\n\nIt preserves the natural tone and pacing of the original speech, adapting idioms and cultural references for the target audience."},"typeVersion":1},{"id":"6477e3df-490a-4a37-8b9c-9a6e8c5d11ad","name":"Sticky Note - TTS","type":"n8n-nodes-base.stickyNote","position":[2176,1888],"parameters":{"color":7,"width":400,"height":492,"content":"## 5. Generate Dubbed Speech\n[deAPI Documentation](https://docs.deapi.ai)\n\n**Generate Speech** uses **Qwen3** to create natural-sounding speech from the translated text.\n\nSwap for **Clone a Voice** to preserve the original speaker's voice characteristics."},"typeVersion":1},{"id":"af9b5122-2305-4803-880d-efaa7bfc28de","name":"Sticky Note - Audio to Video","type":"n8n-nodes-base.stickyNote","position":[2608,1888],"parameters":{"color":7,"width":528,"height":588,"content":"## 6. Generate Lip-Synced Video\n[deAPI Documentation](https://docs.deapi.ai)\n\n**Generate From Audio** uses **LTX-2.3 22B** to create a talking-head video synced to the dubbed speech.\n\nThe local presenter image is used as the first frame to guide the visual appearance."},"typeVersion":1},{"id":"c1a91124-f87e-43fe-bd88-74ff43a00a7b","name":"Manual Trigger","type":"n8n-nodes-base.manualTrigger","position":[384,2192],"parameters":{},"typeVersion":1},{"id":"f12fc28b-b55d-4b9e-b3f9-059751d56012","name":"Set Fields","type":"n8n-nodes-base.set","position":[608,2192],"parameters":{"options":{},"assignments":{"assignments":[{"id":"field-target-language","name":"target_language","type":"string","value":"Spanish"}]}},"typeVersion":3.4},{"id":"d13bdc81-57a2-4a9d-be84-76826691d664","name":"Read Source Video","type":"n8n-nodes-base.readWriteFile","position":[960,2096],"parameters":{"options":{"dataPropertyName":"video"},"fileSelector":"/path/to/your/spokesperson-video.mp4"},"typeVersion":1},{"id":"6ef5ce65-2131-4608-b307-17459c0f7ce7","name":"Read Local Presenter Image","type":"n8n-nodes-base.readWriteFile","position":[960,2288],"parameters":{"options":{"dataPropertyName":"image"},"fileSelector":"/path/to/your/local-presenter.jpg"},"typeVersion":1},{"id":"baccfe84-3b58-439e-816a-7602083bd603","name":"deAPI Transcribe Video","type":"n8n-nodes-deapi.deapi","position":[1344,2096],"parameters":{"source":"binary","options":{"waitTimeout":120},"resource":"video","operation":"transcribe","binaryPropertyName":"video"},"credentials":{"deApi":{"id":"YOUR_DEAPI_CREDENTIAL_ID","name":"deAPI account"}},"typeVersion":1},{"id":"a685c3d7-ba6e-4bf9-88c1-a2ffca1a628e","name":"Extract from File","type":"n8n-nodes-base.extractFromFile","position":[1552,2096],"parameters":{"options":{},"operation":"text","destinationKey":"text"},"typeVersion":1.1},{"id":"91159328-15ff-4f45-abf8-9189631d8bf4","name":"AI Agent","type":"@n8n/n8n-nodes-langchain.agent","position":[1840,2096],"parameters":{"text":"=Translate the following transcript into {{ $('Set Fields').item.json.target_language }}.\n\nReturn ONLY the translated text, without timestamps, line numbers, or formatting. Preserve the natural pacing and tone of the original speech.\n\nTranscript:\n{{ $json.text }}","options":{"systemMessage":"You are a professional translator specializing in video localization. Your goal is to produce natural-sounding translations that work well when spoken aloud.\n\nKey principles:\n- Preserve the tone, energy, and intent of the original speech\n- Adapt idioms and cultural references for the target audience\n- Keep sentences at a similar length to the original for lip-sync compatibility\n- Use natural spoken language, not formal written style\n- Return ONLY the translated text — no explanations, notes, or formatting"},"promptType":"define"},"typeVersion":1.7},{"id":"9bdee101-3077-46f9-8796-7baa8488bf8d","name":"Anthropic Chat Model","type":"@n8n/n8n-nodes-langchain.lmChatAnthropic","position":[1840,2288],"parameters":{"model":{"__rl":true,"mode":"list","value":"claude-opus-4-6","cachedResultName":"Claude Opus 4.6"},"options":{}},"credentials":{"anthropicApi":{"id":"YOUR_ANTHROPIC_CREDENTIAL_ID","name":"Anthropic account"}},"typeVersion":1.3},{"id":"10f8b667-47bb-4333-a5d6-18b0617562f7","name":"deAPI Generate Speech","type":"n8n-nodes-deapi.deapi","position":[2336,2096],"parameters":{"text":"={{ $json.output }}","model":"Qwen3_TTS_12Hz_1_7B_CustomVoice","options":{"waitTimeout":120},"resource":"audio","operation":"generateSpeech","qwen3Lang":"={{ $('Set Fields').item.json.target_language }}"},"credentials":{"deApi":{"id":"YOUR_DEAPI_CREDENTIAL_ID","name":"deAPI account"}},"typeVersion":1},{"id":"bdb0e104-d7a2-4d39-b30f-d7b000bda81c","name":"Merge Audio + Image","type":"n8n-nodes-base.merge","position":[2672,2272],"parameters":{"mode":"combine","options":{},"combineBy":"combineByPosition"},"typeVersion":3.2},{"id":"8d95a8c1-38ba-469b-94bb-41b4faedb3f4","name":"deAPI Generate From Audio","type":"n8n-nodes-deapi.deapi","position":[2896,2272],"parameters":{"prompt":"A person speaking naturally to the camera, subtle head movements and facial expressions, professional lighting, medium close-up shot, steady camera","options":{"frames":241,"firstFrame":"image","waitTimeout":300},"resource":"video","operation":"generateFromAudio"},"credentials":{"deApi":{"id":"YOUR_DEAPI_CREDENTIAL_ID","name":"deAPI account"}},"typeVersion":1}],"active":false,"pinData":{},"settings":{"executionOrder":"v1"},"connections":{"AI Agent":{"main":[[{"node":"deAPI Generate Speech","type":"main","index":0}]]},"Set Fields":{"main":[[{"node":"Read Source Video","type":"main","index":0},{"node":"Read Local Presenter Image","type":"main","index":0}]]},"Manual Trigger":{"main":[[{"node":"Set Fields","type":"main","index":0}]]},"Extract from File":{"main":[[{"node":"AI Agent","type":"main","index":0}]]},"Read Source Video":{"main":[[{"node":"deAPI Transcribe Video","type":"main","index":0}]]},"Merge Audio + Image":{"main":[[{"node":"deAPI Generate From Audio","type":"main","index":0}]]},"Anthropic Chat Model":{"ai_languageModel":[[{"node":"AI Agent","type":"ai_languageModel","index":0}]]},"deAPI Generate Speech":{"main":[[{"node":"Merge Audio + Image","type":"main","index":0}]]},"deAPI Transcribe Video":{"main":[[{"node":"Extract from File","type":"main","index":0}]]},"deAPI Generate From Audio":{"main":[[]]},"Read Local Presenter Image":{"main":[[{"node":"Merge Audio + Image","type":"main","index":1}]]}}},"lastUpdatedBy":1,"workflowInfo":{"nodeCount":19,"nodeTypes":{"n8n-nodes-base.set":{"count":1},"n8n-nodes-base.merge":{"count":1},"n8n-nodes-deapi.deapi":{"count":3},"n8n-nodes-base.stickyNote":{"count":8},"n8n-nodes-base.manualTrigger":{"count":1},"n8n-nodes-base.readWriteFile":{"count":2},"@n8n/n8n-nodes-langchain.agent":{"count":1},"n8n-nodes-base.extractFromFile":{"count":1},"@n8n/n8n-nodes-langchain.lmChatAnthropic":{"count":1}}},"status":"published","readyToDemo":null,"user":{"name":"deAPI Team","username":"deapi","bio":"deAPI Team maintains the official n8n community node for deAPI — a unified API for open-source AI models including image generation, video creation, and transcription, powered by decentralized GPUs at up to 20× lower costs. We build ready-to-use workflow templates that help developers and teams integrate AI media generation into n8n — no infrastructure, no code.","verified":true,"links":["https://deapi.ai/"],"avatar":"https://gravatar.com/avatar/7866869a4e009d12f65f75f3c2e25eed777c19329a51ce803491dcec60ae0c98?r=pg&d=retro&size=200"},"nodes":[{"id":24,"icon":"file:merge.svg","name":"n8n-nodes-base.merge","codex":{"data":{"alias":["Join","Concatenate","Wait"],"resources":{"generic":[{"url":"https://n8n.io/blog/how-to-sync-data-between-two-systems/","icon":"🏬","label":"How to synchronize data between two systems (one-way vs. two-way sync"},{"url":"https://n8n.io/blog/supercharging-your-conference-registration-process-with-n8n/","icon":"🎫","label":"Supercharging your conference registration process with n8n"},{"url":"https://n8n.io/blog/migrating-community-metrics-to-orbit-using-n8n/","icon":"📈","label":"Migrating Community Metrics to Orbit using n8n"},{"url":"https://n8n.io/blog/build-your-own-virtual-assistant-with-n8n-a-step-by-step-guide/","icon":"👦","label":"Build your own virtual assistant with n8n: A step by step guide"},{"url":"https://n8n.io/blog/sending-automated-congratulations-with-google-sheets-twilio-and-n8n/","icon":"🙌","label":"Sending Automated Congratulations with Google Sheets, Twilio, and n8n "},{"url":"https://n8n.io/blog/aws-workflow-automation/","label":"7 no-code workflow automations for Amazon Web Services"}],"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.merge/"}]},"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Flow","Data Transformation"]}}},"group":"[\"transform\"]","defaults":{"name":"Merge"},"iconData":{"type":"file","fileBuffer":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iNTEyIiBoZWlnaHQ9IjUxMiIgdmlld0JveD0iMCAwIDUxMiA1MTIiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CjxnIGNsaXAtcGF0aD0idXJsKCNjbGlwMF8xMTc3XzUxOCkiPgo8cGF0aCBmaWxsLXJ1bGU9ImV2ZW5vZGQiIGNsaXAtcnVsZT0iZXZlbm9kZCIgZD0iTTAgNDhDMCAyMS40OTAzIDIxLjQ5MDMgMCA0OCAwSDExMkMxMzguNTEgMCAxNjAgMjEuNDkwMyAxNjAgNDhWNTZIMTk2LjI1MkMyNDAuNDM1IDU2IDI3Ni4yNTIgOTEuODE3MiAyNzYuMjUyIDEzNlYxOTJDMjc2LjI1MiAyMTQuMDkxIDI5NC4xNjEgMjMyIDMxNi4yNTIgMjMySDM1MlYyMjRDMzUyIDE5Ny40OSAzNzMuNDkgMTc2IDQwMCAxNzZINDY0QzQ5MC41MSAxNzYgNTEyIDE5Ny40OSA1MTIgMjI0VjI4OEM1MTIgMzE0LjUxIDQ5MC41MSAzMzYgNDY0IDMzNkg0MDBDMzczLjQ5IDMzNiAzNTIgMzE0LjUxIDM1MiAyODhWMjgwSDMxNi4yNTJDMjk0LjE2MSAyODAgMjc2LjI1MiAyOTcuOTA5IDI3Ni4yNTIgMzIwVjM3NkMyNzYuMjUyIDQyMC4xODMgMjQwLjQzNSA0NTYgMTk2LjI1MiA0NTZIMTYwVjQ2NEMxNjAgNDkwLjUxIDEzOC41MSA1MTIgMTEyIDUxMkg0OEMyMS40OTAzIDUxMiAwIDQ5MC41MSAwIDQ2NFY0MDBDMCAzNzMuNDkgMjEuNDkwMyAzNTIgNDggMzUySDExMkMxMzguNTEgMzUyIDE2MCAzNzMuNDkgMTYwIDQwMFY0MDhIMTk2LjI1MkMyMTMuOTI1IDQwOCAyMjguMjUyIDM5My42NzMgMjI4LjI1MiAzNzZWMzIwQzIyOC4yNTIgMjk0Ljc4NCAyMzguODU5IDI3Mi4wNDQgMjU1Ljg1MyAyNTZDMjM4Ljg1OSAyMzkuOTU2IDIyOC4yNTIgMjE3LjIxNiAyMjguMjUyIDE5MlYxMzZDMjI4LjI1MiAxMTguMzI3IDIxMy45MjUgMTA0IDE5Ni4yNTIgMTA0SDE2MFYxMTJDMTYwIDEzOC41MSAxMzguNTEgMTYwIDExMiAxNjBINDhDMjEuNDkwMyAxNjAgMCAxMzguNTEgMCAxMTJWNDhaTTEwNCA0OEMxMDguNDE4IDQ4IDExMiA1MS41ODE3IDExMiA1NlYxMDRDMTEyIDEwOC40MTggMTA4LjQxOCAxMTIgMTA0IDExMkg1NkM1MS41ODE3IDExMiA0OCAxMDguNDE4IDQ4IDEwNFY1NkM0OCA1MS41ODE3IDUxLjU4MTcgNDggNTYgNDhIMTA0Wk00NTYgMjI0QzQ2MC40MTggMjI0IDQ2NCAyMjcuNTgyIDQ2NCAyMzJWMjgwQzQ2NCAyODQuNDE4IDQ2MC40MTggMjg4IDQ1NiAyODhINDA4QzQwMy41ODIgMjg4IDQwMCAyODQuNDE4IDQwMCAyODBWMjMyQzQwMCAyMjcuNTgyIDQwMy41ODIgMjI0IDQwOCAyMjRINDU2Wk0xMTIgNDA4QzExMiA0MDMuNTgyIDEwOC40MTggNDAwIDEwNCA0MDBINTZDNTEuNTgxNyA0MDAgNDggNDAzLjU4MiA0OCA0MDhWNDU2QzQ4IDQ2MC40MTggNTEuNTgxNyA0NjQgNTYgNDY0SDEwNEMxMDguNDE4IDQ2NCAxMTIgNDYwLjQxOCAxMTIgNDU2VjQwOFoiIGZpbGw9IiM1NEI4QzkiLz4KPC9nPgo8ZGVmcz4KPGNsaXBQYXRoIGlkPSJjbGlwMF8xMTc3XzUxOCI+CjxyZWN0IHdpZHRoPSI1MTIiIGhlaWdodD0iNTEyIiBmaWxsPSJ3aGl0ZSIvPgo8L2NsaXBQYXRoPgo8L2RlZnM+Cjwvc3ZnPgo="},"displayName":"Merge","typeVersion":3,"nodeCategories":[{"id":9,"name":"Core Nodes"}]},{"id":38,"icon":"fa:pen","name":"n8n-nodes-base.set","codex":{"data":{"alias":["Set","JS","JSON","Filter","Transform","Map"],"resources":{"generic":[{"url":"https://n8n.io/blog/learn-to-automate-your-factorys-incident-reporting-a-step-by-step-guide/","icon":"🏭","label":"Learn to Automate Your Factory's Incident Reporting: A Step by Step Guide"},{"url":"https://n8n.io/blog/2021-the-year-to-automate-the-new-you-with-n8n/","icon":"☀️","label":"2021: The Year to Automate the New You with n8n"},{"url":"https://n8n.io/blog/automatically-pulling-and-visualizing-data-with-n8n/","icon":"📈","label":"Automatically pulling and visualizing data with n8n"},{"url":"https://n8n.io/blog/database-monitoring-and-alerting-with-n8n/","icon":"📡","label":"Database Monitoring and Alerting with n8n"},{"url":"https://n8n.io/blog/automatically-adding-expense-receipts-to-google-sheets-with-telegram-mindee-twilio-and-n8n/","icon":"🧾","label":"Automatically Adding Expense Receipts to Google Sheets with Telegram, Mindee, Twilio, and n8n"},{"url":"https://n8n.io/blog/no-code-ecommerce-workflow-automations/","icon":"store","label":"6 e-commerce workflows to power up your Shopify s"},{"url":"https://n8n.io/blog/how-to-build-a-low-code-self-hosted-url-shortener/","icon":"🔗","label":"How to build a low-code, self-hosted URL shortener in 3 steps"},{"url":"https://n8n.io/blog/automate-your-data-processing-pipeline-in-9-steps-with-n8n/","icon":"⚙️","label":"Automate your data processing pipeline in 9 steps"},{"url":"https://n8n.io/blog/how-to-get-started-with-crm-automation-and-no-code-workflow-ideas/","icon":"👥","label":"How to get started with CRM automation (with 3 no-code workflow ideas"},{"url":"https://n8n.io/blog/5-tasks-you-can-automate-with-notion-api/","icon":"⚡️","label":"5 tasks you can automate with the new Notion API "},{"url":"https://n8n.io/blog/automate-google-apps-for-productivity/","icon":"💡","label":"15 Google apps you can combine and automate to increase productivity"},{"url":"https://n8n.io/blog/how-uproc-scraped-a-multi-page-website-with-a-low-code-workflow/","icon":" 🕸️","label":"How uProc scraped a multi-page website with a low-code workflow"},{"url":"https://n8n.io/blog/building-an-expense-tracking-app-in-10-minutes/","icon":"📱","label":"Building an expense tracking app in 10 minutes"},{"url":"https://n8n.io/blog/the-ultimate-guide-to-automate-your-video-collaboration-with-whereby-mattermost-and-n8n/","icon":"📹","label":"The ultimate guide to automate your video collaboration with Whereby, Mattermost, and n8n"},{"url":"https://n8n.io/blog/5-workflow-automations-for-mattermost-that-we-love-at-n8n/","icon":"🤖","label":"5 workflow automations for Mattermost that we love at n8n"},{"url":"https://n8n.io/blog/learn-to-build-powerful-api-endpoints-using-webhooks/","icon":"🧰","label":"Learn to Build Powerful API Endpoints Using Webhooks"},{"url":"https://n8n.io/blog/how-a-membership-development-manager-automates-his-work-and-investments/","icon":"📈","label":"How a Membership Development Manager automates his work and investments"},{"url":"https://n8n.io/blog/a-low-code-bitcoin-ticker-built-with-questdb-and-n8n-io/","icon":"📈","label":"A low-code bitcoin ticker built with QuestDB and n8n.io"},{"url":"https://n8n.io/blog/how-to-set-up-a-ci-cd-pipeline-with-no-code/","icon":"🎡","label":"How to set up a no-code CI/CD pipeline with GitHub and TravisCI"},{"url":"https://n8n.io/blog/benefits-of-automation-and-n8n-an-interview-with-hubspots-hugh-durkin/","icon":"🎖","label":"Benefits of automation and n8n: An interview with HubSpot's Hugh Durkin"},{"url":"https://n8n.io/blog/how-goomer-automated-their-operations-with-over-200-n8n-workflows/","icon":"🛵","label":"How Goomer automated their operations with over 200 n8n workflows"},{"url":"https://n8n.io/blog/aws-workflow-automation/","label":"7 no-code workflow automations for Amazon Web Services"}],"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.set/"}]},"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Data Transformation"]}}},"group":"[\"input\"]","defaults":{"name":"Edit Fields"},"iconData":{"icon":"pen","type":"icon"},"displayName":"Edit Fields (Set)","typeVersion":3,"nodeCategories":[{"id":9,"name":"Core Nodes"}]},{"id":565,"icon":"fa:sticky-note","name":"n8n-nodes-base.stickyNote","codex":{"data":{"alias":["Comments","Notes","Sticky"],"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Helpers"]}}},"group":"[\"input\"]","defaults":{"name":"Sticky Note","color":"#FFD233"},"iconData":{"icon":"sticky-note","type":"icon"},"displayName":"Sticky Note","typeVersion":1,"nodeCategories":[{"id":9,"name":"Core Nodes"}]},{"id":838,"icon":"fa:mouse-pointer","name":"n8n-nodes-base.manualTrigger","codex":{"data":{"resources":{"generic":[],"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.manualworkflowtrigger/"}]},"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0"}},"group":"[\"trigger\"]","defaults":{"name":"When clicking ‘Execute workflow’","color":"#909298"},"iconData":{"icon":"mouse-pointer","type":"icon"},"displayName":"Manual Trigger","typeVersion":1,"nodeCategories":[{"id":9,"name":"Core Nodes"}]},{"id":1119,"icon":"fa:robot","name":"@n8n/n8n-nodes-langchain.agent","codex":{"data":{"alias":["LangChain","Chat","Conversational","Plan and Execute","ReAct","Tools"],"resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/cluster-nodes/root-nodes/n8n-nodes-langchain.agent/"}]},"categories":["AI","Langchain"],"subcategories":{"AI":["Agents","Root Nodes"]}}},"group":"[\"transform\"]","defaults":{"name":"AI Agent","color":"#404040"},"iconData":{"icon":"robot","type":"icon"},"displayName":"AI Agent","typeVersion":3,"nodeCategories":[{"id":25,"name":"AI"},{"id":26,"name":"Langchain"}]},{"id":1145,"icon":"file:anthropic.svg","name":"@n8n/n8n-nodes-langchain.lmChatAnthropic","codex":{"data":{"alias":["claude","sonnet","opus"],"resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/cluster-nodes/sub-nodes/n8n-nodes-langchain.lmchatanthropic/"}]},"categories":["AI","Langchain"],"subcategories":{"AI":["Language Models","Root Nodes"],"Language Models":["Chat Models (Recommended)"]}}},"group":"[\"transform\"]","defaults":{"name":"Anthropic Chat Model"},"iconData":{"type":"file","fileBuffer":"data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHdpZHRoPSI0NiIgaGVpZ2h0PSIzMiIgZmlsbD0ibm9uZSI+PHBhdGggZmlsbD0iIzdEN0Q4NyIgZD0iTTMyLjczIDBoLTYuOTQ1TDM4LjQ1IDMyaDYuOTQ1ek0xMi42NjUgMCAwIDMyaDcuMDgybDIuNTktNi43MmgxMy4yNWwyLjU5IDYuNzJoNy4wODJMMTkuOTI5IDB6bS0uNzAyIDE5LjMzNyA0LjMzNC0xMS4yNDYgNC4zMzQgMTEuMjQ2eiIvPjwvc3ZnPg=="},"displayName":"Anthropic Chat Model","typeVersion":1,"nodeCategories":[{"id":25,"name":"AI"},{"id":26,"name":"Langchain"}]},{"id":1233,"icon":"file:readWriteFile.svg","name":"n8n-nodes-base.readWriteFile","codex":{"data":{"alias":["Binary","Binary File","File","Text","Open","Import","Save","Export","Disk","Transfer","Read Binary File","Write Binary File"],"resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.readwritefile/"}]},"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Files"]}}},"group":"[\"input\"]","defaults":{"name":"Read/Write Files from Disk"},"iconData":{"type":"file","fileBuffer":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iNTEyIiBoZWlnaHQ9IjUxMiIgdmlld0JveD0iMCAwIDUxMiA1MTIiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CjxnIGNsaXAtcGF0aD0idXJsKCNjbGlwMF8xMTQxXzE1NDcpIj4KPHBhdGggZD0iTTAgMTJDMCA1LjM3MjU4IDUuMzcyNTggMCAxMiAwSDE1OVYxNTRDMTU5IDE2MC42MjcgMTY0LjM3MyAxNjYgMTcxIDE2NkgzMjVWMjQySDIyOC41NjJDMjEwLjg5NSAyNDIgMTk0LjY1NiAyNTEuNzA1IDE4Ni4yODggMjY3LjI2NEwxMjkuMjAzIDM3My40MDdDMTI1LjEzMSAzODAuOTc4IDEyMyAzODkuNDQgMTIzIDM5OC4wMzdWNDM0SDEyQzUuMzcyNTcgNDM0IDAgNDI4LjYyNyAwIDQyMlYxMloiIGZpbGw9IiM0NEFBNDQiLz4KPHBhdGggZD0iTTMyNSAxMzRWMTI3LjQwMUMzMjUgMTI0LjIyMyAzMjMuNzQgMTIxLjE3NSAzMjEuNDk1IDExOC45MjVMMjA2LjM2OSAzLjUyNDgxQzIwNC4xMTggMS4yNjgyIDIwMS4wNjEgMCAxOTcuODczIDBIMTkxVjEzNEgzMjVaIiBmaWxsPSIjNDRBQTQ0Ii8+CjxwYXRoIGZpbGwtcnVsZT0iZXZlbm9kZCIgY2xpcC1ydWxlPSJldmVub2RkIiBkPSJNMjI4LjU2MyAyNzRDMjIyLjY3NCAyNzQgMjE3LjI2MSAyNzcuMjM1IDIxNC40NzIgMjgyLjQyMUwxNzIuMjExIDM2MUg0OTIuNjRMNDQ0LjY3IDI4MS43MTdDNDQxLjc3MiAyNzYuOTI3IDQzNi41OCAyNzQgNDMwLjk4MSAyNzRIMjI4LjU2M1oiIGZpbGw9IiM0NEFBNDQiLz4KPHBhdGggZmlsbC1ydWxlPSJldmVub2RkIiBjbGlwLXJ1bGU9ImV2ZW5vZGQiIGQ9Ik0xNTUgNDA5QzE1NSA0MDAuMTYzIDE2Mi4xNjMgMzkzIDE3MSAzOTNINDk2QzUwNC44MzcgMzkzIDUxMiA0MDAuMTYzIDUxMiA0MDlWNDk2QzUxMiA1MDQuODM3IDUwNC44MzcgNTEyIDQ5NiA1MTJIMTcxQzE2Mi4xNjMgNTEyIDE1NSA1MDQuODM3IDE1NSA0OTZWNDA5Wk0zOTcgNDUzQzM5NyA0NjYuMjU1IDM4Ni4yNTUgNDc3IDM3MyA0NzdDMzU5Ljc0NSA0NzcgMzQ5IDQ2Ni4yNTUgMzQ5IDQ1M0MzNDkgNDM5Ljc0NSAzNTkuNzQ1IDQyOSAzNzMgNDI5QzM4Ni4yNTUgNDI5IDM5NyA0MzkuNzQ1IDM5NyA0NTNaTTQ0NSA0NzdDNDU4LjI1NSA0NzcgNDY5IDQ2Ni4yNTUgNDY5IDQ1M0M0NjkgNDM5Ljc0NSA0NTguMjU1IDQyOSA0NDUgNDI5QzQzMS43NDUgNDI5IDQyMSA0MzkuNzQ1IDQyMSA0NTNDNDIxIDQ2Ni4yNTUgNDMxLjc0NSA0NzcgNDQ1IDQ3N1oiIGZpbGw9IiM0NEFBNDQiLz4KPC9nPgo8ZGVmcz4KPGNsaXBQYXRoIGlkPSJjbGlwMF8xMTQxXzE1NDciPgo8cmVjdCB3aWR0aD0iNTEyIiBoZWlnaHQ9IjUxMiIgZmlsbD0id2hpdGUiLz4KPC9jbGlwUGF0aD4KPC9kZWZzPgo8L3N2Zz4K"},"displayName":"Read/Write Files from Disk","typeVersion":1,"nodeCategories":[{"id":9,"name":"Core Nodes"}]},{"id":1235,"icon":"file:extractFromFile.svg","name":"n8n-nodes-base.extractFromFile","codex":{"data":{"alias":["CSV","Spreadsheet","Excel","xls","xlsx","ods","tabular","decode","decoding","Move Binary Data","Binary","File","PDF","JSON","HTML","ICS","iCal","txt","Text","RTF","XML","64","Base64","Convert"],"resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.extractfromfile/"}]},"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Files","Data Transformation"]}}},"group":"[\"input\"]","defaults":{"name":"Extract from File"},"iconData":{"type":"file","fileBuffer":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iNDAiIGhlaWdodD0iNDAiIHZpZXdCb3g9IjAgMCA0MCA0MCIgZmlsbD0ibm9uZSIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KPHBhdGggZD0iTTAuOTM3NSAyQzAuNDE5NzMzIDIgMCAyLjQxOTczIDAgMi45Mzc1VjM3LjMyMjFDMCAzNy44Mzk5IDAuNDE5NzMzIDM4LjI1OTYgMC45Mzc1IDM4LjI1OTZIMjYuMjE1NEMyNi43MzMyIDM4LjI1OTYgMjcuMTUyOSAzNy44Mzk5IDI3LjE1MjkgMzcuMzIyMUwyNy4xNTI5IDMwLjY3MTlMMTYuNzk2OSAzMC42NzE5QzE0Ljg5ODQgMzAuNjcxOSAxMy4zNTk0IDI5LjEzMjkgMTMuMzU5NCAyNy4yMzQ0VjI1LjM1OTRDMTMuMzU5NCAyMy40NjA5IDE0Ljg5ODQgMjEuOTIxOSAxNi43OTY5IDIxLjkyMTlIMjcuMTUyOUwyNy4xNTI5IDE1Ljc4MjFIMTQuMzA4M0MxMy43OTA2IDE1Ljc4MjEgMTMuMzcwOCAxNS4zNjI0IDEzLjM3MDggMTQuODQ0NlYySDAuOTM3NVoiIGZpbGw9IiMzNTNGNkUiLz4KPHBhdGggZD0iTTE2LjAyNzEgMkMxNS45NDA4IDIgMTUuODcwOCAyLjA2OTk2IDE1Ljg3MDggMi4xNTYyNVYxMi44MTM0QzE1Ljg3MDggMTMuMDcyMyAxNi4wODA3IDEzLjI4MjEgMTYuMzM5NiAxMy4yODIxSDI2Ljk5NjdDMjcuMDgzIDEzLjI4MjEgMjcuMTUyOSAxMy4yMTIyIDI3LjE1MjkgMTMuMTI1OUwyNy4xNTI5IDEyLjYxNzFDMjcuMTUyOSAxMi4zNjg4IDI3LjA1NDUgMTIuMTMwNyAyNi44NzkxIDExLjk1NUwxNy4yMjI1IDIuMjc1MzhDMTcuMDQ2NiAyLjA5OTA4IDE2LjgwNzkgMiAxNi41NTg4IDJIMTYuMDI3MVoiIGZpbGw9IiMzNTNGNkUiLz4KPHBhdGggZD0iTTI5Ljc2NDIgMzQuNjUwM0MyOS4wMzQgMzMuOTE2IDI5LjAzNzQgMzIuNzI4OCAyOS43NzE2IDMxLjk5ODZMMzMuNjE5NyAyOC4xNzE5TDE2Ljc5NjkgMjguMTcxOUMxNi4yNzkxIDI4LjE3MTkgMTUuODU5NCAyNy43NTIxIDE1Ljg1OTQgMjcuMjM0NFYyNS4zNTk0QzE1Ljg1OTQgMjQuODQxNiAxNi4yNzkxIDI0LjQyMTkgMTYuNzk2OSAyNC40MjE5TDMzLjU0MTIgMjQuNDIxOUwyOS43NzE2IDIwLjY3MzNDMjkuMDM3NCAxOS45NDMxIDI5LjAzNCAxOC43NTU5IDI5Ljc2NDIgMTguMDIxNkMzMC40OTQ0IDE3LjI4NzQgMzEuNjgxNiAxNy4yODQgMzIuNDE1OSAxOC4wMTQyTDM5LjQ0NzEgMjUuMDA2NEMzOS44MDEgMjUuMzU4MyA0MCAyNS44MzY4IDQwIDI2LjMzNTlDNDAgMjYuODM1IDM5LjgwMSAyNy4zMTM1IDM5LjQ0NzEgMjcuNjY1NUwzMi40MTU5IDM0LjY1NzZDMzEuNjgxNiAzNS4zODc4IDMwLjQ5NDQgMzUuMzg0NSAyOS43NjQyIDM0LjY1MDNaIiBmaWxsPSIjMzUzRjZFIi8+Cjwvc3ZnPgo="},"displayName":"Extract from File","typeVersion":1,"nodeCategories":[{"id":9,"name":"Core Nodes"}]}],"categories":[{"id":31,"name":"Content Creation"},{"id":51,"name":"Multimodal AI"}],"image":[]}}