Skip to main content
R

Robert Breen

90
Workflows

Workflows by Robert Breen

Workflow preview: Add Project Tasks to Google Sheets with GPT-4.1-mini Chat Assistant
Free advanced

Add Project Tasks to Google Sheets with GPT-4.1-mini Chat Assistant

Let your team create, track, and manage project tasks through natural conversation. This workflow uses an **AI Project Manager Agent** that chats with users, gathers the task details it needs, and automatically adds them to a **Google Sheet**. --- ## ✅ What this template does - Lets you **chat naturally** with an AI to add new project tasks - Automatically detects if the user wants to *create* or *update* an item (updates coming soon) - Collects `Task`, `Description`, and `Status` fields — allows “don’t know” responses - Appends new entries directly into your connected **Google Sheets** - Provides real-time confirmation when the task is added > **Trigger:** n8n Chat Trigger > **Integrations:** OpenAI GPT-4.1-mini + Google Sheets (OAuth2) --- ## 🧠 How it works 1. The **Chat Trigger** starts a chat with the user. 2. The **AI Project Manager Agent** asks guiding questions to gather the task name, description, and status. 3. When all fields are complete (`all Info = Yes`), the data is passed to the **Google Sheets node**. 4. The task is automatically added to your project tracker sheet. 5. The AI confirms completion in chat. --- ## ⚙️ Setup instructions ### 1. Connect OpenAI 1. Go to [OpenAI Platform](https://platform.openai.com/api-keys) → copy your API key. 2. In n8n, create **New Credentials → OpenAI API** and paste your key. 3. Ensure your account has active billing under [OpenAI Billing](https://platform.openai.com/settings/organization/billing/overview). ### 2. Connect Google Sheets (OAuth2) 1. In **n8n → Credentials → New → Google Sheets (OAuth2)** 2. Sign in with your Google account and **grant access**. 3. Select your spreadsheet and tab (e.g., “Tasks”) when prompted. - Example sheet: `https://docs.google.com/spreadsheets/d/1pbK-B-Q9p8fVjxJIsjEVrAfRgqEPCeYw8rZojZPAb84/edit` ### 3. Test your chat Click **Execute Workflow**, then start chatting: > “Add a task for reviewing the project report tomorrow.” The agent will ask questions if needed, then add the record to your sheet. --- ## 🧩 Customization ideas - Add a **Date Added** or **Assigned To** column to the Google Sheet - Integrate with **Slack or Outlook** to message assigned users - Extend the agent to support task *updates* and *deletes* - Replace Google Sheets with **Airtable** or **Notion** if preferred --- ## 🪄 Requirements - n8n version ≥ 1.100 - OpenAI API key - Google Sheets account --- ## 📬 Contact Need help customizing this (e.g., adding deadlines, linking to Notion, or Slack notifications)? - 📧 **[email protected]** - 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** - 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
Project Management
28 Oct 2025
9616
0
Workflow preview: Course recommendation system for surveys with data tables and GPT-4.1-Mini
Free advanced

Course recommendation system for surveys with data tables and GPT-4.1-Mini

Use the **n8n Data Tables** feature to store, retrieve, and analyze survey results — then let OpenAI automatically recommend the most relevant course for each respondent. --- ## 🧠 What this workflow does This workflow demonstrates how to use **n8n’s built-in Data Tables** to create an internal recommendation system powered by AI. It: - Collects **survey responses** through a Form Trigger - Saves responses to a **Data Table** called `Survey Responses` - Fetches a list of available courses from another **Data Table** called `Courses` - Passes both Data Tables into an **OpenAI Chat Agent**, which selects the most relevant course - Returns a structured recommendation with: - `course`: the course title - `reasoning`: why it was selected > **Trigger:** Form submission (manual or public link) --- ## 👥 Who it’s for Perfect for educators, training managers, or anyone wanting to use **n8n Data Tables** as a lightweight internal database — ideal for AI-driven recommendations, onboarding workflows, or content personalization. --- ## ⚙️ How to set it up ### 1️⃣ Create your **n8n Data Tables** This workflow uses **two Data Tables** — both created directly inside n8n. #### 🧾 Table 1: `Survey Responses` Columns: - `Name` - `Q1` — Where did you learn about n8n? - `Q2` — What is your experience with n8n? - `Q3` — What kind of automations do you need help with? To create: 1. Add a **Data Table node** to your workflow. 2. From the list, click **“Create New Data Table.”** 3. Name it **Survey Responses** and add the columns above. --- #### 📚 Table 2: `Courses` Columns: - `Course` - `Description` To create: 1. Add another **Data Table node**. 2. Click **“Create New Data Table.”** 3. Name it **Courses** and create the columns above. 4. Copy course data from this Google Sheet: 👉 https://docs.google.com/spreadsheets/d/1Y0Q0CnqN0w47c5nCpbA1O3sn0mQaKXPhql2Bc1UeiFY/edit?usp=sharing This **Courses Data Table** is where you’ll store all available learning paths or programs for the AI to compare against survey inputs. --- ### 2️⃣ Connect OpenAI 1. Go to [OpenAI Platform](https://platform.openai.com/api-keys) 2. Create an API key 3. In n8n, open **Credentials → OpenAI API** and paste your key 4. The workflow uses the **gpt-4.1-mini** model via the LangChain integration --- ## 🧩 Key Nodes Used | Node | Purpose | n8n Feature | |------|----------|-------------| | **Form Trigger** | Collect survey responses | Forms | | **Data Table (Upsert)** | Stores results in `Survey Responses` | **Data Tables** | | **Data Table (Get)** | Retrieves `Courses` | **Data Tables** | | **Aggregate + Set** | Combines and formats table data | Core nodes | | **OpenAI Chat Model (LangChain Agent)** | Analyzes responses and courses | AI | | **Structured Output Parser** | Returns structured JSON output | LangChain | --- ## 💡 Tips for customization - Add more **Data Table columns** (e.g., email, department, experience years) - Use another **Data Table** to store AI recommendations or performance results - Modify the **Agent system message** to customize how AI chooses courses - Send recommendations via **Email**, **Slack**, or **Google Sheets** --- ## 🧾 Why Data Tables? This workflow shows how **n8n’s Data Tables** can act as your internal database: - Create and manage tables directly inside n8n - No external integrations needed - Store structured data for AI prompts - Share tables across multiple workflows All user data and course content are stored securely and natively in **n8n Cloud or Self-Hosted** environments. --- ## 📬 Contact Need help customizing this (e.g., expanding Data Tables, connecting multiple surveys, or automating follow-ups)? - 📧 **[email protected]** - 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** - 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
Lead Generation
10 Oct 2025
3374
0
Workflow preview: Enrich Google Sheets with Dun & Bradstreet data blocks
Free intermediate

Enrich Google Sheets with Dun & Bradstreet data blocks

Automate company enrichment directly in **Google Sheets** using **Dun & Bradstreet (D&B) Data Blocks**. This workflow reads DUNS numbers from a sheet, fetches a **Bearer token** (via Basic Auth → `/v3/token`), calls the **Data Blocks API** for each row (`/v1/data/duns/...`), extracts **Paydex**, and **appends or updates** the sheet. A **Filter** node skips rows already marked `Complete` for efficient, idempotent runs. --- ## ✅ What this template does - Pulls **DUNS** values from a Google Sheet - (Option A) Uses an **HTTP Header Auth** credential for D&B, or (Option B) Dynamically fetches a **Bearer token** from `/v3/token` (Basic Auth) - Calls **D&B Data Blocks** per row to retrieve payment insights - Extracts **Paydex** and **upserts** results back to the sheet - Skips rows already **Complete** --- ## 👤 Who's it for - RevOps/Data teams enriching company lists at scale - SDR/Marketing teams validating firmographic/credit signals - BI/Automation builders who want a **no-code/low-code** enrichment loop --- ## 🧩 How it works (node-by-node) 1. **Get Companies (Google Sheets)** – Reads rows with at least `duns`, `paydex`, `Complete`. 2. **Only New Rows (Filter)** – Passes only rows where `Complete` is empty. 3. **D&B Info (HTTP Request)** – Calls Data Blocks for each DUNS using a header credential (`Authorization: Bearer <token>`). 4. **Keep Score (Set)** – Maps nested JSON to a single **Paydex** field: `{{$json.organization.businessTrading[0].summary[0].paydexScoreHistory[0].paydexScore}}` 5. **Append to g-sheets (Google Sheets)** – **Append or Update** by `duns`, writing `paydex` and setting `Complete = Yes`. > The workflow also includes Sticky Notes with in-canvas setup help. --- ## 🛠️ Setup instructions (from the JSON) ### 1) Connect Google Sheets (OAuth2) 1. In **n8n → Credentials → New → Google Sheets (OAuth2)** and sign in. 2. Use/prepare a sheet with columns like: **duns**, **paydex**, **Complete**. 3. In your **Google Sheets** nodes, select your credential and target spreadsheet/tab. 4. For upsert behavior, set **Operation** to **Append or Update** and **Matching column** to `duns`. > Replace any example Sheet IDs/URLs with **your own** (avoid publishing private IDs). --- ### 2) Get a D&B Bearer Token (Basic Auth → `/v3/token`) — Optional Dynamic Token Node 1. Add/enable **HTTP Request** node named `Get Bearer Token1`. 2. Configure: - **Authentication:** Basic Auth (your D&B **username**/**password**) - **Method:** `POST` - **URL:** `https://plus.dnb.com/v3/token` - **Body Parameters:** `grant_type = client_credentials` - **Headers:** `Accept = application/json` 3. Execute to receive `access_token`. 4. Reference the token in other nodes via: `Authorization: Bearer {{$node["Get Bearer Token1"].json["access_token"]}}` > ⚠️ **Security:** Don't hardcode tokens. Prefer credentials or fetch dynamically. --- ### 3) Call D&B Data Blocks (use Header Auth or dynamic token) **Node:** `D&B Info` (HTTP Request) - **Authentication:** Header Auth (recommended) - **URL:** ``` https://plus.dnb.com/v1/data/duns/{{ $json.duns }}?blockIDs=paymentinsight_L4_v1&tradeUp=hq&customerReference=customer%20reference%20text&orderReason=6332 ``` - **Headers:** - `Accept = application/json` - If not using a stored Header Auth credential, set: `Authorization = Bearer {{$node["Get Bearer Token1"].json["access_token"]}}` > `{{ $json.duns }}` is resolved from the current row provided by **Get Companies**. --- ### 4) Map Paydex and Upsert to Google Sheets - **Keep Score (Set)** - Field **`Paydex`** (Number): ``` {{$json.organization.businessTrading[0].summary[0].paydexScoreHistory[0].paydexScore}} ``` - **Append to g-sheets (Google Sheets)** - **Operation:** `Append or Update` - **Matching column:** `duns` - **Columns mapping:** - `duns = {{ $('Get Companies').item.json.duns }}` - `paydex = {{ $json.Paydex }}` - `Complete = Yes` --- ## 🧪 Test checklist - Add a few test DUNS rows (leave `Complete` blank). - Run the workflow and confirm **Only New Rows** passes expected items. - Check **D&B Info** returns payment insight data. - Confirm **Paydex** is set and the row is updated with `Complete = Yes`. --- ## 🔐 Security & best practices - Store secrets in **Credentials** (HTTP Header Auth/Basic Auth). - Avoid publishing real Sheet IDs or tokens in screenshots/notes. - Consider rate limits and backoff for large sheets. - Log/handle API errors (e.g., invalid DUNS or expired tokens). --- ## 🩹 Troubleshooting - **401/403 from D&B:** Verify credentials/token; ensure correct environment and entitlements. - **Missing Paydex path:** D&B responses vary by subscription/data availability—add guards (IF node) before mapping. - **Rows not updating:** Confirm **Append or Update** is used and **Matching column** exactly matches your sheet header `duns`. - **Filtered out rows:** Ensure `Complete` is truly empty (no spaces) for new items. --- ## 🧯 Customize further - Enrich additional fields (e.g., **viability score**, **portfolio comparison**, **credit limits**). - Add retry logic, batching, or scheduled triggers. - Push results to a **CRM/DB** or notify teams via **Slack/Email**. --- ## 📬 Contact Need help customizing this (e.g., enriching more fields, normalizing responses, or bulk-processing large sheets)? - 📧 **[email protected]** - 🔗 **https://www.linkedin.com/in/robert-breen-29429625/** - 🌐 **https://ynteractive.com**

R
Robert Breen
Lead Generation
23 Sep 2025
157
0
Workflow preview: Extract structured data from D&B company reports with GPT-4o
Free intermediate

Extract structured data from D&B company reports with GPT-4o

Pull a **Dun & Bradstreet Business Information Report** (PDF) by DUNS, convert the response into a binary **PDF file**, extract readable text, and use **OpenAI** to return a **clean, flat JSON** with only the key fields you care about (e.g., report date, Paydex, viability score, credit limit). Includes Sticky Notes for quick setup help and guidance. --- ## ✅ What this template does - **Requests a D&B report** (PDF) for a specific **DUNS** via HTTP - **Converts** the API response into a **binary PDF file** - **Extracts** the text from the PDF for analysis - Uses **OpenAI** with a **Structured Output Parser** to return a **flat JSON** - Designed to be extended to Sheets, databases, or CRMs --- ## 🧩 How it works (node-by-node) 1. **Manual Trigger** — Runs the workflow on demand ("When clicking 'Execute workflow'"). 2. **D&B Report (HTTP Request)** — Calls the D&B Reports API for a Business Information Report (PDF). 3. **Convert to PDF File (Convert to File)** — Turns the D&B response payload into a binary **PDF**. 4. **Extract Binary (Extract from File)** — Extracts text content from the PDF. 5. **OpenAI Chat Model** — Provides the language model context for the analyzer. 6. **Analyze PDF (AI Agent)** — Reads the extracted text and applies strict rules for a flat JSON output. 7. **Structured Output (AI Structured Output Parser)** — Enforces a schema and validates/auto-fixes the JSON shape. 8. *(Optional)* **Get Bearer Token (HTTP Request)** — Template guidance for OAuth token retrieval (shown as disabled; included for reference if you prefer Bearer flows). --- ## 🛠️ Setup instructions (from the JSON) ### 1) D&B Report (HTTP Request) - **Auth:** Header Auth (use an n8n **HTTP Header Auth** credential) - **URL:** https://plus.dnb.com/v1/reports/duns/804735132?productId=birstd&inLanguage=en-US&reportFormat=PDF&orderReason=6332&tradeUp=hq&customerReference=customer%20reference%20text - **Headers:** - `Accept: application/json` - **Credential Example:** `D&B` (HTTP Header Auth) > Put your `Authorization: Bearer <token>` header inside this credential, not directly in the node. ### 2) Convert to PDF File (Convert to File) - **Operation:** `toBinary` - **Source Property:** `contents[0].contentObject` > This takes the PDF content from the D&B API response and converts it to a binary file for downstream nodes. ### 3) Extract Binary (Extract from File) - **Operation:** `pdf` > Produces a `text` field with the extracted PDF content, ready for AI analysis. ### 4) OpenAI Model(s) - **OpenAI Chat Model** - **Model:** `gpt-4o` (as configured in the JSON) - **Credential:** Your stored **OpenAI API** credential (do **not** hardcode keys) - **Wiring:** - Connect **OpenAI Chat Model** as `ai_languageModel` to **Analyze PDF** - Connect another **OpenAI Chat Model** (also `gpt-4o`) as `ai_languageModel` to **Structured Output** ### 5) Analyze PDF (AI Agent) - **Prompt Type:** `define` - **Text:** `={{ $json.text }}` - **System Message (rules):** - You are a precision extractor. Read the provided business report PDF and return only a single flat JSON object with the fields below. - No arrays/lists. - No prose. - If a value is missing, output null. - Dates: `YYYY-MM-DD`. - Numbers: plain numerics (no commas or $). - Prefer most recent or highest-level overall values if multiple are shown. - Never include arrays, nested structures, or text outside of the JSON object. ### 6) Structured Output (AI Structured Output Parser) - **JSON Schema Example:** ```json { "report_date": "", "company_name": "", "duns": "", "dnb_rating_overall": "", "composite_credit_appraisal": "", "viability_score": "", "portfolio_comparison_score": "", "paydex_3mo": "", "paydex_24mo": "", "credit_limit_conservative": "" } ``` - **Auto Fix:** enabled - **Wiring:** Connect as `ai_outputParser` to **Analyze PDF** ### 7) (Optional) Get Bearer Token (HTTP Request) — Disabled example If you prefer fetching tokens dynamically: - **Auth:** Basic Auth (D&B username/password) - **Method:** POST - **URL:** https://plus.dnb.com/v3/token - **Body Parameters:** - `grant_type = client_credentials` - **Headers:** - `Accept: application/json` - **Downstream usage:** Set header `Authorization: Bearer {{$json["access_token"]}}` in subsequent calls. > In this template, the D&B Report node uses Header Auth credential instead. Use one strategy consistently (credentials are recommended for security). --- ## 🧠 Output schema (flat JSON) The analyzer + parser return a single flat object like: ```json { "report_date": "2024-12-31", "company_name": "Example Corp", "duns": "123456789", "dnb_rating_overall": "5A2", "composite_credit_appraisal": "Fair", "viability_score": "3", "portfolio_comparison_score": "2", "paydex_3mo": "80", "paydex_24mo": "78", "credit_limit_conservative": "25000" } ``` --- ## 🧪 Test flow 1. Click **Execute workflow** (Manual Trigger). 2. Confirm **D&B Report** returns the PDF response. 3. Check **Convert to PDF File** for a binary file. 4. Verify **Extract from File** produces a `text` field. 5. Inspect **Analyze PDF** → **Structured Output** for valid JSON. --- ## 🔐 Security notes - Do not hardcode tokens in nodes; use **Credentials** (HTTP Header Auth or Basic Auth). - Restrict who can execute the workflow if it's accessible from outside your network. - Avoid storing sensitive payloads in logs; mask tokens/headers. --- ## 🧩 Customize - Map the structured JSON to **Google Sheets**, **Postgres/BigQuery**, or a **CRM**. - Extend the schema with additional fields (e.g., number of employees, HQ address) — keep it **flat**. - Add validation (**Set/IF** nodes) to ensure required fields exist before writing downstream. --- ## 🩹 Troubleshooting - **Missing PDF text?** Ensure **Convert to File** source property is `contents[0].contentObject`. - **Unauthorized from D&B?** Refresh/verify token; confirm Header Auth credential contains `Authorization: Bearer <token>`. - **Parser errors?** Keep the agent output short and flat; the Structured Output node will auto-fix minor issues. - **Different DUNS/product?** Update the D&B Report URL query params (`duns`, `productId`, etc.). --- ## 🗒️ Sticky Notes (included) - **Overview:** "Fetch D&B Company Report (PDF) → Convert → Extract → Summarize to Structured JSON (n8n)" - Setup snippets for Data Blocks (optional) and Auth flow --- ## 📬 Contact Need help customizing this (e.g., routing the PDF to Drive, mapping JSON to your CRM, or expanding the schema)? 📧 [email protected] 🔗 https://www.linkedin.com/in/robert-breen-29429625/ 🌐 https://ynteractive.com

R
Robert Breen
Document Extraction
23 Sep 2025
166
0
Workflow preview: Analyze images with OpenAI Vision while preserving binary data for reuse
Free intermediate

Analyze images with OpenAI Vision while preserving binary data for reuse

Use this template to **upload an image**, run a first-pass **OpenAI Vision analysis**, then **re-attach the original file (binary/base64)** to the next step using a **Merge** node. The pattern ensures your downstream AI Agent (or any node) can access **both** the original file (`data`) **and** the first analysis result (`content`) at the same time. --- ### ✅ What this template does - **Collects an image file** via **Form Trigger** (binary field labeled `data`) - **Analyzes the image** with **OpenAI Vision** (GPT-4o) using **base64** input - **Merges** the original upload and the analysis result (**combine by position**) so the next node has **both** - **Re-analyzes/uses** the image alongside the first analysis in an **AI Agent** step --- ### 🧩 How it works (Node-by-node) 1. **Form Trigger** - Presents a simple upload form and emits a binary/base64 field named **`data`**. 2. **Analyze image (OpenAI Vision)** - Reads the same **`data`** field as **base64** and runs image analysis with **GPT-4o**. - The node outputs a text **`content`** (first-pass analysis). 3. **Merge (combine by position)** - Combines the two branches so the next node receives **both** the **original upload** (`data`) and the **analysis** (`content`) on the **same item**. 4. **AI Agent** - Receives `data` + `content` together. - Prompt includes the original image (**`=data`**) and the first analysis (**`{{$json.content}}`**) to compare or refine results. 5. **OpenAI Chat Model** - Provides the language model for the Agent (wired as **ai_languageModel**). --- ### 🛠️ Setup Instructions (from the JSON) > Keep it simple: mirror these settings and you’re good to go. **1) Form Trigger (n8n-nodes-base.formTrigger)** - **Path:** `d6f874ec-6cb3-46c7-8507-bd647c2484f0` *(you can change this)* - **Form Title:** `Image Document Upload` - **Form Description:** `Upload a image document for AI analysis` - **Form Fields:** - **Label:** `data` - **Type:** `file` - **Output:** emits a binary/base64 field named **`data`**. **2) Analyze image (@n8n/n8n-nodes-langchain.openAi)** - **Resource:** `image` - **Operation:** `analyze` - **Model:** `gpt-4o` - **Text:** `=data` *(use the uploaded file field)* - **Input Type:** `base64` - **Credentials:** OpenAI (use your stored **OpenAI API** credential) **3) Merge (n8n-nodes-base.merge)** - **Mode:** `combine` - **Combine By:** `combineByPosition` - Connect **Form Trigger → Merge (input 2)** - Connect **Analyze image → Merge (input 1)** - This ensures the original file (`data`) and the analysis (`content`) line up on the same item. **4) AI Agent (@n8n/n8n-nodes-langchain.agent)** - **Prompt Type:** `define` - **Text:** - **System Message:** `analyze the image again and see if you get the same result.` - **Receives:** merged item containing `data` + `content`. **5) OpenAI Chat Model (@n8n/n8n-nodes-langchain.lmChatOpenAi)** - **Model:** `gpt-4.1-mini` - **Wiring:** connect as **ai_languageModel** to the **AI Agent** - **Credentials:** same OpenAI credential as above > **Security Note:** Store API keys in **Credentials** (do **not** hardcode keys in nodes). --- ### 🧠 Why “Combine by Position” fixes the binary issue - Some downstream nodes **lose access** to the original binary once a branch processes it. - By **merging the original branch** (with `data`) and the **analysis branch** (with `content`) **by position**, you restore a **single item** with **both fields**—so the next step can **use the image again** while referencing earlier analysis. --- ### 🧪 Test Tips - Upload a JPG/PNG and **execute** the workflow from the Form Trigger preview. - Confirm **Merge** output contains **both** `data` (binary/base64) **and** `content` (text). - In the **AI Agent**, log or return both fields to verify availability. --- ### 🔧 Customize - Swap GPT-4o for another **Vision-capable** model if needed. - Extend the AI Agent to **extract structured fields** (e.g., objects detected, text, brand cues). - Add a **Router** after Merge to branch into storage (S3, GDrive) or notifications (Slack, Email). --- ### 📝 Requirements - n8n (cloud or self-hosted) with web UI access - **OpenAI** credential configured (Vision support) --- ### 🩹 Troubleshooting - **Binary missing downstream?** Ensure **Merge** receives **both** branches and is set to `combineByPosition`. - **Wrong field name?** The **Form Trigger** upload field must be labeled **`data`** to match node expressions. - **Model errors?** Verify your **OpenAI** credential and that the chosen model supports **image analysis**. --- ### 💬 Sticky Note (included in the workflow) > “Use Binary Field after next step” — This workflow demonstrates how to preserve and reuse an uploaded file (binary/base64) after a downstream step by using a **Merge** node (`combineByPosition`). A user uploads an image via **Form Trigger** → the image is analyzed with **OpenAI Vision** → results are merged back with the original upload so the next **AI Agent** step can access **both** the original file (`data`) and the first analysis (`content`) at the same time. --- ### 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? - 📧 **[email protected]** - 🔗 **https://www.linkedin.com/in/robert-breen-29429625/** - 🌐 **https://ynteractive.com**

R
Robert Breen
Engineering
23 Sep 2025
869
0
Workflow preview: Voice-driven AI assistant using VAPI and GPT-4.1-mini with memory
Free intermediate

Voice-driven AI assistant using VAPI and GPT-4.1-mini with memory

### Send VAPI voice requests into n8n with memory and OpenAI for conversational automation This template shows how to capture **voice interactions from VAPI (Voice AI Platform)**, send them into n8n via a webhook, process them with OpenAI, and maintain context with memory. The result is a conversational AI agent that responds back to VAPI with short, business-focused answers. --- ## ✅ What this template does - Listens for **POST requests** from VAPI containing the session ID and user query - Extracts session ID and query for consistent conversation context - Uses **OpenAI (GPT-4.1-mini)** to generate conversational replies - Adds **Memory Buffer Window** so each VAPI session maintains history - Returns results to VAPI in the correct JSON response format --- ## 👤 Who’s it for - Developers and consultants building **voice-driven assistants** - Businesses wanting to connect **VAPI calls into automation workflows** - Anyone who needs a **scalable voice → AI → automation pipeline** --- ## ⚙️ How it works 1. **Webhook node** catches incoming VAPI requests 2. **Set node** extracts `session_id` and `user_query` from the request body 3. **OpenAI Agent** generates short, conversational replies with your business context 4. **Memory node** keeps conversation history across turns 5. **Respond to Webhook** sends results back to VAPI in the required JSON schema --- ## 🔧 Setup instructions ### Step 1: Create Function Tool in VAPI 1. In your VAPI dashboard, create a new **Function Tool** - **Name**: `send_to_n8n` - **Description**: Send user query and session data to n8n workflow - **Parameters**: - `session_id` (string, required) – Unique session identifier - `user_query` (string, required) – The user’s question - **Server URL**: `https://your-n8n-instance.com/webhook/vapi-endpoint` ### Step 2: Configure Webhook in n8n 1. Add a **Webhook node** 2. Set HTTP method to `POST` 3. Path: `/webhook/vapi-endpoint` 4. Save, activate, and copy the webhook URL 5. Use this URL in your VAPI Function Tool configuration ### Step 3: Create VAPI Assistant 1. In VAPI, create a new **Assistant** 2. Add the `send_to_n8n` Function Tool 3. Configure the assistant to call this tool on user requests 4. Test by making a voice query — you should see n8n respond --- ## 📦 Requirements - An **OpenAI API key** stored in n8n credentials - A **VAPI account** with access to Function Tools - A self-hosted or cloud n8n instance with webhook access --- ## 🎛 Customization - Update the **system prompt** in the OpenAI Agent node to reflect your brand’s voice - Swap GPT-4.1-mini for another OpenAI model if you need longer or cheaper responses - Extend the workflow by connecting to **CRMs, Slack, or databases** --- ## 📬 Contact Need help customizing this (e.g., filtering by campaign, connecting to CRMs, or formatting reports)? - 📧 **[email protected]** - 🔗 **https://www.linkedin.com/in/robert-breen-29429625/** - 🌐 **https://ynteractive.com**

R
Robert Breen
AI Chatbot
23 Sep 2025
675
0
Workflow preview: Create Monday.com board items from Jotform submissions with field mapping
Free intermediate

Create Monday.com board items from Jotform submissions with field mapping

Capture new Jotform submissions and instantly create items on a Monday.com board with mapped columns (email, date, dropdowns, instructions, etc.). --- ## 🛠️ Setup — Jotform (simple) 1. **Add your Jotform API key** (Jotform Account → Settings → API → Create Key). 2. **Create your form template** in Jotform (use fields like Name, Email, Start Date, Engagement Type, Campaign Type, Instructions). 3. In **n8n**, open the **Jotform Trigger** node and **choose your Jotform template/form** from the dropdown. That’s it. --- ## 🛠️ Setup — Monday.com 1. In Monday.com, **generate an API token** (Admin/Developers → API). 2. In **n8n → Credentials → New → Monday.com**, paste your **API token**. 3. Identify and set: - **Board ID** (from your board URL or via node “List” operations) - **Group ID** (e.g., `topics`) - **Column IDs** that match your board (examples used by this workflow): - `text_mkvdj8v3` → Email (Text) - `date_mkvdg4aa` → Start Date (Date) - `dropdown_mkvdjwra` → Engagement Type (Dropdown) - `dropdown_mkvdd9v3` → Campaign Type (Dropdown) - `text_mkvd2md9` → Campaign Type (as Text label) - `text_mkvd1bj2` → Instructions (Text) - `text_mkvd5w3y` → Domain (Text) 4. Update the **label → ID mappings** inside the Monday.com node if your dropdown IDs differ (e.g., `Engagement A` → `1`, `Engagement B` → `2`). --- ## ✅ Notes (best practices) - No secrets in nodes: store tokens in **n8n Credentials**. - Use the included **Sticky Notes** for quick reference inside the workflow. - Test once in Jotform to see the payload flow into Monday. --- ## 📬 Contact Need help customizing this (e.g., extra fields, file uploads, or routing by campaign)? - 📧 **[email protected]** - 🔗 **Robert Breen** — https://www.linkedin.com/in/robert-breen-29429625/ - 🌐 **ynteractive.com** — https://ynteractive.com

R
Robert Breen
Project Management
4 Sep 2025
185
0
Workflow preview: Generate and split sample data records using JavaScript and Python
Free intermediate

Generate and split sample data records using JavaScript and Python

A minimal, plug-and-play workflow that generates **sample data** using n8n’s **Code** node (both JavaScript and Python versions included) and then **fans out** those records into individual items with **Split Out**. Perfect for testing downstream nodes, mapping, pagination, or prototyping list-based logic without connecting to real data sources. --- ## ✅ What this template does - **Generates 20 sample records** with fields: `index`, `num`, and `test` - Writes the array to `item.json.barr` - Uses **Split Out** to convert the array into **one item per record** - Includes **both JavaScript and Python** implementations side-by-side --- ## 👤 Who’s it for - Builders who need **mock data** to test mappings and loops - Educators/demo makers who want a simple **fan-out pattern** - Anyone who wants a **clean scaffold** for list processing in n8n --- ## ⚙️ How it works / What it does - **Generate Data Javascript**: Produces `barr = [{ index, num, test }, ...]` for 20 rows - **Generate Data Python**: Same output, written in Python - **Split Out Javascript / Python**: Takes `barr` and emits **one item per element** > Use either branch (JS or Python) depending on your preference. --- ## 🛠️ How to set up No external setup required. Import the workflow and **Execute**. Select either the **JavaScript** or **Python** branch to see fan-out items. --- ## 📋 Requirements - n8n (Cloud or self-hosted) - No credentials or third-party services --- ## 🎛️ Customize the workflow - Change the number of rows: adjust the loop range (`0..N`) - Rename or add fields to each record (e.g., `name`, `price`, `tags`) - Replace the static array with generated or randomized data - Pipe the Split Out output into **Set**, **Function**, **HTTP**, or **Sheets** nodes for further testing --- ## 🗒️ Notes - Sticky notes are included for in-editor guidance. - Both implementations return the same structure so you can swap freely. --- ## 📬 Contact Need help customizing this (e.g., shaping fields, adding randomizers, or exporting to CSV/Sheets)? - 📧 **[email protected]** - 🔗 **Robert Breen** — https://www.linkedin.com/in/robert-breen-29429625/ - 🌐 **ynteractive.com** — https://ynteractive.com

R
Robert Breen
Engineering
4 Sep 2025
1011
0
Workflow preview: Research business leads with Perplexity AI & save to Google Sheets using OpenAI
Free advanced

Research business leads with Perplexity AI & save to Google Sheets using OpenAI

Automatically research new leads in your target area, structure the results with AI, and append them into Google Sheets — all orchestrated in n8n. --- ## ✅ What this template does - Uses **Perplexity** to research businesses (coffee shops in this example) with company name + email - Cleans and structures the output into proper JSON using **OpenAI** - Appends the new leads directly into **Google Sheets**, skipping duplicates > **Trigger:** Manual — “Start Workflow” --- ## 👤 Who’s it for - **Sales & marketing teams** who need to prospect local businesses - **Agencies** running outreach campaigns - **Freelancers** and consultants looking to automate lead research --- ## ⚙️ How it works 1. **Set Location** → define your target area (e.g., *Hershey PA*) 2. **Get Current Leads** → pull existing data from your Google Sheet to avoid duplicates 3. **Research Leads** → query Perplexity for 20 businesses, excluding already-scraped ones 4. **Write JSON** → OpenAI converts Perplexity output into structured Company/Email arrays 5. **Split & Merge** → align Companies with Emails row-by-row 6. **Send Leads to Google Sheets** → append or update leads in your sheet --- ## 🛠️ Setup instructions Follow these sticky-note setup steps (already included in the workflow): ### 1) Connect Google Sheets (OAuth2) 1. In **n8n → Credentials → New → Google Sheets (OAuth2)** 2. Sign in with your Google account and grant access 3. In the Google Sheets node, select your **Spreadsheet** and **Worksheet** Example sheet: https://docs.google.com/spreadsheets/d/1MnaU8hSi8PleDNVcNnyJ5CgmDYJSUTsr7X5HIwa-MLk/edit#gid=0 --- ### 2) Connect Perplexity (API Key) 1. Sign in at https://www.perplexity.ai/account 2. Generate an API key: https://docs.perplexity.ai/guides/getting-started 3. In **n8n → Credentials → New → Perplexity API**, paste your key --- ### 3) Connect OpenAI (API Key) 1. In **n8n → Credentials → New → OpenAI API** 2. Paste your **OpenAI API key** 3. In the **OpenAI Chat Model** node, select your credential and a **vision-capable model** (e.g., `gpt-4o-mini`, `gpt-4o`) --- ## 🔧 Requirements - A free **Google account** - An **OpenAI API key** (https://platform.openai.com) - A **Perplexity API key** (https://docs.perplexity.ai) - n8n self-hosted or cloud instance --- ## 🎨 How to customize - Change the **Search Area** in the `Set Location` node - Modify the **Perplexity system prompt** to target different business types (e.g., gyms, salons, restaurants) - Expand the Google Sheet schema to include more fields (phone, website, etc.) --- ## 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your Google Sheet)? - 📧 **[email protected]** - 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** - 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
Lead Generation
4 Sep 2025
740
0
Workflow preview: Email new leads from Google Sheets via Outlook on a schedule
Free intermediate

Email new leads from Google Sheets via Outlook on a schedule

Send a templated outreach email to **new leads** in a Google Sheet on a **daily schedule**, then **mark each lead as contacted** so they won’t be emailed twice. Built with: **Schedule Trigger → Google Sheets → Filter → Outlook Send Email → Google Sheets (append/update)**. --- ## Who’s it for - Teams that collect leads in Sheets and want automatic, once-per-lead outreach - Solo founders/consultants running simple top-of-funnel follow-ups - Anyone prototyping CRM-lite flows in n8n --- ## How it works / What it does 1. **Schedule Trigger** runs daily (e.g., 09:00). 2. **Get rows (Google Sheets)** pulls leads from your sheet. 3. **Filter** keeps only rows where `Contacted` is **empty**. 4. **Outlook – Send a message** emails each remaining lead (`to = {{$json.Email}}`). 5. **Google Sheets (appendOrUpdate)** writes back `Contacted = Yes` (match on `Email`) to prevent re-sending. --- ## How to set up ### 1) Google Sheets (OAuth2) - In **n8n → Credentials → New → Google Sheets (OAuth2)**, sign in and grant access. - In **Get rows**: select your **Lead Source** sheet (e.g., “New Leads”). - In **Append/Update**: select the sheet you want to mark as contacted (can be the same sheet or a CRM sheet). - Make sure your sheet has at least: `Email`, `Contacted` (blank for new). ### 2) Outlook (Microsoft Graph) OAuth2 - **n8n Cloud (quick connect):** - In **Credentials → New → Microsoft Outlook OAuth2**, choose **Connect**, sign in with your Microsoft account, and accept permissions. - **Self-hosted (Azure App Registration):** 1) Azure Portal → **App registrations** → **New registration**. 2) Add redirect URL: `https://YOUR_N8N_URL/rest/oauth2-credential/callback`. 3) **API permissions (Delegated):** `offline_access`, `Mail.Send`, `User.Read`. Grant admin consent if required. 4) Create a **Client secret**; copy **Application (client) ID** and **Directory (tenant) ID**. 5) In n8n, create **Microsoft Outlook OAuth2** credential with those values and scopes: `offline_access Mail.Send openid email profile`. - In the **Send a message** node, select that credential and keep `To` mapped to `{{$json.Email}}`. - Customize **Subject** and **Body** to your brand (default provided). --- ## Requirements - A Google Sheet of leads with an **Email** column (and an optional **Contacted** column). - A Microsoft account with permission to send mail via **Microsoft Graph** (Outlook). --- ## How to customize the workflow - **Email content:** Edit subject/body (add variables like `{{$json.FirstName}}`). - **Send window:** Change Schedule Trigger time/interval. - **Idempotency:** Use the same sheet for read+write (match on `Email`) or separate source/CRM sheets. - **Logging:** Add a Sheet or Notion append step after send for an audit trail. --- ## Security & best practices - **No hardcoded secrets:** Always use **OAuth2 credentials** (Sheets & Outlook). - **Sanitize PII:** If publishing, **remove real spreadsheet IDs/URLs** and personal emails/phone numbers. - **Sticky notes are mandatory:** Add a **yellow sticky note** in the workflow with this description and a short setup checklist. Use neutral sticky notes for per-node tips. --- ## Troubleshooting - **No emails sent:** Ensure `Contacted` is blank for new rows and the Filter checks “is empty.” - **Auth errors (Outlook):** Reconnect credential; confirm Graph scopes and tenant consent. - **Sheets errors:** Re-select spreadsheet/tab; verify headers (`Email`, `Contacted`). - **Duplicates:** Confirm **Append/Update** uses **matching column = Email**. --- ## Tags & category **Category:** Sales & CRM **Tags:** outreach, follow-up, google-sheets, outlook, schedule, dedupe, cold-email --- ## Changelog **v1.0.0** — Initial release: scheduled outreach, Outlook email, de-dupe via `Contacted = Yes`. --- ## Contact Need help customizing (personalization, rate-limits, CRM sync, bounce checks)? - 📧 [email protected] - 🔗 Robert Breen — https://www.linkedin.com/in/robert-breen-29429625/ - 🌐 ynteractive.com — https://ynteractive.com

R
Robert Breen
Lead Nurturing
1 Sep 2025
2572
0
Workflow preview: Ai-powered degree audit system with Google Sheets and GPT-5
Free intermediate

Ai-powered degree audit system with Google Sheets and GPT-5

Run an **AI-powered degree audit** for each senior student. This template reads student rows from Google Sheets, evaluates completed courses against **hard-coded program requirements**, and writes back an **AI Degree Summary** of what's still missing (major core, Gen Eds, major electives, and upper-division credits). It's designed for quick advisor/registrar review and SIS prototypes. **Trigger:** Manual — *When clicking "Execute workflow"* **Core nodes:** Google Sheets, OpenAI Chat Model, (optional) Structured Output Parser **Programs included:** Computer Science BS, Business Administration BBA, Psychology BA, Mechanical Engineering BS, Biology BS (Pre-Med), English Literature BA, Data Science BS, Nursing BSN, Economics BA, Graphic Design BFA ## Who's it for - **Registrars & advisors** who need fast, consistent degree checks - **Student success teams** building prototype dashboards - **SIS/EdTech builders** exploring AI-assisted auditing ## How it works 1. **Read seniors** from Google Sheets (`Senior_data`) with: `StudentID`, `Name`, `Program`, `Year`, `CompletedCourses`. 2. **AI Agent** compares `CompletedCourses` to built-in requirements (per program) and computes **Missing** items + a short **Summary**. 3. **Write back** to the same sheet using "Append or update" by `StudentID` (updates **AI Degree Summary**; you can also map the raw **Missing** array to a column if desired). **Example JSON (for one student):** { "StudentID": "S001", "Program": "Computer Science BS", "Missing": [ "GEN-REMAIN | General Education credits remaining | 6", "CS-EL-REM | CS Major Electives (200+ level) | 6", "UPPER-DIV | Additional Upper-Division (200+ level) credits needed | 18", "FREE-EL | Free Electives to reach 120 total credits | 54" ], "Summary": "All core CS courses are complete. Still need 6 Gen Ed credits, 6 CS electives, and 66 total credits overall, including 18 upper-division credits — prioritize 200/300-level CS electives." } ## Setup (2 steps) ### 1) Connect Google Sheets (OAuth2) In n8n → Credentials → New → Google Sheets (OAuth2) and sign in. In the Google Sheets nodes, select your spreadsheet and the Senior_data tab. Ensure your input sheet has at least: StudentID, Name, Program, Year, CompletedCourses. ### 2) Connect OpenAI (API Key) In n8n → Credentials → New → OpenAI API, paste your key. In the OpenAI Chat Model node, select that credential and a model (e.g., gpt-4o or gpt-5). ## Requirements - **Sheet columns:** StudentID, Name, Program, Year, CompletedCourses - **CompletedCourses format:** pipe-separated IDs (e.g., GEN-101|GEN-103|CS-101). - **Program labels:** should match the built-in list (e.g., Computer Science BS). - **Credits/levels:** Template assumes upper-division ≥ 200-level (adjust the prompt if your policy differs). ## Customization - **Change requirements:** Edit the Agent's system message to update totals, core lists, elective credit rules, or level thresholds. - **Store more output:** Map Missing to a new column (e.g., AI Missing List) or write rows to a separate sheet for dashboards. - **Distribute results:** Email summaries to advisors/students (Gmail/Outlook), or generate PDFs for advising folders. - **Add guardrails:** Extend the prompt to enforce residency, capstone, minor/cognate constraints, or per-college Gen Ed variations. ## Best practices (per n8n guidelines) - **Sticky notes are mandatory:** Include a yellow sticky note that contains this description and quick setup steps; add neutral sticky notes for per-step tips. - **Rename nodes clearly:** e.g., "Get Seniors," "Degree Audit Agent," "Update Summary." - **No hardcoded secrets:** Use credentials—not inline keys in HTTP or Code nodes. - **Sanitize identifiers:** Don't ship personal spreadsheet IDs or private links in the published version. - **Use a Set node for config:** Centralize user-tunable values (e.g., column names, tab names). ## Troubleshooting - **OpenAI 401/429:** Verify API key/billing; slow concurrency if rate-limited. - **Empty summaries:** Check column names and that CompletedCourses uses |. - **Program mismatch:** Align Program labels to those in the prompt (exact naming recommended). - **Sheets auth errors:** Reconnect Google Sheets OAuth2 and re-select spreadsheet/tab. ## Limitations - **Not an official audit:** It infers gaps from the listed completions; registrar rules can be more nuanced. - **Catalog drift:** Requirements are hard-coded in the prompt—update them each term/year. - **Upper-division heuristic:** Adjust the level threshold if your institution defines it differently. ## Tags & category **Category:** Education / Student Information Systems **Tags:** degree-audit, registrar, google-sheets, openai, electives, upper-division, graduation-readiness ## Changelog **v1.0.0** — Initial release: Senior_data in/out, 10 programs, AI Degree Summary output, append/update by StudentID. ## Contact Need help tailoring this to your catalog (e.g., per-college Gen Eds, capstones, minors, PDFs/email)? 📧 [email protected] 📧 [email protected] 🔗 Robert Breen — https://www.linkedin.com/in/robert-breen-29429625/ 🌐 ynteractive.com — https://ynteractive.com

R
Robert Breen
Internal Wiki
1 Sep 2025
483
0
Workflow preview: Generate student course schedules based on prerequisites with GPT and Google Sheets
Free intermediate

Generate student course schedules based on prerequisites with GPT and Google Sheets

Create a **Fall 2025 course schedule** for each student based on what they’ve already completed, catalog prerequisites, and term availability (Fall/Both). Reads students from Google Sheets → asks an AI agent to select **exactly 5 courses** (target **15–17 credits**, no duplicates, prereqs enforced) → appends each student’s schedule to a **schedule** tab. --- ### 🧠 Summary - **Trigger:** Manual — *“When clicking ‘Execute workflow’”* - **I/O:** Google Sheets in → OpenAI decisioning → Google Sheets out - **Ideal for:** Registrars, advisors, degree-planning prototypes --- ### ✅ What this template does - **Reads**: `StudentID, Name, Program, Year, CompletedCourses` (pipe-separated CourseIDs) from **Sheet1** - **Decides**: AI **Scheduling Agent** chooses **5 courses** per student following catalog rules and prerequisites - **Writes**: Appends `StudentID` + `Schedule` strings to **schedule** worksheet - **Credits target**: 15–17 total per term - **Catalog rules** (enforced in the agent prompt): - Use **Fall** or **Both** courses for Fall 2025 - Enforce **AND** prereqs (e.g., `CS-102|CS-103` means both) - Priority: **Major Core → Major Elective → Gen Ed** (include Gen Ed if needed) - No duplicates or already-completed courses - Prefer **200-level** progression when prereqs allow --- ### ⚙️ Setup (only 2 steps) #### 1) Connect Google Sheets (OAuth2) - In **n8n → Credentials → New → Google Sheets (OAuth2)**, sign in and grant access - In the Google Sheets nodes, select your spreadsheet and these tabs: - **Sheet1** (input students) - **schedule** (output) > Example spreadsheet (replace with your own): > - Input: `.../edit#gid=0` > - Output: `.../edit#gid=572766543` #### 2) Connect OpenAI (API Key) - In **n8n → Credentials → New → OpenAI API**, paste your key - In the **OpenAI Chat Model** node, select that credential and a chat model (e.g., `gpt-4o`) --- ### 📥 Required input (Sheet1) - **Columns**: `StudentID`, `Name`, `Program`, `Year`, `CompletedCourses` - **CompletedCourses**: pipe-separated IDs (e.g., `GEN-101|GEN-103|CS-101`) - **Program** names should match those referenced in the embedded catalog (e.g., *Computer Science BS*, *Business Administration BBA*, etc.) --- ### 📤 Output (schedule tab) - **Columns**: - `StudentID` - `Schedule` → a selected course string (written one row per course after splitting) --- ### 🧩 Nodes in this template - **Manual Trigger** → **Get Student Data (Google Sheets)** → **Scheduling Agent (OpenAI)** → **Split Schedule** → **Set Fields** → **Clear sheet** → **Append Schedule (Google Sheets)** --- ### 🛠 Configuration tips - If you rename tabs, update both **Get Student Data** and **Append Schedule** nodes - Keep `CompletedCourses` consistent (use `|` as the delimiter) - To store **rationale** as well, add a column to the output and map it from the agent’s JSON --- ### 🧪 Test quickly 1) Add 2–3 sample student rows with realistic `CompletedCourses` 2) Run the workflow and verify: - 5 course rows per student in **schedule** - Course IDs respect prereqs & Fall/Both availability - Credits sum ~15–17 --- ### 🧯 Troubleshooting - **Sheets OAuth error:** Reconnect “Google Sheets (OAuth2)” and re-select the spreadsheet & tabs - **Empty schedules:** Ensure `CompletedCourses` uses `|` and that programs/courses align with the provided catalog names - **Prereq violations:** Check that students actually have all AND-prereqs in `CompletedCourses` - **OpenAI errors (401/429):** Verify API key, billing, and rate-limit → retry with lower concurrency --- ### 🔒 Privacy & data handling - Student rows are sent to OpenAI for decisioning. Remove or mask any fields you don’t want shared. - Google Sheets retains input/output. Use spreadsheet sharing controls to limit access. --- ### 💸 Cost & performance - **OpenAI**: Billed per token; cost scales with student count and prompt size - **Google Sheets**: Free within normal usage limits - **Runtime**: Typically seconds to a minute depending on rows and rate limits --- ### 🧱 Limitations & assumptions - Works for **Fall 2025** only (as written). For **Spring**, update availability rules in the agent prompt - Assumes **catalog** in the agent system message is your source of truth - Assumes **Program** names match catalog variants (case/spacing matter for clarity) --- ### 🧩 Customization ideas - Add a **Max Credits** column to cap term credits per student - Include **Rationale** in the sheet for advisor review - Add a **“Gen Ed needed?”** flag per student to force at least one Gen Ed selection - Export to PDF or email the schedules to advisors/students --- ### 🧾 Version & maintenance - **n8n version:** Tested on recent n8n Cloud builds (2025) - **Community nodes:** Not required - **Maintenance:** Update the embedded catalog and offerings each term; keep prerequisites accurate --- ### 🗂 Tags & category - **Category:** Education / Student Information Systems - **Tags:** scheduling, registrar, google-sheets, openai, prerequisites, degree-planning, catalog, fall-term --- ### 🗒 Changelog - **v1.0.0** — Initial release: Sheets in/out, Fall 2025 catalog rules, prereq enforcement, 5-course selection, credits target --- ## 📬 Contact Need help customizing this (e.g., cohort logic, program-specific rules, adding rationale to the sheet, or emailing PDFs)? - 📧 **[email protected]** - 🔗 **Robert Breen** — https://www.linkedin.com/in/robert-breen-29429625/ - 🌐 **ynteractive.com** — https://ynteractive.com

R
Robert Breen
Project Management
1 Sep 2025
468
0
Workflow preview: Aggregate marketing spend data with custom pivots & VLOOKUPs in Google Sheets
Free intermediate

Aggregate marketing spend data with custom pivots & VLOOKUPs in Google Sheets

This workflow transforms raw marketing data from Google Sheets into a **pivot-like summary table**. It merges lookup data, groups spend by name, and appends the results into a clean reporting tab — all automatically, without needing to manually build pivot tables in Sheets. --- ## 🧑‍💻 Who’s it for - Marketing analysts who track **channel spend** across campaigns - Small businesses that rely on **Google Sheets** for reporting - Teams that need **automated daily rollups** without rebuilding pivot tables manually --- ## ⚙️ How it works 1. **Get Marketing Data (Google Sheets)** – Pulls raw spend data. 2. **Vlookup Data (Google Sheets)** – Brings in reference/lookup fields (e.g., channel labels). 3. **Merge Tables** – Joins marketing data and lookup data on the `Channel` column. 4. **Summarize** – Groups data by `Name` and sums up **Spend ($)**. 5. **Clear Sheet** – Wipes the reporting tab to avoid duplicates. 6. **Append to Pivot Sheet** – Writes the aggregated results into the "render pivot" sheet. The result: a pivot-style summary table inside Google Sheets, automatically refreshed by n8n. --- ## 🔑 Setup Instructions ### 1) Connect Google Sheets (OAuth2) 1. In **n8n → Credentials → New → Google Sheets (OAuth2)** 2. Sign in with your Google account and grant access 3. In each Google Sheets node, select your **Spreadsheet** and the appropriate **Worksheet**: - **data** (raw spend) - **Lookup** (channel reference table) - **render pivot** (output tab) ### 2) Configure Summarize Node - Group by: `Name` - Summarize: `Spend ($)` → `sum` ### 3) Test the Workflow 1. Execute the workflow manually 2. Check your "render pivot" tab — it should display aggregated spend by Name --- ## 🛠️ How to customize - Change grouping fields (e.g., by **Channel**, **Campaign**, or **Region**) - Add more aggregations (e.g., **average CPC**, **max impressions**) - Use the **Merge node** to join extra data sources before summarizing - Schedule execution to run **daily** for fresh rollups --- ## 📋 Requirements - n8n (Cloud or self-hosted) - Google Sheets account with structured data in `data` and `Lookup` tabs --- ## 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your pivot)? - 📧 **[email protected]** - 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** - 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
Document Extraction
31 Aug 2025
173
0
Workflow preview: Instagram visual analysis with Apify scraping, OpenAI GPT-5 & Google Sheets
Free intermediate

Instagram visual analysis with Apify scraping, OpenAI GPT-5 & Google Sheets

Pull recent Instagram post media for any username, fetch the image binaries, and run automated **visual analysis** with OpenAI — all orchestrated inside n8n. This workflow uses a Google Sheet to supply target usernames, calls Apify’s Instagram Profile Scraper to fetch recent posts, downloads the images, and passes them to an OpenAI vision-capable model for structured analysis. Results can then be logged, stored, or routed onward depending on your use case. --- ## 🧑‍💻 Who’s it for - Social media managers analyzing competitor or brand posts - Marketing teams tracking visual trends and campaign content - Researchers collecting structured insights from Instagram images --- ## ⚙️ How it works 1. **Google Sheets** – Supplies Instagram usernames (one per row). 2. **Apify Scraper** – Fetches latest posts (images and metadata). 3. **HTTP Request** – Downloads each image binary. 4. **OpenAI Vision Model** – Analyzes visuals and outputs structured summaries. 5. **Filter & Split Nodes** – Ensure only the right rows and posts are processed. --- ## 🔑 Setup Instructions ### 1) Connect Google Sheets (OAuth2) 1. Go to **n8n → Credentials → New → Google Sheets (OAuth2)** 2. Sign in with your Google account and grant access 3. In the **Get Google Sheet** node, select your spreadsheet + worksheet (must contain a `User` column with Instagram usernames) ### 2) Connect Apify (HTTP Query Auth) 1. Get your **Apify API token** at [Apify Console → Integrations/API](https://console.apify.com/account/integrations) 2. In **n8n → Credentials → New → HTTP Query Auth**, add a query param `token=<YOUR_APIFY_TOKEN>` 3. In the **Scrape Details** node, select that credential and use the provided URL: ### 3) Connect OpenAI (API Key) 1. Create an API key at [OpenAI Platform](https://platform.openai.com/api-keys) 2. In **n8n → Credentials → New → OpenAI API**, paste your key 3. In the **OpenAI Chat Model** node, select your credential and choose a **vision-capable model** (`gpt-4o-mini`, `gpt-4o`, or `gpt-5` if available) --- ## 🛠️ How to customize - Change the **Google Sheet schema** (e.g., add campaign tags or notes). - Adjust the **OpenAI system prompt** to refine what details are extracted (e.g., brand logos, colors, objects). - Route results to **Slack, Notion, or Airtable** instead of storing only in Sheets. - Apply filters (hashtags, captions, or timeframe) directly in the Apify scraper config. --- ## 📋 Requirements - n8n (Cloud or self-hosted) - Google Sheets account - Apify account + API token - OpenAI API key with a funded account --- ## 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? - 📧 **[email protected]** - 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** - 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
Market Research
31 Aug 2025
904
0
Workflow preview: Create multi-sheet Excel workbooks by merging datasets with Google Drive & Sheets
Free intermediate

Create multi-sheet Excel workbooks by merging datasets with Google Drive & Sheets

### Create multi-sheet Excel workbooks in n8n to automate reporting using Google Drive + Google Sheets Build an automated Excel file with **multiple tabs** directly in n8n. Two Code nodes generate datasets, each is converted into its own Excel **worksheet**, then combined into a single `.xlsx` and (optionally) appended to a Google Sheet for sharing—eliminating manual copy-paste and speeding up reporting. ## Who’s it for - Teams that publish recurring reports as Excel with **multiple tabs** - Ops/Marketing/Data folks who want a **no-code/low-code** way to package JSON into Excel - n8n beginners learning the **Code → Convert to File → Merge** pattern ## How it works 1. **Manual Trigger** starts the run. 2. **Code** nodes emit JSON rows for each table (e.g., People, Locations). 3. **Convert to File** nodes turn each JSON list into an Excel binary, assigning **Sheet1/Sheet2** (or your names). 4. **Merge** combines both binaries into a single Excel workbook with **multiple tabs**. 5. **Google Sheets (optional)** appends the JSON rows to a live spreadsheet for collaboration. ## Setup (only 2 connections) ### 1️⃣ Connect Google Sheets (OAuth2) 1. In **n8n → Credentials → New → Google Sheets (OAuth2)** 2. Sign in with your Google account and grant access 3. **Copy the example sheet referenced in the Google Sheets node** (open the node and duplicate the linked sheet), or select your own 4. In the workflow’s **Google Sheets** node, select your **Spreadsheet** and **Worksheet** https://docs.google.com/spreadsheets/d/1G6FSm3VdMZt6VubM6g8j0mFw59iEw9npJE0upxj3Y6k/edit?gid=1978181834#gid=1978181834 ### 2️⃣ Connect Google Drive (OAuth2) 1. In **n8n → Credentials → New → Google Drive (OAuth2)** 2. Sign in with the Google account that will store your Excel outputs and **allow access** 3. In your Drive-related nodes (if used), point to the **folder** where you want the `.xlsx` saved or retrieved ## Customize the workflow - Replace the sample arrays in the **Code** nodes with your data (APIs, DBs, CSVs, etc.) - Rename **sheetName** in each **Convert to File** node to match your desired tab names - Keep the **Merge** node in **Combine All** mode to produce a single workbook - In **Google Sheets**, switch to **Manual mapping** for strict column order (optional) ## Best practices (per template guidelines) - **Rename nodes** to clear, action-oriented names (e.g., “Build People Sheet”, “Build Locations Sheet”) - Add a **yellow Sticky Note** at the top with this description so users see setup in-workflow - **Do not hardcode credentials** inside HTTP nodes; always use n8n Credentials - Remove personal IDs/links before publishing ## Sticky Note (copy-paste) > **Multi-Tab Excel Builder (Google Drive + Google Sheets)** > This workflow generates two datasets (Code → JSON), converts each to an Excel sheet, merges them into a **single workbook with multiple tabs**, and optionally appends rows to Google Sheets. > > **Setup (2 connections):** > 1) **Google Sheets (OAuth2):** Create credentials → duplicate/select your target spreadsheet → set Spreadsheet + Worksheet in the node. > 2) **Google Drive (OAuth2):** Create credentials → choose the folder for storing/retrieving the `.xlsx`. > > **Customize:** Edit the Code nodes’ arrays, rename tab names in Convert to File, and adjust the Sheets node mapping as needed. ## Troubleshooting - **Missing columns / wrong order:** Use **Manual mapping** in the Google Sheets node - **Binary not found:** Ensure each **Convert to File** node’s `binaryPropertyName` matches what **Merge** expects - **Permissions errors:** Re-authorize Google credentials; confirm you have edit access to the target Sheet/Drive folder ## 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? - 📧 **[email protected]** - 🔗 **https://www.linkedin.com/in/robert-breen-29429625/** - 🌐 **https://ynteractive.com**

R
Robert Breen
Document Extraction
27 Aug 2025
513
0
Workflow preview: Automated PDF form filling for W-9 and more with PDF.co
Free intermediate

Automated PDF form filling for W-9 and more with PDF.co

## 🧑‍💻 Description This workflow demonstrates how to take structured data (e.g., Name, Business, Address) and automatically fill a PDF form (IRS W-9) using **PDF.co** inside **n8n**. The workflow extracts values from a data source (e.g., Set node, database, or webhook), maps those values to fields in a PDF template, and generates a completed PDF. This can be adapted for contracts, invoices, HR forms, or any standardized documents that need automation. By chaining additional nodes, you can also **store completed PDFs in Google Drive/Dropbox, email them to recipients, or route them into CRM/ERP systems**. **Disclaimer: this template can only be used on self-hosted n8n instances** ![image.png](fileId:2207) --- ## ⚙️ Setup Instructions ### 1️⃣ Prepare Your Data - Add a **Set** node (or connect to your data source such as Google Sheets, Airtable, or a webhook). - Define fields like: - `Name` - `Business` - `Address` - `CityState` - These fields will map directly to form fields in the PDF. ### 2️⃣ Connect PDF.co 1. Create a free account at [PDF.co](https://pdf.co/) 2. Copy your **API Key** from the dashboard 3. In **n8n → Credentials → New → PDF.co API** - Paste your API Key → **Save** 4. In the **PDF.co node**, select your credential and choose the **Fill a PDF Form** operation 5. Map each field in your dataset to the corresponding PDF form field (e.g., `Name` → Line 1 on the W-9). --- ## 📄 Example Form This template uses the official IRS **W-9 PDF**: https://www.irs.gov/pub/irs-pdf/fw9.pdf Mapped fields include: - Name → Line 1 - Business → Line 2 - Address + City/State → Address block You can expand this by mapping additional fields such as **Tax Classification, Exemptions, TIN, and Signature**. --- ## 🎛️ Customization Guidance - **Different PDFs**: Swap the W-9 for any contract, invoice, or onboarding form your business uses. - **Dynamic Data Sources**: Replace the Set node with Google Sheets, Airtable, or API/Webhook inputs to pull live data. - **Automated Delivery**: After filling the form, use Email or Slack nodes to send PDFs directly to recipients. - **Document Storage**: Save completed PDFs automatically to Google Drive, Dropbox, or AWS S3. - **Bulk Processing**: Loop through rows of a dataset (e.g., customer list) to generate multiple personalized PDFs at once. --- ## 📬 Contact Need help customizing this workflow for contracts, invoices, or HR forms? - 📧 **[email protected]** - 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** - 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
Document Extraction
26 Aug 2025
455
0
Workflow preview: Extract invoice data from Google Drive to Sheets using PDF.co AI parser
Free intermediate

Extract invoice data from Google Drive to Sheets using PDF.co AI parser

This workflow looks inside a **Google Drive folder**, parses each PDF invoice with **PDF.co’s AI Invoice Parser**, and appends vendor, amount, dates, and a link to the file into a **Google Sheet**. --- ## ⚙️ Setup Instructions ### 1) Connect Google Drive (OAuth2) 1. In **n8n → Credentials → New → Google Drive (OAuth2)** 2. Sign in to the Google account that owns your invoice PDFs and **allow access**. 3. In the **Get Parent Folder ID** node, set your search to the folder name (e.g., `n8n Invoices`). 4. In **Get Invoice ID’s**, ensure **Filter → folderId** uses the ID from the previous node (already wired in this template). > **Tip:** The workflow builds a Drive link like > `https://drive.google.com/file/d/<FILE_ID>` > Make sure those files are at least viewable via link (e.g., **Anyone with the link – Viewer**) so PDF.co can fetch them. --- ### 2) Connect PDF.co (AI Invoice Parser) 1. Create a **PDF.co** account and copy your **API Key**. 2. In **n8n → Credentials → New → PDF.co API**, paste your key and **Save**. 3. In the **AI Invoice Parser** node, keep **URL** mapped to the Drive link coming from **Convert to URL**. The node is already set to use your PDF.co credentials. > **What it does:** Sends each Drive file URL to PDF.co’s **AI Invoice Parser** and returns structured JSON (vendor, totals, dates, line items, etc.). The next **Set Fields** node maps those fields to simple columns for Sheets. --- ### 3) Connect Google Sheets (OAuth2) 1. In **n8n → Credentials → New → Google Sheets (OAuth2)** 2. Log in with your Google account and **grant access**. 3. In the **Store Data in Google Sheets** node, select your **Spreadsheet** and **Sheet** (this template is set to: - Spreadsheet: `Invoice Template` (`1a6QBIQkr7RsZtUZBi87NwhwgTbnr5hQl4J_ZOkr3F1U`) - Tab: `Due` (gid: `1002294955`) 4. The node is configured to **Append or Update** by `Url` so repeats won’t duplicate. **Expected columns on the `Due` tab:** - `Url` (Drive link we generated) - `Vendor` - `Invoice Date` - `Total` - `Due Date` --- ## ▶️ How to Run 1. Put your PDF invoices in the **“n8n Invoices”** folder (or the folder name you used). 2. Click **Execute Workflow** in n8n. 3. Watch items flow: - **Get Parent Folder ID** → finds folder - **Get Invoice ID’s** → lists files in that folder - **Convert to URL** → turns each file ID into a shareable link - **AI Invoice Parser** → extracts JSON from each PDF - **Set Fields** → maps fields for Sheets - **Store Data in Google Sheets** → appends/updates the `Due` sheet --- ## 🧩 Troubleshooting - **PDF.co can’t fetch the file:** Ensure your Drive file link is accessible (e.g., **Anyone with the link – Viewer**), or use a URL that PDF.co can reach. - **No rows added:** Confirm the **Google Sheets** credential is selected and the **Sheet tab** (gid) matches your target. - **Wrong folder:** Update the search string in **Get Parent Folder ID** to match your Drive folder name. --- ## 📬 Contact Need help customizing this (e.g., auto-sharing Drive links, enriching the Sheet, or pushing to Slack/Email)? - 📧 **[email protected]** - 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** - 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
AI Summarization
26 Aug 2025
339
0
Workflow preview: Generate marketing reports from Google Sheets with GPT-4 insights and PDF.co
Free advanced

Generate marketing reports from Google Sheets with GPT-4 insights and PDF.co

This workflow pulls **marketing data from Google Sheets**, aggregates spend by channel, generates an **AI-written summary**, and outputs a formatted **PDF report** using a custom HTML template on **PDF.co**. --- ## ⚙️ Setup Instructions ### 1️⃣ Prepare Your Google Sheet - Copy this template into your Google Drive: [Sample Marketing Data](https://docs.google.com/spreadsheets/d/1UDWt0-Z9fHqwnSNfU3vvhSoYCFG6EG3E-ZewJC_CLq4/edit?gid=365710158#gid=365710158) - Add or update your marketing spend data in rows 2–100. #### Connect Google Sheets in n8n 1. Go to **n8n → Credentials → New → Google Sheets (OAuth2)** 2. Log in with your Google account and grant access 3. Select the **Spreadsheet ID** and **Worksheet** in the workflow --- ### 2️⃣ Set Up PDF.co for PDF Reports 1. Create a free account at [PDF.co](https://pdf.co/) 2. In **PDF.co Dashboard → HTML to PDF Templates**, create a new **Mustache template** - Paste the HTML provided at the bottom of this description - Save, and note your **Template ID** 3. In **n8n → Credentials → New → PDF.co API**, paste your **API Key** and save 4. In the workflow, select your **PDF.co credential** in the `Create PDF` node 5. Replace the `templateId` with your Template ID --- ## 🧠 How It Works - **Google Sheets Node**: Pulls marketing spend data - **Summarize Nodes**: Aggregate total spend and spend per channel - **OpenAI Node**: Writes a daily summary of marketing performance - **Code Node**: Converts aggregated data into the correct shape for the PDF template - **PDF.co Node**: Generates a final, formatted **PDF report** --- ## 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? - 📧 **[email protected]** - 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** - 🌐 **[ynteractive.com](https://ynteractive.com)** --- ## 📄 HTML Template (for PDF.co) &gt; Paste this into a new **HTML Template** on [PDF.co](https://pdf.co/) and reference its Template ID in your workflow. ```html &lt;!DOCTYPE html&gt; &lt;html&gt; &lt;head&gt; &lt;meta charset="utf-8" /&gt; &lt;title&gt;Invoice {{invoiceNumber}}&lt;/title&gt; &lt;style&gt; body { font-family: Arial, Helvetica, sans-serif; margin: 36px; color: #222; } .header { display: flex; justify-content: space-between; align-items: center; } .brand { max-height: 56px; } h1 { margin: 12px 0 4px; font-size: 22px; } .meta { font-size: 12px; color: #555; } .two-col { display: flex; gap: 24px; margin-top: 16px; } .box { flex: 1; border: 1px solid #ddd; padding: 12px; border-radius: 6px; } .label { font-size: 11px; color: #666; text-transform: uppercase; letter-spacing: .02em; } table { width: 100%; border-collapse: collapse; margin-top: 16px; } th, td { border-bottom: 1px solid #eee; padding: 10px 8px; font-size: 13px; } th { background: #fafafa; text-align: left; } tfoot td { border-top: 2px solid #ddd; font-size: 13px; } .right { text-align: right; } .totals td { padding: 6px 8px; } .grand { font-weight: 700; font-size: 14px; } .notes { margin-top: 18px; font-size: 12px; color: #444; } &lt;/style&gt; &lt;/head&gt; &lt;body&gt; <div> <div> <h1>Invoice {{invoiceNumber}}</h1> <div>Date: {{invoiceDate}} &nbsp; | &nbsp; Due: {{dueDate}}</div> </div> {{#company.logoUrl}} <img src alt="Logo" /> {{/company.logoUrl}} </div> <div> <div> <div>From</div> <div><strong>{{company.name}}</strong></div> <div>{{company.address}}</div> <div>{{company.phone}}</div> <div>{{company.email}}</div> </div> <div> <div>Bill To</div> <div><strong>{{billTo.name}}</strong></div> <div>{{billTo.address}}</div> <div>{{billTo.email}}</div> </div> </div> <table> <thead> <tr> <th>#</th> <th>Description</th> <th>Qty</th> <th>Unit Price</th> <th>Line Total</th> </tr> </thead> <tbody> {{#items}} <tr> <td>{{line}}</td> <td>{{description}}</td> <td>{{qty}}</td> <td>{{unitPriceFmt}}</td> <td>{{lineTotalFmt}}</td> </tr> {{/items}} </tbody> <tfoot> <tr> <td colspan="4">Subtotal</td> <td>{{subTotalFmt}}</td> </tr> <tr> <td colspan="4">Tax ({{taxRatePct}})</td> <td>{{taxAmountFmt}}</td> </tr> <tr> <td colspan="4">Discount</td> <td>-{{discountFmt}}</td> </tr> <tr> <td colspan="4">Total</td> <td>{{totalFmt}}</td> </tr> </tfoot> </table> <div> <strong>Notes:</strong> {{notes}}<br /> <strong>Terms:</strong> {{terms}} </div> &lt;/body&gt; &lt;/html&gt;

R
Robert Breen
Document Extraction
26 Aug 2025
317
0
Workflow preview: Generate PDF invoices from Google Sheets with PDF.co
Free intermediate

Generate PDF invoices from Google Sheets with PDF.co

This workflow automatically **pulls invoice rows from Google Sheets** and generates a **PDF invoice** using a PDF.co template. Perfect for small businesses that manage invoices in Sheets but need professional, ready-to-send PDFs. --- ## ⚙️ Setup Instructions ### 1️⃣ Connect Google Sheets 1. Copy this [Invoice Template Sheet](https://docs.google.com/spreadsheets/d/1a6QBIQkr7RsZtUZBi87NwhwgTbnr5hQl4J_ZOkr3F1U/edit?usp=drivesdk) into your own Drive 2. In **n8n → Credentials → New → Google Sheets (OAuth2)** - Log in with your Google account and save 3. In the workflow, select your **Spreadsheet ID** and **Worksheet (Sheet1)** in the **Google Sheets node** --- ### 2️⃣ Connect PDF.co 1. Create a free account at [PDF.co](https://pdf.co/) 2. Copy your **API Key** from the dashboard 3. In **n8n → Credentials → New → PDF.co API** - Paste your API Key → **Save** 4. In the **PDF.co node**, select your credential and choose the **HTML Template to PDF** operation --- ### 3️⃣ Create Your PDF.co HTML Template 1. In your [PDF.co dashboard](https://app.pdf.co/), go to **Templates → New Template** 2. Paste in the HTML from the template section below 3. Save the template and copy its **Template ID** 4. Replace the `templateId` in the workflow with your new ID --- ## 🧠 How It Works - **Get Invoice Rows** → Reads your Google Sheet rows (customer, line items, totals) - **Convert to HTML Import (Code node)** → Formats rows into structured data for the invoice template - **Create PDF (PDF.co)** → Sends the data into PDF.co and generates a professional PDF invoice --- ## 📬 Contact Need help customizing this (e.g., add taxes, change invoice styling, or auto-email invoices)? 📧 **[email protected]** 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** 🌐 **[ynteractive.com](https://ynteractive.com)** --- ## 📄 Full HTML Template for PDF.co Use this HTML when creating your PDF.co template: ```html &lt;!DOCTYPE html&gt; &lt;html&gt; &lt;head&gt; &lt;meta charset="utf-8" /&gt; &lt;title&gt;Invoice {{invoiceNumber}}&lt;/title&gt; &lt;style&gt; body { font-family: Arial, Helvetica, sans-serif; margin: 36px; color: #222; } .header { display: flex; justify-content: space-between; align-items: center; } .brand { max-height: 56px; } h1 { margin: 12px 0 4px; font-size: 22px; } .meta { font-size: 12px; color: #555; } .two-col { display: flex; gap: 24px; margin-top: 16px; } .box { flex: 1; border: 1px solid #ddd; padding: 12px; border-radius: 6px; } .label { font-size: 11px; color: #666; text-transform: uppercase; letter-spacing: .02em; } table { width: 100%; border-collapse: collapse; margin-top: 16px; } th, td { border-bottom: 1px solid #eee; padding: 10px 8px; font-size: 13px; } th { background: #fafafa; text-align: left; } tfoot td { border-top: 2px solid #ddd; font-size: 13px; } .right { text-align: right; } .totals td { padding: 6px 8px; } .grand { font-weight: 700; font-size: 14px; } .notes { margin-top: 18px; font-size: 12px; color: #444; } &lt;/style&gt; &lt;/head&gt; &lt;body&gt; <div> <div> <h1>Invoice {{invoiceNumber}}</h1> <div>Date: {{invoiceDate}} &nbsp; | &nbsp; Due: {{dueDate}}</div> </div> {{#company.logoUrl}} <img src alt="Logo" /> {{/company.logoUrl}} </div> <div> <div> <div>From</div> <div><strong>{{company.name}}</strong></div> <div>{{company.address}}</div> <div>{{company.phone}}</div> <div>{{company.email}}</div> </div> <div> <div>Bill To</div> <div><strong>{{billTo.name}}</strong></div> <div>{{billTo.address}}</div> <div>{{billTo.email}}</div> </div> </div> <table> <thead> <tr> <th>#</th> <th>Description</th> <th>Qty</th> <th>Unit Price</th> <th>Line Total</th> </tr> </thead> <tbody> {{#items}} <tr> <td>{{line}}</td> <td>{{description}}</td> <td>{{qty}}</td> <td>{{unitPriceFmt}}</td> <td>{{lineTotalFmt}}</td> </tr> {{/items}} </tbody> <tfoot> <tr> <td colspan="4">Subtotal</td> <td>{{subTotalFmt}}</td> </tr> <tr> <td colspan="4">Tax ({{taxRatePct}})</td> <td>{{taxAmountFmt}}</td> </tr> <tr> <td colspan="4">Discount</td> <td>-{{discountFmt}}</td> </tr> <tr> <td colspan="4">Total</td> <td>{{totalFmt}}</td> </tr> </tfoot> </table> <div> <strong>Notes:</strong> {{notes}}<br /> <strong>Terms:</strong> {{terms}} </div> &lt;/body&gt; &lt;/html&gt;

R
Robert Breen
Invoice Processing
26 Aug 2025
291
0
Workflow preview: Log new Gmail messages automatically in Google Sheets
Free intermediate

Log new Gmail messages automatically in Google Sheets

## 🧑‍💻 Description This workflow automatically **fetches new Gmail messages since the last run** and appends them into a **Google Sheet** with their ID, snippet, and timestamp. It’s designed for **professionals, small business owners, and operations teams** who need a simple way to keep an ongoing log of emails, build lightweight CRMs, or power downstream automations like **email tracking, reporting, or follow-up reminders**. --- ## ⚙️ Setup Instructions for Gmail → Google Sheets Email Log ### 1️⃣ Connect Gmail 1. In **n8n → Credentials → New → Gmail OAuth2** 2. Log in with your Gmail account & approve access 3. Attach this credential to the **Get new messages** node in the workflow ### 2️⃣ Connect Google Sheets 1. Copy this [Google Sheet template](https://docs.google.com/spreadsheets/d/1t5VXtbo9g7SvGDPmeZok4HG1K-WI1PS0DNBylzmhVwg/edit?usp=drivesdk) into your own Drive 2. In **n8n → Credentials → New → Google Sheets (OAuth2)** → log in with your Google account & save 3. In the workflow, select your Spreadsheet ID and Worksheet (Sheet1 by default) in the **Google Sheets nodes** --- ## 🧠 How Email Logging Automation Works - **Get Current Emails** → Reads the last logged messages from Google Sheets - **Get Max Date** → Finds the most recent email timestamp - **Combine** → Passes the last timestamp and today’s date together - **Get New Messages** → Fetches all new Gmail messages after the last logged date - **Add Emails to Sheets** → Appends message ID, snippet, and date to the sheet --- ## 🎛️ Customization Guidance - **Filter by Sender or Subject** → Modify the Gmail query to only log messages from specific addresses or with certain keywords. - **Track Labels** → Add Gmail labels to your sheet to separate categories (e.g., “Leads,” “Support,” “Invoices”). - **Auto-Reply** → Extend with an additional Gmail node to send automated responses to new messages. - **Send Alerts** → Add a Slack or Email node to ping your team whenever a new email is logged. - **Analytics** → Use Google Sheets formulas or connect to BI tools to analyze trends like response times or sender frequency. --- ## 📬 Contact Need advanced customization (e.g., multi-inbox tracking, CRM integration, or automated follow-ups)? 📧 **[email protected]** 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
CRM
26 Aug 2025
310
0
Workflow preview: Reddit content Q&A chatbot with GPT-4o and Reddit API
Free intermediate

Reddit content Q&A chatbot with GPT-4o and Reddit API

This workflow lets you **chat with Reddit** using OpenAI and the Reddit API. The chatbot pulls posts from a subreddit and uses GPT to answer your questions. --- ## ⚙️ Setup Instructions ### 2️⃣ Set Up OpenAI Connection 1. Go to [OpenAI Platform](https://platform.openai.com/api-keys) 2. Navigate to [OpenAI Billing](https://platform.openai.com/settings/organization/billing/overview) 3. Add funds to your billing account 4. Copy your API key into the **OpenAI credentials** in n8n --- ### 2️⃣ Set Up Reddit API 1. Go to [Reddit Apps](https://www.reddit.com/prefs/apps) 2. Click **Create App** → Choose **script** type 3. Fill in: - **Name**: (your choice) - **Redirect URI**: `http://localhost:8080` (or your n8n URL) - Save → copy **Client ID** and **Secret** 4. In **n8n → Credentials → New → Reddit OAuth2 API** - Enter **Client ID** + **Client Secret** - Log in with your Reddit account and approve access 5. Attach this credential to the **Reddit Tool** node --- ## 🧠 How It Works - **Reddit Tool Node** → Fetches posts from chosen subreddit (e.g., `r/n8n`) - **OpenAI Agent** → Processes posts and your chat question - **Chatbot** → Returns summarized answers with Reddit context --- ## 📬 Contact Need help customizing this (e.g., targeting multiple subreddits or filtering posts)? - 📧 **[email protected]** - 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** - 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
AI Chatbot
25 Aug 2025
159
0
Workflow preview: Compare your n8n version with latest release using n8n API
Free intermediate

Compare your n8n version with latest release using n8n API

## 🧑‍💻 Description This workflow automatically compares the **version of your n8n instance** with the **latest release available**. Keeping your n8n instance up-to-date is essential for **security patches, bug fixes, performance improvements, and access to new automation features**. By running this workflow, you’ll know right away if your instance is behind and whether it’s time to upgrade. After the comparison, the workflow clearly shows whether your instance is **up-to-date** or **outdated**, along with the version numbers for both. This makes it easy to plan updates and keep your automation environment secure and reliable. --- ## ⚙️ Setup Instructions ### 1️⃣ Set Up n8n API Credentials 1. In your **n8n instance** → go to **Admin Panel → API** 2. Copy your **API Key** 3. In **n8n → Credentials → New → n8n API** - Paste the API Key - Save it 4. Attach this credential to the **n8n node** (`Set up your n8n credentials`) --- ## ✅ How It Works - **Get Most Recent n8n Version** → Fetches the latest release info from [docs.n8n.io](https://docs.n8n.io). - **Extract Version + Clean Value** → Parses the version string for accuracy. - **Get your n8n version** → Connects to your own n8n instance via API and retrieves the current version. - **Compare** → Evaluates the difference and tells you if your instance is **current** or needs an **update**. --- ## 🎛️ Customization Guidance - **Notifications**: Add an Email or Slack node to automatically notify your team when a new n8n update is available. - **Scheduling**: Use a **Schedule Trigger** to run this workflow daily or weekly for ongoing monitoring. - **Conditional Actions**: Extend the workflow to log version mismatches into Google Sheets, or even trigger upgrade playbooks. - **Multi-Instance Tracking**: Duplicate the version-check step for multiple n8n environments (e.g., dev, staging, production). --- ## 💬 Example Output - *“Your instance (v1.25.0) is up-to-date with the latest release (v1.25.0).”* - *“Your instance (v1.21.0) is behind the latest release (v1.25.0). Please update to get the latest bug fixes and features.”* --- ## 📬 Contact Need help setting up API credentials or automating version checks across environments? - 📧 **[email protected]** - 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** - 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
DevOps
25 Aug 2025
443
0
Workflow preview: Auto-categorize Outlook emails into color categories with GPT-4o
Free intermediate

Auto-categorize Outlook emails into color categories with GPT-4o

This workflow fetches recent emails from Outlook and uses OpenAI to assign a category (e.g., **Red**, **Yellow**, **Green**). It then updates each message’s category in Outlook. --- ## ⚙️ Setup Instructions ### 1️⃣ Set Up Outlook Connection 1. In n8n → **Credentials → New → Microsoft Outlook OAuth2 API** 2. Log in with your Outlook account & approve access 3. Attach this credential to the **Get Messages from Outlook** and **Update Category** nodes in the workflow 4. (Optional) Adjust the **limit** field in **Get Messages from Outlook** if you want more/less than 5 results --- ### 2️⃣ Set Up OpenAI Connection 1. Go to the **[OpenAI Platform](https://platform.openai.com/api-keys)** 2. Navigate to **[OpenAI Billing](https://platform.openai.com/settings/organization/billing/overview)** 3. Add funds to your billing account 4. Copy your API key into the **OpenAI** credentials in n8n and select it on the **OpenAI Chat Model** and **Categorizing Agent** nodes --- ## ✅ Notes - The agent reads `subject` + `bodyPreview` and returns a JSON category. - The **Update Category** node writes that category back to Outlook. - You can tweak the category names or logic in the **Categorizing Agent** system message. --- ## 📬 Contact Need help customizing the categorization rules or routing categories to folders? - 📧 **[email protected]** - 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** - 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
AI Summarization
25 Aug 2025
720
0
Workflow preview: Financial data Q&A chatbot with Google Finance, SerpAPI, and OpenAI
Free intermediate

Financial data Q&A chatbot with Google Finance, SerpAPI, and OpenAI

Replace `YOUR_API_KEY` with your actual SerpApi key. ### 2️⃣ Set Up OpenAI Connection 1. Go to [OpenAI Platform](https://platform.openai.com/api-keys) 2. Navigate to **Billing** and ensure your account has credits/funding 3. Copy your **API Key** into the **OpenAI credentials** in n8n --- ## 🧠 Workflow Breakdown 1. **Chat Trigger** → User enters a financial question (e.g., “What’s the current price of Tesla?”). 2. **HTTP Request (SerpApi Finance Search)** → Fetches the latest market data for the requested ticker or index. 3. **OpenAI Node** → Takes both the raw financial data and the user’s query, then formulates a natural language response. 4. **Output** → Returns a clear, conversational answer that can be displayed in chat, Slack, or another integration. --- ## 🎛️ Customization Guidance - **Multiple Tickers**: Update the workflow to query multiple tickers (e.g., TSLA, AAPL, AMZN) and return a combined report. - **Scheduling**: Add a **Schedule Trigger** to run this workflow every morning and send a market recap. - **Delivery Channels**: Use Slack, Email, or Google Sheets nodes to distribute reports automatically. - **Extended Data**: Adjust the SerpApi query to include more than prices — e.g., company info, market news, or related tickers. - **Custom Prompts**: Change the OpenAI system prompt to make the chatbot more formal (for reporting) or casual (for quick insights). --- ## 💬 Example Questions & Responses **Question:** *“What’s the current price of the S&P 500?”* **Expected Response:** “The S&P 500 (^GSPC) is currently trading at **4,725.13**, down **0.8%** today.” **Question:** *“Summarize the performance of Tesla and Apple today.”* **Expected Response:** - Tesla (TSLA): **$238.45**, up **1.5%** - Apple (AAPL): **$192.11**, down **0.3%** **Question:** *“Give me a quick market recap.”* **Expected Response:** “Markets are mixed today — the S&P 500 is slightly down, while tech stocks like Tesla are showing gains. Apple dipped slightly after earnings news.” --- ## 📬 Contact Need help customizing this workflow (e.g., multiple tickers, daily summaries, or integrating into dashboards)? 📧 **[email protected]** 🔗 **[Robert Breen](https://www.linkedin.com/in/robert-breen-29429625/)** 🌐 **[ynteractive.com](https://ynteractive.com)**

R
Robert Breen
AI Chatbot
22 Aug 2025
809
0