Miscellaneous Workflows
Send package and visitor alerts with Slack, SMS, email, Google Sheets and Claude
## Who this is for Property management teams handling multiple properties with high package/visitor traffic who want automated tenant and management notifications. ## What this workflow does Automatically classifies package and visitor events, sends notifications to tenants, alerts property managers, and logs activity for accountability. ## How it works 1. Package/visitor system triggers workflow. 2. AI classifies urgency and type. 3. Notifications sent via Email, SMS, and Slack. 4. Google Sheets logs all events. 5. Optional AI follow-up suggestions for unclaimed packages or missed visitors. ## How to set up Connect Webhook, Slack, Email, SMS, and AI. Test routing and logging. Adjust AI prompts for local building protocols. ## Requirements - AI Node - Webhook from package/visitor system - Slack, Email, SMS credentials - Google Sheets Built by QuarterSmart. Created by Hyrum Hurst.
Send daily Malaysian weather alerts with Perplexity AI, Firecrawl and Telegram
## Automated Malaysian Weather Alerts with Perplexity AI, Firecrawl and Telegram This n8n template automates daily weather monitoring by fetching official government warnings and searching for related news coverage, then delivering comprehensive reports directly to Telegram. Use cases include monitoring severe weather conditions, tracking flood warnings across Malaysian states, staying updated on weather-related news, and receiving automated daily weather briefings for emergency preparedness. ### Good to know - Firecrawl free tier allows limited scraping requests per hour. Consider the 3-second interval between requests to avoid rate limits. - OpenAI costs apply for content summarization - GPT-4.1 mini balances quality and affordability. - After testing multiple AI models (GPT, Gemini), **Perplexity Sonar Pro Search proved most effective** for finding recent, relevant weather news from Malaysian sources. - The workflow focuses on major Malaysian news outlets like Utusan, Harian Metro, Berita Harian, and Kosmo. ### How it works 1. Schedule Trigger runs daily at 9 AM to fetch weather warnings from Malaysia's official data.gov.my API. 2. JavaScript code processes weather data to extract warning types, severity levels, and affected locations. 3. Search queries are aggregated and combined with location information. 4. Perplexity Sonar Pro AI Agent searches for recent news articles (within 3 days) from Malaysian news channels. 5. URLs are cleaned and processed one by one through a loop to manage API limits. 6. Firecrawl scrapes each news article and extracts summaries from main content. 7. All summaries and source URLs are combined and sent to OpenAI for final report generation. 8. The polished weather report is delivered to your Telegram channel in English. ### How to use - The schedule trigger is set for 9 AM but can be adjusted to any preferred time. - Replace the Telegram chat ID with your channel or group ID. - The workflow automatically filters out "No Advisory" warnings to avoid unnecessary notifications. - Modify the search query timeout and batch processing based on your API limits. ### Requirements - OpenAI API key (get one at [https://platform.openai.com](https://platform.openai.com)) - Perplexity API via OpenRouter (get access at [https://openrouter.ai](https://openrouter.ai)) - Firecrawl API key (get free tier at [https://firecrawl.dev](https://firecrawl.dev)) - Telegram Bot token and channel/group ID ### Customizing this workflow - **Expand news sources**: Modify the AI Agent prompt to include additional Malaysian news outlets or social media sources. - **Language options**: Change the final report language from English to Bahasa Malaysia by updating the "Make a summary" system prompt. - **Alert filtering**: Adjust the JavaScript code to focus on specific warning types (e.g., only severe warnings or specific states). - **Storage integration**: Connect to Supabase or Google Sheets to maintain a historical database of weather warnings and news. - **Multi-channel delivery**: Add more notification nodes to send alerts via email, WhatsApp, or SMS alongside Telegram.
Predict disaster damage and coordinate property response with GPT-4, Google Sheets, Calendar and Gmail
## How It Works This automated disaster response workflow streamlines emergency management by monitoring multiple alert sources and coordinating property protection teams. Designed for property managers, insurance companies, and emergency response organizations, it solves the critical challenge of rapidly identifying at-risk properties and deploying resources during disasters.The system continuously monitors weather, seismic, and flood alerts from authoritative sources. When threats are detected, it cross-references property databases to identify affected locations, calculates insurance exposure, and generates damage assessments using OpenAI's GPT-4. Teams receive automated maintenance schedules while property owners and insurers get instant email notifications with comprehensive reports. This eliminates manual monitoring, reduces response time from hours to minutes, and ensures no vulnerable properties are overlooked during emergencies. ## Setup Steps 1. Configure alert fetch nodes with weather/seismic/flood API endpoints 2. Connect property database credentials (specify database type) 3. Add OpenAI API key for GPT-4 damage assessments 4. Set up Gmail/SMTP credentials for owner and insurer notifications 5. Customize insurance calculation formulas and team scheduling logic ## Prerequisites Weather/seismic/flood alert API access, property database (SQL/Sheets/Airtable) ## Use Cases Insurance companies automating claims preparation, property management firms protecting rental portfolios ## Customization Modify alert source APIs, adjust damage assessment prompts ## Benefits Reduces emergency response time by 90%, eliminates manual alert monitoring
Find the best Roblox server for your game using a webhook and Roblox API
# Who For? Gamers who don't like bad and slow preformance playing games and want to find a good preformance based server near them. # Setup? None! # Modes Available - Auto - Optimized (Recommended & Default) - Ping - Finds the lowest ping - Latency - Lowest ping & highest FPS
Smart irrigation scheduler with weather forecast and soil analysis
# Smart Irrigation Scheduler with Weather Forecast and Soil Analysis ## Summary Automated garden and farm irrigation system that uses weather forecasts and evapotranspiration calculations to determine optimal watering schedules, preventing water waste while maintaining healthy plants. ## Detailed Description A comprehensive irrigation management workflow that analyzes weather conditions, forecasts, soil types, and plant requirements to make intelligent watering decisions. The system considers multiple factors including expected rainfall, temperature, humidity, wind speed, and days since last watering to determine if irrigation is needed and how much. ### Key Features - **Multi-Zone Management**: Support for multiple irrigation zones with different plant and soil types - **Weather-Based Decisions**: Uses OpenWeatherMap current conditions and 5-day forecast - **Evapotranspiration Calculation**: Simplified Penman method for accurate water loss estimation - **Rain Forecast Skip**: Automatically skips watering when significant rain is expected - **Plant-Type Specific**: Different requirements for flowers, vegetables, grass, and shrubs - **Soil Type Consideration**: Adjusts for clay, loam, and sandy soil characteristics - **Urgency Classification**: High/medium/low priority based on moisture levels - **Optimal Timing**: Adjusts watering time based on temperature and wind conditions - **IoT Integration**: Sends commands to smart irrigation controllers - **Historical Logging**: Tracks all decisions in Google Sheets ### Use Cases - Home garden automation - Commercial greenhouse management - Agricultural operations - Landscaping company scheduling - Property management with large grounds - Water conservation projects ### Required Credentials - OpenWeatherMap API key - Slack Bot Token - Google Sheets OAuth - IoT Hub API (optional) ### Node Count: 24 (19 functional + 5 sticky notes) ### Unique Aspects - Uses **OpenWeatherMap** node (rarely used in templates) - Uses **Split Out** node for loop-style processing of zones - Uses **Filter** node for conditional routing - Uses **Aggregate** node to collect results - Implements **evapotranspiration calculation** using Code node - Comprehensive **multi-factor decision logic** ### Workflow Architecture ``` [Daily Morning Check] [Manual Override Trigger] | | +----------+-------------+ | v [Define Irrigation Zones] | v [Split Zones] (Loop) / \ v v [Get Current] [Get 5-Day Forecast] \ / +----+----+ | v [Merge Weather Data] | v [Analyze Irrigation Need] / \ v v [Filter Needing] [Aggregate All] \ / +----+----+ | v [Generate Irrigation Schedule] | v [Has Irrigation Tasks?] (If) / \ Has Tasks No Tasks / | \ | [Sheets][IoT][Slack] [Log No Action] \ | / | +---+---+-----------+ | v [Respond to Webhook] ``` ### Configuration Guide 1. **Irrigation Zones**: Edit "Define Irrigation Zones" with your zone data (coordinates, plant/soil types) 2. **Water Thresholds**: Adjust `waterThreshold` per zone based on plant needs 3. **OpenWeatherMap**: Add API credentials in the weather nodes 4. **Slack Channel**: Set to your garden/irrigation channel 5. **IoT Integration**: Configure endpoint URL for your smart valve controller 6. **Google Sheets**: Connect to your logging spreadsheet ### Decision Logic The system evaluates: 1. Expected rainfall in next 24 hours (skip if >5mm expected) 2. Soil moisture estimate based on days since watering + evapotranspiration 3. Plant-specific minimum and ideal moisture levels 4. Temperature adjustments for hot days 5. Scheduled watering frequency by plant type 6. Wind speed for optimal watering time
Automate event registration and QR check-ins with Google Sheets, Gmail, and Slack
Who is this for? This template is ideal for event organizers, conference managers, and community teams who need an automated participant management system. Perfect for workshops, conferences, meetups, or any event requiring registration and check-in tracking. What this workflow does This workflow provides end-to-end event management with two main flows: Registration Flow: ⦁ Receives participant registration via webhook ⦁ Generates unique ticket ID and stores in Google Sheets ⦁ Creates QR code using the QR Code node ⦁ Sends confirmation email with QR code attached Check-in Flow: ⦁ Scans and decodes QR code at venue entrance ⦁ Validates ticket against participant database ⦁ Blocks duplicate check-ins with clear error messages ⦁ Sends Slack notification for VIP arrivals ⦁ Returns real-time attendance statistics Setup 1. Create a Google Sheet with columns: Ticket ID, Event ID, Name, Email, Ticket Type, Registered At, Checked In, Check-in Time 2. Connect your Google Sheets and Gmail credentials 3. Configure Slack for VIP notifications 4. Set up the webhook URLs in your registration form Requirements ⦁ Google Sheets (participant database) ⦁ Gmail account (confirmation emails) ⦁ Slack workspace (VIP notifications) How to customize ⦁ Add capacity limits by checking row count before registration ⦁ Modify QR code size and format in the QR Code node ⦁ Add additional ticket types beyond VIP/standard ⦁ Integrate with payment systems for paid events
Play RPG with Groq Dungeon Master via Telegram voice messages
## Dungeons and Goblins — AI Telegram Voice Adventure with Persistent Memory This n8n template demonstrates how to use an AI agent with persistent memory to run a structured, rules-driven fantasy role-playing game entirely through Telegram voice messages. The workflow acts as a Dungeon Master, narrating scenes, resolving mechanics, performing dice rolls when authorized, and explicitly saving game state between turns. ## How it works - A player actions are provided to Telegram bot via voice messages. - The AI agent loads the current game state from n8n memory. - A strict system prompt enforces rules, turn flow, and narration. - When an action requires a dice roll, the agent waits for player authorization. - Once authorized, the AI rolls, resolves the outcome, and applies changes. - All state updates are emitted as structured data and saved to memory. - The request and response are processed in Groq's STT and TTS. ## Use cases - Solo text-based fantasy campaigns - Persistent AI-driven adventures - Testing stateful AI agents in n8n - Educational examples of memory-aware workflows ## Requirements - Groq API token (free tier supported) - Telegram bot API token
Automate inventory management with Google Sheets & Gmail
## ✅ What problem does this workflow solve? Managing inventory manually requires constant monitoring, manual purchase order creation, and back-and-forth communication with suppliers. This workflow automates the **entire inventory replenishment cycle** — from detecting low-stock items to generating purchase orders and emailing suppliers automatically. It ensures accurate stock levels, reduces manual work, and prevents stockouts. --- ## 💡 Main Use Cases - 🔍 Identify low-stock items automatically based on thresholds - 📊 Perform scheduled daily inventory checks - 🧾 Auto-generate purchase orders for items that need replenishment - ✉️ Email purchase orders directly to suppliers - 📄 Update Google Sheets with order and inventory tracking information --- ## 🧠 How It Works – Step-by-Step ### 1. ⏰ Scheduled Trigger The workflow runs automatically every day (or any chosen interval) to begin inventory checks without manual involvement. --- ### 2. 📉 Get Low-Stock Items Reads your Google Sheets inventory file to identify items where **current stock < minimum stock threshold**. --- ### 3. 🧮 Process Each Low-Stock Item For every item below the # Header 1threshold: - Calculates the required order quantity - Generates purchase order details, including - SKU / Item Name - Quantity Needed - Supplier Email - Stock Levels --- ### 4. 🔀 Conditional Flow For each low-stock item: #### **Purchase Order Actions** - Creates a purchase order email using the generated details - Sends the PO automatically to the supplier via Gmail - Logs the PO entry in Google Sheets with: - Item Details - Order Quantity - Supplier - Timestamp - Status (“PO Sent”) --- ### 5. 📢 Notifications Sends purchase order emails directly to suppliers. (Optional) Internal notifications (Slack/email) can be added for procurement visibility. --- ## 📊 Logging & Reporting All actions — PO creation, stock levels, supplier emails — are written back to Google Sheets for complete auditability and reporting. --- ## 👤 Who can use this? Perfect for: - Retail & eCommerce businesses - Warehouse teams - Procurement & purchasing departments - Manufacturing operations - Any business managing physical inventory --- ## 🚀 Benefits - ⏱ Automated stock monitoring - 📦 Prevents stockouts - ✉️ Eliminates manual PO creation - 📚 Creates a complete audit trail - 🧠 Smart, rule-based reorder logic ---
Real-time public transport delay tracking with ScrapeGraphAI, Teams & Dropbox
# Public Transport Schedule & Delay Tracker with Microsoft Teams and Dropbox  **⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template.** This workflow automatically scrapes public transport websites or apps for real-time schedules and service alerts, then pushes concise delay notifications to Microsoft Teams while archiving full-detail JSON snapshots in Dropbox. Ideal for commuters and travel coordinators, it keeps riders informed and maintains a historical log of disruptions. ## Pre-conditions/Requirements ### Prerequisites - n8n instance (self-hosted or n8n.cloud) - ScrapeGraphAI community node installed - Microsoft Teams incoming webhook configured - Dropbox account with an app token created - Public transit data source (website or API) that is legally scrapable or offers open data ### Required Credentials - **ScrapeGraphAI API Key** – enables web scraping - **Microsoft Teams Webhook URL** – posts messages into a channel - **Dropbox Access Token** – saves JSON files to Dropbox ### Specific Setup Requirements | Item | Example | Notes | |------|---------|-------| | Transit URL(s) | https://mycitytransit.com/line/42 | Must return the schedule or service alert data you need | | Polling Interval | 5 min | Adjust via Cron node or external trigger | | Teams Channel | #commuter-updates | Create an incoming webhook in channel settings | ## How it works This workflow automatically scrapes public transport websites or apps for real-time schedules and service alerts, then pushes concise delay notifications to Microsoft Teams while archiving full-detail JSON snapshots in Dropbox. Ideal for commuters and travel coordinators, it keeps riders informed and maintains a historical log of disruptions. ## Key Steps: - **Webhook Trigger**: Starts the workflow (can be replaced with Cron for polling). - **Set Node**: Stores target route IDs, URLs, or API endpoints. - **SplitInBatches**: Processes multiple routes one after another to avoid rate limits. - **ScrapeGraphAI**: Scrapes each route page/API and returns structured schedule & alert data. - **Code Node (Normalize)**: Cleans & normalizes scraped fields (e.g., converts times to ISO). - **If Node (Delay Detected?)**: Compares live data vs. expected timetable to detect delays. - **Merge Node**: Combines route metadata with delay information. - **Microsoft Teams Node**: Sends alert message and rich card to the selected Teams channel. - **Dropbox Node**: Saves the full JSON snapshot to a dated folder for historical reference. - **StickyNote**: Documents the mapping between scraped fields and final JSON structure. ## Set up steps **Setup Time: 15-25 minutes** 1. **Clone or Import** the JSON workflow into your n8n instance. 2. **Install ScrapeGraphAI** community node if you haven’t already (`Settings → Community Nodes`). 3. **Open the Set node** and enter your target routes or API endpoints (array of URLs/IDs). 4. **Configure ScrapeGraphAI**: - Add your API key in the node’s credentials section. - Define CSS selectors or API fields inside the node parameters. 5. **Add Microsoft Teams credentials**: - Paste your channel’s incoming webhook URL into the Microsoft Teams node. - Customize the message template (e.g., include route name, delay minutes, reason). 6. **Add Dropbox credentials**: - Provide the access token and designate a folder path (e.g., `/TransitLogs/`). 7. **Customize the If node** logic to match your delay threshold (e.g., ≥5 min). 8. **Activate** the workflow and trigger via the webhook URL, or add a Cron node (every 5 min). ## Node Descriptions ### Core Workflow Nodes: - **Webhook** – External trigger for on-demand checks or recurring scheduler. - **Set** – Defines static or dynamic variables such as route list and thresholds. - **SplitInBatches** – Iterates through each route to control request volume. - **ScrapeGraphAI** – Extracts live schedule and alert data from transit websites/APIs. - **Code (Normalize)** – Formats scraped data, merges dates, and calculates delay minutes. - **If (Delay Detected?)** – Branches the flow based on presence of delays. - **Merge** – Re-assembles metadata with computed delay results. - **Microsoft Teams** – Sends formatted notifications to Teams channels. - **Dropbox** – Archives complete JSON payloads for auditing and analytics. - **StickyNote** – Provides inline documentation for maintainers. ### Data Flow: 1. **Webhook** → **Set** → **SplitInBatches** → **ScrapeGraphAI** → **Code (Normalize)** → **If (Delay Detected?)** ├─ true → **Merge** → **Microsoft Teams** → **Dropbox** └─ false → **Dropbox** ## Customization Examples ### Change to Slack instead of Teams ```javascript // Replace Microsoft Teams node with Slack node { "text": `🚊 *${$json.route}* is delayed by *${$json.delay}* minutes.`, "channel": "#commuter-updates" } ``` ### Filter only major delays (>10 min) ```javascript // In If node, use: return $json.delay >= 10; ``` ## Data Output Format The workflow outputs structured JSON data: ```json { "route": "Line 42", "expected_departure": "2024-04-22T14:05:00Z", "actual_departure": "2024-04-22T14:17:00Z", "delay": 12, "status": "delayed", "reason": "Signal failure at Main Station", "scraped_at": "2024-04-22T13:58:22Z", "source_url": "https://mycitytransit.com/line/42" } ``` ## Troubleshooting ### Common Issues 1. **ScrapeGraphAI returns empty data** – Verify CSS selectors/API fields match the current website markup; update selectors after site redesigns. 2. **Teams messages not arriving** – Ensure the Teams webhook URL is correct and the incoming webhook is still enabled. 3. **Dropbox writes fail** – Check folder path, token scopes (`files.content.write`), and available storage quota. ### Performance Tips - Limit `SplitInBatches` to 5-10 routes per run to avoid IP blocking. - Cache unchanged schedules locally and fetch only alert pages for faster runs. **Pro Tips:** - Use environment variables for API keys & webhook URLs to keep credentials secure. - Attach a Cron node set to off-peak hours (e.g., 4 AM) for daily full-schedule backups. - Add a Grafana dashboard that reads the Dropbox archive for long-term delay analytics.
Monitor commodity markets with Apify, DeepL translation & sector impact analysis
Stay ahead of commodity market movements with automated news collection, translation, and sector impact analysis. This workflow monitors Oil, Gold, and Grain markets from global English sources, translates them to Japanese using DeepL, and delivers categorized alerts showing which business sectors are affected. ## Who is this for - Trading company staff and procurement managers affected by raw material prices - CFD and commodity futures traders - Economists and market researchers tracking inflation indicators - Anyone who needs early warning on geopolitical risks affecting commodities ## What this workflow does 1. Fetches latest news every 4 hours for three commodity categories using Apify 2. Categorizes news and identifies impacted business sectors automatically 3. Translates headlines and summaries from English to Japanese using DeepL 4. Adds unit conversion notes (barrel, troy ounce, bushel) for easier understanding 5. Formats a comprehensive report with sector impact tags 6. Delivers alerts to Discord, Telegram, and Gmail simultaneously ## How to set up 1. Get your Apify API token from apify.com (Settings → Integrations → API) 2. Replace the token placeholder in all HTTP Request nodes 3. Add DeepL API credentials (free tier: 500,000 chars/month) 4. Configure at least one notification channel 5. Set your Telegram Chat ID and email address 6. Activate the workflow ## Requirements - Apify account (free tier available) - DeepL API key (free tier available) - At least one notification channel (Discord, Telegram, or Gmail)
Create daily trivia icebreakers in Slack with OpenTDB & Google Sheets log
**Who this workflow is for** This template is for teams who want a lightweight “daily icebreaker” in Slack and creators who’d like to build a reusable trivia database over time. It works well for remote teams, communities, and any workspace that enjoys a quick brain teaser each day. **What this workflow does** The workflow fetches a random multiple-choice question from the Open Trivia Database (OpenTDB), posts a nicely formatted trivia message to a Slack channel, and logs the full question and answers into a Google Sheets spreadsheet. Over time, this creates a searchable “trivia archive” you can reuse for quizzes, content, or community events. **How it works** A Schedule Trigger runs once per day at a time you define. A Set node randomly chooses a difficulty level (easy, medium, or hard). A Switch node routes to the matching OpenTDB HTTP request. Each branch normalizes the API response into common fields (timestamp, date, difficulty, category, question, correct, incorrect, messageTitle, messageBody). A Merge node combines the three branches into a single stream. Slack posts the trivia message. Google Sheets appends the same data as a new row. **How to set up** Connect your Slack OAuth2 credentials and choose a target channel. Connect your Google Sheets credentials and select the spreadsheet and sheet. Adjust the schedule (time and frequency) to match your use case. **How to customize** Change the Slack message format (for example, add emojis or hints). Filter categories or difficulty levels instead of picking them fully at random. Add additional logging (e.g., user reactions, answer stats) in Sheets or another datastore.
Automated MIT AI news delivery to Discord with deduplication
Stay ahead of the curve with the latest Artificial Intelligence research from MIT, delivered directly to your Discord server—clean, filtered, and duplicate-free. This workflow is perfect for AI agencies, researchers, and tech teams who want to stay informed without the noise. ## How it works This workflow runs automatically every day (default 9:00 AM) to: 1. **Fetch** the official MIT News RSS feed for the "Artificial Intelligence" topic. 2. **Filter** articles to keep only those published in the last 24 hours. 3. **Deduplicate** content using an internal **n8n Data Table**. It checks if the article link has already been sent to prevent spamming old news. 4. **Notify** your team on Discord with a clean, formatted message including the Title, Author, Date, and Link. ## Setup steps ### 1. Create the Data Table (Mandatory) This workflow relies on n8n Data Tables to track sent articles. Before running: 1. Go to your n8n Dashboard > **Data Tables**. 2. Create a new table named: `mit_ai_news_sent` 3. Add these exact columns: * `creator` (Type: String) * `title` (Type: String) * `link` (Type: String) * `pubDate` (Type: Date) ### 2. Connect the Table 1. Import this template. 2. Open the orange nodes named **"Avoid Duplicated Articles"** and **"Register New Article"**. 3. Select the `mit_ai_news_sent` table you just created from the list. ### 3. Configure Discord 1. Create a Webhook in your Discord Server (Server Settings > Integrations > Webhooks). 2. Open the **"MIT AI Articles"** node. 3. Create a new Credential and paste your Webhook URL.
🔄️ AI warehouse inventory cycle count bot using GPT, Telegram and Google Sheets
*Tags: Logistics, Supply Chain, Warehouse Operations, Paperless Processes, Inventory Management* ### Context Hi! I’m [Samir](https://samirsaci.com), Supply Chain Engineer, Data Scientist based in Paris, and founder of [LogiGreen](https://logi-green.com). > Let's use AI with n8n to help SMEs digitalise their logistics operations! Traditional inventory cycle counts often require clipboards, scanners, and manual reconciliation. With this workflow, the operator walks through the warehouse, sends **voice messages**, and the bot automatically updates the inventory records. [](https://youtu.be/_EOJ3M7APsQ) Using AI-based transcription and structured extraction, we optimise the entire process with a simple mobile device connected to Telegram. 📬 For business inquiries, you can find me on [LinkedIn](https://www.linkedin.com/in/samir-saci) ### Demo of the workflow In this example, the bot guides the operator through the cycle count for three locations. [](https://youtu.be/_EOJ3M7APsQ) The workflow automatically records the results in Google Sheets. [](https://youtu.be/_EOJ3M7APsQ) ### Who is this template for? This template is ideal for companies with limited IT resources: - **Inventory controllers** who need a hands-free, mobile-friendly counting process - Small **3PLs** and retailers looking to digitalise stock control ### 🎥 Tutorial A complete tutorial (with explanations of every node) is available on YouTube: [](https://youtu.be/_EOJ3M7APsQ) ### What does this workflow do? This automation uses Telegram and OpenAI’s Whisper transcription: 1. The operator sends **/start** to the bot. 2. The bot identifies the **first location** that still needs to be counted. 3. The operator is guided to the location through a Telegram message. 4. The operator records a **voice message** with the `location ID` and the `number of units` counted. 5. AI nodes transcribe the audio and extract `location_id` and `quantity`. 6. If the message cannot be transcribed, the bot asks the operator to repeat. 7. If the location is valid and still pending, the Google Sheet is updated. 8. The bot sends the **next location**, until the final one is completed. 9. The operator receives a confirmation that the cycle count is finished. ### Next Steps Before running the workflow, follow the sticky notes and configure: - Connect your **Telegram Bot API** - Add your **OpenAI API Key** to the transcription and extraction nodes - Connect your **Google Sheets credentials** - Update the Google Sheet ID and the worksheet name in all Spreadsheet nodes - Adjust the AI prompts depending on your warehouse location naming conventions *Submitted: 20 November 2025* *Template designed with n8n version 1.116.2*
Automate LINE & Google account linking with OAuth2 authentication
# LINE x Google Account Linking Workflow This workflow automates the process of linking a new user on your LINE Official Account to their Google Account. When a user adds your LINE account as a friend, this workflow automatically sends them a message with a unique authentication link. After the user approves the connection, their Google profile information is fetched, and a confirmation message is sent, completing the loop. ## Prerequisites Before you begin, ensure you have the following: * **An n8n instance:** Either on n8n.cloud or a self-hosted environment. * **A LINE Developers Account:** * A Messaging API channel. * Your Channel Access Token (long-lived). * **A Google Cloud Platform (GCP) Account:** * A configured OAuth consent screen. * An OAuth 2.0 Client ID and Client Secret. ## Setup Instructions Follow these steps to configure the workflow. ### Step 1: Configure LINE Developers Console 1. Log in to the [LINE Developers Console](https://developers.line.biz/console/). 2. Navigate to your provider and select your **Messaging API channel**. 3. Go to the **Messaging API** tab. 4. Issue a **Channel access token (long-lived)** and copy the value. 5. In the **Webhook URL** field, paste the Test URL from the `LINE Webhook` node in your n8n workflow. 6. Enable **Use webhook**. ### Step 2: Configure Google Cloud Platform (GCP) 1. Log in to the [Google Cloud Console](https://console.cloud.google.com/) and select your project. 2. Navigate to **APIs & Services** > **OAuth consent screen**. Configure it if you haven't already, ensuring you add your own Google account as a test user. 3. Go to **APIs & Services** > **Credentials**. 4. Click **+ CREATE CREDENTIALS** and select **OAuth 2.0 Client ID**. 5. For **Application type**, choose **Web application**. 6. Under **Authorized redirect URIs**, click **+ ADD URI** and paste the Test URL from the `Google Auth Callback` node in your n8n workflow. 7. Click **Create**. Copy your **Client ID** and **Client Secret**. ### Step 3: Configure the n8n Workflow Import the workflow JSON into your n8n canvas and follow these steps to set it up. #### 1. Configure n8n Credentials First, set up the credentials that the HTTP Request nodes will use. * **For the LINE Messaging API:** 1. In n8n, go to **Credentials** > **Add credential**. 2. Search for and select **Header Auth**. 3. Set `Name` to `Authorization`. 4. Set `Value` to `Bearer YOUR_LINE_CHANNEL_ACCESS_TOKEN` (replace with the token from Step 1). 5. Save the credential with a memorable name like "LINE Messaging API Auth". * **For the Google API (Dynamic Token):** 1. Create another **Header Auth** credential. 2. Set `Name` to `Authorization`. 3. For `Value`, enter a placeholder like `Bearer dummy_token`. This will be replaced dynamically by the workflow. 4. Save the credential with a name like "Google API Dynamic Token". #### 2. Update Node Parameters Now, update the parameters in the following nodes: * **`Create Google Auth URL` node:** * In the `value` field, replace `YOUR_N8N_WEBHOOK_URL_FOR_GOOGLE` with the webhook URL of the `Google Auth Callback` node. * Replace `YOUR_GOOGLE_CLIENT_ID` with the Client ID from GCP (Step 2). * **`Get Google Access Token` node:** * In the `jsonBody` field, replace `YOUR_GOOGLE_CLIENT_ID`, `YOUR_GOOGLE_CLIENT_SECRET`, and `YOUR_N8N_WEBHOOK_URL_FOR_GOOGLE` with your actual GCP credentials and callback URL. * **`Get Google User Info` node:** * For **Authentication**, select `Header Auth`. * For **Credential for Header Auth**, choose the "Google API Dynamic Token" credential you created. * **Important:** Click **Add Option** > **Header To Append**. Set the `Name` to `Authorization` and the `Value` to the following expression to use the token from the previous step: `Bearer {{ $node["Get Google Access Token"].json["access_token"] }}`. * **`Send Auth Link to LINE` & `Send Completion Message to LINE` nodes:** * For **Credential for Header Auth**, choose the "LINE Messaging API Auth" credential. * **`Redirect to LINE OA` node:** * In the `redirectURL` parameter, replace `YOUR_LINE_OFFICIAL_ACCOUNT_ID` with your LINE OA's ID (e.g., `@123abcde`). ### Step 4: Activate and Test 1. Save the workflow by clicking the **Save** button. 2. **Activate** the workflow using the toggle in the top-right corner. 3. On your phone, add your LINE Official Account as a friend. You should receive a message with a link. 4. Follow the link to authorize with your Google account. After successful authorization, you should receive a completion message in LINE and be redirected. > **Note:** When you are ready for production, remember to replace the "Test" webhook URLs in the LINE and GCP consoles with the "Production" URLs from the n8n webhook nodes.
Predict F1 Race Winners with OpenAI GPT-4o, Historical Data & Slack Alerts
## How It Works Every day at 8 AM, the workflow automatically retrieves the latest F1 data—including driver standings, qualifying results, race schedules, and circuit information. All sources are merged into a unified dataset, and driver performance metrics are computed using historical trends. An AI agent, enhanced with vectorized race history, evaluates patterns and generates race-winner predictions. When the confidence score exceeds the defined threshold, the system pushes an automated Slack alert and records the full analysis in the database and Google Sheets. ## Setup Steps 1. Update the workflow configuration with: `newsApiUrl`, `weatherApiUrl`, `historicalYears`, and `confidenceThreshold`. 2. Connect PostgreSQL using the schema: **prediction_date, predicted_winner, confidence_score, prediction_source, data_version, full_analysis**. 3. Provide the Slack channel ID for sending high-confidence alerts. 4. Specify the Google Sheets document ID and sheet name for prediction logging. 5. Test connectivity to the Ergast API (no authentication required). ## Prerequisites OpenAI account (GPT-4o access), Slack workspace admin access, PostgreSQL instance, Google Sheets account, n8n instance with LangChain community nodes enabled. ## Customization Extend by adding constructor predictions (modify AI prompt). Integrate Discord or Teams instead of Slack. ## Benefits Saves time by automating data collection, improves accuracy using multiple performance metrics and historical patterns.
Monitor Japan Flight Prices & Generate Booking Alerts with GPT-4o & Multi-Source Analysis
## How It Works Scheduled triggers run automated price checks across multiple travel data sources. The collected data is aggregated, validated, and processed through an AI analysis layer that compares trends, detects anomalies, and evaluates multi-criteria factors such as price movement, seasonality, and route demand. The system then routes results into booking preparation, report generation, and notification modules. When target price conditions are met, alerts are sent and records are updated accordingly. ## Setup Steps 1. Connect Google Flights and Skyscanner APIs using authenticated tokens. 2. Configure the OpenAI API for enhanced analysis and multi-factor evaluation. 3. Link Google Sheets for storing historical price data. 4. Add WordPress site credentials to enable automated report publishing. 5. Enable email notifications for price alerts and updates. 6. Adjust the scheduler frequency within the **Schedule Price Check** node to control how often the workflow runs. ## Prerequisites Google Flights API, Skyscanner API, flight booking service credentials, OpenAI API key, Google Sheets access, WordPress admin account, email service configured. ## Use Cases Travel agencies automating client alerts for price drops. Corporate travel managers monitoring bulk bookings. ## Customization Modify price thresholds in Multi-Criteria Decision node. Add airline or destination filters in search parameters. ## Benefits Eliminates manual price monitoring. Reduces booking delays through automation.
Automated press pass verification & badge creation with QR codes & multi-channel distribution
# 🎫 Verified Press Pass Generator for Media Events **Automate press credential verification and badge generation for journalists covering your events** --- ## 📝 Description Streamline your event media management with this comprehensive press pass automation. When journalists apply for credentials, this workflow instantly validates their identity, verifies their media affiliation, generates professional digital badges with QR codes, and delivers everything via email—all within seconds. Perfect for conferences, product launches, trade shows, corporate events, and any occasion requiring verified media access. --- ## ✨ Key Features ### 🔐 **Advanced Email Verification** - Real-time email validation using VerifiEmail API - Checks RFC compliance, MX records, and domain reputation - Detects disposable email addresses and spoofed domains - Confirms journalist works for legitimate media organization ### 🎨 **Professional Badge Design** - Auto-generates branded digital press passes - Includes journalist photo, name, media outlet, and credentials - Embedded QR code for contactless event entry - Customizable colors, fonts, and event branding - 400×600px portrait format optimized for mobile display ### 📧 **Automated Communication** - Beautiful HTML email with embedded badge preview - Download links for PNG and PDF versions - Clear instructions for event check-in - Professional event branding throughout ### 📊 **Multi-Platform Logging** - Google Sheets backup with timestamp logs - Slack notifications for organizer oversight - Complete audit trail for compliance ### ⚡ **Lightning Fast Processing** - Average execution time: 5-10 seconds - Real-time webhook response with confirmation - Scalable to hundreds of applications per hour - Error handling with graceful fallbacks --- ## 🎯 Use Cases ### **Event Types:** - Tech conferences and summits - Product launch events - Trade shows and exhibitions - Political rallies and press conferences - Sports events and tournaments - Film festivals and premieres - Corporate announcements - Award ceremonies --- ## 🔧 What You Need ### **Required Services:** 1. **n8n** (Cloud or Self-hosted) 2. **VerifiEmail API** ([Get API Key](https://verifi.email)) - Email verification 3. **HTMLCSSToImage API** ([Get API Key](https://htmlcsstoimg.com)) - Badge generation 4. **Gmail Account** (OAuth) - Email delivery 5. **Slack Workspace** - Team notifications 6. **Google Sheets** - Backup logging --- ## 📋 How It Works ### **Step-by-Step Process:** **1. Application Submission** Journalist fills out form on your event website (name, email, media outlet, photo, phone) **2. Data Validation** Webhook receives application and checks for required fields (name, email, photo) **3. Email Verification** VerifiEmail API validates email domain, checks MX records, and confirms media affiliation **4. Credential Generation** - Generates unique press ID (PRESS-XXX-timestamp) - Creates QR code linking to verification portal - Sets 30-day validity period **5. Badge Creation** HTMLCSSToImage API renders professional badge with: - Circular profile photo - Name and media outlet - Press ID in styled container - Scannable QR code - Event name and validity dates - "VERIFIED" indicator **6. Distribution** - Sends HTML email with badge preview and download link - Posts notification to Slack channel - Backs up to Google Sheets - Returns success response to webhook **7. Event Check-In** Security scans QR code at event entrance, verifies credentials instantly --- ## 🚀 Setup Instructions ### **Quick Start (15 minutes):** **1. Import Workflow** - Download the JSON file - In n8n: Click Workflows → Import from File - Upload the JSON and open the workflow **2. Configure Webhook** - Activate the workflow - Copy the webhook URL from the Webhook Trigger node - Add this URL to your website form's action attribute **3. Add API Credentials** - **VerifiEmail:** Create credential with API key from verifi.email dashboard - **HTMLCSSToImage:** Add User ID and API Key from htmlcsstoimg.com - **Gmail:** Connect via OAuth (click "Sign in with Google") - **Slack:** Connect via OAuth and select notification channel - **Google Sheets:** Connect via OAuth **4. Setup Google Sheets** Create a new sheet named "Press Pass Logs" with these column headers: ``` Timestamp | Press ID | Name | Email | Phone | Media Outlet | Email Domain | Verification Status | Event Name | Issued Date | Valid Until | Badge Image URL | QR Code URL | Verification URL | Photo URL | Execution Mode ``` **5. Customize Badge Design** - Open the "HTML/CSS to Image" node - Edit the HTML in `html_content` field - Change gradient colors: Replace `#667eea` and `#764ba2` with your brand colors - Update event name default value - Modify font sizes, spacing, or layout as needed **6. Update Email Content** - Open "Send Press Pass Email" node - Customize email text, support contact info - Update company/event branding - Modify footer with your details **7. Configure Slack Channel** - Open "Notify Organizers (Slack)" node - Select your preferred notification channel - Customize notification message format **8. Test the Workflow** Send a test POST request using Postman or cURL: ```bash curl -X POST https://your-n8n-url/webhook/press-application \ -H "Content-Type: application/json" \ -d '{ "name": "Jane Smith", "email": "[email protected]", "media_outlet": "BBC News", "photo_url": "https://randomuser.me/api/portraits/women/50.jpg", "phone": "+44-1234567890", "event_name": "Tech Summit 2025" }' ``` **9. Go Live** - Verify test execution completed successfully - Check email received with badge - Activate workflow for production use --- ## 🎨 Customization Options ### **Badge Design:** - **Colors:** Change gradient from purple (`#667eea`, `#764ba2`) to your brand colors - **Fonts:** Swap Google Font from Poppins to any available font - **Logo:** Add event logo in header section - **Size:** Adjust viewport_width and viewport_height for different dimensions - **Layout:** Modify HTML structure for custom badge designs ### **Email Templates:** - **Branding:** Update colors, fonts, and styling in HTML email - **Content:** Customize greeting, instructions, and footer - **Attachments:** Add PDF version or additional documents - **Language:** Translate all text to your language --- ## 🔒 Security & Privacy ### **Data Protection:** - ✅ Email verification prevents fake submissions - ✅ QR codes use unique, non-guessable IDs - ✅ HTTPS webhook for encrypted transmission - ✅ No sensitive data stored in workflow variables - ✅ Audit trail for compliance requirements ### **Best Practices:** - Use environment variables for API keys - Enable webhook authentication (Basic Auth or API key) - Implement rate limiting on webhook endpoint - Regularly rotate API credentials - Set up backup systems for critical data --- ## 🛠️ Troubleshooting ### **Common Issues:** **Issue:** "Webhook not receiving data" **Solution:** Ensure workflow is activated and webhook URL is correct in form action **Issue:** "Email verification fails for valid domains" **Solution:** Check VerifiEmail API credit balance and credential configuration **Issue:** "Badge image not generating" **Solution:** Verify HTMLCSSToImage API key is correct and has sufficient credits **Issue:** "Gmail not sending" **Solution:** Reconnect Gmail OAuth credential and check sending limits **Issue:** "QR code not loading in badge" **Solution:** Ensure QR code URL is properly encoded and publicly accessible --- ## 📈 Performance Metrics - **Average execution time:** 5-10 seconds - **Success rate:** 98%+ (with valid inputs) - **Concurrent capacity:** 50+ requests/minute - **API reliability:** 99.9% uptime (dependent on services) - **Badge generation:** <2 seconds - **Email delivery:** <3 seconds --- ## 🏷️ Tags `event-management` `press-pass` `credential-verification` `badge-generation` `email-automation` `qr-code` `media-relations` `event-technology` `htmlcsstoimage` `verifi-email` `gmail` `slack` `google-sheets` `webhook` `automation` `workflow` `conference` `journalism` `press-credentials` --- ## 📄 License This workflow template is provided as-is for use with n8n. Customize freely for your organization's needs. ---
Sync Amazon Luna Prime games to Google Sheets with automatic updates
## Amazon Luna Prime Games Catalog Tracker (Auto-Sync to Google Sheets)** Automatically fetch, organize, and maintain an updated catalog of **Amazon Luna – Included with Prime** games.This workflow regularly queries Amazon’s official Luna endpoint, extracts complete metadata, and syncs everything into Google Sheets without duplicates. Ideal for: * tracking monthly **Prime Luna rotations** * keeping a personal archive of games * monitoring **new games appearing on Amazon Games / Prime Gaming**, so you can instantly play titles you’re interested in * building dashboards or gaming databases * powering notification systems (Discord, Telegram, email, etc.) * * * ### **Overview** Amazon Luna’s “Included with Prime” lineup changes frequently, with new games added and old ones removed.Instead of checking manually, this n8n template fully automates the process: * Fetches the latest list from Amazon’s backend * Extracts detailed metadata from the response * Syncs the data into Google Sheets * Avoids duplicates by updating existing rows * Supports all major Amazon regions Once configured, it runs automatically—keeping your game catalog correct, clean, and always up to date. * * * #### 🛠️ **How the workflow works** **1. Scheduled Trigger** Starts the workflow on a set schedule (default: every 5 days at 3:00 PM).You can change both frequency and time freely. **2. HTTP Request to Amazon Luna** Calls Amazon Luna’s regional endpoint and retrieves the full **“Included with Prime”** catalog. **3. JavaScript Code Node – Data Extraction** Parses the JSON response and extracts structured fields: * Title * Genres * Release Year * ASIN * Image URLs * Additional metadata The result is a clean, ready-to-use dataset. **4. Google Sheets – Insert or Update Rows** Each game is written into the selected Google Sheet: * Existing games get updated * New games are appended The **Title** acts as the unique identifier to prevent duplicates. * * * ## ⚙️ **Configuration Parameters** | Parameter | Description | Recommended values | | --- | --- | --- | | **x-amz-locale** | Language + region | `it_IT` 🇮🇹 · `en_US` 🇺🇸 · `de_DE` 🇩🇪 · `fr_FR` 🇫🇷 · `es_ES` 🇪🇸 · `en_GB` 🇬🇧 · `ja_JP` 🇯🇵 · `en_CA` 🇨🇦 | | **x-amz-marketplace-id** | Marketplace backend ID | `APJ6JRA9NG5V4` 🇮🇹 · `ATVPDKIKX0DER` 🇺🇸 · `A1PA6795UKMFR9` 🇩🇪 · `A13V1IB3VIYZZH` 🇫🇷 · `A1RKKUPIHCS9HS` 🇪🇸 · `A1F83G8C2ARO7P` 🇬🇧 · `A1VC38T7YXB528` 🇯🇵 · `A2EUQ1WTGCTBG2` 🇨🇦 | | **Accept-Language** | Response language | Example: `it-IT,it;q=0.9,en;q=0.8` | | **User-Agent** | Browser-like request | Default or updated UA | | **Trigger interval** | Refresh frequency | Every 5 days at 3:00 PM (modifiable) | | **Google Sheet** | Storage output | Select your file + sheet | You can adapt these headers to fetch data from any supported country. * * * 💡 **Tips & Customization** #### 🌍 Regional catalogs Duplicate the HTTP Request + Code + Sheet block to track multiple countries (US, DE, JP, UK…). #### 🧹 No duplicates The workflow updates rows intelligently, ensuring a clean catalog even after many runs. #### 🗂️ Move data anywhere Send the output to: * Airtable * Databases (MySQL, Postgres, MongoDB…) * Notion * CSV * REST APIs * BI dashboards #### 🔔 Add notifications (Discord, Telegram, Email, etc.) You can pair this template with a notification workflow.When used with **Discord**, the notification message can include: * game title * description or metadata * **the game’s image**, automatically downloaded and attached This makes notifications visually informative and perfect for tracking new Prime titles. * * * ### 🔒 **Important Notes** * All retrieved data belongs to **Amazon**. * The workflow is intended for **personal, testing, or educational use only**. * Do not republish or redistribute collected data without permission.
Track Companies House filing deadlines with Google Sheets, Gmail & interactive alerts
## 🎯 Accounting Alerts Automation **Purpose:** Automatically track Companies House filing deadlines for UK accounting firms and prevent costly penalties (£150-£1,500 per missed deadline). **How it works:** - Daily automated checks pull live deadline data from Companies House API - Color-coded email alerts (Red/Orange/Yellow/Green) prioritize urgent deadlines - Interactive "Yes/No" buttons let recipients confirm completion status - All data syncs back to Google Sheets for complete audit trail **Value:** Saves 2-3 hours/week per firm while eliminating manual tracking errors. ## ⚙️ Daily Deadline Check & Alert System **Runs:** Every weekday at 5 PM (Mon-Fri) **What happens:** 1. **Read Company Database** - Fetches all tracked companies from Google Sheets 2. **Get Company Data** - Pulls live filing deadlines from Companies House API for each company 3. **Update Due Dates** - Syncs latest deadline data back to the tracking sheet 4. **Build Interactive Email** - Creates HTML email with: - Color-coded urgency indicators (days remaining) - Sortable table by due date - Clickable Yes/No confirmation buttons for each company 5. **Send via Gmail** - Delivers consolidated report to accounting team **Why automated:** Manual deadline checking across 10-50+ companies is time-consuming and error-prone. This ensures nothing falls through the cracks. ## ✅ Email Response Handler (Webhook Flow) **Triggered when:** Recipient clicks "Yes" or "No" button in the alert email **What happens:** 1. **Webhook** - Receives confirmation status (company_number, company_name, yes/no) 2. **Process Data** - Extracts response details from the webhook payload 3. **Update Sheet** - Records confirmation status in Google Sheets with timestamp 4. **Confirmation Page** - Displays success message to user **Why this matters:** Provides instant feedback to the user and creates an audit trail of who confirmed what and when. No separate tracking system needed—everything updates automatically in the same spreadsheet. **Result:** Accountability without administrative burden. ## 📋 Setup Requirements **Google Sheets Database Structure:** Create a sheet with these columns: - company_number (manually entered) - company_name (manually entered) - accounts_due (auto-updated) - confirmation_due (auto-updated) - confirmation_submitted (updated via email clicks) - last_updated (auto-timestamp) **Required Credentials:** - Google Sheets OAuth (for reading/writing data) - Companies House API key (free from api.company-information.service.gov.uk) - Gmail OAuth (for sending alerts) **Webhook Configuration:** Update webhook URL in "Build Interactive Email" node to match your n8n instance. **Time to Setup:** ~15 minutes once credentials are configured.
Moderate your Discord server using chatGPT-5 & Google Sheets (Learning system)
# Discord AI Content Moderator with Learning System This n8n template demonstrates how to automatically moderate Discord messages using AI-powered content analysis that learns from your community standards. It continuously monitors your server, intelligently flags problematic content while allowing context-appropriate language, and provides a complete audit trail for all moderation actions. ## Use cases are many: Try moderating a forex trading community where enthusiasm runs high, protecting a gaming server from toxic behavior while keeping banter alive, or maintaining professional standards in a business Discord without being overly strict! ## Good to know * This workflow uses OpenAI's GPT-5 Mini model which incurs API costs per message analyzed (approximately $0.001-0.003 per moderation check depending on message volume) * The workflow runs every minute by default - adjust the Schedule Trigger interval based on your server activity and budget * Discord API rate limits apply - the batch processor includes 1.5-second delays between deletions to prevent rate limiting * You'll need a Google Sheet to store training examples - a template link is provided in the workflow notes * The AI analyzes context and intent, not just keywords - "I **cking love this community" won't be deleted, but "you guys are sh*t" will be * Deleted messages cannot be recovered from Discord - the admin notification channel preserves the content for review ## How it works * The Schedule Trigger activates every minute to check for new messages requiring moderation * We'll fetch training data from Google Sheets containing labeled examples of messages to delete (with reasons) and messages to keep * The workflow retrieves the last 10 messages from your specified Discord channel using the Discord API * A preparation node formats both the training examples and recent messages into a structured prompt with unique indices for each message * The AI Agent (powered by GPT-5 Mini) analyzes each message against your community standards, considering intent and context rather than just keywords * The AI returns a JSON array of message indices that violate guidelines (e.g., [0, 2, 5]) * A parsing node extracts these indices, validates them, removes duplicates, and maps them to actual Discord message objects * The batch processor loops through each flagged message one at a time to prevent API rate limiting and ensure proper error handling * Each message is deleted from Discord using the exact message ID * A 1.5-second wait prevents hitting Discord's rate limits between operations * Finally, an admin notification is posted to your designated admin channel with the deleted message's author, ID, and original content for audit purposes ## How to use * Replace the Discord Server ID, Moderated Channel ID, and Admin Channel ID in the "Edit Fields" node with your server's specific IDs * Create a copy of the provided Google Sheets template with columns: message_content, should_delete (YES/NO), and reason * Connect your Discord OAuth2 credentials (requires bot permissions for reading messages, deleting messages, and posting to channels) * Add your OpenAI API key to access GPT-5 Mini * Customize the AI Agent's system message to reflect your specific community standards and tone * Adjust the message fetch limit (default: 10) based on your server activity - higher limits cost more per run but catch more violations * Consider changing the Schedule Trigger from every minute to every 3-5 minutes if you have a smaller community ## Requirements * Discord OAuth2 credentials for bot authentication with message read, delete, and send permissions * Google Sheets API connection for accessing the training data knowledge base * OpenAI API key for GPT-5 Mini model access * A Google Sheet formatted with message examples, deletion labels, and reasoning * Discord Server ID, Channel IDs (moderated + admin) which you can get by enabling Developer Mode in Discord ## Customising this workflow * Try building an emoji-based feedback system where admins can react to notifications with ✅ (correct deletion) or ❌ (wrong deletion) to automatically update your training data * Add a severity scoring system that issues warnings for minor violations before deleting messages * Implement a user strike system that tracks repeat offenders and automatically applies temporary mutes or bans * Expand the AI prompt to categorize violations (spam, harassment, profanity, etc.) and route different types to different admin channels * Create a weekly digest that summarizes moderation statistics and trending violation types * Add support for monitoring multiple channels by duplicating the Discord message fetch nodes with different channel IDs * Integrate with a database instead of Google Sheets for faster lookups and more sophisticated training data management ## If you have questions Feel free to contact me here: [email protected] [email protected]
Domain availability monitor with Porkbun, Google Sheets & multi-channel alerts
This workflow automatically checks a list of desired domain names for availability every 30 minutes. Using the Porkbun API and Google Sheets, it instantly sends detailed notifications via Gmail and Discord the moment a domain becomes available, so you can secure it before anyone else. ## Why Use This Workflow? **Time Savings:** Eliminates hours of manual domain checking each week. Set your list once and let the automation monitor your targets 24/7. **Competitive Edge:** Gain a critical speed advantage in acquiring high-value or expiring domains the second they become available. **Scalability:** Effortlessly monitor hundreds of domains simultaneously without any extra effort or performance degradation. ## Ideal For - **Domain Investors:** Automatically track and snipe valuable expiring domains for their portfolio without constant manual checks. - **Marketing Agencies & SEO Specialists:** Secure brandable domains for new clients or build out private blog networks by catching domains as soon as they drop. - **Startups & Entrepreneurs:** Monitor desired brand names without the daily hassle, ensuring they get the perfect domain the moment it's available. ## How It Works 1. **Trigger:** A schedule trigger initiates the workflow every 30 minutes. 2. **Data Collection:** It retrieves a list of domains to monitor from a designated Google Sheet, specifically targeting rows marked as "no" for availability. 3. **Processing:** The workflow iterates through each domain one by one. 4. **Intelligence Layer:** It makes an API call to Porkbun to check the current availability of the domain. An IF node then determines if the domain is available (`avail` == `yes`). 5. **Output & Delivery:** If a domain is available, the workflow sends a rich HTML email via Gmail and a formatted message to a Discord channel, complete with pricing details and a direct registration link. 6. **Storage & Logging:** The Google Sheet is automatically updated to mark the domain as "available," preventing redundant notifications on future runs. ## Setup Guide ### Prerequisites | Requirement | Type | Purpose | |-------------|------|---------| | [n8n instance](https://n8n.partnerlinks.io/khmuhtadin) | Essential | Workflow execution platform | | [Porkbun Account](https://porkbun.com/) | Essential | API Access for domain checks | | Google Cloud Platform | Essential | Storing domain list (Sheets) & sending alerts (Gmail) | | Discord Server | Optional | Real-time channel notifications | ### Installation Steps 1. Import the JSON file to your [n8n instance](https://n8n.partnerlinks.io/khmuhtadin). 2. **Create a Google Sheet** with two columns: `Domain` (e.g., example.com) and `isAvailable` (e.g., no). 3. **Configure credentials:** - **Porkbun**: Log in to Porkbun, go to API Access, create a new key, and copy the API Key and Secret Key into the HTTP Request nodes. A "Validate API KEY" node is included for testing your credentials. - **Google Sheets/Gmail**: Authenticate your Google account for the Google Sheets and Gmail nodes. - **Discord**: Create a Discord Bot and add the credentials to the Discord node. 4. **Update environment-specific values:** - **Get Domains from Sheet**: Enter your Google Sheet ID and select the correct sheet name. - **Send Email Alert**: Set your recipient's email address in the "To" field. - **Send Discord Notification**: Select your desired Server and Channel ID. 5. **Test execution:** - Add a domain you know is available to your Google Sheet (with `isAvailable` set to "no"). Run the workflow manually to verify that all connections work and you receive notifications. ## Technical Details ### Core Nodes | Node | Purpose | Key Configuration | |------|---------|-------------------| | Schedule Trigger | Initiates the workflow on a recurring basis. | Set the desired interval (default: 30 minutes). | | Google Sheets | Reads the domain list and updates its status. | Sheet ID, Sheet Name, and column mapping. | | SplitInBatches | Processes each domain from the list individually. | Batch size is set to 1 to check domains sequentially. | | HTTP Request | Queries the Porkbun API for domain availability. | Porkbun API endpoint and credentials. | | IF | Routes the workflow based on the API response. | Checks if `response.avail` equals "yes". | | Gmail | Sends a detailed email alert for available domains. | Recipient email, subject, and HTML body. | | Discord | Sends a concise notification to a Discord channel. | Server ID, Channel ID, and message content. | | Wait | Prevents API rate-limiting. | Pauses for 10 seconds between checking domains. | ### Workflow Logic The workflow is triggered by a schedule, fetching a list of domains from a Google Sheet. It uses the `SplitInBatches` node to loop through each domain sequentially. For every domain, it calls the Porkbun API. An `IF` node checks the response; if available, it triggers notifications and updates the Google Sheet. A `Wait` node is crucial for respecting API rate limits, ensuring the workflow runs smoothly even with large domain lists. ## Customization Options **Basic Adjustments:** - **Check Frequency**: Modify the "Schedule Trigger" node to run more or less frequently. - **Notification Channels**: Remove the Gmail or Discord nodes, or add new ones like Slack or Telegram. - **Email Content**: Customize the HTML in the Gmail node to match your branding. **Advanced Enhancements:** - **Auto-Registration**: Extend the workflow to use Porkbun's domain registration API endpoint to automatically purchase the domain when it becomes available (use with caution). - **Advanced Filtering**: Add logic to only send notifications for domains with specific TLDs (.com, .io, etc.) or that are not marked as "premium." - **Tiered Notifications**: Set up different notification channels based on the perceived value of the domain, sending high-priority alerts via SMS for critical domains. ## Troubleshooting **Common Issues:** | Problem | Cause | Solution | |---------|-------|----------| | Workflow fails at HTTP Request node | Invalid Porkbun API credentials. | Use the separate "Validate API KEY" node to test your keys directly. Regenerate them if needed. | | No domains are processed | Google Sheets node configuration error or the sheet is empty. | Verify the Sheet ID is correct and that the `isAvailable` column contains "no" for the domains you want to check. | | Authentication errors | Google or Discord credentials have expired or lack permissions. | Re-authenticate the respective nodes in the n8n credentials panel. | --- **Created by:** [Khaisa Studio](https://khaisa.studio) **Category:** Monitoring **Tags:** Porkbun, Domain, Automation, Google Sheets, Notifications **Need custom workflows?** [Contact us](https://khaisa.studio/pages/contact) **Connect with the creator:** [Portfolio](https://khmuhtadin.com) • [Workflows](https://khaisa.studio/products/) • [LinkedIn](https://www.linkedin.com/in/khmuhtadin/) • [Medium](https://medium.com/@khaisastudio) • [Threads](https://www.threads.com/@khmuhtadin)
Extract TikTok usernames from any video or creator link format
## What's the problem? Imagine you want to **automate a task** where, based on a TikTok video link, **you must retrieve the username of the creator of that video**. Many people may think that it's enough to get the "@" part of the link but that's not the case always. TikTok's iOS and Android app have **specific link formats** that are easier to share with others but, at the same time, it makes our task of retrieving creators way harder. ## Our solution: **In solution to this problem, this simple workflow** makes a HTTP protocol request to retrieve the original link of the video hosted on _www.tiktok.com_ instead of the default mobile app's subdomain _vm.tiktok.com_. Then, we can in fact remove the attributes of the link and extract the handle correctly. ## Good things to know: **Note that we extract the username** (and not the profile's nickname) **without the "@"**. Once we have our username, we can simply access to their profile from then on using _"https://www.tiktok.com/@{{ $json.username }}"_.
Create Dynamic Seating & Venue Layout Plans with Google Sheets
Enhance event logistics with this automated n8n workflow. Triggered by seating requests, it fetches attendee data and venue templates from Google Sheets, calculates totals, and optimizes seating layouts. The workflow generates detailed recommendations, splits individual assignments, and sends alerts, ensuring efficient venue planning and real-time updates. 🎪📋 ### Key Features - Optimizes seating arrangements based on attendee data and event type. - Generates venue layouts with visual and statistical insights. - Provides real-time alerts with comprehensive seating plans. - Logs detailed assignments and layouts in Google Sheets. ### Workflow Process - The **Webhook Trigger** node initiates the workflow upon receiving venue requirements and attendee data via webhook. - **Validate Request Data** ensures the incoming data is complete and accurate. - **Fetch Attendee Data** retrieves attendee information, including groups, accessibility needs, and VIP preferences from Google Sheets. - **Fetch Venue Templates** reads venue layout templates from Google Sheets. - **Calculate Totals** aggregates attendee data and venue constraints for optimal planning. - **Combine All Data** merges attendee and venue data for analysis. - **AI Optimization** uses algorithms to calculate optimal seating based on venue dimensions, attendee groups, accessibility needs, VIP placement, and aisle placement. - **Optimize Seating Layout** refines the seating plan for efficiency. - **Format Recommendations** structures the seating plan with visual layout map, seat assignments, statistics & metrics, and optimization tips. - **Split Seat Assignments** divides the plan into individual seat assignments. - **Send Response** returns the complete seating plan with visual layout map, seat assignment list, statistics & recommendations, and export-ready format. - **Send Alert** notifies organizers with the finalized plan details. - **Update Sheets** saves the master plan summary, individual seat assignments, and layout specifications to Google Sheets. - **Save Individual Assignments** appends or updates individual seat assignments to Google Sheets. ### Setup Instructions - Import the workflow into n8n and configure Google Sheets OAuth2 for data access. - Set up the Webhook Trigger with your event management system's API credentials. - Configure the AI Optimization node with a suitable algorithm or model. - Test the workflow by sending sample seating requests and verifying layouts. - Adjust optimization parameters as needed for specific venue or event requirements. ### Prerequisites - Google Sheets OAuth2 credentials - Webhook integration with the event management system - Structured attendee and venue data in a Google Sheet **Google Sheet Structure:** 1. Attendee Data Sheet with columns: - Name - Group - Accessibility Needs - VIP Status - Preferences - Updated At 2. Venue Templates Sheet with columns: - Venue Name - Capacity - Dimensions - Layout Template - Updated At ### Modification Options - Customize the **Validate Request Data** node to include additional validation rules. - Adjust the **AI Optimization** node to prioritize specific criteria (e.g., proximity, accessibility). - Modify the **Format Recommendations** node to include custom visual formats. - Integrate with venue management tools for live layout updates. - Set custom alert triggers in the **Send Alert** node. **Discover more workflows – [Get in touch with us](https://www.oneclickitsolution.com/contact-us/)**
Generate event speaker recommendations with Claude AI and Google Sheets
Simplify event planning with this automated n8n workflow. Triggered by incoming requests, it fetches speaker and audience data from Google Sheets, analyzes profiles and preferences, and generates optimized session recommendations. The workflow delivers formatted voice responses and updates tracking data, ensuring organizers receive real-time, tailored suggestions. 🎙️📊 ### Key Features - Real-time analysis of speaker and audience data for personalized recommendations. - Generates optimized session lineups based on profiles and preferences. - Delivers responses via voice agent for a seamless experience. - Logs maintain a detailed recommendation history in Google Sheets. ### Workflow Process - The **Webhook Trigger** node initiates the workflow upon receiving voice agent or external system requests. - **Parse Voice Request** processes incoming voice data into actionable parameters. - **Fetch Database** retrieves speaker ratings, past sessions, and audience ratings from Google Sheets. - **Calculate & Analyze** combines voice request data with speaker profiles and audience insights for comprehensive matching. - **AI Optimization Engine** analyzes speaker-audience fit and recommends optimal session lineups. - **Format Recommendations** structures the recommendations for voice agent response. - **Voice Agent Response** returns formatted recommendations to the user with natural language summary and structured data. - **Update Tracking Sheet** saves recommendation history and analytics to Google Sheets. - If errors occur, the **Check for Errors** node branches to: - **Format Error Response** prepares an error message. - **Send Error Response** delivers the error notification. ### Setup Instructions - Import the workflow into n8n and configure Google Sheets OAuth2 for data access. - Set up the Webhook Trigger with your voice agent or external system's API credentials. - Configure the AI Optimization Engine node with a suitable language model (e.g., Anthropic Chat Model). - Test the workflow by sending sample voice requests and verifying recommendations. - Adjust analysis parameters as needed for specific event requirements. ### Prerequisites - Google Sheets OAuth2 credentials - Voice agent API or integration service - AI/LLM service for optimization (e.g., Anthropic) - Structured speaker and audience data in a Google Sheet **Google Sheet Structure:** 1. Create a sheet with columns: - Speaker Name - Rating - Past Sessions - Audience Rating - Preferences - Updated At ### Modification Options - Customize the **Calculate & Analyze** node to include additional matching criteria (e.g., topic expertise). - Adjust the **AI Optimization Engine** to prioritize specific session formats or durations. - Modify voice response templates in the **Voice Agent Response** node with branded phrasing. - Integrate with event management tools (e.g., Eventbrite) for live data feeds. - Set custom error handling rules in the **Check for Errors** node. **Discover more workflows – [Get in touch with us](https://www.oneclickitsolution.com/contact-us/)**