Trung Tran
Workflows by Trung Tran
Monitor Cloudflare incidents and alert via Slack, Telegram, and Jira
# Cloudflare Incident Monitoring & Escalation Workflow ## 🚀 Try Decodo — Web Scraping & Data API (Coupon: **TRUNG**)  **Decodo** is a powerful public data access platform offering managed web scraping APIs and proxy infrastructure to collect structured web data at scale. It handles proxies, anti-bot protection, JavaScript rendering, retries, and global IP rotation—so you can focus on data, not scraping complexity. **Why Decodo** - Managed **Web Scraping API** with anti-bot bypass & high success rates - Works with JS-heavy sites; outputs JSON/HTML/CSV - Easy integration (Python, Node.js, cURL) for eCommerce, SERP, social & general web data **🎟️ Special Discount** Use coupon **`TRUNG`** to get the **Advanced Scraping API** plan — ~**23,000 requests for ~$5**. ## Who this workflow is for For **DevOps, SRE, IT Ops, and Platform teams** running production traffic behind Cloudflare who need reliable incident awareness without alert fatigue. Use it if you want: - Continuous Cloudflare incident monitoring - Clear severity-based routing - Automatic escalation into JIRA - Clean Slack & Telegram notifications - Deduplicated, noise-controlled alerts ## What this workflow does This workflow polls the **Cloudflare Status API**, detects unresolved incidents, scores their impact, and routes them to the right channels. High-impact incidents are escalated to JIRA. Lower-impact updates are notified (or skipped) to reduce noise. ## How it works (high level) 1. Runs on a fixed schedule (e.g. every 5 minutes) 2. Fetches current Cloudflare incidents 3. Stops early if no active issues exist 4. Normalizes and scores incidents (severity, impact, affected service) 5. Deduplicates previously-alerted incidents 6. Builds human-readable notification payloads 7. Routes by impact: - **High** → create JIRA incident + notify - **Low** → notify or suppress 8. Sends alerts to Slack and Telegram ## Requirements - Decoco Scrapper API credential - n8n (self-hosted or Cloud) - Cloudflare Status API (public) - Slack bot (`chat:write`) - Telegram bot + chat ID - JIRA project with issue-create permission - Optional LLM credentials (summarization/classification) ## Notes - All secrets are stored in **n8n Credentials** - Workflow is **idempotent** and safe to rerun - No assumptions about root cause or remediation Built for production-grade incident visibility with **n8n**.
Aws Lambda manager with GPT-4.1 & Google Sheets audit logging via chat
# Chat-Based AWS Lambda Manager with Automated Audit Logging (GPT-4.1 mini + Google Sheet) > This workflow provides a chat-based AI agent to manage AWS Lambda functions. It allows users to list, invoke, get details, and delete Lambda functions, while automatically recording every action into Google Sheets for audit and compliance tracking. ## **Who’s it for** - Cloud engineers and DevOps teams managing AWS Lambda functions. - Developers who want a simple chat interface to perform Lambda operations. - Compliance and operations teams needing automatic audit logs of AWS actions. ## **How it works / What it does** 1. A chat message triggers the **AWS Lambda Manager Agent**. 2. The agent interprets user intent and calls one of the available tools: - **Invoke Lambda Function**: Runs a Lambda function with given payload. - **List Lambda Functions**: Shows all functions in the account. - **Get Lambda Function**: Retrieves details/configuration of a function. - **Delete a Function**: Removes a Lambda function permanently. 3. After each action, the agent calls **Audit Logs (Google Sheets)** to record the operation type, function name, timestamp, and outcome. 4. The agent confirms destructive actions (like delete) before execution. ## **How to set up** 1. Add a **Chat Trigger** node to start the workflow when a user sends a message. 2. Connect it to the **AWS Lambda Manager Agent** node. 3. Configure the agent with the provided system prompt to enforce rules and logging. 4. Link the agent to the following tool nodes: - Invoke Lambda Function - List Lambda Functions - Get Lambda Function - Delete a Function - Audit Logs (Google Sheets with `appendOrUpdate` enabled) 5. Deploy the workflow and test it with sample chat commands like “list functions” or “invoke testFunction”. ## **Requirements** - AWS account with IAM credentials that have `lambda:ListFunctions`, `lambda:InvokeFunction`, `lambda:GetFunction`, and `lambda:DeleteFunction` permissions. - Google Sheets API connected for storing audit logs. - Proper region and function names configured when invoking or deleting. - n8n instance or automation platform that supports agent + tool integration. ## **How to customize the workflow** - **Add new tools**: Extend with more AWS Lambda operations like Update Function Code or Publish Version. - **Enhance logging**: Include user ID, request payload, or execution results in Audit Logs. - **Access control**: Restrict delete actions to admins by adding role-based logic. - **Multi-cloud support**: Extend the agent to handle Azure/AWS/GCP serverless functions in one workflow. - **Custom responses**: Modify agent prompt to tailor tone (developer-friendly vs. business-friendly).
Manage AWS S3 with GPT-4 agent and Google Sheets audit logging via Slack
# AI-Powered AWS S3 Manager with Audit Logging in n8n (Slack/ChatOps Workflow) > This n8n workflow empowers users to manage AWS S3 buckets and files using natural language via Slack or chat platforms. Equipped with an OpenAI-powered Agent and integrated audit logging to Google Sheets, it supports operations like listing buckets, copying/deleting files, managing folders, and automatically records every action for compliance and traceability. ## 👥 Who’s it for This workflow is built for: - DevOps engineers who want to manage AWS S3 using natural chat commands. - Technical support teams interacting with AWS via Slack, Telegram, etc. - Automation engineers building ChatOps tools. - Organizations that require **audit logs** for every cloud operation. Users don’t need AWS Console or CLI access — just send a message like “Copy file from dev to prod”. ## ⚙️ How it works / What it does This workflow turns natural chat input into **automated AWS S3 actions** using an OpenAI-powered AI Agent in n8n. ### 🔁 Workflow Overview: 1. **Trigger**: A user sends a message in Slack, Telegram, etc. 2. **AI Agent**: - Interprets the message - Calls one of 6 S3 tools: - `ListBuckets` - `ListObjects` - `CopyObject` - `DeleteObject` - `ListFolders` - `CreateFolder` 3. **S3 Action**: Performs the requested AWS S3 operation. 4. **Audit Log**: Logs the tool call to Google Sheets using `AddAuditLog`: - Includes timestamp, tool used, parameters, prompt, reasoning, and user info. ## 🛠️ How to set up ### Step-by-step Setup: 1. **Webhook Trigger** - Slack, Telegram, or custom chat platform → connects to n8n. 2. **OpenAI Agent** - Model: `gpt-4` or `gpt-3.5-turbo` - Memory: Simple Memory Node - Prompt: Instructs agent to always follow tool calls with an `AddAuditLog` call. 3. **AWS S3 Nodes** - Configure each tool with AWS credentials. - Tools: - `getAll: bucket` - `getAll: file` - `copy: file` - `delete: file` - `getAll: folder` - `create: folder` 4. **Google Sheets Node** - Sheet: `AWS S3 Audit Logs` - Operation: `Append or Update Row` - Columns (must match input keys): - `timestamp`, `tool`, `status`, `chat_prompt`, `parameters`, `user_name`, `tool_call_reasoning` 5. **Agent Tool Definitions** - Include `AddAuditLog` as a 7th tool. - Agent calls it **immediately after every S3 action** (except when logging itself). ## ✅ Requirements - [ ] n8n instance with AI Agent feature - [ ] OpenAI API Key - [ ] AWS IAM user with S3 access - [ ] Google Sheet with required columns - [ ] Chat integration (Slack, Telegram, etc.) ## 🧩 How to customize the workflow | Feature | Customization Tip | |----------------------|--------------------------------------------------------------| | 🌎 Multi-region S3 | Let users include region in the message or agent memory | | 🛡️ Restricted actions| Use memory/user ID to limit delete/copy actions | | 📁 Folder filtering | Extend `ListObjects` with prefix/suffix filters | | 📤 Upload file | Add `PutObject` with pre-signed URL support | | 🧾 Extra logging | Add IP, latency, error trace to audit logs | | 📊 Reporting | Link Google Sheet to Looker Studio for audit dashboards | | 🚨 Security alerts | Notify via Slack/Email when `DeleteObject` is triggered |
AWS EC2 lifecycle manager with AI chat agent (describe, start, stop, reboot)
# EC2 Lifecycle Manager with AI Chat Agent (Describe, Start, Stop, Reboot, Terminate) ### Watch the demo video below: [](https://youtu.be/C1s0AM1_ho0) ## Who’s it for This workflow is designed for **DevOps engineers** and **cloud administrators** who want to manage AWS EC2 instances directly from chat platforms (Slack, Teams, Telegram, etc.) using natural language. It helps engineers quickly check EC2 instance status, start/stop servers, reboot instances, or terminate unused machines — without logging into the AWS console. ## How it works / What it does 1. A chat message (command) from the engineer triggers the workflow. 2. The **EC2 Manager AI Agent** interprets the request using the AI chat model and memory. 3. The agent decides which AWS EC2 action to perform: - `DescribeInstances` → List or check status of EC2 instances. - `StartInstances` → Boot up stopped instances. - `StopInstances` → Gracefully shut down running instances. - `RebootInstances` → Restart instances without stopping them. - `TerminateInstances` → Permanently delete instances. 4. The selected tool (API call) is executed via an HTTP Request to the AWS EC2 endpoint. 5. The agent replies back in chat with the result (confirmation, instance status, errors, etc.). ## How to set up 1. **Add Chat Trigger** - Connect your chatbot platform (Slack/Telegram/Teams) to n8n. - Configure the “When chat message received” node. 2. **Configure OpenAI Chat Model** - Select a supported LLM (GPT-4, GPT-4.1, GPT-5, etc.). - Add system and user prompts to define behavior (EC2 assistant role). 3. **Add Memory** - Use `Simple Memory` to keep track of context (e.g., instance IDs, region, last action). 4. **Connect EC2 API Tools** - Create HTTP Request nodes for: - Describe Instances - Start Instance - Stop Instance - Reboot Instance - Terminate Instance - Use AWS credentials with Signature V4 authentication. - API endpoint: `https://ec2.{region}.amazonaws.com/` 5. **Link Tools to Agent** - Attach all EC2 tools to the EC2 Manager AI Agent node. - Ensure the agent can choose which tool to call based on user input. ## Requirements - **n8n instance** (self-hosted or cloud). - **Chat platform integration** (Slack, Teams, or Telegram). - **OpenAI (or other LLM) credentials**. - **AWS IAM user with EC2 permissions**: - `ec2:DescribeInstances` - `ec2:StartInstances` - `ec2:StopInstances` - `ec2:RebootInstances` - `ec2:TerminateInstances` - **AWS region configured** for API calls. ## How to customize the workflow - **Add safety checks**: Require explicit confirmation before running `Stop` or `Terminate`. - **Region flexibility**: Add support for multi-region management by letting the user specify the region in chat. - **Tag-based filters**: Extend `DescribeInstances` to return only instances matching specific tags (e.g., `env=dev`). - **Cost-saving automation**: Add scheduled rules to automatically stop instances outside working hours. - **Enhanced chatbot UX**: Format responses into tables or rich messages in Slack/Teams. - **Audit logging**: Store each action (who/what/when) into a database or Google Sheets for compliance.
Generate AWS IAM policies via chat interface with GPT-4 assistant
# Chat-Based AWS IAM Policy Generator with OpenAI Agent > Chat-driven workflow that lets IT and DevOps teams generate custom AWS IAM policies via AI, automatically apply them to AWS, and send an email notification with policy details. ## 👤 Who’s it for This workflow is designed for: - **Cloud Engineers / DevOps** who need to quickly generate and apply **custom IAM policies** in AWS. - **IT Support / Security teams** who want to create IAM policies through a **chat-based interface** without manually writing JSON. - Teams that want **automatic notifications** (via email) once new policies are created. ## ⚙️ How it works / What it does 1. **Trigger** → Workflow starts when a **chat message is received**. 2. **IAM Policy Creator Agent** → Uses OpenAI to: - Interpret user requirements (e.g., service, actions, region). - Generate a valid **IAM policy JSON** following AWS best practices. 3. **IAM Policy HTTP Request** → Sends the generated policy to **AWS IAM CreatePolicy API**. 4. **Email Notification** → Once AWS responds with a `CreatePolicyResponse`, an email is sent with policy details (name, ARN, ID, timestamps, etc.) using n8n mapping. Result: The user can **chat with the AI agent**, create a policy, and receive an **email confirmation** with full details. ## 🛠 How to set up 1. **Chat Trigger Node** - Configure the `When chat message received` node to connect your preferred chat channel (Slack, MS Teams, Telegram, etc.). 2. **IAM Policy Creator Agent** - Add **OpenAI Chat Model** as the LLM. - Use a **system prompt** that enforces AWS IAM JSON best practices (least privilege, correct JSON structure). - Connect **Memory** (Simple Memory) and **Structured Output Parser** to ensure consistent JSON output. 3. **IAM Policy HTTP Request** - Set method: `POST` - URL: `https://iam.amazonaws.com/` - Add authentication using **AWS Signature v4** (Access Key + Secret Key). - Body: - `Action=CreatePolicy` - `PolicyName={{ $json.CreatePolicyResponse.CreatePolicyResult.Policy.PolicyName }}` - `PolicyDocument={{ $json.policyDocument }}` - `Version=2010-05-08` 4. **Email for tracking** ## 📋 Requirements - n8n instance (self-hosted or cloud). - AWS IAM user/role with permission to `iam:CreatePolicy`. - AWS Access Key + Secret Key (for SigV4 signing in HTTP request). - OpenAI API key (for the Chat Model). - Email server credentials (SMTP or provider integration). ## 🎨 How to customize the workflow - **Restrict services/actions** → Adjust the IAM Policy Creator Agent system prompt to limit what services/policies can be generated. - **Notification channels** → Replace the email node with Slack, MS Teams, or PagerDuty to alert other teams. - **Tagging policies** → Modify the HTTP request to include `Tags` when creating policies in AWS. - **Human-readable timestamps** → Add a Function or Set node to convert `CreateDate` and `UpdateDate` from Unix epoch to ISO datetime before sending emails. - **Approval step** → Insert a manual approval node before sending the policy to AWS for compliance workflows.
Extract Amazon book data & generate purchase reports with Decodo Scraper
# Decodo Scraper API Workflow Template (n8n Automation Amazon Book Purchase Report) ### Watch the demo video below: [](https://www.youtube.com/watch?v=9Kn583UJlqY) > This workflow demos how to use **Decodo Scraper API** to crawl any public web page (headless JS, device emulation: mobile/desktop/tablet), extract structured product data from the returned HTML, generate a **purchase-ready report**, and automatically deliver it as a **Google Doc + PDF** to Slack/Drive. ## 🚀 Try Decodo — Web Scraping & Data API (Coupon: **TRUNG**)  **Decodo** is a powerful public data access platform offering managed web scraping APIs and proxy infrastructure to collect structured web data at scale. It handles proxies, anti-bot protection, JavaScript rendering, retries, and global IP rotation—so you can focus on data, not scraping complexity. **Why Decodo** - Managed **Web Scraping API** with anti-bot bypass & high success rates - Works with JS-heavy sites; outputs JSON/HTML/CSV - Easy integration (Python, Node.js, cURL) for eCommerce, SERP, social & general web data **🎟️ Special Discount** Use coupon **`TRUNG`** to get the **Advanced Scraping API** plan — ~**23,000 requests for ~$5**. ## Who’s it for - **Creators / Analysts** who need quick product lists (books, gadgets, etc.) with prices/ratings. - **Ops & Marketing teams** building weekly “top picks” reports. - **Engineers** validating the Decodo Scraper API + LLM extraction pattern before scaling. ## How it works / What it does 1. **Trigger** – Manually run the workflow. 2. **Edit Fields (manual)** – Provide inputs: - `targetUrl` (e.g., an Amazon category/search/listing page) - `deviceType` (`desktop` | `mobile` | `tablet`) - Optional: `maxItems`, `notes`, `reportTitle`, `reportOwner` 3. **Scraper API Request (HTTP Request → POST)** Calls **Decodo Scraper API** with: - URL to crawl, **headless JS** enabled - **Device emulation** (UA + viewport) - Optional **waitFor / executeJS** to ensure late-loading content is captured 4. **HTML Response Parser (Code/Function or HTML node)** Pulls the HTML string from Decodo response and normalizes it (strip scripts/styles, collapse whitespace). 5. **Product Analyzer Agent (LLM + Structured Output Parser)** Prompts an LLM to extract **structured “book” objects** from the HTML: The **Structured Output Parser** enforces a strict JSON schema and drops malformed items. 6. **Build 📚 Book Purchase Report (Code/LLM)** Converts the JSON array into a **Markdown** (or HTML) report with: - Executive summary (top picks, average price/rating) - Table of items (rank, title, author, price, rating, link) - “Recommended to buy” shortlist (rules configurable) - Notes / owner / timestamp 7. **Configure Google Drive Folder (manual)** Choose/create a Drive folder for output artifacts. 8. **Create Document File (Google Docs API)** Creates a Doc from the generated Markdown/HTML. 9. **Convert Document to PDF (Google Drive export)** Exports the Doc to PDF. 10. **Upload report to Slack** Sends the PDF (and/or Doc link) to a chosen Slack channel with a short summary. ## How to set up ### 1 Prerequisites - **n8n** (self-hosted or Cloud) - **Decodo Scraper API** key - **OpenAI (or compatible) API key** for the Analyzer Agent - **Google Drive/Docs** credentials (OAuth2) - **Slack** Bot/User token (files:write, chat:write) ### 2 Environment variables (recommended) - `DECODO_API_KEY` - `OPENAI_API_KEY` - `DRIVE_FOLDER_ID` (optional default) - `SLACK_CHANNEL_ID` ### 3 Nodes configuration (high level) **Edit Fields (Set node)** **Scraper API Request (HTTP Request → POST)** **HTML Response Parser (Code node)** **Product Analyzer Agent** **Build Book Purchase Report (Code/LLM)** **Create Document File** **Convert to PDF** **Upload to Slack** ## Requirements - **Decodo**: Active API key and endpoint access. Be mindful of concurrency/rate limits. - **Model**: GPT-4o/4.1-mini or similar for reliable structured extraction. - **Google**: OAuth client (Docs/Drive scopes). Ensure n8n can write to the target folder. - **Slack**: Bot token with `files:write` + `chat:write`. ## How to customize the workflow - **Target site**: Change `targetUrl` to any **public** page (category, search, or listing). For other domains (not Amazon), tweak the **LLM guidance** (e.g., price/label patterns). - **Device emulation**: Switch `deviceType` to `mobile` to fetch mobile-optimized markup (often simpler DOMs). - **Late-loading pages**: Adjust `waitFor.selector` or use `waitUntil: "networkidle"` (if supported) to ensure full content loads. - **Client-side JS**: Extend `executeJS` if you need to interact (scroll, click “next”, expand sections). You can also loop over pagination by iterating URLs. - **Extraction schema**: Add fields (e.g., `discount_percent`, `bestseller_badge`, `prime_eligible`) and update the Structured Output schema accordingly. - **Filtering rules**: Modify recommendation logic (e.g., min ratings count, price bands, languages). - **Report branding**: Add logo, cover page, footer with company info; switch to HTML + inline CSS for richer Docs formatting. - **Destinations**: Besides Slack & Drive, add Email, Notion, Confluence, or a database sink. - **Scheduling**: Add a **Cron** trigger for weekly/monthly auto-reports.
Automate YouTube video SEO tagging with GPT and Slack notifications
# AI-Powered YouTube Auto-Tagging Workflow (SEO Automation) ### Watch the demo video below: [](https://www.youtube.com/watch?v=fnzhEJb9R4w) > Supercharge your YouTube SEO with this AI-powered workflow that automatically generates and applies smart, SEO friendly tags to your new videos every week. No more manual tagging, just better discoverability, improved reach, and consistent optimization. Plus, get instant Slack notifications so your team stays updated on every video’s SEO boost. ## Who’s it for - YouTube creators, channel admins, and marketing teams who publish regularly and want consistent, SEO-friendly tags without manual effort. - Agencies managing multiple channels who need an auditable, automated tagging process with Slack notifications. ## How it works / What it does 1. **Weekly Schedule Trigger** Runs the workflow once per week. 2. **Get all videos uploaded last week** Queries YouTube for videos uploaded by the channel in the past 7 days. 3. **Get video detail** Retrieves each video’s title, description, and ID. 4. **YouTube Video Auto Tagging Agent** (LLM) - Inputs: `video.title`, `video.description`, `channelName`. - Uses a SEO-specialist system prompt to generate 15–20 relevant, comma-separated tags. 5. **Update video with AI-generated tags** Writes the tags back to the video via YouTube Data API. 6. **Inform via Slack message** Posts a confirmation message (video title + ID + tags) to a chosen Slack channel for visibility. ## How to set up 1. **YouTube connection** - Create a Google Cloud project and enable **YouTube Data API v3**. - Configure OAuth client (Web app / Desktop as required). - Authorize with the Google account that manages the channel. - In your automation platform, add the YouTube credential and grant scopes (see *Requirements*). 2. **Slack connection** - Create or use an existing Slack app/bot. - Install to your workspace and capture the Bot Token. - Add the Slack credential in your automation platform. 3. **LLM / Chat Model** - Select your model (e.g., OpenAI GPT). - Paste the **System Prompt** (SEO expert) and the **User Prompt** template: - Inputs: `{{video_title}}`, `{{video_description}}`, `{{channel_name}}`. - Output: **comma-separated list** of 15–20 tags (no #, no duplicates). 4. **Node configuration** - **Weekly Schedule Trigger:** choose day/time (e.g., Mondays 09:00 local). - **Get all videos uploaded last week:** date filter = now() - 7 days. - **Get video detail:** map each video ID from previous node. - **Agent node:** map fields to the prompt variables. - **Update video:** map the agent’s tag string to the YouTube `tags` field. - **Slack message:** ``` The video "*{{video_title}} - {{video_id}}*" has been auto-tagged successfully. Tags: {{tags}} ``` 5. **Test run** - Manually run the workflow with one recent video. - Verify the tags appear in YouTube Studio and the Slack message posts. ## Requirements **APIs & Scopes** - **YouTube Data API v3** - `youtube.readonly` (to list videos / details) - `youtube` or `youtube.force-ssl` (to update video metadata incl. tags) - **Slack Bot Token Scopes** - `chat:write` (post messages) - `channels:read` or `groups:read` if selecting channels dynamically (optional) **Platform** - Access to a chat/LLM provider (e.g., OpenAI). - Outbound HTTPS allowed. **Rate limits & quotas** - YouTube updates consume quota; tag updates are write operations—avoid re-writing unchanged tags. - Add basic throttling (e.g., 1–2 updates/sec) if you process many videos. ## How to customize the workflow - **Schedule:** switch to daily, or run on publish events instead of weekly. - **Filtering:** process only videos matching rules (e.g., title contains “tutorial”, or missing tags). - **Prompt tuning:** - Add brand keywords to always include (e.g., “WiseStack AI”). - Constrain to language (e.g., “Vietnamese tags only”). - Enforce max 500 chars total for tags if you want a stricter cap. - **Safety guardrails:** - Validate model output: split by comma, trim whitespace, dedupe, drop empty/over-long tags. - If the agent fails, fall back to a heuristic generator (title/keywords extraction). - **Change log:** write a row per update to a sheet/DB (videoId, oldTags, newTags, timestamp, runId). - **Human-in-the-loop:** send tags to Slack as buttons (“Apply / Edit / Skip”) before updating YouTube. - **Multi-channel support:** loop through a list of channel credentials and repeat the pipeline. - **Notifications:** add error Slack messages for failed API calls; summarize weekly results. **Tip:** Keep a small allow/deny list (e.g., banned terms, mandatory brand terms) and run a quick sanitizer right after the agent node to maintain consistency across your channel.
Automated Slack channel inactivity audit & report generator for workspace cleanup
# Automated Slack Channel Audit Workflow with Chatbot and GPT-4.1  > Automatically scans all public Slack channels weekly to detect those with no activity in the past 30 days, then generates and sends a detailed inactivity report to admins for review and action. Helps keep your Slack workspace clean, relevant, and clutter-free. ## 🧑💼 Who’s it for This workflow is built for: - **Slack Workspace Admins** - **IT or Ops Managers** - **HR/Compliance Teams** …who want to maintain a clean and active Slack workspace by regularly reviewing **inactive channels**. ## ⚙️ How it works / What it does This automated n8n workflow: 1. **Runs weekly** via a scheduled trigger. 2. **Fetches all public Slack channels** in the workspace. 3. **Checks message history** of each channel for activity. 4. **Filters channels** that have had **no discussion in the past 30 days**. 5. **Generates a Slack-friendly report** with key metadata (name, member count, purpose, etc.). 6. **Sends the report to a Slack channel** for admin review and possible action (e.g., archive, engage, repurpose). ## 🛠️ How to set up 1. **Configure your Slack App** - Go to https://api.slack.com/apps → Create App - Add the following **OAuth scopes** to your **Bot Token**: - `channels:read` → to get list of public channels - `channels:history` → to fetch message history - `users:read` → to personalize report (optional) - `chat:write` → to post the report to a Slack channel 2. **Install the app in your workspace** - Copy the **Bot User OAuth Token** - Add it to your n8n Slack credentials under "Slack API" 3. **Customize the schedule** in the "Weekly Schedule Trigger" node to control report frequency. 4. **Connect your Slack workspace** to the workflow using your credentials. ## ✅ Requirements - n8n (self-hosted or cloud) - Slack App with: - `channels:read` - `channels:history` - `chat:write` - Active channels and member data - A designated Slack channel to receive the report ## 🔧 How to customize the workflow | Component | Customization Options | |----------|------------------------| | ⏱️ Schedule Trigger | Change to daily, monthly, etc. | | 📅 Inactivity Threshold | Modify `Filter channel with last discussion 30 days ago` to 60/90 days | | 📊 Report Formatting | Tweak the `Consume slack report` node to change formatting or summary | | 💬 Output Channel | Change target channel in `Send Channel Inactivity Report` | | 🚫 Auto-archiving | Add logic to archive channels with 0 members or activity (using Slack API) | ## 📝 Slack Permissions Reference | Action | Slack Scope Required | |--------|-----------------------| | Get all public channels | `channels:read` | | Get message history of a channel | `channels:history` | | Post message to Slack | `chat:write` | | Get user info (optional) | `users:read` |
Answer code of conduct questions in Slack with GPT-4 & RAG technology
# 📘 Code of Conduct Q&A Slack Chatbot with RAG Powered [](https://www.youtube.com/watch?v=2EWgC5UKiBQ) > Empower employees to instantly access and understand the company’s Code of Conduct via a Slack chatbot, powered by Retrieval-Augmented Generation (RAG) and LLMs. ## 🧑💼 Who’s it for This workflow is designed for: - **HR and compliance teams** to automate policy-related inquiries - **Employees** who want quick answers to Code of Conduct questions directly inside Slack - **Startups or enterprises** that need internal compliance self-service tools powered by AI ## ⚙️ How it works / What it does This RAG-powered Slack chatbot answers user questions based on your uploaded **Code of Conduct PDF** using GPT-4 and embedded document chunks. Here's the flow: 1. **Receive Message from Slack:** A webhook triggers when a message is posted in Slack. 2. **Check if it’s a valid query:** Filters out non-user messages (e.g., bot mentions). 3. **Run Agent with RAG:** - Uses GPT-4 with `Query Data Tool` to retrieve relevant document chunks. - Returns a well-formatted, context-aware answer. 4. **Send Response to Slack:** Fetches user info and posts the answer back in the same channel. 5. **Document Upload Flow:** - HR can upload the PDF Code of Conduct file. - It’s parsed, chunked, embedded using OpenAI, and stored for future query retrieval. - A backup copy is saved to Google Drive. ## 🛠️ How to set up 1. **Prepare your environment:** - Slack Bot token & webhook configured (Sample slack app manifest: https://wisestackai.s3.ap-southeast-1.amazonaws.com/slack_bot_manifest.json) - OpenAI API key (for GPT-4 & embedding) - Google Drive credentials (optional for backup) 2. **Upload the Code of Conduct PDF:** - Use the designated node to upload your document (Sample file: https://wisestackai.s3.ap-southeast-1.amazonaws.com/20220419-ingrs-code-of-conduct-policy-en.pdf) - This triggers chunking → embedding → data store. 3. **Deploy the chatbot:** - Host the webhook and connect it to your Slack app. - Share the command format with employees (e.g., `@CodeBot Can I accept gifts from partners?`) 4. **Monitor and iterate:** - Improve chunk size or embed model if queries aren’t accurate. - Review unanswered queries to enhance coverage. ## 📋 Requirements - n8n (Self-hosted or Cloud) - Slack App (with `chat:write`, `users:read`, `commands`) - OpenAI account (embedding + GPT-4 access) - Google Drive integration (for backups) - Uploaded Code of Conduct in PDF format ## 🧩 How to customize the workflow | What to Customize | How to Do It | |-----------------------------|------------------------------------------------------------------------------| | 🔤 **Prompt style** | Edit the System & User prompts inside the `Code Of Conduct Agent` node | | 📄 **Document types** | Upload additional policy PDFs and tag them differently in metadata | | 🤖 **Agent behavior** | Tune GPT temperature or replace with different LLM | | 💬 **Slack interaction** | Customize message formats or trigger phrases | | 📁 **Data Store engine** | Swap to Pinecone, Weaviate, Supabase, etc. depending on use case | | 🌐 **Multilingual support** | Preprocess text and support locale detection via Slack metadata |
AI resume analysis & candidate evaluation with Slack and Google Sheets
# Create AI-Powered Chatbot for Candidate Evaluation on Slack [](https://www.youtube.com/watch?v=NAn5BSr15Ks) > This workflow connects a Slack chatbot with AI agents and Google Sheets to automate candidate resume evaluation. It extracts resume details, identifies the applied job from the message, fetches the correct job description, and provides a summarized evaluation via Slack and tracking sheet. Perfect for HR teams using Slack. ## **Who’s it for** This workflow is designed for: - **HR Teams**, **Recruiters**, and **Hiring Managers** - Working in software or tech companies using **Slack**, **Google Sheets**, and **n8n** - Who want to **automate candidate evaluation** based on uploaded profiles and applied job positions ## **How it works / What it does** This workflow is triggered when a **Slack user mentions the HR bot** and attaches a **candidate profile PDF**. The workflow performs the following steps: 1. **Trigger from Slack Mention** - A user mentions the bot in Slack with a message like: `@HRBot Please evaluate this candidate for the AI Engineer role.` (with PDF attached) 2. **Input Validation** - If no file is attached, the bot replies: _"Please upload the candidate profile file before sending the message."_ 3. **Extract Candidate Profile** - Downloads the attached PDF from Slack - Uses `Extract from File` to parse the resume into text 4. **Profile Analysis (AI Agent)** - Sends the resume text and message to the `Profile Analyzer Agent` - Identifies: - Candidate name, email, and summary - Applied position (from message) - Looks up the **Job Description PDF URL** using Google Sheets 5. **Job Description Retrieval** - Downloads and parses the matching JD PDF 6. **HR Evaluation (AI Agent)** - Sends both the **candidate profile** and **job description** to `HR Expert Agent` - Receives a summarized **fit evaluation** and insights 7. **Output and Logging** - Sends evaluation result back to Slack in the original thread - Updates a **Google Sheet** with evaluation data for tracking ## **How to set up** 1. **Slack Setup** - Create a Slack bot and install it into your workspace - Enable the `app_mention` event and generate a **bot token** - Connect Slack to n8n using **Slack Bot credentials** 2. **Google Sheets Setup** - Create a sheet mapping `Position Title → Job Description URL` - Create another sheet for logging evaluation results 3. **n8n Setup** - Add a **Webhook Trigger** for Slack mentions - Connect Slack, Google Sheets, and GPT-4 credentials - Set up agents (`Profile Analyzer Agent`, `HR Expert Agent`) with appropriate prompts 4. **Deploy & Test** - Mention your bot in Slack with a message and file - Confirm the reply and entry in the evaluation tracking sheet ## **Requirements** - n8n (self-hosted or cloud) - Slack App with Bot Token - OpenAI or Azure OpenAI account (for GPT-4) - Google Sheets (2 sheets: job mapping + evaluation log) - Candidate profiles in PDF format - Defined job titles and descriptions ## **How to customize the workflow** You can easily adapt this workflow to your team’s needs: | Customization Area | How to Customize | |--------------------------|----------------------------------------------------------------------------------| | Job Mapping Source | Replace Google Sheet with Airtable or Notion DB | | JD Format | Use Markdown or inline descriptions instead of PDF | | Evaluation Output Format | Change from Slack message to Email or Notion update | | HR Agent Prompt | Customize to match your company tone or include scoring rubrics | | Language Support | Add support for bilingual input/output (e.g., Vietnamese & English) | | Workflow Trigger | Trigger from slash command or form instead of `@mention` |
Generate SSL/TLS certificate expiry reports with AWS ACM and AI for Slack & email
# Automated SSL/TLS Certificate Expiry Report for AWS  > Automatically generates a weekly report of all AWS ACM certificates, including status, expiry dates, and renewal eligibility. The workflow formats the data into both Markdown (for PDF export to Slack) and HTML (for email summary), helping teams stay on top of certificate compliance and expiration risks. ## **Who’s it for** This workflow is designed for **DevOps engineers**, **cloud administrators**, and **compliance teams** who manage AWS infrastructure and need **automated weekly visibility** into the status of their SSL/TLS certificates in **AWS Certificate Manager (ACM)**. It's ideal for teams that want to reduce the risk of expired certs, track renewal eligibility, and maintain reporting for audit or operational purposes. ## **How it works / What it does** This n8n workflow performs the following actions on a weekly schedule: 1. **Trigger**: Automatically runs once a week using the `Weekly schedule trigger`. 2. **Fetch Certificates**: Uses `Get many certificates` action from AWS Certificate Manager to retrieve all certificate records. 3. **Parse Data**: Processes and reformats certificate data (dates, booleans, SANs, etc.) into a clean JSON object. 4. **Generate Reports**: - 📄 **Markdown Report**: Uses the `Certificate Summary Markdown Agent` (OpenAI) to generate a Markdown report for PDF export. - 🌐 **HTML Report**: Uses the `Certificate Summary HTML Agent` to generate a styled HTML report for email. 5. **Deliver Reports**: - Converts Markdown to PDF and sends it to Slack as a file. - Sends HTML content as a formatted email. ## **How to set up** 1. **Configure AWS Credentials** in n8n to allow access to AWS ACM. 2. Create a new workflow and use the following nodes in sequence: - `Schedule Trigger`: Weekly (e.g., every Monday at 08:00 UTC) - `AWS ACM → Get many certificates` - `Function Node → Parse ACM Data`: Converts and summarizes certificate metadata - `OpenAI Chat Node (Markdown Agent)` with a system/user prompt to generate Markdown - `Configure Metadata` → Define file name and MIME type (`.md`) - `Create document file` → Converts Markdown to document stream - `Convert to PDF` - `Slack Node` → Upload the PDF to a channel - *(Optional)* Add a second `OpenAI Chat Node` for generating HTML and sending it via email 3. **Connect Output**: - Markdown report → Slack file upload - HTML report → Email node with embedded HTML ## **Requirements** - 🟩 **n8n instance** (self-hosted or cloud) - 🟦 **AWS account** with access to ACM - 🟨 **OpenAI API key** (for ChatGPT Agent) - 🟥 **Slack webhook or OAuth credentials** (for file upload) - 📧 **Email integration** (e.g., SMTP or SendGrid) - 📝 Permissions to write documents (Google Drive / file node) ## **How to customize the workflow** - **Change report frequency**: Adjust the `Weekly schedule trigger` to daily or monthly as needed. - **Filter certificates**: - Modify the function node to only include `EXPIRED`, `IN_USE`, or `INELIGIBLE` certs. - Add tags or domains to include/exclude. - **Add visuals**: Enhance the HTML version with colored rows, icons, or company branding. - **Change delivery channels**: - Replace Slack with Microsoft Teams, Discord, or Telegram. - Send Markdown as email attachment instead of PDF. - **Integrate ticketing**: - Create a JIRA/GitHub issue for each certificate that is `EXPIRED` or `INELIGIBLE`.
Generate and store AI images with DALL-E and Azure Blob Storage
# Beginner’s Tutorial: Manage Azure Storage Account Container & Blob with n8n [](https://www.youtube.com/watch?v=vh06fpMkalw) > This beginner-friendly n8n workflow shows you how to generate AI images using OpenAI, store them in Azure Blob Storage, and manage blob containers, all with zero code. ## 👤 Who’s it for This workflow is perfect for: - **Beginners learning Azure + OpenAI integration** - **No-code developers** experimenting with image generation - **Cloud learners** who want hands-on Blob Storage use cases - Anyone who wants to automate storing AI-generated content in the cloud ## ⚙️ How it works / What it does 1. 🖱️ Trigger the workflow manually using the `Execute Workflow` node. 2. ✏️ Use the `Edit Fields` node to input: - `containerName` (e.g., `demo-images`) - `imageIdea` (e.g., "a robot holding a coffee cup") 3. 📦 Create a new Azure Blob container (`Create container`). 4. 🤖 Use an OpenAI-powered **Prompt Generation Agent** to craft the perfect image prompt. 5. 🎨 Generate an image using OpenAI’s DALL·E model. 6. ☁️ Upload the generated image to Azure Blob Storage (`Create Blob`). 7. 📂 List blobs in the container (`Get many blobs`). 8. 🧹 Delete any blob as needed (`Delete Blob`). 9. (Optional) 🗑️ Remove the entire container (`Delete container`). ## 🔧 How to set up 1. **🧠 Set up OpenAI** - Create an OpenAI account and get your API key. - In n8n, go to **Credentials → OpenAI** and paste your key. 2. **🪣 Set up Azure Blob Storage** - Log in to your Azure Portal. - Create a **Storage Account** (e.g., `mystorageaccount`). - Go to **Access Keys** tab and copy: - **Storage Account Name** - **Key1** - In n8n, create a new **Azure Blob Storage Credential** using: - **Account Name** = your storage account name - **Access Key** = key1 value > 📝 This demo uses **Access Key** authentication. You can also configure Shared Access Signatures (SAS) or OAuth in production setups. 3. **Run the Workflow** - Enter your image idea and container name. - Click “Execute Workflow” to test it. ## 📋 Requirements | Requirement | Description | |------------------------|--------------------------------------------------| | Azure Storage Account | With container-level read/write access | | OpenAI API Key | For image and prompt generation | | n8n Version | v1.0+ recommended | | Image Credits | OpenAI charges tokens for DALL·E image creation | ## 🛠️ How to customize the workflow ### 🧠 Adjust Prompt Generation Update the **Prompt Agent** to include: - Specific style (3D, anime, cyberpunk) - Brand elements - Multiple language options ### 📁 Organize by Date/User Modify the `containerName` to auto-include: - Date (e.g., `images-2025-08-20`) - Username or session ID ### 📤 Send Image Output - Add Slack, Telegram, or Email nodes to deliver the image - Create public links using Azure’s blob permissions ### 🔁 Cleanup Logic - Auto-delete blobs after X days - Add versioning or backup logic
Decodo SaaS pricing intelligence workflow (B2B pricing radar)
# SaaS Pricing Brief Generator (Decodo → LLM → Google Docs → PDF → Slack) ## 🚀 Try Decodo — Web Scraping & Data API (Coupon: **TRUNG**)  **Decodo** is a powerful public data access platform offering managed web scraping APIs and proxy infrastructure to collect structured web data at scale. It handles proxies, anti-bot protection, JavaScript rendering, retries, and global IP rotation—so you can focus on data, not scraping complexity. **Why Decodo** - Managed **Web Scraping API** with anti-bot bypass & high success rates - Works with JS-heavy sites; outputs JSON/HTML/CSV - Easy integration (Python, Node.js, cURL) for eCommerce, SERP, social & general web data **🎟️ Special Discount** Use coupon **`TRUNG`** to get the **Advanced Scraping API** plan — ~**23,000 requests for ~$5**. ## Who this workflow is for This workflow is designed for **Presales, Product Managers, Business Analysts, and Sales teams** who need to: - Monitor competitor or SaaS pricing pages - Convert unstructured pricing content into a structured, comparable format - Quickly generate and share an internal **pricing brief** without manual copy/paste It is especially useful during **launches, pricing updates, or quarterly reviews**. ## What this workflow does At a high level, the workflow automates the full pricing intelligence pipeline: 1. Scrapes a live SaaS pricing page using a real browser environment 2. Cleans and normalizes the HTML for reliable parsing 3. Uses an LLM agent to extract pricing plans into structured JSON 4. Builds an executive-ready SaaS pricing brief 5. Publishes the brief as a Google Doc and PDF 6. Shares the final PDF directly to Slack The output is a consistent, repeatable pricing brief ready for internal distribution. ## How it works 1. **Manual Trigger** The workflow is manually triggered to ensure intentional execution (ideal for known pricing updates). 2. **Configure Target Pricing Page** Input the competitor pricing URL, device type, and report metadata. 3. **Decodo Pricing Scraper** Renders the full pricing page using JS and device emulation so tier cards, modals, and dynamic content are captured correctly. 4. **HTML Normalization** Custom code removes scripts, styles, and noise, leaving clean, readable text for the LLM. 5. **Plan Extraction Agent (LLM)** The agent translates messy pricing text into a structured JSON schema including: - Plan name - Price and billing period - Key features - CTA / positioning notes 6. **Build SaaS Pricing Brief** Transforms structured JSON into an executive-ready markdown brief with: - Plan comparison table - Key differentiators - Observations and suggested next actions 7. **Publish Assets** Creates a Google Doc in a predefined Drive folder and converts it to PDF for sharing. 8. **Share in Slack** Uploads the PDF directly to a Slack channel (e.g. `#pricing-intel`) for instant visibility. ## Requirements - Decodo node configured with a valid plan - OpenAI / LLM credentials for the Plan Extraction Agent - Google Drive API access - Slack workspace and upload permissions ## Notes & best practices - Manual trigger avoids unnecessary scraping and rate-limit issues - Device emulation ensures accurate pricing capture - Structured JSON output enables future extensions (trend tracking, diffing, dashboards) - The workflow is modular and easy to adapt for other competitor intelligence use cases
IAM compliance automation: enforce MFA and clean up access keys in AWS
# Automated AWS IAM Compliance Workflow for MFA Enforcement and Access Key Deactivation > This workflow leverages AWS IAM APIs and n8n automation to ensure strict security compliance by continuously monitoring IAM users for MFA (Multi-Factor Authentication) enforcement. [.jpg)](https://www.youtube.com/watch?v=ZggCRl8z_gQ) ## **Who’s it for** This workflow is designed for **DevOps, Security, or Cloud Engineers** responsible for maintaining IAM security compliance in AWS accounts. It's ideal for teams who want to **enforce MFA usage** and **automatically disable access** for non-compliant IAM users. ## **How it works / What it does** This automated workflow performs a **daily check** to detect IAM users without an MFA device and deactivate their access keys. ### Step-by-step: 1. **Daily scheduler**: Triggers the workflow once a day. 2. **Get many users**: Retrieves a list of all IAM users in the account. 3. **Get IAM User MFA Devices**: Calls AWS API to get MFA device info for each user. 4. **Filter out IAM users with MFA**: Keeps only users **without any MFA device**. 5. **Send warning message(s)**: Sends Slack alerts for users who do not have MFA enabled. 6. **Get User Access Key(s)**: Fetches access keys for each non-MFA user. 7. **Parse the list of user access key(s)**: Extracts and flattens key information like `AccessKeyId`, `Status`, and `UserName`. 8. **Filter out inactive keys**: Keeps only **active** access keys for further action. 9. **Deactivate Access Key(s)**: Calls AWS API to deactivate each active key for non-MFA users. ## **How to set up** 1. **Configure AWS credentials** in your environment (IAM role or AWS access key with required permissions). 2. **Connect Slack** via the Slack node for alerting (set channel and credentials). 3. Set the **scheduler** to your preferred frequency (e.g., daily at 9AM). 4. Adjust any Slack message template or filtering conditions as needed. ## **Requirements** - IAM user or role credentials with the following AWS IAM permissions: - `iam:ListUsers` - `iam:ListMFADevices` - `iam:ListAccessKeys` - `iam:UpdateAccessKey` - Slack credentials (Bot token with `chat:write` permission). - n8n environment with: - Slack integration - AWS credentials (set via environment or credentials manager) ## **How to customize the workflow** - **Alert threshold**: Instead of immediate deactivation, you can delay action (e.g., alert first, wait 24h, then disable). - **Change notification channel**: Modify the Slack node to send alerts to a different channel or add email integration. - **Whitelist exceptions**: Add a Set or IF node to exclude specific usernames (e.g., service accounts). - **Add audit logging**: Use Google Sheets, Airtable, or a database to log which users were flagged or had access disabled. - **Extend access checks**: Include console password check (`GetLoginProfile`) if needed.
Collaborative sales planning with multi-agent AI, Google Docs, and Slack
# Multi-Agent Architecture Free Bootstrap Template for Beginners [](https://www.youtube.com/watch?v=BfMY2jFJR9k) _Free template to learn and reuse a multi-agent architecture in n8n. The company metaphor: a **CEO (orchestrator)** delegates to **Marketing, Operations, Finance** to produce a short sales-season plan, export it to PDF, and share it._ ## **Who’s it for** - Builders who want a **clear, minimal pattern** for multi-agent orchestration in n8n. - Teams demoing/teaching agent collaboration with **one coordinator + three specialists**. - Anyone needing a **repeatable template** to generate plans from multiple “departments”. ## **How it works / What it does** 1. **Trigger (Manual)** — Click *Execute workflow* to start. 2. **Edit Fields** — Provide brief inputs (company, products, dates, constraints, channels, goals). 3. **CEO Agent (Orchestrator)** — Reads the brief, calls 3 tool agents once, merges results, resolves conflicts. 4. **Marketing Agent** — Proposes top campaigns + channels + content calendar. 5. **Operations Agent** — Outlines inventory/staffing readiness, fulfillment steps, risks. 6. **Finance Agent** — Suggests pricing/discounts, budget split, targets. 7. **Compose Document** — CEO produces Markdown; node converts to Google Doc → **PDF**. 8. **Share** — Upload the PDF to Slack (or Drive) for review. **Outputs** - **Markdown plan** with sections (Summary, Timeline, Marketing, Ops, Pricing, Risks, Next Actions). - **Compact JSON** for automation (campaigns, budget, dates, actions). - **PDF** file for stakeholders. ## **How to set up** 1. **Add credentials** - OpenAI (or your LLM provider) for all agents. - Google (Drive/Docs) to create the document and export PDF. - Slack (optional) to upload/share the PDF. 2. **Map nodes (suggested)** - **When clicking ‘Execute workflow’** → **Edit Fields** (form with: `company`, `products`, `audience`, `start_date`, `end_date`, `channels`, `constraints`, `metrics`). - **CEO Agent** (AI Tool Node) → calls **Marketing Agent**, **Operations Agent**, **Finance Agent** (AI Tool Nodes). - **Configure metadata** (doc title from company + window). - **Create document file** (Google Docs API) with CEO Markdown. - **Convert to PDF** (export). - **Upload a file** (Slack) to share. 3. **Prompts (drop-in)** - **CEO (system)**: orchestrate 3 tools; request concise JSON+Markdown; merge & resolve; output sections + JSON. - **Marketing / Operations / Finance (system)**: each returns a small JSON per its scope (campaigns/calendar; staffing/steps/risks; discounts/budget/targets). 4. **Test** — Run once; verify the PDF and Slack message. ## **Requirements** - n8n (current version with **AI Tool Node**). - LLM credentials (e.g., OpenAI). - Google credentials for Docs/Drive (to create & export). - Optional Slack bot token for file uploads. ## **How to customize the workflow** - **Swap roles**: Replace departments (e.g., Product, Legal, Support) or add more tool agents. - **Change outputs**: Export to **DOCX/HTML/Notion**; add a cover page; attach brand styles. - **Approval step**: Insert **Slack “Send & Wait”** before PDF generation for review/edits. - **Data grounding**: Add RAG (Sheets/DB/Docs) so agents cite inventory, pricing, or past campaign KPIs. - **Automation JSON**: Extend the schema to match your CRM/PM tool and push `next_actions` into Jira/Asana. - **Scheduling**: Replace manual trigger with a **cron** (weekly/monthly planning). - **Localization**: Add a Translation agent or set language via input field. - **Guardrails**: Add length limits, cost caps (max tokens), and validation on agent JSON.
Manage Google Cloud Storage with AI image generation using GPT-4 Mini
# Beginner’s Tutorial: Manage Google Cloud Storage Buckets and Objects with n8n ### Watch the demo video below: [](https://www.youtube.com/watch?v=2iJfmQgVL6E) ## **Who’s it for** - Beginners who want to learn how to automate Google Cloud Storage (GCS) operations with n8n. - Developers who want to combine **AI image generation** with **cloud storage management**. - Anyone looking for a simple introduction to working with **Buckets** and **Objects** in GCS. ## **How it works / What it does** This workflow demonstrates end-to-end usage of **Google Cloud Storage** with AI integration: 1. **Trigger:** Start manually by clicking *Execute Workflow*. 2. **Edit Fields:** Provide input values (e.g., bucket name or image description). 3. **List Buckets:** Retrieve all existing buckets in the project (branch: view only). 4. **Create Bucket:** If needed, create a new bucket to store objects. 5. **Prompt Generation Agent:** Use an AI model to generate a creative text prompt. 6. **Generate Image:** Convert the AI-generated prompt into an image. 7. **Upload Object:** Store the generated image as an object in the selected bucket. 8. **Delete Object:** Clean up by removing the uploaded object if required. This shows the full lifecycle: *Bucket → Object (Create/Upload/Delete)* combined with AI image generation. ## **How to set up** 1. **Trigger the workflow:** Use the *When clicking Execute workflow* node to start manually. 2. **Provide inputs:** In *Edit Fields*, specify details such as bucket name or description text for the image. 3. **List buckets:** Use the *List Buckets* node to see what exists. 4. **Create a bucket:** Use *Create Bucket* if you want a new storage bucket. 5. **Generate prompt & image:** - The *Prompt Generation Agent* uses an OpenAI Chat Model to create an image prompt. - The *Generate an Image* node turns this prompt into an actual image. 6. **Upload to bucket:** Use *Create Object* to upload the generated image into your GCS bucket. 7. **Delete object (optional):** Use *Delete Object* to remove the file from the bucket as a cleanup step. ## **Requirements** - An active **Google Cloud account** with **Cloud Storage API enabled**. - A **Service Account Key (JSON)** credential added in n8n for GCS. - An **OpenAI API Key** configured in n8n for the prompt and image generation nodes. - Basic familiarity with running workflows in n8n. ## **How to customize the workflow** - **Different object types:** Instead of images, upload PDFs, logs, or text files. - **Automatic cleanup:** Skip the delete step if you want objects to persist. - **Schedule trigger:** Replace manual execution with a weekly or daily schedule. - **Dynamic prompts:** Accept user input from a form or webhook to generate images. - **Multi-bucket management:** Extend the logic to manage multiple buckets across projects. - **Notifications:** Add a Slack/Email step after upload to notify your team with the object URL. ✅ By the end of this tutorial, you’ll understand how to: - Work with **Buckets** (list, create). - Work with **Objects** (upload, delete). - Integrate **AI image generation** with Google Cloud Storage.
Send Slack alerts for AWS IAM access keys older than 365 days
# AWS IAM Access Key Rotation Reminder Automation Workflow ### Watch the demo video below: [](https://youtu.be/tW2y_dRvcs0) ## **Who’s it for** - DevOps/SRE teams responsible for AWS account security. - Security/compliance officers ensuring key rotation policies are followed. - Any AWS account owner who wants automatic detection of stale access keys. ## **How it works / What it does** 1. **Weekly Scheduler** — triggers the workflow on a recurring basis. 2. **Get Many Users** — fetches all IAM users in the AWS account. 3. **Get User Access Key(s)** — retrieves the access keys associated with each user. 4. **Filter Out Inactive Keys** — removes keys that are not active (e.g., status `Inactive`). 5. **Access Key Older Than 365 Days** — checks the key creation date and flags keys older than one year. 6. **Send Slack Message** — notifies a Slack channel with details of the outdated key(s) for review and action. 7. **No Operation** — safely ends the workflow if no keys match the condition. ## **How to set up** - Configure the **Weekly Scheduler** to run at your desired cadence (e.g., every Monday). - Use **Get Many Users** to list all IAM users. - For each user, call **ListAccessKeys** (`Get User Access Key(s)`) to fetch their key metadata. - Apply a filter to keep only keys with status `Active`. - Add a condition to compare `CreateDate` against `today - 365 days`. - Send results to Slack using the **Slack Post Message** node. ## **Requirements** - n8n (latest version). - AWS credential in n8n configured for **us-east-1** (IAM requires signing with this region). - IAM permissions: - `iam:ListUsers` - `iam:ListAccessKeys` - Slack bot credentials with permission to post messages in the desired channel. ## **How to customize the workflow** - **Change threshold** — adjust the `365 days` condition to 90, 180, or any other rotation policy. - **Escalation** — mention `@security` or create a Jira/Ticket when old keys are found. - **Logging** — push flagged results into a Google Sheet, database, or log management system for audit. - **Automation** — instead of only notifying, add a step to automatically deactivate keys older than the threshold (after approval). - **Multi-account support** — duplicate or loop across multiple AWS credentials if you manage several AWS accounts.
Monitor & alert on inactive AWS IAM users with Slack notifications
# AWS IAM Inactive User Automation Alert Workflow > Weekly job that finds IAM users with **no activity for > 90 days** and notifies a Slack channel. > ⚠️ **Important:** AWS SigV4 for IAM must be scoped to **`us-east-1`**. Create the AWS credential in n8n with region **us-east-1** (even if your other services run elsewhere). ## **Who’s it for** - SRE/DevOps teams that want automated IAM hygiene checks. - Security/compliance owners who need regular inactivity reports. - MSPs managing multiple AWS accounts who need lightweight alerting. ## **How it works / What it does** 1. **Weekly scheduler** – kicks off the workflow (e.g., every Monday 09:00). 2. **Get many users** – lists IAM users. 3. **Get user** – enriches each user with details (password status, MFA, etc.). 4. **Filter bad data** – drops service-linked users or items without usable dates. 5. **IAM user inactive for more than 90 days?** – keeps users whose **last activity** is older than 90 days. - Last activity is derived from any of: - `PasswordLastUsed` (console sign-in) - `AccessKeyLastUsed.LastUsedDate` (from `GetAccessKeyLastUsed` if you add it) - Fallback to `CreateDate` if no usage data exists (optional) 6. **Send a message (Slack)** – posts an alert for each inactive user. 7. **No operation** – path for users that don’t match (do nothing). ## **How to set up** 1. **Credentials** - **AWS (Predefined → AWS)** - Service: `iam` - Region: `us-east-1` ← **required for IAM** - Access/Secret (or Assume Role) with read-only IAM perms (see below). - **Slack** OAuth (bot in your target channel). ## **Requirements** - n8n (current version). - **AWS IAM permissions** (minimum): - `iam:ListUsers`, `iam:GetUser` - *(Optional for higher fidelity)* `iam:ListAccessKeys`, `iam:GetAccessKeyLastUsed` - Slack bot with permission to post in the target channel. - Network egress to `iam.amazonaws.com`. ## **How to customize the workflow** - **Change window:** set 60/120/180 days by adjusting `minus(N, 'days')`. - **Audit log:** append results to Google Sheets/DB with `UserName`, `Arn`, `LastActivity`, `CheckedAt`. - **Escalation:** if a user remains inactive for another cycle, mention `@security` or open a ticket. - **Auto-remediation (advanced):** on a separate approval path, disable access keys or detach policies. - **Multi-account / multi-region:** iterate a list of AWS credentials (one per account; IAM stays `us-east-1`). - **Exclude list:** add a static list or tag-based filter to skip known service users. ## **Notes & gotchas** - Many users never sign in; if you don’t pull `GetAccessKeyLastUsed`, they may look “inactive”. Add that call for accuracy. - `PasswordLastUsed` is null if console login never happened. - IAM returns timestamps in ISO or epoch—use `toDate`/`toDateTime` before comparisons.
Auto-renew AWS certificates with Slack approval workflow
# AWS Certificate Manager (ACM) Auto-Renew with Slack notify & approval ## **Who’s it for** - SRE/DevOps teams managing many ACM certs. - Cloud ops who want **hands-off renewals** with an **approval step in Slack**. - MSPs that need auditable reminders and renewals on schedule. ## **How it works / What it does** 1. **Schedule Trigger** – runs daily (or your cadence). 2. **Get many certificates** – fetches ACM certs (paginate if needed). 3. **Filter: expiring in next 7 days** – keeps items where: - `NotAfter` **before** `today + 7d` - `NotBefore` **before** `today` (already valid) 4. **Send message and wait for response (Slack)** – posts a certificate summary and **pauses** until Approve/Reject. 5. **Renew a certificate** – on **Approve**, calls the renew action for the item. ## **How to set up** 1. **Credentials** - **AWS** in n8n with permissions to list/read/renew certs. - **Slack** OAuth (bot in the target channel). 2. **Schedule Trigger** - Set to run once per day (e.g., `09:00` local). 3. **Get many certificates** - Region: your ACM region(s). - If you have several regions, loop regions or run multiple branches. 4. **Filter (IF / Filter node)** - Add these two conditions (AND): - `{{ $json.NotAfter.toDateTime('s') }}` **is before** `{{ $today.plus(7,'days') }}` - `{{ $json.NotBefore.toDateTime('s') }}` **is before** `{{ $today }}` 5. **Slack → Send & Wait** - Message (text input): ``` :warning: *ACM Certificate Expiry Alert* :warning: *Domain:* {{ $json.DomainName }} *SANs:* {{ $json.SubjectAlternativeNameSummaries }} *ARN:* {{ $json.CertificateArn }} *Algo:* {{ $json.KeyAlgorithm }} *Status:* {{ $json.Status }} *Issued:* {{ $json.IssuedAt | toDate | formatDate("YYYY-MM-DD HH:mm") }} *Expires:* {{ $json.NotAfter | toDate | formatDate("YYYY-MM-DD HH:mm") }} Approve to start renewal. ``` - Add two buttons: **Approve** / **Reject** (the node will output which was clicked). 6. **Renew a certificate** - Map the **CertificateArn** from the Slack Approved branch. ## **Requirements** - n8n (current version with Slack *Send & Wait*). - AWS IAM permissions (read + renew ACM), e.g.: - `acm:ListCertificates`, `acm:DescribeCertificate`, `acm:RenewCertificate` (plus region access). - Slack bot with permission to post & use interactivity in the target channel. ## **How to customize the workflow** - **Window size:** change `7` to `14` or `30` days in the filter. - **Catch expired**: add an OR path `{{ $json.NotAfter.toDateTime('s') }} is before {{ $today }}` → send a **red** Slack alert. - **Auto-renew w/o approval:** bypass Slack and renew directly for low-risk domains. - **Multiple regions/accounts:** iterate over a list of regions or assume roles per account. - **Logging:** add a Google Sheet/DB append after Slack click with `user`, `time`, `result`. - **Escalation:** if no Slack response after N hours, ping `@oncall` or open a ticket. ## **Notes** - The Slack node **pauses** execution until a button is clicked—perfect for change control. - Time conversions above assume `NotAfter`/`IssuedAt` are Unix seconds (`'s'`). Adjust if your data differs.
Generate business requirement documents with Multi-agent GPT & Google Workspace
# Multi-agent RAG system for smarter BRD (Business Requirement Document) writing ## **Who’s it for** This workflow is designed for Business Analysts, Project Managers, and Operations Teams who need to automate the creation, tracking, and delivery of Business Requirements Documents (BRDs) from submitted forms and supporting materials. It’s ideal for organizations handling multiple BRD requests and looking to streamline document generation, archiving, and communication. ## **How it works / What it does** 1. **Trigger**: The process begins when a BRD request form is submitted along with any supporting files. Sample supporting document PDF: [Download URL](https://wisestackai.s3.ap-southeast-1.amazonaws.com/Customer+Feedback+Analysis+%26+Automation+Platform.pdf) 2. **Data Recording**: - Creates a BRD request record and appends it to a tracking Google Sheet. - Handles multiple uploaded files, saving them to Google Drive. - Creates supporting document records and updates the supporting documents tracking sheet. 3. **Content Extraction & Storage**: - Extracts text from uploaded PDF files. - Inserts extracted content into a vector store for contextual retrieval by AI agents. 4. **Document Generation**: - Uses two specialized AI agents: - **General BRD Writer Agent** for the overall document structure. - **Business Requirement Writer Agent** for detailed business requirement sections. - Both agents query the stored data and produce content, which is then merged. 5. **Metadata & File Creation**: - Configures metadata for the document. - Creates a final document file (Google Docs). 6. **Finalization**: - Converts the document to PDF [Sample output](https://wisestackai.s3.ap-southeast-1.amazonaws.com/BRD-2025-001+Customer+Feedback+Analysis+%26+Automation+Platform.pdf) - Archives the PDF in Google Drive. - Sends a BRD response email to the requester with the completed document. - Updates the request status in the Google Sheet as completed. ## **How to set up** 1. **Prepare Google Sheets**: - Create a BRD tracking sheet. - Create a supporting document tracking sheet. 2. **Configure Google Drive**: - Set up folders for supporting documents and archived PDFs. - Ensure the workflow has API access to upload and retrieve files. 3. **Form Integration**: - Connect your BRD request form to trigger the workflow. 4. **Vector Store**: - Configure a vector database or embedding store for extracted document text. 5. **AI Agents**: - Configure the **General BRD Writer Agent** and **Business Requirement Writer Agent** with your preferred OpenAI model. - Link both agents to the Query Data Tool for retrieving embedded content. 6. **Email Setup**: - Configure email sending credentials to deliver final BRDs to requesters. ## **Requirements** - Google Sheets API credentials. - Google Drive API credentials. - An OpenAI API key with access to the desired models. - A form submission trigger (e.g., Google Forms, Typeform). - Vector store or embedding database for contextual AI queries. - Permissions for file uploads, downloads, and updates in Google Drive. ## **How to customize the workflow** - **Custom Templates**: Modify the AI agents’ system prompts to match your organization’s BRD format and tone. - **Metadata Fields**: Add custom fields (e.g., department, priority level) during metadata configuration. - **File Storage Paths**: Adjust Google Drive folder structure for project-specific storage. - **Approval Steps**: Insert an approval workflow between draft document creation and final archiving. - **Notification Channels**: Add Slack, Microsoft Teams, or other notification integrations in addition to email. - **AI Model Selection**: Swap the OpenAI model for another LLM or fine-tuned variant to improve BRD quality for your domain.
Store AI-generated images in AWS S3: OpenAI image creation & cloud storage
# Automating AWS S3 Operations with n8n: Buckets, Folders, and Files ### Watch the demo video below: [](https://www.youtube.com/watch?v=el0dDJ4Ah3k) This tutorial walks you through setting up an automated workflow that **generates AI-powered images** from prompts and securely stores them in **AWS S3**. It leverages the new **AI Tool Node** and OpenAI models for prompt-to-image generation. ## **Who’s it for** This workflow is ideal for: - **Designers & marketers** who need quick, on-demand AI-generated visuals. - **Developers & automation builders** exploring **AI-driven workflows** integrated with cloud storage. - **Educators or trainers** creating tutorials or exercises on AI image generation. - **Businesses** looking to automate **image content pipelines** with AWS S3 storage. ## **How it works / What it does** 1. **Trigger**: The workflow starts manually when you click **“Execute Workflow”**. 2. **Edit Fields**: You can provide input fields such as image description, resolution, or naming convention. 3. **Create AWS S3 Bucket**: Automatically creates a new **S3 bucket** if it doesn’t exist. 4. **Create a Folder**: Inside the bucket, a folder is created to organize generated images. 5. **Prompt Generation Agent**: An AI agent generates or refines the image prompt using the **OpenAI Chat Model**. 6. **Generate an Image**: The refined prompt is used to generate an image using AI. 7. **Upload File to S3**: The generated image is uploaded to the AWS S3 bucket for secure storage. This workflow showcases how to combine **AI + Cloud Storage** seamlessly in an automated pipeline. ## **How to set up** 1. **Import the workflow** into **n8n**. 2. Configure the following credentials: - **AWS S3** (Access Key, Secret Key, Region). - **OpenAI API Key** (for Chat + Image models). 3. Update the **Edit Fields node** with your preferred input fields (e.g., image size, description). 4. Execute the workflow and test by entering a sample image prompt (e.g., *“Futuristic city skyline in watercolor style”*). 5. Check your AWS S3 bucket to verify the uploaded image. ## **Requirements** - **n8n** (latest version with AI Tool Node support). - **AWS account** with S3 permissions to create buckets and upload files. - **OpenAI API key** (for prompt refinement and image generation). - Basic familiarity with **AWS S3 structure** (buckets, folders, objects). ## **How to customize the workflow** - **Custom Buckets**: Replace the auto-create step with an existing S3 bucket. - **Image Variations**: Generate multiple image variations per prompt by looping the image generation step. - **File Naming**: Adjust file naming conventions (e.g., timestamp, user input). - **Metadata**: Add metadata such as tags, categories, or owner info when uploading to S3. - **Alternative Storage**: Swap AWS S3 with **Google Cloud Storage, Azure Blob, or Dropbox**. - **Trigger Options**: Replace manual trigger with **Webhook, Form Submission, or Scheduler** for automation. ✅ This workflow is a **hands-on example** of how to combine **AI prompt engineering, image generation, and cloud storage automation** into a single streamlined process.
Create AI-generated books with GPT-4.1-mini, DALL-E, Google Drive and AWS S3
# Multi-Agent Book Creation Workflow with AI Tool Node and GPT-4, DALL-E [](https://www.youtube.com/watch?v=o1x8Tw_7FwQ) ## **Who’s it for** This workflow is designed for: - **Content creators** who want to generate books or structured documents automatically. - **Educators and trainers** who need quick course materials, eBooks, or study guides. - **Automation enthusiasts** exploring **multi-agent systems** using the newly released **AI Tool Node** in n8n. - **Developers** looking for a reference template to understand **orchestration of multiple AI agents** with structured output. ## **How it works / What it does** This template demonstrates a **multi-agent orchestration system** powered by AI Tool Nodes: 1. **Trigger**: Workflow starts when a chat message is received. 2. **Book Brief Agent**: Generates the initial book concept (title, subtitle, and outline). 3. **Book Writer Agent**: Expands the outline into full content by collaborating with two sub-agents: - **Designer Agent** → Provides layout/design suggestions. - **Content Writer Agent** → Drafts and refines chapters. 4. **Generate Cover Image**: AI generates a custom book cover image. 5. **Upload to AWS S3**: Stores the cover image securely. 6. **Configure Metadata**: Adds metadata for title, author, and description. 7. **Build Book HTML**: Converts markdown-based content into HTML format. 8. **Upload to Google Drive**: Saves the HTML content for processing. 9. **Convert to PDF**: Transforms the book into a professional PDF. 10. **Archive to Google Drive**: Final version is archived for safe storage. This workflow showcases **multi-agent coordination, structured parsing, and seamless integration** with cloud storage services. ## **How to set up** 1. **Import the workflow** into n8n. 2. Configure the following connections: - **OpenAI** (for Book Brief, Book Writer, Designer, and Content Writer Agents). - **AWS S3** (for image storage). - **Google Drive** (for document storage & archiving). 3. Add your API keys and credentials in **n8n credentials manager**. 4. Test the workflow by sending a sample chat message (e.g., *“Write a book about AI in education”*). 5. Verify outputs in Google Drive (HTML + PDF) and AWS S3 (cover image). ## **Requirements** - **n8n** (latest version with **AI Tool Node** support). - **OpenAI API key** (to power multi-agent models). - **AWS account** (with S3 bucket for storing images). - **Google Drive integration** (for document storage and archiving). - Basic familiarity with **workflow setup in n8n**. ## **How to customize the workflow** - **Switch Models**: Replace `gpt-4.1-mini` with other models (faster, cheaper, or more powerful). - **Add More Agents**: Introduce agents for **editing, fact-checking, or translation**. - **Change Output Format**: Export to **EPUB, DOCX, or Markdown** instead of PDF. - **Branding Options**: Modify the **cover generation prompt** to include company logos or specific style. - **Extend Storage**: Add **Dropbox, OneDrive, or Notion** integration for additional archiving. - **Trigger Alternatives**: Replace chat trigger with **form submission, webhook, or schedule-based runs**. ✅ This workflow acts as a **free, plug-and-play template** to showcase how **multi-agents + AI Tool Node** can work together to automate complex content creation pipelines.
Generate and share professional PDFs with OpenAI, Google Docs, and Slack
# Free PDF Generator in n8n – No External Libraries or Paid Services > A 100% free n8n workflow for generating professionally formatted PDFs without relying on external libraries or paid converters. It uses OpenAI to create Markdown content, Google Docs to format and convert to PDF, and integrates with Google Drive and Slack for archiving and sharing, ideal for reports, BRDs, proposals, or any document you need directly inside n8n. ### Watch the demo video below: [](https://www.youtube.com/watch?v=BB32RPQYI94) ## **Who’s it for** - Teams that need **auto-generated documents** (reports, guides, checklists) in PDF format. - Operations or enablement teams who want files **archived in Google Drive** and **shared in Slack** automatically. - Anyone experimenting with **LLM-powered document generation** integrated into business workflows. ## **How it works / What it does** 1. **Manual trigger** starts the workflow. 2. **LLM generates** a sample Markdown document (via OpenAI Chat Model). 3. **Google Drive folder** is configured for storage. 4. **Google Doc is created** from the generated Markdown content. 5. **Document is exported to PDF** using Google Drive. ([Sample PDF generated from comprehensive markdown](https://wisestackai.s3.ap-southeast-1.amazonaws.com/12082025052957.pdf)) 6. **PDF is archived** in a designated Drive folder. 7. **Archived PDF is downloaded** for sharing. 8. **Slack message is sent** with the PDF attached. ## **How to set up** 1. **Add nodes in sequence**: - Manual Trigger - OpenAI Chat Model (prompt to generate sample Markdown) - Set/Manual input for Google Drive folder ID(s) - HTTP Request or Google Drive Upload (convert to Google Docs) - Google Drive Download (PDF export) - Google Drive Upload (archive PDF) - Google Drive Download (fetch archived file) - Slack Upload (send message with attachment) 2. **Configure credentials** for OpenAI, Google Drive, and Slack. 3. **Map output fields**: - `data.markdown` → Google Docs creation - `docId` → PDF export - `fileId` → Slack upload 4. **Test run** to ensure PDF is generated, archived, and posted to Slack. ## **Requirements** - **Credentials**: - OpenAI API key (or compatible LLM provider) - Google Drive (OAuth2) with read/write permissions - Slack bot token with `files:write` permission - **Access**: - Write access to target Google Drive folders - Slack bot invited to the target channel ## **How to customize the workflow** - **Change the prompt** in the OpenAI Chat Model to generate different types of content (reports, meeting notes, checklists). - **Automate triggering**: - Replace Manual Trigger with Cron for scheduled document generation. - Use Webhook Trigger to run on-demand from external apps. - **Modify storage logic**: - Save both `.md` and `.pdf` versions in Google Drive. - Use separate folders for drafts vs. final versions. - **Enhance distribution**: - Send PDFs to multiple Slack channels or via email. - Integrate with project management tools for automated task creation.
Automate IT support with Telegram voice to JIRA tickets using Whisper & GPT-4.1 Mini
# 🎧 IT Voice Support Automation Bot – Telegram Voice Message to JIRA ticket with OpenAI Whisper > Automatically process IT support requests submitted via Telegram voice messages by transcribing, extracting structured data, creating a JIRA ticket, and notifying relevant parties. ## 🧑💼 Who’s it for - Internal teams that handle IT support but want to streamline voice-based requests. - Employees who prefer using mobile/voice to report incidents or ask for support. - Organizations aiming to integrate conversational AI into existing support workflows. ## ⚙️ How it works / What it does 1. A user sends a voice message to a Telegram bot. 2. The system checks whether it’s an audio message. 3. If valid, the audio is: - Downloaded - Transcribed via OpenAI Whisper - Backed up to Google Drive 4. The transcription and file metadata are merged. 5. The merged content is processed through an **AI Agent** (GPT) to extract structured request info. 6. A JIRA ticket is created using the extracted data. 7. The IT team is notified via Slack (or other channels). 8. The requester receives a Telegram confirmation message with the JIRA ticket link. 9. If the input is not audio, a polite rejection message is sent. ## 📌 Key Features - Supports voice-based ticket creation - Accurate transcription using Whisper - Context-aware request parsing using GPT-4.1 mini - Fully automated ticket creation in JIRA - Notifies both IT and the original requester - Cloud backup of original voice messages (Google Drive) ## 🛠️ Setup Instructions ### Prerequisites | Component | Required | |----------|----------| | Telegram Bot & API Key | ✅ | | OpenAI Whisper / Transcription Model | ✅ | | Google Drive Credentials (OAuth2) | ✅ | | Google Sheets or other storage (optional) | ⬜ | | JIRA Cloud API Access | ✅ | | Slack Bot or Webhook | ✅ | ### Workflow Steps 1. **Telegram Voice Message Trigger**: Starts the flow when a user sends a voice message. 2. **Is Audio Message?**: If false → reply "only voice is supported" 3. **Download Audio**: Download `.oga` file from Telegram. 4. **Transcribe Audio**: Use OpenAI Whisper to get text transcript. 5. **Backup to Google Drive**: Upload original voice file with metadata. 6. **Merge Results**: Combine transcript and metadata. 7. **Pre-process Output**: Clean formatting before AI extraction. 8. **Transcript Processing Agent**: GPT-based agent extracts: - Requester name, department - Request title & description - Priority & request type 9. **Submit JIRA Request Ticket**: Create ticket from AI-extracted data. 10. **Setup Slack / Email / Manual Steps**: Optional internal routing or approvals. 11. **Inform Reporter via Telegram**: Sends confirmation message with JIRA ticket link. ## 🔧 How to Customize - Replace JIRA with Zendesk, GitHub Issues, or other ticketing tools. - Change Slack to Microsoft Teams or Email. - Add Notion/Airtable logging. - Enhance agent to extract department from user ID or metadata. ## 📦 Requirements | Integration | Notes | |-------------|-------| | Telegram Bot | Used for input/output | | Google Drive | Audio backup | | OpenAI GPT + Whisper | Transcript & Extraction | | JIRA | Ticketing platform | | Slack | Team notification | Built with ❤️ using [n8n](https://n8n.io)