Aadarsh Jain
Workflows by Aadarsh Jain
Analyze documents & web content with GPT-4o Q&A assistant
# Document Analyzer and Q&A Workflow AI-powered document and web page analysis using n8n and GPT model. Ask questions about any local file or web URL and get intelligent, formatted answers. ## Who's it for Perfect for researchers, developers, content analysts, students, and anyone who needs quick insights from documents or web pages without uploading files to external services. ## What it does - **Analyzes local files**: PDF, Markdown, Text, JSON, YAML, Word docs - **Fetches web content**: Documentation sites, blogs, articles - **Answers questions**: Using GPT model with structured, well-formatted responses **Input format:** `path_or_url | your_question` **Examples:** ``` /Users/docs/readme.md | What are the installation steps? https://n8n.io | What is n8n? ``` ## Setup 1. Import workflow into n8n 2. Add your OpenAI API key to credentials 3. Link the credential to the "OpenAI Document Analyzer" node 4. Activate the workflow 5. Start chatting! ## Customize **Change AI model** → Edit "OpenAI Document Analyzer" node (switch to `gpt-4o-mini` for cost savings) **Adjust content length** → Modify `maxLength` in "Process Document Content" node (default: 15000 chars) **Add file types** → Update `supportedTypes` array in "Parse Document & Question" node **Increase timeout** → Change timeout value in "Fetch Web Content" node (default: 30s) ---
Kubernetes management with natural language using GPT-4o and MCP tools
## Who is this for? This workflow is designed for DevOps engineers, platform engineers, and Kubernetes administrators who want to interact with their Kubernetes clusters through natural language queries in n8n. It's perfect for teams who need quick cluster insights without memorizing complex kubectl commands or switching between multiple cluster contexts manually. ## How it works? The workflow operates in three intelligent stages: 1. **Cluster Discovery & Context Switching** - Automatically lists available clusters from your kubeconfig and switches to the appropriate cluster based on your natural language query 2. **Command Generation** - Uses GPT-4o to analyze your request and generate the correct kubectl command with proper flags, selectors, and output formatting 3. **Command Execution** - Executes the generated kubectl command against your selected cluster and returns the results The workflow supports multi-cluster environments and can handle queries like: - "Show me all pods in production cluster" - "List failing deployments in production" - "Get pod details in kube-system namespace" ## Setup 1. **Clone the MCP Server** ```bash git clone https://github.com/aadarshjain/kubectl-mcp-server cd kubectl-mcp-server ``` 2. **Configure your kubeconfig** - Ensure your `~/.kube/config` contains all the clusters you want to access 3. **Set up MCP STDIO credentials** in n8n - Command: /full/path/to/python-package - Arguments: /full/path/to/kubectl-mcp-server/server.py 4. **Import the workflow** into your n8n instance 5. **Configure OpenAI credentials** for the GPT-4o models 6. **Test the workflow** using the chat interface with queries like "show pods in [cluster-name]"