{"workflow":{"id":14035,"name":"Monitor PostgreSQL data quality and generate remediation alerts with Slack","views":15,"recentViews":0,"totalViews":15,"createdAt":"2026-03-14T08:56:22.414Z","description":"# Autonomous PostgreSQL Data Quality Monitoring & Remediation\n\n## Overview\n\nThis workflow automatically monitors **PostgreSQL database data quality** and detects structural or statistical anomalies before they impact analytics, pipelines, or applications.\n\nRunning every **6 hours**, it scans database metadata, table statistics, and historical baselines to identify:\n\n- Schema drift\n- Null value explosions\n- Abnormal data distributions\n\nDetected issues are evaluated using a **confidence scoring system** that considers severity, frequency, and affected data volume. When issues exceed the defined threshold, the workflow generates **SQL remediation suggestions**, logs the issue to an audit table, and sends alerts to Slack.\n\nThis automation enables teams to **proactively maintain database reliability**, detect unexpected schema changes, and quickly respond to data quality problems.\n\n---\n\n## How It Works\n\n### 1. Scheduled Monitoring\n\nA **Schedule Trigger** starts the workflow every 6 hours to run automated database quality checks.\n\n### 2. Metadata & Statistics Collection\n\nThe workflow retrieves important metadata from PostgreSQL:\n\n- **Schema metadata** from `information_schema.columns`\n- **Table statistics** from `pg_stat_user_tables`\n- **Historical baselines** from a baseline tracking table\n\nThese datasets allow the workflow to compare current database conditions against historical norms.\n\n### 3. Data Quality Detection Engine\n\nThree parallel detection checks analyze the database:\n\n**Schema Drift Detection**\n- Identifies new tables or columns\n- Detects removed columns or tables\n- Detects datatype or nullability changes\n\n**Null Explosion Detection**\n- Calculates null percentage per column\n- Flags columns exceeding configured null thresholds\n\n**Outlier Distribution Detection**\n- Compares current column statistics against historical baselines\n- Uses statistical deviation (z-score) to detect abnormal distributions\n\n### 4. Issue Aggregation & Confidence Scoring\n\nAll detected issues are aggregated and evaluated using a **confidence scoring system** based on:\n\n- Severity of the issue\n- Data volume affected\n- Historical frequency\n- Consistency of detection\n\nOnly issues above the configured confidence threshold proceed to remediation.\n\n### 5. SQL Remediation Suggestions\n\nFor high-confidence issues, the workflow automatically generates **SQL investigation or remediation queries**, such as:\n\n- ALTER TABLE fixes\n- NULL cleanup queries\n- Outlier review queries\n\n### 6. Logging & Alerting\n\nConfirmed issues are:\n\n- Stored in a **PostgreSQL audit table**\n- Sent as alerts to **Slack**\n\n### 7. Baseline Updates\n\nFinally, the workflow updates the **data quality baseline table**, improving anomaly detection accuracy in future runs.\n\n---\n\n## Setup Instructions\n\n1. Configure a **PostgreSQL credential** in n8n.\n2. Replace `&lt;target schema name&gt;` in the SQL queries with your database schema.\n3. Create the following tables in PostgreSQL:\n\n**Audit Table**\n\n`data_quality_audit`\n\nStores detected data quality issues and remediation suggestions.\n\n**Baseline Table**\n\n`data_quality_baselines`\n\nStores historical statistics used for anomaly detection.\n\n4. Configure your **Slack credential**.\n5. Replace the placeholder Slack channel ID in the **Send Alert to Team** node.\n\nOptional configuration parameters can be modified in the **Workflow Configuration** node:\n\n- `confidenceThreshold`\n- `maxNullPercentage`\n- `outlierStdDevThreshold`\n- `auditTableName`\n- `baselineTableName`\n\n---\n\n## Use Cases\n\n### Database Reliability Monitoring\nDetect unexpected schema changes or structural modifications in production databases.\n\n### Data Pipeline Validation\nIdentify anomalies in datasets used by ETL pipelines before they propagate errors downstream.\n\n### Analytics Data Quality Monitoring\nPrevent reporting inaccuracies caused by missing data or abnormal values.\n\n### Production Database Observability\nProvide automated alerts when critical database quality issues occur.\n\n### Data Governance & Compliance\nMaintain a historical audit log of database quality issues and remediation actions.\n\n---\n\n## Requirements\n\nThis workflow requires the following services:\n\n- **PostgreSQL Database**\n- **Slack Workspace**\n- **n8n**\n\nNodes used:\n\n- Schedule Trigger\n- Set\n- Postgres\n- Code (Python)\n- Aggregate\n- IF\n- Slack\n\n---\n\n## Key Features\n\n- Automated **database health monitoring**\n- **Schema drift detection**\n- **Null explosion detection**\n- **Statistical anomaly detection**\n- Confidence-based issue filtering\n- Automated **SQL remediation suggestions**\n- Slack alerting\n- Historical **baseline learning system**\n\n---\n\n## Summary\n\nThis workflow provides an **automated data quality monitoring system for PostgreSQL**. It continuously analyzes schema structure, column statistics, and historical baselines to detect anomalies, generate remediation suggestions, and notify teams in real time.\n\nBy automating database quality checks, teams can **identify issues early, reduce debugging time, and maintain reliable data pipelines**.","workflow":{"meta":{"instanceId":"48aac30adfc5487a33ef16e0e958096f27cef40df3ed0febcbe1ca199e789786"},"nodes":[{"id":"b2fcf63e-b2a0-4d0a-8df1-6e06e2b3dc1e","name":"Schedule DB Quality Scan","type":"n8n-nodes-base.scheduleTrigger","position":[-672,224],"parameters":{"rule":{"interval":[{"field":"hours","hoursInterval":6}]}},"typeVersion":1.3},{"id":"bad30ec6-11fa-4d9c-b66f-f685546fb82d","name":"Workflow Configuration","type":"n8n-nodes-base.set","position":[-464,224],"parameters":{"options":{},"assignments":{"assignments":[{"id":"id-1","name":"confidenceThreshold","type":"number","value":0.85},{"id":"id-2","name":"maxNullPercentage","type":"number","value":0.15},{"id":"id-3","name":"outlierStdDevThreshold","type":"number","value":3},{"id":"id-4","name":"auditTableName","type":"string","value":"data_quality_audit"},{"id":"id-5","name":"baselineTableName","type":"string","value":"data_quality_baselines"}]},"includeOtherFields":true},"typeVersion":3.4},{"id":"b3eebfd6-88bb-435f-80d7-411fb423f546","name":"Get Schema Metadata","type":"n8n-nodes-base.postgres","position":[-128,-48],"parameters":{"query":"SELECT table_name, column_name, data_type, is_nullable FROM information_schema.columns WHERE table_schema = <__PLACEHOLDER_VALUE__target schema name__> ORDER BY table_name, ordinal_position","options":{},"operation":"executeQuery"},"typeVersion":2.6},{"id":"40d7ac18-bef8-47f5-964a-0bbe59107125","name":"Get Table Statistics","type":"n8n-nodes-base.postgres","position":[-128,224],"parameters":{"query":"SELECT schemaname, tablename, n_live_tup as row_count, n_dead_tup as dead_rows FROM pg_stat_user_tables WHERE schemaname = <__PLACEHOLDER_VALUE__target schema name__>","options":{},"operation":"executeQuery"},"typeVersion":2.6},{"id":"5a61e58d-fd77-4bc0-aef0-eabc3fc410a9","name":"Get Historical Baselines","type":"n8n-nodes-base.postgres","position":[-128,416],"parameters":{"query":"{{ 'SELECT * FROM ' + $('Workflow Configuration').first().json.baselineTableName + ' ORDER BY recorded_at DESC LIMIT 1' }}","options":{},"operation":"executeQuery"},"typeVersion":2.6},{"id":"42867364-5006-44b2-ab56-3f7c0897fcb8","name":"Detect Schema Drift","type":"n8n-nodes-base.code","position":[208,-48],"parameters":{"language":"pythonNative","pythonCode":"from datetime import datetime\nimport json\n\n# Get current schema metadata from Get Schema Metadata node\ncurrent_schema = _input.all()[0].get('json', {})\n\n# Get historical baselines from Get Historical Baselines node\nhistorical_data = _input.all()[1].get('json', {})\n\nissues = []\n\n# Parse current schema and historical baselines\nif isinstance(current_schema, list):\n    current_tables = {item.get('table_name'): item for item in current_schema}\nelse:\n    current_tables = {current_schema.get('table_name'): current_schema}\n\nif isinstance(historical_data, list):\n    historical_tables = {item.get('table_name'): item for item in historical_data}\nelse:\n    historical_tables = {historical_data.get('table_name'): historical_data}\n\ntimestamp = datetime.utcnow().isoformat()\n\n# Check for schema drift\nfor table_name, current_table in current_tables.items():\n    if table_name not in historical_tables:\n        # New table detected\n        issues.append({\n            'type': 'schema_drift',\n            'severity': 'medium',\n            'table_name': table_name,\n            'column_name': None,\n            'change_description': f'New table {table_name} detected',\n            'detected_at': timestamp\n        })\n        continue\n    \n    historical_table = historical_tables[table_name]\n    current_columns = current_table.get('columns', {})\n    historical_columns = historical_table.get('columns', {})\n    \n    # Check for added columns\n    for col_name, col_info in current_columns.items():\n        if col_name not in historical_columns:\n            issues.append({\n                'type': 'schema_drift',\n                'severity': 'low',\n                'table_name': table_name,\n                'column_name': col_name,\n                'change_description': f'New column {col_name} added to {table_name}',\n                'detected_at': timestamp\n            })\n    \n    # Check for removed columns\n    for col_name in historical_columns:\n        if col_name not in current_columns:\n            issues.append({\n                'type': 'schema_drift',\n                'severity': 'high',\n                'table_name': table_name,\n                'column_name': col_name,\n                'change_description': f'Column {col_name} removed from {table_name}',\n                'detected_at': timestamp\n            })\n    \n    # Check for data type changes\n    for col_name, col_info in current_columns.items():\n        if col_name in historical_columns:\n            current_type = col_info.get('data_type')\n            historical_type = historical_columns[col_name].get('data_type')\n            \n            if current_type != historical_type:\n                issues.append({\n                    'type': 'schema_drift',\n                    'severity': 'high',\n                    'table_name': table_name,\n                    'column_name': col_name,\n                    'change_description': f'Data type changed from {historical_type} to {current_type} for {col_name} in {table_name}',\n                    'detected_at': timestamp\n                })\n            \n            # Check for nullability changes\n            current_nullable = col_info.get('is_nullable', True)\n            historical_nullable = historical_columns[col_name].get('is_nullable', True)\n            \n            if current_nullable != historical_nullable:\n                nullable_change = 'nullable' if current_nullable else 'not nullable'\n                issues.append({\n                    'type': 'schema_drift',\n                    'severity': 'medium',\n                    'table_name': table_name,\n                    'column_name': col_name,\n                    'change_description': f'Nullability changed to {nullable_change} for {col_name} in {table_name}',\n                    'detected_at': timestamp\n                })\n\n# Check for removed tables\nfor table_name in historical_tables:\n    if table_name not in current_tables:\n        issues.append({\n            'type': 'schema_drift',\n            'severity': 'critical',\n            'table_name': table_name,\n            'column_name': None,\n            'change_description': f'Table {table_name} has been removed',\n            'detected_at': timestamp\n        })\n\nreturn issues"},"typeVersion":2},{"id":"7ac59653-ed75-41ac-b19d-02e2e2beeee3","name":"Detect Null Explosions","type":"n8n-nodes-base.code","position":[208,160],"parameters":{"language":"pythonNative","pythonCode":"from datetime import datetime\nimport json\n\n# Get configuration from Workflow Configuration node\nconfig = _input.first().json\ntables_to_monitor = config.get('tablesToMonitor', [])\nmax_null_percentage = config.get('maxNullPercentage', 5.0)\n\n# Get table statistics from Get Table Statistics node\ntable_stats = _input.all()\n\nissues = []\n\nfor stat in table_stats:\n    table_name = stat.json.get('table_name')\n    columns_data = stat.json.get('columns', [])\n    \n    # Check if this table should be monitored\n    if table_name not in tables_to_monitor:\n        continue\n    \n    # Analyze each column for null explosions\n    for column in columns_data:\n        column_name = column.get('column_name')\n        null_count = column.get('null_count', 0)\n        total_count = column.get('total_count', 0)\n        \n        if total_count == 0:\n            continue\n        \n        null_percentage = (null_count / total_count) * 100\n        \n        # Check if null percentage exceeds threshold\n        if null_percentage > max_null_percentage:\n            severity = 'critical' if null_percentage > max_null_percentage * 2 else 'high'\n            \n            issues.append({\n                'type': 'null_explosion',\n                'severity': severity,\n                'table_name': table_name,\n                'column_name': column_name,\n                'null_percentage': round(null_percentage, 2),\n                'threshold': max_null_percentage,\n                'detected_at': datetime.utcnow().isoformat()\n            })\n\nreturn issues"},"typeVersion":2},{"id":"d2790027-d863-4126-9ce8-c2998809ae3a","name":"Detect Outlier Distributions","type":"n8n-nodes-base.code","position":[208,384],"parameters":{"language":"pythonNative","pythonCode":"from datetime import datetime\nimport math\n\n# Get configuration from Workflow Configuration node\nconfig = _input.first().get('json', {})\noutlier_threshold = config.get('outlierStdDevThreshold', 3)\n\n# Get table statistics from Get Table Statistics node\ntable_stats = []\nfor item in _input.all():\n    if 'table_name' in item.get('json', {}):\n        table_stats.append(item['json'])\n\n# Get historical baselines from Get Historical Baselines node\nbaselines = {}\nfor item in _input.all():\n    baseline = item.get('json', {})\n    if 'table_name' in baseline and 'column_name' in baseline:\n        key = f\"{baseline['table_name']}.{baseline['column_name']}\"\n        baselines[key] = baseline\n\nissues = []\n\n# Analyze each table's statistics\nfor stat in table_stats:\n    table_name = stat.get('table_name')\n    column_name = stat.get('column_name')\n    column_type = stat.get('column_type', '')\n    \n    # Only analyze numeric columns\n    if not any(t in column_type.lower() for t in ['int', 'float', 'numeric', 'decimal', 'double']):\n        continue\n    \n    current_mean = stat.get('mean')\n    current_stddev = stat.get('stddev')\n    current_count = stat.get('row_count', 0)\n    \n    if current_mean is None or current_stddev is None:\n        continue\n    \n    # Get historical baseline\n    baseline_key = f\"{table_name}.{column_name}\"\n    baseline = baselines.get(baseline_key, {})\n    baseline_mean = baseline.get('baseline_mean')\n    baseline_stddev = baseline.get('baseline_stddev')\n    \n    if baseline_mean is None or baseline_stddev is None or baseline_stddev == 0:\n        continue\n    \n    # Calculate z-score for the current mean compared to baseline\n    z_score = abs((current_mean - baseline_mean) / baseline_stddev)\n    \n    # Detect outliers based on threshold\n    if z_score > outlier_threshold:\n        # Estimate outlier count (simplified approach)\n        outlier_count = int(current_count * (z_score / 10))  # Rough estimate\n        \n        # Determine severity based on z-score\n        if z_score > outlier_threshold * 2:\n            severity = 'critical'\n        elif z_score > outlier_threshold * 1.5:\n            severity = 'high'\n        else:\n            severity = 'medium'\n        \n        issues.append({\n            'type': 'outlier_distribution',\n            'severity': severity,\n            'table_name': table_name,\n            'column_name': column_name,\n            'outlier_count': outlier_count,\n            'z_score': round(z_score, 2),\n            'current_mean': round(current_mean, 2),\n            'baseline_mean': round(baseline_mean, 2),\n            'current_stddev': round(current_stddev, 2),\n            'detected_at': datetime.utcnow().isoformat()\n        })\n\nreturn issues"},"typeVersion":2},{"id":"dbe44b7a-936f-44cd-a376-888c71f5e242","name":"Combine All Issues","type":"n8n-nodes-base.aggregate","position":[464,128],"parameters":{"options":{},"aggregate":"aggregateAllItemData","destinationFieldName":"issues"},"typeVersion":1},{"id":"7835d9d6-b483-496e-a8f1-e4861d29fed2","name":"Calculate Confidence Scores","type":"n8n-nodes-base.code","position":[640,128],"parameters":{"language":"pythonNative","pythonCode":"# Calculate confidence scores for each data quality issue\n# Score based on: severity, historical frequency, data volume affected, and consistency\n\nimport json\n\n# Get all items from previous node\nitems = _items('all')\n\n# Process each issue and calculate confidence score\nfor item in items:\n    issue = item['json']\n    \n    # Initialize score components (0-1 scale)\n    severity_score = 0\n    frequency_score = 0\n    volume_score = 0\n    consistency_score = 0\n    \n    # 1. Severity Score (based on issue type and impact)\n    severity = issue.get('severity', 'medium').lower()\n    if severity == 'critical':\n        severity_score = 1.0\n    elif severity == 'high':\n        severity_score = 0.8\n    elif severity == 'medium':\n        severity_score = 0.5\n    else:\n        severity_score = 0.3\n    \n    # 2. Historical Frequency Score (how often this issue occurs)\n    historical_count = issue.get('historical_count', 0)\n    if historical_count >= 10:\n        frequency_score = 1.0\n    elif historical_count >= 5:\n        frequency_score = 0.7\n    elif historical_count >= 2:\n        frequency_score = 0.5\n    else:\n        frequency_score = 0.2\n    \n    # 3. Data Volume Score (percentage of data affected)\n    affected_rows = issue.get('affected_rows', 0)\n    total_rows = issue.get('total_rows', 1)\n    affected_percentage = (affected_rows / total_rows) * 100 if total_rows > 0 else 0\n    \n    if affected_percentage >= 50:\n        volume_score = 1.0\n    elif affected_percentage >= 25:\n        volume_score = 0.8\n    elif affected_percentage >= 10:\n        volume_score = 0.6\n    elif affected_percentage >= 1:\n        volume_score = 0.4\n    else:\n        volume_score = 0.2\n    \n    # 4. Consistency Score (detected across multiple checks)\n    detection_count = issue.get('detection_count', 1)\n    if detection_count >= 3:\n        consistency_score = 1.0\n    elif detection_count >= 2:\n        consistency_score = 0.7\n    else:\n        consistency_score = 0.4\n    \n    # Calculate weighted confidence score\n    # Weights: severity (40%), volume (30%), frequency (20%), consistency (10%)\n    confidence_score = (\n        severity_score * 0.4 +\n        volume_score * 0.3 +\n        frequency_score * 0.2 +\n        consistency_score * 0.1\n    )\n    \n    # Round to 3 decimal places\n    confidence_score = round(confidence_score, 3)\n    \n    # Add confidence score to the issue\n    issue['confidence_score'] = confidence_score\n    issue['confidence_breakdown'] = {\n        'severity': severity_score,\n        'volume': volume_score,\n        'frequency': frequency_score,\n        'consistency': consistency_score\n    }\n\n# Return all items with confidence scores\nreturn items"},"typeVersion":2},{"id":"ba0f352a-1a7c-4c62-9c2b-bc3a9c42d730","name":"Check Confidence Threshold","type":"n8n-nodes-base.if","position":[800,128],"parameters":{"options":{},"conditions":{"options":{"version":3,"leftValue":"","caseSensitive":true,"typeValidation":"strict"},"combinator":"and","conditions":[{"id":"id-1","operator":{"type":"array","operation":"notEmpty"},"leftValue":"={{ $json.issues }}"},{"id":"id-2","operator":{"type":"boolean","operation":"true"},"leftValue":"={{ $json.issues.some(issue => issue.confidence_score >= $('Workflow Configuration').first().json.confidenceThreshold) }}"}]}},"typeVersion":2.3},{"id":"fecc8343-1930-4730-9fc0-8760aaf6097b","name":"Generate SQL Fixes","type":"n8n-nodes-base.code","position":[1088,112],"parameters":{"language":"pythonNative","pythonCode":"# Generate SQL fix suggestions for each high-confidence issue\nimport json\n\n# Get all items from previous node\nitems = _items('all')\n\nresults = []\n\nfor item in items:\n    issue = item['json']\n    issue_type = issue.get('issue_type', '')\n    table_name = issue.get('table_name', '')\n    column_name = issue.get('column_name', '')\n    \n    sql_fix = ''\n    rollback_sql = ''\n    \n    if issue_type == 'schema_drift':\n        # Generate ALTER TABLE statements for schema changes\n        drift_details = issue.get('details', {})\n        change_type = drift_details.get('change_type', '')\n        \n        if change_type == 'column_added':\n            # Suggest documenting or validating the new column\n            sql_fix = f\"-- New column detected: {column_name}\\nCOMMENT ON COLUMN {table_name}.{column_name} IS 'Added on {drift_details.get(\\\"detected_at\\\", \\\"unknown\\\")}';\\n\"\n            rollback_sql = f\"ALTER TABLE {table_name} DROP COLUMN IF EXISTS {column_name};\"\n        elif change_type == 'column_removed':\n            # Suggest adding back the column if needed\n            sql_fix = f\"-- Column removed: {column_name}\\n-- Review if this was intentional\\nALTER TABLE {table_name} ADD COLUMN {column_name} {drift_details.get('data_type', 'TEXT')};\"\n            rollback_sql = f\"ALTER TABLE {table_name} DROP COLUMN IF EXISTS {column_name};\"\n        elif change_type == 'type_changed':\n            old_type = drift_details.get('old_type', 'TEXT')\n            new_type = drift_details.get('new_type', 'TEXT')\n            sql_fix = f\"-- Type change detected: {old_type} -> {new_type}\\nALTER TABLE {table_name} ALTER COLUMN {column_name} TYPE {old_type} USING {column_name}::{old_type};\"\n            rollback_sql = f\"ALTER TABLE {table_name} ALTER COLUMN {column_name} TYPE {new_type} USING {column_name}::{new_type};\"\n    \n    elif issue_type == 'null_explosion':\n        # Generate UPDATE or constraint additions for null explosions\n        null_percentage = issue.get('null_percentage', 0)\n        \n        if null_percentage > 50:\n            # High null rate - suggest adding NOT NULL constraint or default value\n            sql_fix = f\"-- High null rate detected ({null_percentage}%)\\n-- Option 1: Add default value\\nUPDATE {table_name} SET {column_name} = '<default_value>' WHERE {column_name} IS NULL;\\nALTER TABLE {table_name} ALTER COLUMN {column_name} SET DEFAULT '<default_value>';\\n\\n-- Option 2: Add NOT NULL constraint (after fixing nulls)\\n-- ALTER TABLE {table_name} ALTER COLUMN {column_name} SET NOT NULL;\"\n            rollback_sql = f\"ALTER TABLE {table_name} ALTER COLUMN {column_name} DROP NOT NULL;\\nALTER TABLE {table_name} ALTER COLUMN {column_name} DROP DEFAULT;\"\n        else:\n            # Moderate null rate - suggest investigation\n            sql_fix = f\"-- Investigate null values\\nSELECT * FROM {table_name} WHERE {column_name} IS NULL LIMIT 100;\\n\\n-- If appropriate, update nulls:\\n-- UPDATE {table_name} SET {column_name} = '<value>' WHERE {column_name} IS NULL;\"\n            rollback_sql = f\"-- Manual rollback required based on UPDATE operation\"\n    \n    elif issue_type == 'outlier_distribution':\n        # Generate data cleanup queries for outliers\n        outlier_details = issue.get('details', {})\n        outlier_count = outlier_details.get('outlier_count', 0)\n        \n        sql_fix = f\"-- Outlier distribution detected\\n-- Review outlier values:\\nSELECT {column_name}, COUNT(*) as frequency\\nFROM {table_name}\\nGROUP BY {column_name}\\nORDER BY frequency DESC;\\n\\n-- Option 1: Cap extreme values\\n-- UPDATE {table_name} SET {column_name} = <max_threshold> WHERE {column_name} > <max_threshold>;\\n-- UPDATE {table_name} SET {column_name} = <min_threshold> WHERE {column_name} < <min_threshold>;\\n\\n-- Option 2: Remove outliers (use with caution)\\n-- DELETE FROM {table_name} WHERE {column_name} > <max_threshold> OR {column_name} < <min_threshold>;\"\n        rollback_sql = f\"-- Backup data before cleanup:\\n-- CREATE TABLE {table_name}_backup AS SELECT * FROM {table_name};\\n-- Restore: INSERT INTO {table_name} SELECT * FROM {table_name}_backup;\"\n    \n    # Add SQL fix fields to the issue\n    issue['sql_fix'] = sql_fix\n    issue['rollback_sql'] = rollback_sql\n    \n    results.append({'json': issue})\n\nreturn results"},"typeVersion":2},{"id":"8f05dfe2-f1b7-4cf4-8645-887cb7c9bb12","name":"Store Issue in Audit Log","type":"n8n-nodes-base.postgres","position":[1344,112],"parameters":{"table":{"__rl":true,"mode":"name","value":"={{ $('Workflow Configuration').first().json.auditTableName }}"},"schema":{"__rl":true,"mode":"list","value":"public"},"columns":{"value":{"sql_fix":"={{ $json.sql_fix }}","severity":"={{ $json.severity }}","issue_type":"={{ $json.type }}","table_name":"={{ $json.table_name }}","column_name":"={{ $json.column_name }}","description":"={{ $json.change_description }}","detected_at":"={{ $json.detected_at }}","confidence_score":"={{ $json.confidence_score }}"},"schema":[{"id":"issue_type","required":false,"displayName":"issue_type","defaultMatch":true,"canBeUsedToMatch":true},{"id":"table_name","required":false,"displayName":"table_name","defaultMatch":true,"canBeUsedToMatch":true},{"id":"column_name","required":false,"displayName":"column_name","defaultMatch":true,"canBeUsedToMatch":true},{"id":"severity","required":false,"displayName":"severity","defaultMatch":true,"canBeUsedToMatch":true},{"id":"confidence_score","required":false,"displayName":"confidence_score","defaultMatch":true,"canBeUsedToMatch":true},{"id":"description","required":false,"displayName":"description","defaultMatch":true,"canBeUsedToMatch":true},{"id":"sql_fix","required":false,"displayName":"sql_fix","defaultMatch":true,"canBeUsedToMatch":true},{"id":"detected_at","required":false,"displayName":"detected_at","defaultMatch":true,"canBeUsedToMatch":true}],"mappingMode":"defineBelow","matchingColumns":["issue_type","table_name","column_name","severity","confidence_score","description","sql_fix","detected_at"]},"options":{}},"typeVersion":2.6},{"id":"bd9b5a43-277f-4b70-940f-ea546ab079cd","name":"Send Alert to Team","type":"n8n-nodes-base.slack","position":[1552,112],"webhookId":"92274cac-0b17-41a6-8eef-abe8a43e3de8","parameters":{"text":"=🚨 **Data Quality Alert**\n\n**Severity:** {{ $json.severity }}\n**Confidence Score:** {{ $json.confidenceScore }}%\n\n**Affected Resources:**\n• Table: {{ $json.table }}\n• Column: {{ $json.column }}\n\n**Issue Details:**\n{{ $json.issueDescription }}\n\n**Suggested SQL Fix:**\n```sql\n{{ $json.sqlFix }}\n```\n\n**Detection Time:** {{ $json.detectedAt }}\n**Issue ID:** {{ $json.issueId }}","select":"channel","channelId":{"__rl":true,"mode":"id","value":"<__PLACEHOLDER_VALUE__Slack channel ID or name__>"},"otherOptions":{"includeLinkToWorkflow":true}},"typeVersion":2.4},{"id":"03b96358-29c8-497a-a1da-8bd9d4723c76","name":"Update Baselines","type":"n8n-nodes-base.postgres","position":[1728,112],"parameters":{"table":{"__rl":true,"mode":"name","value":"={{ $('Workflow Configuration').first().json.baselineTableName }}"},"schema":{"__rl":true,"mode":"list","value":"public"},"columns":{"value":{"avg_value":"={{ $json.avg_value }}","data_type":"={{ $json.data_type }}","max_value":"={{ $json.max_value }}","min_value":"={{ $json.min_value }}","null_count":"={{ $json.null_count }}","table_name":"={{ $json.table_name }}","total_rows":"={{ $json.total_rows }}","column_name":"={{ $json.column_name }}","schema_name":"={{ $json.schema_name }}","distinct_count":"={{ $json.distinct_count }}","baseline_timestamp":"={{ $now }}"},"mappingMode":"defineBelow"},"options":{}},"typeVersion":2.6},{"id":"3b790b9c-eb8b-4cee-a353-147f5220f9ec","name":"Sticky Note4","type":"n8n-nodes-base.stickyNote","position":[992,-32],"parameters":{"color":7,"width":256,"height":368,"content":"## Suggestions\n\nFor high-confidence issues, the workflow generates SQL suggestions for investigation or correction."},"typeVersion":1},{"id":"ab37b97d-d2c9-4bd9-a634-a7005e5930aa","name":"Sticky Note6","type":"n8n-nodes-base.stickyNote","position":[-1328,96],"parameters":{"width":448,"height":448,"content":"## Autonomous Data Quality Monitoring and Remediation for Production Databases\n This workflow continuously monitors PostgreSQL database health by running automated data quality checks every six hours. It retrieves schema metadata, table statistics, and historical baselines to detect structural and statistical anomalies.\n\nThe workflow analyzes three major issue types: schema drift (table or column changes), null value explosions, and abnormal data distributions. All detected issues are aggregated and evaluated using a confidence scoring system based on severity, frequency, and data impact.\n\nIf an issue exceeds the defined confidence threshold, the workflow automatically generates SQL remediation suggestions, logs the issue in a database audit table, and sends alerts to Slack for team awareness.\n"},"typeVersion":1},{"id":"c55ed3d0-c3dd-49ab-822e-5c3a59bc32e4","name":"Sticky Note8","type":"n8n-nodes-base.stickyNote","position":[1264,-16],"parameters":{"color":7,"width":704,"height":336,"content":"## Logging, Alerts & Baseline Updates\n\nConfirmed issues are stored in a PostgreSQL audit table and alerts are sent to Slack.\nFinally, the workflow updates baseline statistics to improve future anomaly detection accuracy."},"typeVersion":1},{"id":"e15e09e6-e46a-4fbf-aa43-f0b3f181c326","name":"Sticky Note10","type":"n8n-nodes-base.stickyNote","position":[432,-32],"parameters":{"color":7,"width":544,"height":368,"content":"## Issue Scoring & Filtering\n\nEach issue receives a confidence score based on severity, frequency, data volume affected, and consistency.\nOnly issues exceeding the configured confidence threshold proceed to remediation and alerting."},"typeVersion":1},{"id":"77d8b644-03cc-4cd3-878d-7cc869621d74","name":"Sticky Note11","type":"n8n-nodes-base.stickyNote","position":[96,-272],"parameters":{"color":7,"width":304,"height":1040,"content":"## Data Quality Detection Engine\n\nThree parallel checks detect potential problems:\n1>Schema drift (structure changes)\n2>Null explosions (unexpected missing values)\n3>Outlier distributions (statistical anomalies)\n"},"typeVersion":1},{"id":"16a1ff1b-c8e2-49d6-819b-087f0100a606","name":"Sticky Note12","type":"n8n-nodes-base.stickyNote","position":[-272,-272],"parameters":{"color":7,"width":352,"height":1040,"content":"## Database Metadata Collection\n\nFetches schema metadata, table statistics, and historical baseline records from PostgreSQL.\nThese datasets provide the reference points required to analyze schema changes, column behavior, and statistical deviations in the database."},"typeVersion":1},{"id":"afbd454d-4ee3-43b4-b105-b90b42e4906c","name":"Sticky Note13","type":"n8n-nodes-base.stickyNote","position":[-752,80],"parameters":{"color":7,"width":464,"height":320,"content":"## Workflow Trigger & Configuration\n\nRuns every 6 hours to monitor database quality."},"typeVersion":1}],"pinData":{},"connections":{"Combine All Issues":{"main":[[{"node":"Calculate Confidence Scores","type":"main","index":0}]]},"Generate SQL Fixes":{"main":[[{"node":"Store Issue in Audit Log","type":"main","index":0}]]},"Send Alert to Team":{"main":[[{"node":"Update Baselines","type":"main","index":0}]]},"Detect Schema Drift":{"main":[[{"node":"Combine All Issues","type":"main","index":0}]]},"Get Schema Metadata":{"main":[[{"node":"Detect Schema Drift","type":"main","index":0}]]},"Get Table Statistics":{"main":[[{"node":"Detect Null Explosions","type":"main","index":0},{"node":"Detect Outlier Distributions","type":"main","index":0}]]},"Detect Null Explosions":{"main":[[{"node":"Combine All Issues","type":"main","index":0}]]},"Workflow Configuration":{"main":[[{"node":"Get Schema Metadata","type":"main","index":0},{"node":"Get Table Statistics","type":"main","index":0},{"node":"Get Historical Baselines","type":"main","index":0}]]},"Schedule DB Quality Scan":{"main":[[{"node":"Workflow Configuration","type":"main","index":0}]]},"Store Issue in Audit Log":{"main":[[{"node":"Send Alert to Team","type":"main","index":0}]]},"Check Confidence Threshold":{"main":[[{"node":"Generate SQL Fixes","type":"main","index":0}]]},"Calculate Confidence Scores":{"main":[[{"node":"Check Confidence Threshold","type":"main","index":0}]]},"Detect Outlier Distributions":{"main":[[{"node":"Combine All Issues","type":"main","index":0}]]}}},"lastUpdatedBy":1,"workflowInfo":{"nodeCount":22,"nodeTypes":{"n8n-nodes-base.if":{"count":1},"n8n-nodes-base.set":{"count":1},"n8n-nodes-base.code":{"count":5},"n8n-nodes-base.slack":{"count":1},"n8n-nodes-base.postgres":{"count":5},"n8n-nodes-base.aggregate":{"count":1},"n8n-nodes-base.stickyNote":{"count":7},"n8n-nodes-base.scheduleTrigger":{"count":1}}},"status":"published","readyToDemo":null,"user":{"name":"ResilNext","username":"rnair1996","bio":"","verified":true,"links":[""],"avatar":"https://gravatar.com/avatar/c20bc6c3bcdf260fac3c28c556a8db661ee93670037a3ceb857e047851f6f438?r=pg&d=retro&size=200"},"nodes":[{"id":20,"icon":"fa:map-signs","name":"n8n-nodes-base.if","codex":{"data":{"alias":["Router","Filter","Condition","Logic","Boolean","Branch"],"details":"The IF node can be used to implement binary conditional logic in your workflow. You can set up one-to-many conditions to evaluate each item of data being inputted into the node. That data will either evaluate to TRUE or FALSE and route out of the node accordingly.\n\nThis node has multiple types of conditions: Bool, String, Number, and Date & Time.","resources":{"generic":[{"url":"https://n8n.io/blog/learn-to-automate-your-factorys-incident-reporting-a-step-by-step-guide/","icon":"🏭","label":"Learn to Automate Your Factory's Incident Reporting: A Step by Step Guide"},{"url":"https://n8n.io/blog/2021-the-year-to-automate-the-new-you-with-n8n/","icon":"☀️","label":"2021: The Year to Automate the New You with n8n"},{"url":"https://n8n.io/blog/why-business-process-automation-with-n8n-can-change-your-daily-life/","icon":"🧬","label":"Why business process automation with n8n can change your daily life"},{"url":"https://n8n.io/blog/create-a-toxic-language-detector-for-telegram/","icon":"🤬","label":"Create a toxic language detector for Telegram in 4 step"},{"url":"https://n8n.io/blog/no-code-ecommerce-workflow-automations/","icon":"store","label":"6 e-commerce workflows to power up your Shopify s"},{"url":"https://n8n.io/blog/how-to-build-a-low-code-self-hosted-url-shortener/","icon":"🔗","label":"How to build a low-code, self-hosted URL shortener in 3 steps"},{"url":"https://n8n.io/blog/automate-your-data-processing-pipeline-in-9-steps-with-n8n/","icon":"⚙️","label":"Automate your data processing pipeline in 9 steps"},{"url":"https://n8n.io/blog/how-to-get-started-with-crm-automation-and-no-code-workflow-ideas/","icon":"👥","label":"How to get started with CRM automation (with 3 no-code workflow ideas"},{"url":"https://n8n.io/blog/5-tasks-you-can-automate-with-notion-api/","icon":"⚡️","label":"5 tasks you can automate with the new Notion API "},{"url":"https://n8n.io/blog/automate-google-apps-for-productivity/","icon":"💡","label":"15 Google apps you can combine and automate to increase productivity"},{"url":"https://n8n.io/blog/automation-for-maintainers-of-open-source-projects/","icon":"🏷️","label":"How to automatically manage contributions to open-source projects"},{"url":"https://n8n.io/blog/how-uproc-scraped-a-multi-page-website-with-a-low-code-workflow/","icon":" 🕸️","label":"How uProc scraped a multi-page website with a low-code workflow"},{"url":"https://n8n.io/blog/5-workflow-automations-for-mattermost-that-we-love-at-n8n/","icon":"🤖","label":"5 workflow automations for Mattermost that we love at n8n"},{"url":"https://n8n.io/blog/why-this-product-manager-loves-workflow-automation-with-n8n/","icon":"🧠","label":"Why this Product Manager loves workflow automation with n8n"},{"url":"https://n8n.io/blog/sending-automated-congratulations-with-google-sheets-twilio-and-n8n/","icon":"🙌","label":"Sending Automated Congratulations with Google Sheets, Twilio, and n8n "},{"url":"https://n8n.io/blog/how-to-set-up-a-ci-cd-pipeline-with-no-code/","icon":"🎡","label":"How to set up a no-code CI/CD pipeline with GitHub and TravisCI"},{"url":"https://n8n.io/blog/benefits-of-automation-and-n8n-an-interview-with-hubspots-hugh-durkin/","icon":"🎖","label":"Benefits of automation and n8n: An interview with HubSpot's Hugh Durkin"},{"url":"https://n8n.io/blog/aws-workflow-automation/","label":"7 no-code workflow automations for Amazon Web Services"}],"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.if/"}]},"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Flow"]}}},"group":"[\"transform\"]","defaults":{"name":"If","color":"#408000"},"iconData":{"icon":"map-signs","type":"icon"},"displayName":"If","typeVersion":2,"nodeCategories":[{"id":9,"name":"Core Nodes"}]},{"id":30,"icon":"file:postgres.svg","name":"n8n-nodes-base.postgres","codex":{"data":{"resources":{"generic":[{"url":"https://n8n.io/blog/love-at-first-sight-ricardos-n8n-journey/","icon":"❤️","label":"Love at first sight: Ricardo’s n8n journey"},{"url":"https://n8n.io/blog/why-i-chose-n8n-over-zapier-in-2020/","icon":"😍","label":"Why I chose n8n over Zapier in 2020"},{"url":"https://n8n.io/blog/database-monitoring-and-alerting-with-n8n/","icon":"📡","label":"Database Monitoring and Alerting with n8n"},{"url":"https://n8n.io/blog/running-n8n-on-ships-an-interview-with-maranics/","icon":"🛳","label":"Running n8n on ships: An interview with Maranics"},{"url":"https://n8n.io/blog/automate-your-data-processing-pipeline-in-9-steps-with-n8n/","icon":"⚙️","label":"Automate your data processing pipeline in 9 steps"},{"url":"https://n8n.io/blog/how-honest-burgers-use-automation-to-save-100k-per-year/","icon":"🍔","label":"How Honest Burgers Use Automation to Save $100k per year"}],"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/app-nodes/n8n-nodes-base.postgres/"}],"credentialDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/credentials/postgres/"}]},"categories":["Development","Data & Storage"],"nodeVersion":"1.0","codexVersion":"1.0"}},"group":"[\"input\"]","defaults":{"name":"Postgres"},"iconData":{"type":"file","fileBuffer":"data:image/svg+xml;base64,<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" fill="#fff" fill-rule="evenodd" stroke="#000" stroke-linecap="round" stroke-linejoin="round" viewBox="0 0 79 81"><use xlink:href="#a" x=".5" y=".5"/><symbol id="a" overflow="visible"><g fill-rule="nonzero" stroke="none"><path fill="#000" d="M77.391 47.922c-.466-1.412-1.688-2.396-3.268-2.632-.745-.111-1.598-.064-2.608.144-1.76.363-3.065.501-4.018.528 3.596-6.072 6.521-12.997 8.204-19.515 2.722-10.54 1.268-15.341-.432-17.513C70.77 3.185 64.206.097 56.287.002c-4.224-.052-7.933.782-9.867 1.382a37 37 0 0 0-5.77-.528c-3.809-.061-7.174.77-10.05 2.476a46 46 0 0 0-7.098-1.782C16.561.411 10.968 1.299 6.876 4.19 1.922 7.689-.375 13.77.05 22.262c.135 2.696 1.643 10.9 4.018 18.68 1.365 4.472 2.82 8.185 4.326 11.038 2.135 4.046 4.419 6.428 6.984 7.284 1.438.479 4.049.814 6.797-1.473a6 6 0 0 0 1.429 1.23c.783.494 1.74.897 2.696 1.136 3.446.862 6.674.646 9.427-.561l.041 1.362.06 1.899c.163 4.064.44 7.223 1.259 9.434.045.122.105.307.169.503.409 1.251 1.092 3.346 2.83 4.987 1.8 1.699 3.978 2.22 5.972 2.22 1 0 1.955-.131 2.792-.311 2.984-.639 6.373-1.614 8.824-5.104 2.318-3.3 3.444-8.27 3.648-16.101l.074-.634.048-.414.546.048.141.01c3.039.138 6.755-.506 9.037-1.566 1.803-.837 7.582-3.888 6.221-8.007"/><path fill="#336791" d="M72.195 48.723c-9.036 1.864-9.657-1.195-9.657-1.195 9.541-14.157 13.529-32.127 10.087-36.525C63.235-.994 46.981 4.68 46.71 4.827l-.087.016c-1.785-.371-3.783-.591-6.029-.628-4.089-.067-7.19 1.072-9.544 2.857 0 0-28.995-11.945-27.647 15.023.287 5.737 8.223 43.41 17.689 32.031 3.46-4.161 6.803-7.679 6.803-7.679 1.66 1.103 3.648 1.666 5.732 1.463l.162-.137a6.3 6.3 0 0 0 .065 1.62c-2.439 2.725-1.722 3.203-6.597 4.206-4.933 1.017-2.035 2.826-.143 3.299 2.294.574 7.6 1.386 11.185-3.633l-.143.573c.956.765 1.626 4.978 1.514 8.797s-.188 6.441.565 8.489 1.503 6.656 7.912 5.282c5.355-1.148 8.13-4.121 8.516-9.081.274-3.526.894-3.005.933-6.158l.497-1.493c.573-4.78.091-6.322 3.39-5.605l.802.07c2.428.11 5.606-.391 7.471-1.257 4.016-1.864 6.398-4.976 2.438-4.158"/><path d="M32.747 24.66c-.814-.113-1.552-.008-1.925.274a.7.7 0 0 0-.292.47c-.047.336.188.707.333.898.409.542 1.006.915 1.598.997a2 2 0 0 0 .256.018c.986 0 1.883-.768 1.962-1.335.099-.71-.932-1.183-1.931-1.322m26.975.022c-.078-.556-1.068-.715-2.007-.584s-1.848.554-1.772 1.112c.061.434.844 1.174 1.771 1.174q.117 0 .237-.016c.619-.086 1.073-.479 1.288-.705.329-.345.518-.73.484-.98m15.477 23.828c-.345-1.042-1.453-1.377-3.296-.997-5.471 1.129-7.43.347-8.073-.127 4.252-6.478 7.75-14.308 9.637-21.614.894-3.461 1.388-6.675 1.428-9.294.045-2.876-.445-4.988-1.455-6.279-4.072-5.203-10.048-7.994-17.283-8.07-4.973-.056-9.175 1.217-9.99 1.575a25 25 0 0 0-5.622-.722c-3.734-.06-6.961.834-9.633 2.655a43 43 0 0 0-7.828-2.052c-6.342-1.021-11.381-.248-14.978 2.3-4.291 3.04-6.272 8.475-5.888 16.152.129 2.583 1.601 10.529 3.923 18.139 3.057 10.016 6.38 15.686 9.877 16.852a4.4 4.4 0 0 0 1.402.232c1.276 0 2.839-.575 4.466-2.531a161 161 0 0 1 6.156-6.966 9.9 9.9 0 0 0 4.429 1.191l.01.121c-.31.368-.564.69-.781.965-1.07 1.358-1.293 1.641-4.738 2.351-.98.202-3.582.738-3.62 2.563-.041 1.993 3.076 2.83 3.431 2.919 1.238.31 2.43.463 3.568.463 2.766 0 5.2-.909 7.145-2.668-.06 7.106.236 14.107 1.089 16.241.699 1.746 2.406 6.014 7.798 6.014.791 0 1.662-.092 2.62-.297 5.627-1.207 8.071-3.694 9.016-9.177.506-2.93 1.374-9.928 1.782-13.682.862.269 1.971.392 3.17.392 2.501 0 5.387-.531 7.197-1.372 2.033-.944 5.702-3.261 5.037-5.274zM61.8 23.147c-.019 1.108-.171 2.114-.333 3.164-.174 1.129-.354 2.297-.399 3.715-.045 1.379.128 2.814.294 4.2.337 2.801.682 5.685-.655 8.531a11 11 0 0 1-.592-1.218c-.166-.403-.527-1.05-1.027-1.946-1.944-3.487-6.497-11.652-4.167-14.984.694-.992 2.456-2.011 6.879-1.463zM56.439 4.374c6.482.143 11.609 2.568 15.24 7.207 2.784 3.558-.282 19.749-9.158 33.716l-.269-.339-.112-.14c2.294-3.788 1.845-7.536 1.446-10.859-.164-1.364-.319-2.652-.28-3.861.041-1.283.21-2.382.374-3.446.202-1.311.407-2.667.35-4.265a1.8 1.8 0 0 0 .037-.601c-.144-1.533-1.894-6.12-5.462-10.273-1.951-2.271-4.797-4.813-8.682-6.527a29.3 29.3 0 0 1 6.515-.612zM20.167 53.298c-1.793 2.155-3.031 1.742-3.438 1.607-2.653-.885-5.73-6.491-8.444-15.382-2.348-7.693-3.72-15.428-3.829-17.597-.343-6.86 1.32-11.641 4.943-14.21 5.896-4.181 15.589-1.679 19.484-.409l-.17.163c-6.391 6.455-6.24 17.483-6.224 18.157a22 22 0 0 0 .051 1.135c.11 1.855.315 5.307-.232 9.217-.508 3.633.612 7.189 3.072 9.756q.383.398.795.75a164 164 0 0 0-6.008 6.814zm6.83-9.113c-1.983-2.069-2.884-4.947-2.471-7.896.577-4.13.364-7.727.25-9.659l-.039-.694c.934-.828 5.261-3.146 8.346-2.439 1.408.323 2.266 1.281 2.623 2.931 1.846 8.539.244 12.098-1.043 14.957-.265.589-.516 1.146-.73 1.722l-.166.445c-.42 1.126-.811 2.173-1.053 3.167-2.108-.006-4.159-.907-5.718-2.534zm.324 11.516a5 5 0 0 1-1.494-.642c.271-.128.754-.301 1.591-.474 4.052-.834 4.678-1.423 6.045-3.158.313-.398.669-.849 1.16-1.398.733-.821 1.068-.682 1.676-.43.493.204.972.821 1.167 1.501.092.321.195.93-.143 1.404-2.855 3.997-7.015 3.946-10.003 3.198zm21.207 19.735c-4.957 1.062-6.713-1.467-7.869-4.359-.747-1.867-1.113-10.285-.853-19.582a1.1 1.1 0 0 0-.048-.356 5 5 0 0 0-.139-.657c-.387-1.353-1.331-2.484-2.462-2.953-.45-.186-1.275-.528-2.267-.274.212-.871.578-1.855.976-2.921l.167-.448c.188-.505.423-1.029.673-1.583 1.347-2.992 3.192-7.091 1.19-16.35-.75-3.468-3.254-5.161-7.05-4.768-2.276.235-4.358 1.154-5.396 1.68q-.334.169-.618.329c.29-3.494 1.385-10.024 5.481-14.156 2.579-2.601 6.014-3.886 10.199-3.817 8.246.135 13.534 4.367 16.518 7.893 2.571 3.039 3.964 6.1 4.52 7.751-4.179-.425-7.022.4-8.463 2.46-3.135 4.481 1.715 13.178 4.046 17.358.427.766.796 1.428.912 1.709.759 1.839 1.742 3.067 2.459 3.964.22.275.433.541.596.774-1.266.365-3.539 1.208-3.332 5.422-.167 2.115-1.356 12.016-1.959 15.514-.797 4.621-2.497 6.343-7.279 7.368zm20.693-23.68c-1.294.601-3.46 1.052-5.518 1.148-2.273.107-3.43-.255-3.702-.477-.128-2.626.85-2.901 1.884-3.191.163-.046.321-.09.474-.144a4 4 0 0 0 .313.23c1.827 1.206 5.085 1.336 9.685.386l.05-.01c-.62.58-1.682 1.359-3.187 2.058z"/></g></symbol></svg>"},"displayName":"Postgres","typeVersion":3,"nodeCategories":[{"id":3,"name":"Data & Storage"},{"id":5,"name":"Development"}]},{"id":38,"icon":"fa:pen","name":"n8n-nodes-base.set","codex":{"data":{"alias":["Set","JS","JSON","Filter","Transform","Map"],"resources":{"generic":[{"url":"https://n8n.io/blog/learn-to-automate-your-factorys-incident-reporting-a-step-by-step-guide/","icon":"🏭","label":"Learn to Automate Your Factory's Incident Reporting: A Step by Step Guide"},{"url":"https://n8n.io/blog/2021-the-year-to-automate-the-new-you-with-n8n/","icon":"☀️","label":"2021: The Year to Automate the New You with n8n"},{"url":"https://n8n.io/blog/automatically-pulling-and-visualizing-data-with-n8n/","icon":"📈","label":"Automatically pulling and visualizing data with n8n"},{"url":"https://n8n.io/blog/database-monitoring-and-alerting-with-n8n/","icon":"📡","label":"Database Monitoring and Alerting with n8n"},{"url":"https://n8n.io/blog/automatically-adding-expense-receipts-to-google-sheets-with-telegram-mindee-twilio-and-n8n/","icon":"🧾","label":"Automatically Adding Expense Receipts to Google Sheets with Telegram, Mindee, Twilio, and n8n"},{"url":"https://n8n.io/blog/no-code-ecommerce-workflow-automations/","icon":"store","label":"6 e-commerce workflows to power up your Shopify s"},{"url":"https://n8n.io/blog/how-to-build-a-low-code-self-hosted-url-shortener/","icon":"🔗","label":"How to build a low-code, self-hosted URL shortener in 3 steps"},{"url":"https://n8n.io/blog/automate-your-data-processing-pipeline-in-9-steps-with-n8n/","icon":"⚙️","label":"Automate your data processing pipeline in 9 steps"},{"url":"https://n8n.io/blog/how-to-get-started-with-crm-automation-and-no-code-workflow-ideas/","icon":"👥","label":"How to get started with CRM automation (with 3 no-code workflow ideas"},{"url":"https://n8n.io/blog/5-tasks-you-can-automate-with-notion-api/","icon":"⚡️","label":"5 tasks you can automate with the new Notion API "},{"url":"https://n8n.io/blog/automate-google-apps-for-productivity/","icon":"💡","label":"15 Google apps you can combine and automate to increase productivity"},{"url":"https://n8n.io/blog/how-uproc-scraped-a-multi-page-website-with-a-low-code-workflow/","icon":" 🕸️","label":"How uProc scraped a multi-page website with a low-code workflow"},{"url":"https://n8n.io/blog/building-an-expense-tracking-app-in-10-minutes/","icon":"📱","label":"Building an expense tracking app in 10 minutes"},{"url":"https://n8n.io/blog/the-ultimate-guide-to-automate-your-video-collaboration-with-whereby-mattermost-and-n8n/","icon":"📹","label":"The ultimate guide to automate your video collaboration with Whereby, Mattermost, and n8n"},{"url":"https://n8n.io/blog/5-workflow-automations-for-mattermost-that-we-love-at-n8n/","icon":"🤖","label":"5 workflow automations for Mattermost that we love at n8n"},{"url":"https://n8n.io/blog/learn-to-build-powerful-api-endpoints-using-webhooks/","icon":"🧰","label":"Learn to Build Powerful API Endpoints Using Webhooks"},{"url":"https://n8n.io/blog/how-a-membership-development-manager-automates-his-work-and-investments/","icon":"📈","label":"How a Membership Development Manager automates his work and investments"},{"url":"https://n8n.io/blog/a-low-code-bitcoin-ticker-built-with-questdb-and-n8n-io/","icon":"📈","label":"A low-code bitcoin ticker built with QuestDB and n8n.io"},{"url":"https://n8n.io/blog/how-to-set-up-a-ci-cd-pipeline-with-no-code/","icon":"🎡","label":"How to set up a no-code CI/CD pipeline with GitHub and TravisCI"},{"url":"https://n8n.io/blog/benefits-of-automation-and-n8n-an-interview-with-hubspots-hugh-durkin/","icon":"🎖","label":"Benefits of automation and n8n: An interview with HubSpot's Hugh Durkin"},{"url":"https://n8n.io/blog/how-goomer-automated-their-operations-with-over-200-n8n-workflows/","icon":"🛵","label":"How Goomer automated their operations with over 200 n8n workflows"},{"url":"https://n8n.io/blog/aws-workflow-automation/","label":"7 no-code workflow automations for Amazon Web Services"}],"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.set/"}]},"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Data Transformation"]}}},"group":"[\"input\"]","defaults":{"name":"Edit Fields"},"iconData":{"icon":"pen","type":"icon"},"displayName":"Edit Fields (Set)","typeVersion":3,"nodeCategories":[{"id":9,"name":"Core Nodes"}]},{"id":40,"icon":"file:slack.svg","name":"n8n-nodes-base.slack","codex":{"data":{"alias":["human","form","wait","hitl","approval"],"resources":{"generic":[{"url":"https://n8n.io/blog/no-code-ecommerce-workflow-automations/","icon":"store","label":"6 e-commerce workflows to power up your Shopify s"},{"url":"https://n8n.io/blog/automate-your-data-processing-pipeline-in-9-steps-with-n8n/","icon":"⚙️","label":"Automate your data processing pipeline in 9 steps"},{"url":"https://n8n.io/blog/how-to-get-started-with-crm-automation-and-no-code-workflow-ideas/","icon":"👥","label":"How to get started with CRM automation (with 3 no-code workflow ideas"},{"url":"https://n8n.io/blog/5-tasks-you-can-automate-with-notion-api/","icon":"⚡️","label":"5 tasks you can automate with the new Notion API "},{"url":"https://n8n.io/blog/build-your-own-virtual-assistant-with-n8n-a-step-by-step-guide/","icon":"👦","label":"Build your own virtual assistant with n8n: A step by step guide"},{"url":"https://n8n.io/blog/how-to-automatically-give-kudos-to-contributors-with-github-slack-and-n8n/","icon":"👏","label":"How to automatically give kudos to contributors with GitHub, Slack, and n8n"},{"url":"https://n8n.io/blog/automations-for-activists/","icon":"✨","label":"How Common Knowledge use workflow automation for activism"}],"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/app-nodes/n8n-nodes-base.slack/"}],"credentialDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/credentials/slack/"}]},"categories":["Communication","HITL"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"HITL":["Human in the Loop"]}}},"group":"[\"output\"]","defaults":{"name":"Slack"},"iconData":{"type":"file","fileBuffer":"data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHhtbG5zOnhsaW5rPSJodHRwOi8vd3d3LnczLm9yZy8xOTk5L3hsaW5rIiBmaWxsPSIjZmZmIiBmaWxsLXJ1bGU9ImV2ZW5vZGQiIHN0cm9rZT0iIzAwMCIgc3Ryb2tlLWxpbmVjYXA9InJvdW5kIiBzdHJva2UtbGluZWpvaW49InJvdW5kIiB2aWV3Qm94PSIwIDAgMTUwLjg1MiAxNTAuODUyIj48dXNlIHhsaW5rOmhyZWY9IiNhIiB4PSIuOTI2IiB5PSIuOTI2Ii8+PHN5bWJvbCBpZD0iYSIgb3ZlcmZsb3c9InZpc2libGUiPjxnIHN0cm9rZS13aWR0aD0iMS44NTIiPjxwYXRoIGZpbGw9IiNlMDFlNWEiIHN0cm9rZT0iI2UwMWU1YSIgZD0iTTQwLjc0MSA5My41NWMwLTguNzM1IDYuNjA3LTE1Ljc3MiAxNC44MTUtMTUuNzcyczE0LjgxNSA3LjAzNyAxNC44MTUgMTUuNzcydjM4LjgyNGMwIDguNzM3LTYuNjA3IDE1Ljc3NC0xNC44MTUgMTUuNzc0cy0xNC44MTUtNy4wMzctMTQuODE1LTE1Ljc3MnoiLz48cGF0aCBmaWxsPSIjZWNiMjJkIiBzdHJva2U9IiNlY2IyMmQiIGQ9Ik05My41NSAxMDcuNDA4Yy04LjczNSAwLTE1Ljc3Mi02LjYwNy0xNS43NzItMTQuODE1czcuMDM3LTE0LjgxNSAxNS43NzItMTQuODE1aDM4LjgyNmM4LjczNSAwIDE1Ljc3MiA2LjYwNyAxNS43NzIgMTQuODE1cy03LjAzNyAxNC44MTUtMTUuNzcyIDE0LjgxNXoiLz48cGF0aCBmaWxsPSIjMmZiNjdjIiBzdHJva2U9IiMyZmI2N2MiIGQ9Ik03Ny43NzggMTUuNzcyQzc3Ljc3OCA3LjAzNyA4NC4zODUgMCA5Mi41OTMgMHMxNC44MTUgNy4wMzcgMTQuODE1IDE1Ljc3MnYzOC44MjZjMCA4LjczNS02LjYwNyAxNS43NzItMTQuODE1IDE1Ljc3MnMtMTQuODE1LTcuMDM3LTE0LjgxNS0xNS43NzJ6Ii8+PHBhdGggZmlsbD0iIzM2YzVmMSIgc3Ryb2tlPSIjMzZjNWYxIiBkPSJNMTUuNzcyIDcwLjM3MUM3LjAzNyA3MC4zNzEgMCA2My43NjMgMCA1NS41NTZzNy4wMzctMTQuODE1IDE1Ljc3Mi0xNC44MTVoMzguODI2YzguNzM1IDAgMTUuNzcyIDYuNjA3IDE1Ljc3MiAxNC44MTVzLTcuMDM3IDE0LjgxNS0xNS43NzIgMTQuODE1eiIvPjxnIHN0cm9rZS1saW5lam9pbj0ibWl0ZXIiPjxwYXRoIGZpbGw9IiNlY2IyMmQiIHN0cm9rZT0iI2VjYjIyZCIgZD0iTTc3Ljc3OCAxMzMuMzMzYzAgOC4yMDggNi42MDcgMTQuODE1IDE0LjgxNSAxNC44MTVzMTQuODE1LTYuNjA3IDE0LjgxNS0xNC44MTUtNi42MDctMTQuODE1LTE0LjgxNS0xNC44MTVINzcuNzc4eiIvPjxwYXRoIGZpbGw9IiMyZmI2N2MiIHN0cm9rZT0iIzJmYjY3YyIgZD0iTTEzMy4zMzQgNzAuMzcxaC0xNC44MTVWNTUuNTU2YzAtOC4yMDcgNi42MDctMTQuODE1IDE0LjgxNS0xNC44MTVzMTQuODE1IDYuNjA3IDE0LjgxNSAxNC44MTUtNi42MDcgMTQuODE1LTE0LjgxNSAxNC44MTV6Ii8+PHBhdGggZmlsbD0iI2UwMWU1YSIgc3Ryb2tlPSIjZTAxZTVhIiBkPSJNMTQuODE1IDc3Ljc3OEgyOS42M3YxNC44MTVjMCA4LjIwNy02LjYwNyAxNC44MTUtMTQuODE1IDE0LjgxNVMwIDEwMC44IDAgOTIuNTkzczYuNjA3LTE0LjgxNSAxNC44MTUtMTQuODE1eiIvPjxwYXRoIGZpbGw9IiMzNmM1ZjEiIHN0cm9rZT0iIzM2YzVmMSIgZD0iTTcwLjM3MSAxNC44MTVWMjkuNjNINTUuNTU2Yy04LjIwNyAwLTE0LjgxNS02LjYwNy0xNC44MTUtMTQuODE1UzQ3LjM0OCAwIDU1LjU1NiAwczE0LjgxNSA2LjYwNyAxNC44MTUgMTQuODE1eiIvPjwvZz48L2c+PC9zeW1ib2w+PC9zdmc+"},"displayName":"Slack","typeVersion":2,"nodeCategories":[{"id":6,"name":"Communication"},{"id":28,"name":"HITL"}]},{"id":565,"icon":"fa:sticky-note","name":"n8n-nodes-base.stickyNote","codex":{"data":{"alias":["Comments","Notes","Sticky"],"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Helpers"]}}},"group":"[\"input\"]","defaults":{"name":"Sticky Note","color":"#FFD233"},"iconData":{"icon":"sticky-note","type":"icon"},"displayName":"Sticky Note","typeVersion":1,"nodeCategories":[{"id":9,"name":"Core Nodes"}]},{"id":834,"icon":"file:code.svg","name":"n8n-nodes-base.code","codex":{"data":{"alias":["cpde","Javascript","JS","Python","Script","Custom Code","Function"],"details":"The Code node allows you to execute JavaScript in your workflow.","resources":{"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.code/"}]},"categories":["Development","Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Helpers","Data Transformation"]}}},"group":"[\"transform\"]","defaults":{"name":"Code"},"iconData":{"type":"file","fileBuffer":"data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iNTEyIiBoZWlnaHQ9IjUxMiIgdmlld0JveD0iMCAwIDUxMiA1MTIiIGZpbGw9Im5vbmUiIHhtbG5zPSJodHRwOi8vd3d3LnczLm9yZy8yMDAwL3N2ZyI+CjxnIGNsaXAtcGF0aD0idXJsKCNjbGlwMF8xMTcxXzQ0MSkiPgo8cGF0aCBkPSJNMTcwLjI4MyA0OEgxOTYuNUMyMDMuMTI3IDQ4IDIwOC41IDQyLjYyNzQgMjA4LjUgMzZWMTJDMjA4LjUgNS4zNzI1OCAyMDMuMTI3IDAgMTk2LjUgMEgxNzAuMjgzQzEyNi4xIDAgOTAuMjgzIDM1LjgxNzIgOTAuMjgzIDgwVjE3NkM5MC4yODMgMjA2LjkyOCA2NS4yMTA5IDIzMiAzNC4yODMgMjMySDIzQzE2LjM3MjYgMjMyIDExIDIzNy4zNzIgMTEgMjQ0VjI2OEMxMSAyNzQuNjI3IDE2LjM3MjQgMjgwIDIyLjk5OTYgMjgwTDM0LjI4MyAyODBDNjUuMjEwOSAyODAgOTAuMjgzIDMwNS4wNzIgOTAuMjgzIDMzNlY0NDBDOTAuMjgzIDQ3OS43NjQgMTIyLjUxOCA1MTIgMTYyLjI4MyA1MTJIMTk2LjVDMjAzLjEyNyA1MTIgMjA4LjUgNTA2LjYyNyAyMDguNSA1MDBWNDc2QzIwOC41IDQ2OS4zNzMgMjAzLjEyNyA0NjQgMTk2LjUgNDY0SDE2Mi4yODNDMTQ5LjAyOCA0NjQgMTM4LjI4MyA0NTMuMjU1IDEzOC4yODMgNDQwVjMzNkMxMzguMjgzIDMwOS4wMjIgMTI4LjAxMSAyODQuNDQzIDExMS4xNjQgMjY1Ljk2MUMxMDYuMTA5IDI2MC40MTYgMTA2LjEwOSAyNTEuNTg0IDExMS4xNjQgMjQ2LjAzOUMxMjguMDExIDIyNy41NTcgMTM4LjI4MyAyMDIuOTc4IDEzOC4yODMgMTc2VjgwQzEzOC4yODMgNjIuMzI2OSAxNTIuNjEgNDggMTcwLjI4MyA0OFoiIGZpbGw9IiNGRjk5MjIiLz4KPHBhdGggZD0iTTMwNSAzNkMzMDUgNDIuNjI3NCAzMTAuMzczIDQ4IDMxNyA0OEgzNDIuOTc5QzM2MC42NTIgNDggMzc0Ljk3OCA2Mi4zMjY5IDM3NC45NzggODBWMTc2QzM3NC45NzggMjAyLjk3OCAzODUuMjUxIDIyNy41NTcgNDAyLjA5OCAyNDYuMDM5QzQwNy4xNTMgMjUxLjU4NCA0MDcuMTUzIDI2MC40MTYgNDAyLjA5OCAyNjUuOTYxQzM4NS4yNTEgMjg0LjQ0MyAzNzQuOTc4IDMwOS4wMjIgMzc0Ljk3OCAzMzZWNDMyQzM3NC45NzggNDQ5LjY3MyAzNjAuNjUyIDQ2NCAzNDIuOTc5IDQ2NEgzMTdDMzEwLjM3MyA0NjQgMzA1IDQ2OS4zNzMgMzA1IDQ3NlY1MDBDMzA1IDUwNi42MjcgMzEwLjM3MyA1MTIgMzE3IDUxMkgzNDIuOTc5QzM4Ny4xNjEgNTEyIDQyMi45NzggNDc2LjE4MyA0MjIuOTc4IDQzMlYzMzZDNDIyLjk3OCAzMDUuMDcyIDQ0OC4wNTEgMjgwIDQ3OC45NzkgMjgwSDQ5MEM0OTYuNjI3IDI4MCA1MDIgMjc0LjYyOCA1MDIgMjY4VjI0NEM1MDIgMjM3LjM3MyA0OTYuNjI4IDIzMiA0OTAgMjMyTDQ3OC45NzkgMjMyQzQ0OC4wNTEgMjMyIDQyMi45NzggMjA2LjkyOCA0MjIuOTc4IDE3NlY4MEM0MjIuOTc4IDM1LjgxNzIgMzg3LjE2MSAwIDM0Mi45NzkgMEgzMTdDMzEwLjM3MyAwIDMwNSA1LjM3MjU4IDMwNSAxMlYzNloiIGZpbGw9IiNGRjk5MjIiLz4KPC9nPgo8ZGVmcz4KPGNsaXBQYXRoIGlkPSJjbGlwMF8xMTcxXzQ0MSI+CjxyZWN0IHdpZHRoPSI1MTIiIGhlaWdodD0iNTEyIiBmaWxsPSJ3aGl0ZSIvPgo8L2NsaXBQYXRoPgo8L2RlZnM+Cjwvc3ZnPgo="},"displayName":"Code","typeVersion":2,"nodeCategories":[{"id":5,"name":"Development"},{"id":9,"name":"Core Nodes"}]},{"id":839,"icon":"fa:clock","name":"n8n-nodes-base.scheduleTrigger","codex":{"data":{"alias":["Time","Scheduler","Polling","Cron","Interval"],"resources":{"generic":[],"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.scheduletrigger/"}]},"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0"}},"group":"[\"trigger\",\"schedule\"]","defaults":{"name":"Schedule Trigger","color":"#31C49F"},"iconData":{"icon":"clock","type":"icon"},"displayName":"Schedule Trigger","typeVersion":1,"nodeCategories":[{"id":9,"name":"Core Nodes"}]},{"id":1236,"icon":"file:aggregate.svg","name":"n8n-nodes-base.aggregate","codex":{"data":{"alias":["Aggregate","Combine","Flatten","Transform","Array","List","Item"],"details":"","resources":{"generic":[],"primaryDocumentation":[{"url":"https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-base.aggregate/"}]},"categories":["Core Nodes"],"nodeVersion":"1.0","codexVersion":"1.0","subcategories":{"Core Nodes":["Data Transformation"]}}},"group":"[\"transform\"]","defaults":{"name":"Aggregate"},"iconData":{"type":"file","fileBuffer":"data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHdpZHRoPSI1MTIiIGhlaWdodD0iNTEyIiBmaWxsPSJub25lIj48ZyBmaWxsPSIjRkY2RDVBIiBjbGlwLXBhdGg9InVybCgjYSkiPjxwYXRoIGZpbGwtcnVsZT0iZXZlbm9kZCIgZD0iTTMyIDE0OGMwLTYuNjI3IDUuMzczLTEyIDEyLTEyaDE0NmM2LjYyNyAwIDEyIDUuMzczIDEyIDEydjI0YzAgNi42MjctNS4zNzMgMTItMTIgMTJINDRjLTYuNjI3IDAtMTItNS4zNzMtMTItMTJ6bTAgOTZjMC02LjYyNyA1LjM3My0xMiAxMi0xMmgxNDZjNi42MjcgMCAxMiA1LjM3MyAxMiAxMnYyNGMwIDYuNjI3LTUuMzczIDEyLTEyIDEySDQ0Yy02LjYyNyAwLTEyLTUuMzczLTEyLTEyem0wIDk2YzAtNi42MjcgNS4zNzMtMTIgMTItMTJoMTQ2YzYuNjI3IDAgMTIgNS4zNzMgMTIgMTJ2MjRjMCA2LjYyNy01LjM3MyAxMi0xMiAxMkg0NGMtNi42MjcgMC0xMi01LjM3My0xMi0xMnoiIGNsaXAtcnVsZT0iZXZlbm9kZCIvPjxwYXRoIGQ9Ik03NCA3NmMwIDYuNjI3IDUuMzczIDEyIDEyIDEyaDExNi4yMTdjMTcuNjczIDAgMzIgMTQuMzI3IDMyIDMydjU2YzAgMjYuOTc4IDEwLjI3MiA1MS41NTcgMjcuMTE5IDcwLjAzOSA1LjA1NSA1LjU0NSA1LjA1NSAxNC4zNzcgMCAxOS45MjItMTYuODQ3IDE4LjQ4Mi0yNy4xMTkgNDMuMDYxLTI3LjExOSA3MC4wMzl2NTZjMCAxNy42NzMtMTQuMzI3IDMyLTMyIDMySDg2Yy02LjYyNyAwLTEyIDUuMzczLTEyIDEydjI0YzAgNi42MjcgNS4zNzMgMTIgMTIgMTJoMTE2LjIxN2M0NC4xODMgMCA4MC0zNS44MTcgODAtODB2LTU2YzAtMzAuOTI4IDI1LjA3Mi01NiA1Ni01NmE1Ljc4MyA1Ljc4MyAwIDAgMCA1Ljc4My01Ljc4M3YtMzYuNDM0YTUuNzgzIDUuNzgzIDAgMCAwLTUuNzgzLTUuNzgzYy0zMC45MjggMC01Ni0yNS4wNzItNTYtNTZ2LTU2YzAtNDQuMTgzLTM1LjgxNy04MC04MC04MEg4NmMtNi42MjcgMC0xMiA1LjM3My0xMiAxMnoiLz48cGF0aCBmaWxsLXJ1bGU9ImV2ZW5vZGQiIGQ9Ik0zNzYgMjQ0YzAtNi42MjcgNS4zNzMtMTIgMTItMTJoMTEyYzYuNjI3IDAgMTIgNS4zNzMgMTIgMTJ2MjRjMCA2LjYyNy01LjM3MyAxMi0xMiAxMkgzODhjLTYuNjI3IDAtMTItNS4zNzMtMTItMTJ6IiBjbGlwLXJ1bGU9ImV2ZW5vZGQiLz48L2c+PGRlZnM+PGNsaXBQYXRoIGlkPSJhIj48cGF0aCBmaWxsPSIjZmZmIiBkPSJNMCAwaDUxMnY1MTJIMHoiLz48L2NsaXBQYXRoPjwvZGVmcz48L3N2Zz4="},"displayName":"Aggregate","typeVersion":1,"nodeCategories":[{"id":9,"name":"Core Nodes"}]}],"categories":[{"id":5,"name":"Engineering"},{"id":49,"name":"AI Summarization"}],"image":[]}}