Skip to content

comet-ml/n8n-observability

Β 
Β 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

31 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

n8n-observability

Important

This package only works with self-hosted n8n installations. It is not compatible with n8n Cloud.

OpenTelemetry instrumentation for n8n workflows. Automatically traces workflow executions and node operations using the standard OpenTelemetry SDK.

Observability Setup

Features

  • πŸ” Automatic tracing of workflow executions and individual node operations
  • πŸ“Š Standard OpenTelemetry instrumentation using the official Node.js SDK
  • 🎯 Zero-code setup via n8n's hook system
  • πŸ”Œ OTLP compatible - works with any OpenTelemetry-compatible backend
  • βš™οΈ Configurable I/O capture, node filtering, and more
  • πŸš€ Works with Docker or bare metal
  • πŸ’» Node.js β‰₯ 18

Quick Start (Docker)

The fastest way to get started is with Docker Compose:

# Clone and navigate to the example
git clone https://github.com/comet-ml/n8n-observability.git
cd n8n-observability/examples/docker-compose

# Set your Opik API key (get one free at https://www.comet.com/signup)
export OPIK_API_KEY=your_api_key_here

# Build and run
docker-compose up --build

Open http://localhost:5678, create a workflow, and see traces in your Comet ML dashboard!

πŸ“– See examples/docker-compose/ for full documentation.


Setup Options

Docker (Recommended)

Create a custom Dockerfile that installs the package globally:

FROM n8nio/n8n:latest

USER root
RUN npm install -g n8n-observability

ENV EXTERNAL_HOOK_FILES=/usr/local/lib/node_modules/n8n-observability/dist/hooks.cjs

USER node

Then run with your OTLP configuration:

# docker-compose.yml
services:
  n8n:
    build: .
    environment:
      # Comet ML / Opik
      OTEL_EXPORTER_OTLP_ENDPOINT: "https://www.comet.com/opik/api/v1/private/otel"
      OTEL_EXPORTER_OTLP_HEADERS: "Authorization=${OPIK_API_KEY},Comet-Workspace=default"
      N8N_OTEL_SERVICE_NAME: "my-n8n"
    volumes:
      - n8n_data:/home/node/.n8n
    ports:
      - "5678:5678"

volumes:
  n8n_data:

Bare Metal / npm

# Install globally
npm install -g n8n-observability

# Configure OTLP endpoint (Comet ML example)
export OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel
export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'
export N8N_OTEL_SERVICE_NAME=my-n8n
export EXTERNAL_HOOK_FILES=$(npm root -g)/n8n-observability/dist/hooks.cjs

# Start n8n
n8n start

Programmatic

import { setupN8nObservability } from 'n8n-observability';

await setupN8nObservability({
  serviceName: 'my-n8n',
  debug: true,
});

// Then start n8n as usual

Configuration

Variable Purpose Default
OTEL_EXPORTER_OTLP_ENDPOINT OTLP exporter endpoint β€”
OTEL_EXPORTER_OTLP_HEADERS OTLP headers (e.g., auth tokens) β€”
N8N_OTEL_SERVICE_NAME Service name for telemetry n8n
N8N_OTEL_NODE_INCLUDE Only trace listed nodes (comma-separated) β€”
N8N_OTEL_NODE_EXCLUDE Exclude listed nodes (comma-separated) β€”
N8N_OTEL_CAPTURE_INPUT Capture node input data true
N8N_OTEL_CAPTURE_OUTPUT Capture node output data true
N8N_OTEL_AUTO_INSTRUMENT Enable HTTP/Express instrumentation false
N8N_OTEL_METRICS Enable metrics collection false
N8N_OTEL_DEBUG Enable debug logging false
EXTERNAL_HOOK_FILES Path to hooks.cjs (set automatically) β€”

Node Filtering

# Only trace specific nodes
export N8N_OTEL_NODE_INCLUDE="OpenAI,HTTP Request"

# Exclude noisy nodes
export N8N_OTEL_NODE_EXCLUDE="Wait,Set"

# Disable I/O capture for privacy
export N8N_OTEL_CAPTURE_INPUT=false
export N8N_OTEL_CAPTURE_OUTPUT=false

OTLP Backends

Works with any OpenTelemetry-compatible backend:

Comet ML / Opik (Recommended)

# Cloud
export OTEL_EXPORTER_OTLP_ENDPOINT=https://www.comet.com/opik/api/v1/private/otel
export OTEL_EXPORTER_OTLP_HEADERS='Authorization=<your-api-key>,Comet-Workspace=default'

# Self-hosted Opik
export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:5173/api/v1/private/otel

Other Providers

# Jaeger
export OTEL_EXPORTER_OTLP_ENDPOINT=http://jaeger:4318

# Grafana Tempo
export OTEL_EXPORTER_OTLP_ENDPOINT=http://tempo:4318

# Honeycomb
export OTEL_EXPORTER_OTLP_ENDPOINT=https://api.honeycomb.io
export OTEL_EXPORTER_OTLP_HEADERS='x-honeycomb-team=<api-key>'

# Generic OTLP collector
export OTEL_EXPORTER_OTLP_ENDPOINT=http://otel-collector:4318

Span Attributes

Workflow Spans

  • n8n.workflow.id - Workflow ID
  • n8n.workflow.name - Workflow name
  • n8n.span.type - "workflow"

Node Spans

  • n8n.node.type - Node type (e.g., n8n-nodes-base.httpRequest)
  • n8n.node.name - Node name
  • n8n.span.type - "llm", "prompt", "evaluation", or undefined
  • n8n.node.input - JSON input (if capture enabled)
  • n8n.node.output - JSON output (if capture enabled)
  • gen_ai.system - AI provider (e.g., openai, anthropic)
  • gen_ai.request.model - Model name (e.g., gpt-4)

Verify Installation

Check the package is installed:

node -e "console.log(require.resolve('n8n-observability/hooks'))"

Expected startup logs:

[otel-setup] OpenTelemetry initialized: my-n8n (OTLP export enabled, langchain (manual), n8n spans only)
[n8n-observability] observability ready and patches applied

Examples

Example Description
examples/docker-compose/ Production-ready Docker setup with Comet ML

Development

# Install dependencies
pnpm install

# Build
pnpm build

# Run e2e tests
pnpm e2e

License

MIT - See LICENSE for details.


Acknowledgments

Note

This project is a fork of LangWatch's n8n-observability. We're grateful for their excellent work in creating the original implementation. We've extended their code to work with all OpenTelemetry providers, making it a universal solution for n8n observability.

About

Otel observability for n8n - Built by the Opik team, available to everyone

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 91.6%
  • JavaScript 4.9%
  • Shell 3.5%