Content: 02005.zip (38.23 KB)
Uploaded: 14.01.2026

Positive responses: 0
Negative responses: 0

Sold: 0
Refunds: 0

$7
# Manual LLM query processing with agent and Wikipedia in n8n

This template enables testing of LLM chains and AI agents with external tools in n8n. The workflow starts manually and demonstrates two approaches: direct model invocation and agent-based lookup using Wikipedia via LangChain.

## Who it´s for
- Developers testing custom LLM chains in n8n
- Engineers building agents with external knowledge access
- Automation specialists exploring LangChain integration in n8n

## What the automation does
- Triggered manually by user input
- Processes two query types: joke (direct LLM call) and factual question (via agent)
- Agent autonomously decides to use Wikipedia tool for information retrieval
- Both paths use OpenAI´s gpt-4o-mini through LangChain
- Enables quick testing of agent behavior vs. pure LLM responses

## What´s included
- Ready-to-use n8n workflow with LangChain and custom code nodes
- Logic for routing queries between direct chain and agent path
- Integrations with OpenAI API and Wikipedia
- Basic textual instructions for setup and adaptation

## Requirements for setup
- n8n instance (self-hosted or cloud)
- OpenAI API key
- Access to public Wikipedia API (no auth required)

## Benefits and outcomes
- Accelerates LLM logic testing without complex infrastructure
- Clearly shows difference between direct calls and agent-based workflows
- Allows adding new tools following the Wikipedia integration pattern
- Serves as a reference for building advanced agents in n8n

## Important: template only
Important: you are purchasing a ready-made automation workflow template only. Rollout into your infrastructure, connecting specific accounts and services, 1:1 setup help, custom adjustments for non-standard stacks and any consulting support are provided as a separate paid service at an individual rate. To discuss custom work or 1:1 help, contact via chat
No feedback yet