# Manual CSV import from Azure Blob to Snowflake with field mapping
This automation template solves the task of manually importing CSV files from cloud storage into Snowflake. It streamlines the process of loading test or demo data by eliminating manual copying, formatting, and pasting. Ideal for data professionals who want to reduce operational overhead on repetitive ETL tasks without writing code.
## Who it´s for
- Analysts regularly loading test data into Snowflake
- Data engineers setting up simple no-code ETL processes
- Administrators needing one-time CSV imports into data warehouses
## What the automation does
- Triggered manually via n8n
- Downloads a CSV file from a public URL in Azure Blob Storage
- Parses the file using the Spreadsheet File node
- Maps columns to structured fields (id, first_name, last_name)
- Inserts data into the ´users´ table in Snowflake
## What´s included
- Ready-to-use n8n workflow
- Predefined trigger and processing logic
- Integrations with Azure Blob Storage (via HTTP) and Snowflake
- Basic text instructions for setup and adaptation
## Requirements for setup
- n8n instance (self-hosted or cloud)
- Access to Azure Blob Storage with a public URL to the CSV file
- Snowflake account with write permissions to the ´users´ table
- Snowflake credentials (username, password, account, warehouse, database, schema)
## Benefits and outcomes
- Eliminates manual CSV handling
- Speeds up loading of test and demo datasets
- Ensures consistent field mapping before insertion
- Reduces import errors through automated parsing
- Enables fast replication of ETL scenarios without coding
## Important: template only
Important: you are purchasing a ready-made automation workflow template only. Rollout into your infrastructure, connecting specific accounts and services, 1:1 setup help, custom adjustments for non-standard stacks and any consulting support are provided as a separate paid service at an individual rate. To discuss custom work or 1:1 help, contact via chat
No feedback yet