OPEN SOURCE · CONSULTATION AVAILABLE

Your entire stack. One queryable database.

Pull brand data from Shopify, Klaviyo, Meta, GA4, Gorgias, Gmail, and more — into a single local SQLite warehouse. 25+ tables. 28 CRO views. Identity resolution. One command.

~/project — npm run build
$ npm run build

 Pass 1: 25 tables created
 Pass 2: Data loaded (8 channels)
 Pass 3: 28 CRO views built
 Pass 4: Identity graph 1,847 profiles matched

$ npm run summary
warehouse.db — 4.2 MB
Tables: 25 · Views: 28 · Identity: 1,847

8 channels. One warehouse.

Every data source your brand generates — pulled, structured, and joined into a single queryable file.

Shopify

Orders · Customers · Products · Line Items

Klaviyo

Profiles · Campaigns · Flows · Messages

Meta Ads

Campaigns · Ad Sets · Ads · Creative

GA4

Traffic · E-Commerce · Events

Facebook Organic

Posts · Comments · Messages

Instagram Organic

Posts · Comments · DMs

Gorgias

Tickets · Customers

Gmail

Inbox · Product Reviews

What you get.

Not just raw data. An analytics engine with CRO views, identity resolution, and multi-touch attribution — built for AI agents.

28 CRO Views

Pre-built SQL views across 8 pillars: Revenue, Customer LTV, Product Performance, Retention, Email, Ads, Support, and Attribution.

Identity Resolution

Automatic email-based identity matching across Shopify, Klaviyo, and Gorgias. One customer, one profile, cross-channel.

Incremental Pull

Cursor-based API syncs. Only new and updated records get pulled. Full re-pull with --full when you need it.

Multi-Touch Attribution

First-touch, last-touch, and linear attribution models. See which channels actually drive conversions.

Production-Grade Resilience

Exponential backoff, Retry-After headers, run manifests, and graceful recovery. Designed for autonomous AI operation.

BIOS-Ready

Run npm run bios-check to see which BIOS intelligence specs your warehouse can now generate. The data feeds the brain.

Clone. Pull. Build. Query.

Four commands from zero to a fully queryable local warehouse — with your live brand data.

The setup script validates your Node.js version, installs deps, and creates environment directories. After pulling, build-db.js runs 4 passes: Tables → Data → Views → Identity Graph.

Terminal
# Clone and bootstrap
$ git clone https://github.com/ecomxco/setup-data-warehouse.git
$ cd setup-data-warehouse && ./setup.sh

# Pull live data + build warehouse
$ npm run pull && npm run build

# Verify the result
$ npm run summary

Clone it free. Or hire us to build it.

The repo is open source — clone it and follow the README. If you'd rather have Jim scope your warehouse, connect your APIs, and validate the first build, book a consultation.

OPEN SOURCE

DIY — Free

Clone the repo, follow the README, and run /setup-data-warehouse in your AI assistant. Everything you need is documented — credentials, pull commands, build steps, and validation.

Clone from GitHub
CONSULTATION

Consultation — from $2,500

Jim scopes your warehouse, connects your APIs, runs the initial pull, and validates the data. The $2,500 covers discovery, configuration, and a 60-minute walkthrough. Full warehouse build is quoted on data volume and channel count.

Book a Call →

Build cost varies by data volume + complexity.

Already purchased? Log in to your dashboard →

Built for operators.

Node.js ≥ 18 · SQLite (better-sqlite3) · 8 API channels · Incremental cursors · Run manifests · MIT License