Why We Built Our Own Analytics Dashboard

When CCFish first launched on the App Store, we relied on Apple's App Store Connect analytics and a third-party SDK for in-game events. It worked, but we kept hitting walls: we couldn't join player behavior data with revenue data, event metadata was truncated at the SDK level, and real-time dashboards required expensive third-party subscriptions.

We needed a lightweight, cost-effective analytics system that could:

- Track custom in-game events (level starts, item purchases, ad views)

- Correlate player sessions with revenue in real time

- Power a public-facing dashboard for the marketing team

- Cost close to $0 to run

Architecture: Cloudflare Workers + D1 + KV

Here's the stack we settled on:

```

CCFish Client → Cloudflare API Gateway → D1 (event store)

→ KV (session cache)

→ Workers Cron (aggregation)

```

Event Ingestion

Every CCFish client sends a lightweight POST request to a Cloudflare Worker endpoint whenever a meaningful event happens. The payload is minimal:

```json

{

"event": "level_complete",

"player_id": "abc123",

"session_id": "sess_456",

"metadata": {

"level": 7,

"score": 12500,

"time_played_s": 342

},

"timestamp": "2026-05-07T10:30:00Z"

}

```

Data Pipeline

The pipeline runs in three stages:

| Stage | Component | Frequency | Purpose |

|-------|-----------|-----------|---------|

| Ingestion | Workers Route | Real-time | Write raw events to D1 |

| Processing | Cron Worker (30 min) | Every 30 min | Aggregate events into hourly stats |

| Reporting | API Endpoint | On demand | Serve dashboard data to frontend |

D1 Schema

We keep two tables for efficiency:

```sql

-- Raw events (TTL-managed, trimmed weekly)

CREATE TABLE raw_events (

id TEXT PRIMARY KEY,

player_id TEXT NOT NULL,

event_type TEXT NOT NULL,

session_id TEXT,

metadata TEXT, -- JSON blob

created_at TEXT NOT NULL DEFAULT (datetime('now'))

);

CREATE INDEX idx_events_type ON raw_events(event_type, created_at);

-- Aggregated daily stats

CREATE TABLE daily_stats (

date TEXT NOT NULL,

metric TEXT NOT NULL,

value REAL NOT NULL,

PRIMARY KEY (date, metric)

);

```

The Dashboard

The dashboard itself is a simple Astro page that queries the aggregated data endpoint. It shows:

- **Daily Active Users (DAU)** — line chart over 30 days

- **Average Session Length** — rolling 7-day average

- **Revenue Per Daily Active User (ARPDAU)** — split by ad revenue vs IAP

- **Event Funnel** — level 1 → level 5 → level 10 completion rates

All charts render server-side using a lightweight charting library, no JavaScript tracking required.

Cost Analysis

| Service | Monthly Cost | Volume |

|---------|-------------|--------|

| Cloudflare Workers (ingestion) | $0 | 100k requests/day within free tier |

| D1 Database | $0 | 5GB storage, 5M reads/month |

| Workers KV (session cache) | $0 | 1GB storage, 10M reads/month |

| **Total** | **$0** | Up to 3M events/month |

Compare this to Mixpanel at $25/month (starter) or Amplitude at $33/month (growth plan) — we save thousands per year with no feature trade-offs.

What We Learned

1. **Schema-first design pays off.** We spent an hour designing the D1 schema upfront and it's handled three months of data without migrations.

2. **Cron-based aggregation is essential.** Querying raw events for a 30-day DAU chart was taking 3+ seconds. Pre-aggregated daily stats return in under 10ms.

3. **Keep the pipeline simple.** We avoided message queues, stream processors, and data lakes. Workers + D1 handles everything with sub-100ms p95 latency.

4. **D1's global replication is a cheat code.** Our players are in the US and Asia; D1's primary replicas serve reads from the nearest region automatically.

If your mobile game or app needs analytics and you're tired of paying per-event fees, give Workers + D1 a shot. The setup takes an afternoon and pays for itself on day one.