The Challenge
Playable ad campaigns generate massive amounts of performance data. Every impression, click, install, and post-install event flows through ad networks. But the data arrives in different formats, with different latency profiles, and across multiple platforms (Meta, Google, TikTok, Unity). Aggregating this into a single real-time dashboard is the #1 operational challenge for ad operations teams.
For PlayableAd Studio, clients needed to see campaign performance within minutes of launch, not hours. They wanted to know: Which creative variant is winning? Which audience segment has the highest install rate? Is the spend efficient? Without real-time analytics, campaigns waste budget on underperforming creatives for hours before optimization kicks in. A campaign spending $1,000 per day on 10 variants might waste $300+ per day on the bottom 3 variants before anyone notices the pattern and rotates the creative set.
The Solution
We built a **Real-Time Analytics Pipeline** on Cloudflare Workers, D1, and Durable Objects. The pipeline ingests webhook callbacks from ad networks (Meta CAPI, Google Ads conversions, TikTok Events, Unity Ads), normalizes them into a unified event schema, and writes to D1 for dashboard queries. A Durable Object maintains an in-memory sliding window of the last 15 minutes for sub-second dashboard refreshes without polling the database.
The system supports both push-based (WebSocket) and pull-based (REST polling) dashboard consumption patterns. Campaign managers open a real-time dashboard in their browser that updates every second without a page refresh. Mobile operators get push notifications when critical KPIs cross configurable thresholds like CTR dropping below 0.5 percent.
Architecture
```
Ad Network Webhooks -> Event Normalizer -> Durable Object (15-min window) -> Dashboard API
| |
v v
D1 (long-term storage) WebSocket push to frontend
```
Stage 1: Webhook Ingestion
Each ad network sends webhooks to a shared Worker endpoint at /_webhooks/:platform. The endpoint validates the HMAC signature, extracts the event payload, and normalizes it into a universal schema. Each network has its own normalizer function that handles field mapping, timestamp conversion, and deduplication. Normalized events include campaign ID, creative ID, platform, event type, timestamp, spend value, currency, and metadata about device, country, and placement.
```typescript
interface NormalizedEvent {
campaignId: string;
creativeId: string;
platform: 'meta' | 'google' | 'tiktok' | 'unity';
eventType: 'impression' | 'click' | 'install' | 'purchase';
timestamp: number;
value: number;
currency: string;
metadata: Record<string, string>;
}
```
Stage 2: Durable Object Real-Time Window
A Durable Object maintains a circular buffer of the last 15 minutes of events. The buffer stores up to 500,000 events in memory. When a new event arrives, it updates running aggregates (impressions, CTR, CVR, spend, ROAS) and broadcasts the delta to all connected dashboard clients via WebSocket. The DO implements a custom gossip protocol for cross-region synchronization when campaign managers are distributed globally across time zones.
Stage 3: D1 Long-Term Storage
Events also flow to D1 for historical queries. The database uses two aggregation layers to balance query speed with storage cost:
```sql
-- Hourly rollups for real-time dashboard time-series charts
CREATE TABLE agg_hourly_performance (
campaign_id TEXT,
creative_id TEXT,
hour TIMESTAMP,
impressions INTEGER,
clicks INTEGER,
installs INTEGER,
spend_cents INTEGER,
revenue_cents INTEGER,
PRIMARY KEY (campaign_id, creative_id, hour)
);
-- Daily rollups for weekly trend reports and client performance summaries
CREATE TABLE agg_daily_creative_performance (
campaign_id TEXT,
creative_id TEXT,
date TEXT,
impressions INTEGER,
ctr REAL,
cvr REAL,
ecpi_cents INTEGER,
PRIMARY KEY (campaign_id, creative_id, date)
);
```
Results
Since deploying the analytics pipeline for PlayableAd Studio clients over 12 weeks:
- **Dashboard latency**: Under 200ms from event ingestion to visible dashboard update on the frontend
- **Creative optimization speed**: Campaigns optimized 6x faster -- hours instead of days between iterations
- **Spend efficiency**: 18% reduction in wasted ad spend from faster detection of underperforming variants
- **Scale**: 50,000+ events per minute processed with zero backpressure across 10 simultaneous campaigns
- **Cost**: $12 per month for Durable Object plus D1 at peak traffic with 10 campaigns running 24 hours a day
- **Uptime**: 99.97% over 3 months with only a single 4-minute incident from an upstream LLM API rate limit
Key Takeaways
- Durable Objects are ideal for real-time sliding-window analytics with sub-second refreshes and zero database polling
- D1 aggregation tables make historical queries fast without a dedicated OLAP database or expensive data warehouse
- Webhook normalization is the hardest part: each ad network has different schemas, retry logic, deduplication rules, and HMAC signature schemes that must be maintained independently
- The DO plus D1 combination gives the best of both worlds: real-time event visibility and long-term storage at serverless cost
- WebSocket broadcasting from DO to dashboard frontend eliminates the need for a real-time messaging service like Pusher
- Per-normalizer test coverage is essential since an unannounced schema change at any ad network breaks the pipeline silently
- Every ad network guarantees at-least-once delivery, so the pipeline must handle duplicate events with idempotency keys stored in D1 with a 24-hour TTL