CCFish serves real-time, personalized marketing based on how every player actually plays the game. By wiring Cloudflare Workers event pipelines directly into segmentation queries on D1, CCFish closes the loop between in-game behavior and outbound campaigns, turning raw telemetry into targeted push notifications, dynamic in-game offers, and individualized IAP bundles — all in under 200 milliseconds.
The Problem — Silos between game dev data and marketing execution
Most mobile games treat development data and marketing as separate worlds. The game client sends telemetry to an analytics SDK, which dumps events into a data warehouse. Marketing teams export cohorts manually, upload them to a push service, and hope the creative resonates. This introduces hours or days of latency — by the time a campaign launches, the player's session context is gone.
For CCFish, this meant:
- A player who spent 20 minutes browsing the rare-fish catalog might get a generic "come back and play" push the next day, instead of a discount on the fish they were inspecting.
- A player who abandoned checkout at the payment confirmation screen would never receive a follow-up offer for that same item.
- Marketing campaigns had to target broad segments ("all level 10+ players") because fine-grained behavioral segments were too slow to compute.
The core problem is architectural: game event data flows one way into a silo, and marketing data flows another way out of a CRM. There is no feedback loop.
The Solution — Event-driven marketing pipeline
The solution is an event-driven marketing pipeline on Cloudflare's edge platform. Instead of batch-processing telemetry overnight, every game event — `level_up`, `item_purchase`, `catalog_view`, `session_end`, `streak_milestone` — streams through a Cloudflare Worker in real time. That Worker:
1. Validates and enriches the event (adds player segment tags, computes rolling session metrics)
2. Writes the raw event to D1 for long-term analytics and segmentation
3. Evaluates the event against active campaign rules stored in D1
4. If the event matches a campaign trigger, enqueues a personalized marketing action
This reduces the feedback loop from hours to milliseconds. A player doesn't wait for tomorrow's batch job — they get a relevant offer on their next screen.
Architecture — Game client events → Workers → D1 → Segmentation → Campaign
The pipeline follows a linear flow with a feedback attribution branch:
| Layer | Component | Role |
|-------|-----------|------|
| 1 | Game Client | Emits structured events via POST to Workers endpoint |
| 2 | Ingestion Worker | Validates, enriches, routes to D1 + campaign evaluator |
| 3 | D1 Database | Stores raw events, powers segmentation queries, caches segments |
| 4 | Segmentation Engine | Runs scheduled SQL queries to build player cohorts |
| 5 | Campaign Evaluator | Matches segments against rules with frequency capping |
| 6 | Dispatch Worker | Sends push via FCM/APNs, updates in-game offer state |
| 7 | Attribution Loop | Campaign clicks flow back as game events, feeding the next cycle |
```
┌─────────────┐ ┌──────────────┐ ┌──────────────────┐
│ Game Client │────▶│ Ingestion │────▶│ D1 (raw events, │
│ (Unity/C#) │ │ Worker │ │ player profile) │
└─────────────┘ └──────┬───────┘ └────────┬─────────┘
│ │
│ ┌───────▼──────────┐
│ │ Segmentation │
│ │ Engine (SQL + │
│ │ scheduled cron) │
│ └───────┬──────────┘
│ │
┌──────▼──────────────────────▼──────────┐
│ Campaign Evaluator (Worker + KV) │
│ - matches event → campaign rules │
│ - frequency cap check in KV │
└──────┬─────────────────────────────────┘
│
┌──────▼─────────┐ ┌─────────────────┐
│ Dispatch Worker │────▶│ Push / In-Game │
│ (FCM/APNs) │ │ Offer State │
└─────────────────┘ └─────────────────┘
```
All components run on Cloudflare's global network. D1 provides durable SQL storage accessible from Workers. KV handles frequency capping for fast point lookups. The entire pipeline is serverless — it scales to zero when idle and handles launch-event spikes without provisioning.
Implementation — Code snippets for event ingestion, segmentation queries, campaign trigger
Event Ingestion Worker
```javascript
export default {
async fetch(request, env) {
const event = await request.json();
if (!event.player_id || !event.event_type || !event.timestamp) {
return new Response('Missing required fields', { status: 400 });
}
const enriched = {
...event,
server_ts: Date.now(),
segment_tags: await computeSegmentTags(event, env),
rolling_metrics: await getRollingMetrics(event.player_id, env),
};
await env.DB.prepare(
`INSERT INTO game_events (player_id, event_type, payload, server_ts)
VALUES (?, ?, ?, ?)`
).bind(event.player_id, event.event_type, JSON.stringify(enriched), enriched.server_ts).run();
const campaignAction = await evaluateCampaigns(enriched, env);
if (campaignAction) {
await env.QUEUE.send(campaignAction);
}
return Response.json({ ok: true, campaignAction });
}
};
```
Segmentation Query — High-Value At-Risk Players
This query (run every 6 hours via Cron Trigger) finds highly engaged players who have gone silent for 48+ hours — the ideal segment for re-engagement with an incentive:
```sql
SELECT
p.player_id,
p.last_active_at,
COUNT(e.id) AS event_count_last_week,
SUM(CASE WHEN e.event_type = 'item_purchase' THEN 1 ELSE 0 END) AS purchase_count
FROM players p
JOIN game_events e ON e.player_id = p.player_id
AND e.server_ts >= strftime('%s','now','-7 days') * 1000
WHERE p.last_active_at >= strftime('%s','now','-14 days') * 1000
AND p.last_active_at < strftime('%s','now','-2 days') * 1000
AND p.total_spend > 10.00
GROUP BY p.player_id
HAVING event_count_last_week > 50
ORDER BY p.total_spend DESC;
```
Campaign Trigger Evaluation
```javascript
async function evaluateCampaigns(event, env) {
const activeRules = await env.DB.prepare(
`SELECT * FROM campaign_rules
WHERE status = 'active'
AND (event_type = ? OR event_type = '*')`
).bind(event.event_type).all();
for (const rule of activeRules.results) {
if (!matchCondition(rule.condition_expr, event)) continue;
const capKey = `cap:${event.player_id}:${rule.campaign_id}`;
const capVal = await env.KV.get(capKey);
if (capVal && parseInt(capVal) >= rule.max_impressions) continue;
await env.KV.put(capKey, (parseInt(capVal || 0) + 1).toString(), {
expirationTtl: rule.cap_window_seconds
});
return {
campaign_id: rule.campaign_id,
player_id: event.player_id,
template_id: rule.template_id,
variables: extractTemplateVariables(event, rule),
};
}
return null;
}
```
Campaign Dispatch — Personalized Push Notification
```javascript
async function dispatchPush(action, env) {
const campaign = await env.DB.prepare(
`SELECT * FROM campaign_templates WHERE id = ?`
).bind(action.template_id).first();
const message = renderTemplate(campaign.body_template, action.variables);
await sendPushNotification({
to: action.player_id,
title: campaign.title,
body: message,
data: {
campaign_id: action.campaign_id,
deep_link: campaign.deep_link,
offer_id: action.variables.offer_id,
},
});
// Log attribution event back into the pipeline
await env.QUEUE.send({
event_type: 'campaign_dispatched',
player_id: action.player_id,
campaign_id: action.campaign_id,
server_ts: Date.now(),
});
}
```
Results — Engagement lift, conversion rate improvements
After deploying the feedback loop across CCFish's player base, we measured these improvements over an 8-week A/B test (50% control on batch campaigns, 50% treatment on event-driven):
| Metric | Control (Batch) | Treatment (Event-Driven) | Improvement |
|--------|----------------|-------------------------|-------------|
| Push notification CTR | 4.2% | 11.8% | **+181%** |
| In-game offer conversion | 6.1% | 15.3% | **+151%** |
| 7-day retention (new users) | 34% | 47% | **+38%** |
| ARPPU | $12.40 | $18.70 | **+51%** |
| Campaign latency | ~6 hours | ~180ms | **99.997% faster** |
Personalized IAP bundles — a 24-hour discount on the exact rare fish a player viewed twice in a session — achieved the highest conversion rate at 22.4%. Frequency capping ensured no player received more than 3 pushes in any 12-hour window.
Key Takeaways
1. **Real-time beats batch for marketing.** When a player's intent is fresh — they just browsed, abandoned, or hit a milestone — that's the moment to engage. Even a 2-hour delay halves conversion rates.
2. **Edge computing makes event-driven marketing practical.** Cloudflare Workers + D1 + KV provide sub-200ms latency, durability, and SQL capabilities without managing servers. The pipeline scales globally by default.
3. **The attribution loop is a force multiplier.** By logging campaign dispatches and clicks as game events that re-enter the pipeline, each campaign improves the next one. Players who click rare-fish offers get tagged as `rare_fish_interested` and receive better offers next session.
4. **Frequency capping is non-negotiable.** KV-based per-campaign and per-player capping ensures relevance without annoyance. Players who feel spammed will uninstall.
5. **Materialize your segments.** Running heavy SQL JOINs on every request is expensive. Pre-compute cohorts into a `player_segments` lookup table, updated every 6 hours via cron and incrementally on key events. Campaign evaluation becomes a single KV or D1 point read.
6. **Start with one campaign type, then expand.** CCFish launched with only push notifications for session-abandon recovery. After validating the pipeline, they added in-game offer overlays, personalized IAP bundles, and streak-reward reminders — each reusing the same ingestion and segmentation infrastructure.
The dev-marketing feedback loop transforms CCFish from a game that broadcasts generic messages to one that listens, understands, and responds to every player individually. Because the entire stack runs on Cloudflare Workers and D1, it costs less than $50/month at launch scale — a fraction of the revenue lift it generates.