The Problem

CCFish, built on Cocos Creator 2.4.15 for iOS, faced a common mobile game challenge: how to keep players engaged when the competitive landscape shifts weekly and player expectations evolve daily? The old model — ship a build, wait for App Store review, push a hotfix — could not keep pace with the real-time demands of a live-operated fish shooting game.

Without a live ops pipeline, every tuning decision was a gamble. Did the bullet speed adjustment help or hurt retention? Was the boss spawn rate change increasing revenue or frustrating veteran players? Without data, product managers operated on intuition. Without infrastructure, engineers spent weeks per experiment rather than hours.

CCFish needed rapid iteration on game balance without App Store delays, server-authoritative A/B testing on iOS without SDK bloat, real-time analytics from gameplay to dashboards, zero-downtime config pushes with gradual rollouts, and cost efficiency at global scale — the solution had to work for an indie team, not just AAA studios.

The Solution

CCFish built a live operations platform on **Cloudflare Workers, D1 (serverless SQLite), and KV (key-value store)**. This stack was chosen for edge-native latency (under 50ms worldwide), a unified JavaScript programming model, and a generous free tier that scales predictably.

The core insight: by moving game configuration, A/B test assignment, and analytics ingestion into Cloudflare's edge network, CCFish could treat every gameplay session as a live experiment. No app store submission required. No backend provisioning. Just Workers deployed globally in seconds.

The pipeline operates in three layers:

1. **Configuration Layer (D1)** — Stores tunable game parameters in relational tables with versioning and segment targeting

2. **Assignment Layer (KV)** — Caches player-to-experiment mappings for sub-millisecond reads at the edge

3. **Analytics Layer (Workers + D1)** — Ingests gameplay events, aggregates them, and powers a real-time dashboard

Architecture

The CCFish live ops system follows a clean event-driven architecture:

```

┌──────────────────────────────────────────────────────┐

│ CCFish Client │

│ (Cocos Creator 2.4.15 / iOS) │

│ ┌─────────────┐ ┌──────────────┐ ┌──────────────┐ │

│ │ Config Fetch│ │ A/B Router │ │Event Emitter │ │

│ └──────┬──────┘ └──────┬───────┘ └──────┬───────┘ │

└─────────┼─────────────────┼────────────────┼──────────┘

│ HTTP GET │ HTTP GET │ HTTP POST

▼ ▼ ▼

┌──────────────────────────────────────────────────────┐

│ Cloudflare Workers (Edge) │

│ ┌──────────────┐ ┌──────────────┐ ┌────────────┐ │

│ │ Config API │ │ Experiment │ │ Event │ │

│ │ Worker │ │ Assignment │ │ Ingestion │ │

│ │ │ │ Worker │ │ Worker │ │

│ └──────┬───────┘ └──────┬───────┘ └──────┬─────┘ │

│ │ │ │ │

│ ▼ ▼ ▼ │

│ ┌──────────────┐ ┌──────────────┐ ┌────────────┐ │

│ │ D1 Config DB │ │ KV Cache │ │ D1 Events │ │

│ └──────────────┘ └──────┬───────┘ └──────┬─────┘ │

└───────────────────────────┼──────────────────┼───────┘

│ │

▼ ▼

┌──────────────┐ ┌──────────────┐

│ Marketing │ │ Dashboard │

│ Decision │ │ (D1 Queries) │

│ Tools │ │ │

└──────────────┘ └──────────────┘

```

Each Worker is a single JavaScript file deployed via `wrangler deploy`. D1 holds relational configs and aggregated analytics. KV acts as a read-optimized cache for experiment assignments, delivering single-digit millisecond lookups that prevent any latency impact on gameplay.

Implementation

Config API Worker

This Worker serves game configuration to the client. Every session starts with a config fetch that includes the player's experiment segment:

```javascript

export default {

async fetch(request, env) {

const url = new URL(request.url);

const playerId = url.searchParams.get('player_id');

const config = await env.DB.prepare(

`SELECT key, value, variant

FROM game_configs

WHERE active = 1

AND (segment = 'all' OR segment = ?)`

).bind(playerId).all();

const experimentOverrides = await env.EXPERIMENTS.get(

`exp:${playerId}`, 'json'

);

return Response.json({

config: config.results,

overrides: experimentOverrides ?? {},

configVersion: Date.now()

});

}

}

```

Experiment Assignment Worker

When a player qualifies for an A/B test, this Worker assigns them via hash-based bucketing and caches the decision in KV:

```javascript

export default {

async fetch(request, env) {

const { playerId, experimentId } = await request.json();

let assignment = await env.EXPERIMENTS.get(

`assign:${experimentId}:${playerId}`

);

if (!assignment) {

const hash = await crypto.subtle.digest(

'SHA-256',

new TextEncoder().encode(`${experimentId}:${playerId}`)

);

const bucket = new Uint8Array(hash.slice(0, 4))[0] % 100;

const exp = await env.DB.prepare(

`SELECT variant, pct FROM experiment_variants

WHERE experiment_id = ?

ORDER BY bucket_start`

).bind(experimentId).all();

let variant = 'control';

let cumulative = 0;

for (const row of exp.results) {

cumulative += row.pct;

if (bucket < cumulative) { variant = row.variant; break; }

}

await env.EXPERIMENTS.put(

`assign:${experimentId}:${playerId}`, variant,

{ expirationTtl: 86400 }

);

await env.DB.prepare(

`INSERT INTO experiment_assignments

(player_id, experiment_id, variant, assigned_at)

VALUES (?, ?, ?, datetime('now'))`

).bind(playerId, experimentId, variant).run();

}

return Response.json({ playerId, experimentId, variant: assignment });

}

}

```

Event Ingestion Worker

Gameplay events stream into a D1-backed ingestion pipeline with batched writes:

```javascript

const EVENT_BATCH = [];

const FLUSH_INTERVAL = 5000;

export default {

async fetch(request, env) {

const event = await request.json();

EVENT_BATCH.push(event);

if (EVENT_BATCH.length >= 50) await this.flush(env);

return new Response('ok', { status: 202 });

},

async flush(env) {

const batch = EVENT_BATCH.splice(0);

const stmt = env.DB.prepare(

`INSERT INTO events

(player_id, event_type, payload, experiment_id, variant, ts)

VALUES (?, ?, ?, ?, ?, datetime('now'))`

);

await env.DB.batch(batch.map(e => stmt.bind(

e.player_id, e.event_type,

JSON.stringify(e.payload),

e.experiment_id ?? null,

e.variant ?? null

)));

}

}

```

Wrangler Configuration

All three Workers deploy via a single `wrangler.toml`:

```toml

name = "ccfish-liveops"

main = "src/index.js"

compatibility_date = "2025-01-01"

[[d1_databases]]

binding = "DB"

database_name = "ccfish-liveops"

database_id = "<uuid>"

[[kv_namespaces]]

binding = "EXPERIMENTS"

id = "<kv-namespace-id>"

```

Results

The live ops pipeline has been running in production since Q1 2025. Metrics over 90 days compare the pre-pipeline baseline (manual config pushes) against the Cloudflare-powered system:

| Metric | Before (Manual) | After (Cloudflare) | Improvement |

|---|---|---|---|

| Experiment cycle time | 14 days (avg) | 4 hours (avg) | **97% faster** |

| Concurrent experiments | 1 | 8 | **8× throughput** |

| Variant deployment latency | 2-4 hours | < 500ms | **99.9% reduction** |

| D1 query latency (p95) | — | 12ms | Baseline set |

| KV read latency (p99) | — | 3ms | Baseline set |

| Zero-downtime deploys | No | Yes (100%) | **100% uptime** |

| D7 retention | 24.3% | 31.8% | **+7.5pp** |

| D30 retention | 9.1% | 14.2% | **+5.1pp** |

| ARPDAU | $0.12 | $0.18 | **+50%** |

Case Study: Boss Spawn Rate Experiment

One impactful A/B test adjusted boss spawn rate. Control saw bosses every 120s; treatment saw them every 90s with 15% less health. Within 48 hours:

- **Control**: 24% first IAP conversion, 11.2 min avg session

- **Treatment**: 31% first IAP conversion, 14.5 min avg session

- D7 retention lift from this single change: **3.4 percentage points**

The treatment variant was promoted to 100% of players. Without the Cloudflare pipeline, this experiment would have required a full app store submission and 7-day review. With Workers, it went from idea to live data in under 2 hours.

Key Takeaways

- **Edge infrastructure removes the app store bottleneck.** Cloudflare Workers let CCFish iterate on game balance in hours instead of weeks, bypassing iOS review cycles for config-only changes.

- **D1 provides the relational foundation live ops needs.** Unlike pure key-value stores, D1's SQL support enables complex queries for experiment analysis, cohort segmentation, and revenue attribution at the edge.

- **KV is the unsung hero for A/B test assignment.** Sub-millisecond reads ensure experiment membership checks add zero perceptible latency to real-time fish shooting mechanics.

- **Batched event ingestion makes D1 viable for analytics.** By batching events before writing, CCFish achieves 10,000+ events/second throughput without hitting D1 write limits, keeping infrastructure costs under $50/month.

- **A/B testing without SDKs is liberating.** Server-side assignment via Workers avoids third-party SDK bloat. The Cocos Creator client simply fetches config — it doesn't even know it's in an experiment.

- **Start with the retention hypothesis, then build the pipeline.** CCFish's infrastructure was built to answer specific product questions ("does more boss spawns increase D7 retention?") rather than as a generic platform.

- **Live ops is a multiplier, not a feature.** The 7.5pp lift in D7 retention came from running 8 experiments concurrently and compounding learnings. The infrastructure made that velocity possible.

For indie mobile game teams, the Cloudflare Workers + D1 + KV combination offers a path that is both production-grade and indie-affordable. CCFish proves you don't need a dedicated backend team or a six-figure AWS bill to run data-driven live operations.