The Problem

CCFish offers five in-app purchase tiers: a $1.99 starter pack, a $4.99 fish bundle, a $9.99 premium pass, a $19.99 whale pack, and a $49.99 lifetime unlock. But which price point actually maximizes revenue? Conventional wisdom says $4.99 is the sweet spot for casual games, but that's an industry average — CCFish's players might behave differently.

Running price experiments manually is a nightmare. You need to:

- Segment users into control and test groups

- Enforce consistent pricing per user (not per session)

- Track conversion rates across cohorts

- Wait for statistical significance

- Roll out the winner without downtime

Doing this with a traditional backend means feature flags, experiment frameworks, and a data pipeline — weeks of engineering for what should be a config change.

The Solution: Experiment Engine on Cloudflare Workers

We built a lightweight experiment engine that runs entirely on Cloudflare Workers + D1. It handles segmentation, price assignment, conversion tracking, and winner declaration — all for under 50 lines of Worker code and a single D1 table.

Architecture

```

Player API Request → Worker (experiment middleware)

D1 lookup: player_id → experiment group

Return variant price → Game Client

Purchase event → Worker → D1 (conversion log)

Cron Trigger (daily) → Statistical Analysis

Auto-declare winner → Update experiment config

```

The experiment middleware intercepts IAP price requests. If the player is enrolled in an active experiment, it returns the variant price instead of the control. Post-purchase events are logged to D1 for analysis.

Step 1: Define the Experiment Schema

```sql

CREATE TABLE experiments (

id TEXT PRIMARY KEY,

product_id TEXT NOT NULL, -- e.g., "starter_pack"

control_price INTEGER NOT NULL, -- price in cents

variants TEXT NOT NULL, -- JSON array of variant prices

traffic_pct INTEGER DEFAULT 50, -- % of users in experiment

status TEXT DEFAULT 'running', -- running, paused, winner_declared

winner_variant TEXT, -- declared winner (index)

created_at TEXT,

started_at TEXT,

ended_at TEXT

);

CREATE TABLE experiment_assignments (

player_id TEXT NOT NULL,

experiment_id TEXT NOT NULL,

variant TEXT NOT NULL, -- 'control' or 'variant_0', 'variant_1', etc.

assigned_at TEXT,

PRIMARY KEY (player_id, experiment_id)

);

CREATE TABLE experiment_events (

id INTEGER PRIMARY KEY AUTOINCREMENT,

player_id TEXT NOT NULL,

experiment_id TEXT NOT NULL,

event_type TEXT NOT NULL, -- 'assigned', 'impression', 'purchase'

variant TEXT NOT NULL,

revenue_cents INTEGER DEFAULT 0,

occurred_at TEXT

);

```

Step 2: Experiment Middleware Worker

Every time the game client fetches a product price, the Worker checks for active experiments:

```typescript

async function getProductPrice(playerId, productId, env) {

// Find active experiment for this product

const experiment = await env.DB.prepare(

"SELECT * FROM experiments WHERE product_id = ? AND status = 'running'"

).bind(productId).first();

if (!experiment) return getDefaultPrice(productId); // fallback

// Check if player already assigned

let assignment = await env.DB.prepare(

"SELECT variant FROM experiment_assignments WHERE player_id = ? AND experiment_id = ?"

).bind(playerId, experiment.id).first();

if (!assignment) {

// Assign new player

const isInExperiment = Math.random() * 100 < experiment.traffic_pct;

const variant = isInExperiment ? assignVariant(experiment) : 'control';

assignment = { variant };

await env.DB.prepare(

"INSERT INTO experiment_assignments (player_id, experiment_id, variant, assigned_at) VALUES (?, ?, ?, ?)"

).bind(playerId, experiment.id, variant, Date.now()).run();

await logEvent(env, playerId, experiment.id, 'assigned', variant, 0);

}

// Return price based on variant

if (assignment.variant === 'control') {

return { price: experiment.control_price, cents: true };

}

const variantIdx = parseInt(assignment.variant.replace('variant_', ''));

const variants = JSON.parse(experiment.variants);

return { price: variants[variantIdx], cents: true };

}

```

The assignment is deterministic per player — once assigned, they always see the same price for that product. This prevents confusion and ensures clean data.

Step 3: Track Conversions

When a player completes a purchase, the Worker logs the event before fulfilling the IAP:

```typescript

async function onPurchaseComplete(playerId, productId, revenueCents, env) {

const experiment = await getActiveExperiment(productId, env);

if (!experiment) return;

const assignment = await env.DB.prepare(

"SELECT variant FROM experiment_assignments WHERE player_id = ? AND experiment_id = ?"

).bind(playerId, experiment.id).first();

if (assignment) {

await logEvent(env, playerId, experiment.id, 'purchase', assignment.variant, revenueCents);

}

}

```

Step 4: Statistical Analysis (Daily Cron)

Every night at 3 AM UTC, a cron Worker runs Bayesian analysis on experiment data:

```typescript

async function analyzeExperiments(env) {

const experiments = await env.DB.prepare(

"SELECT * FROM experiments WHERE status = 'running'"

).all();

for (const exp of experiments.results) {

const stats = await computeStats(env, exp.id);

if (stats.winnerProb > 0.95 && stats.totalEvents >= 500) {

// Declare winner with 95%+ confidence

await env.DB.prepare(

"UPDATE experiments SET status = 'winner_declared', winner_variant = ?, ended_at = ? WHERE id = ?"

).bind(stats.winningVariant, Date.now(), exp.id).run();

}

}

}

```

We use a simplified Bayesian model: Beta(1 + conversions, 1 + non_conversions) per variant, then calculate the probability that each variant is the best via Monte Carlo simulation (10,000 samples — takes about 50ms on Workers).

Real Results from CCFish

We ran three experiments on CCFish's starter pack ($1.99 control). Here's what we found:

| Variant | Price | Conversion Rate | Revenue per Visitor | vs Control |

|---------|-------|----------------|---------------------|------------|

| Control | $1.99 | 3.2% | $0.064 | baseline |

| Variant A | $2.99 | 2.8% | $0.084 | **+31%** |

| Variant B | $0.99 | 4.1% | $0.041 | -36% |

| Variant C | $4.99 | 1.1% | $0.055 | -14% |

Counter-intuitive finding: **$2.99 beat $1.99** despite a lower conversion rate. The higher price per purchase more than compensated for the drop in conversions. Winner was declared after 4 days and 612 events (97.3% confidence).

Key Takeaways

- **Price anchoring works.** CCFish players accustomed to $1.99 didn't balk at $2.99 — conversion dropped only 12.5% while revenue per visitor jumped 31%.

- **Never trust industry averages.** The "proven" $4.99 sweet spot for casual games was the worst performer for CCFish.

- **Automated winner declaration prevents analysis paralysis.** Without the cron trigger auto-declaring at 95% confidence, we'd have kept the experiment running for weeks chasing diminishing returns.

- **This system costs $0.20/month.** The experiment engine runs as part of the existing CCFish Worker — no additional infrastructure. At this price, you can A/B test everything: prices, notification copy, reward amounts, level difficulty.