CCFish ships marketing experiments server-side using Cloudflare Workers and D1, enabling A/B tests on push notifications, in-app purchase offers, and onboarding messages without ever touching the App Store review queue. By moving feature flags and experiment routing to the edge, the team collapsed iteration cycles from weeks to minutes.
The Problem — App Store Review Delays Kill Marketing Agility
Every marketing team shipping native mobile apps knows this pain. You want to test a new push notification, a discounted IAP tier, or a redesigned onboarding flow. In a traditional client-side setup, that means:
1. Hard-coding variants into the app binary
2. Submitting to Apple and Google for review
3. Waiting 24–72 hours for approval
4. Gradually rolling out the new binary
5. If the variant underperforms, starting the cycle over
That delay kills marketing agility. A flash sale or seasonal promotion can't wait three days for App Review. And if an experiment tanks conversion, you're stuck until the next binary ships.
CCFish faced this directly. With a growing user base across 15 countries, the team needed to iterate on pricing, notifications, and copy at a cadence the app stores couldn't support.
| Pain Point | Client-Side | Server-Side |
|---|---|---|
| Time to launch | 1–3 days | 5 minutes |
| Rollback speed | 1–3 days | Instant |
| Variant count | Binary-size limited | Unlimited |
| Segmentation | Coarse (app version) | Granular (any attribute) |
| Cost per experiment | Full CI/CD cycle | Flag toggle |
The choice was clear: CCFish needed a server-side experiment engine at the edge.
The Solution — Server-Side Experiment Engine Architecture
The core insight: treat every marketing message, offer, and UI string as a *resolved value* determined at request time, not compile time. Instead of shipping variant logic in the binary, CCFish ships a lightweight evaluator on Cloudflare Workers that resolves every flag at the HTTP edge.
The architecture follows three layers:
**Layer 1: Feature Flag Registry** — A D1 database table storing every active experiment, its variants, traffic allocation, and targeting rules. This is the source of truth that marketing operators update via an internal dashboard.
**Layer 2: Experiment Evaluator** — A Cloudflare Worker intercepting API requests from the mobile app, evaluating active flags against user attributes, and returning resolved variant assignments. It runs in ~5ms and caches results in Workers KV.
**Layer 3: Analytics Sink** — Every flag evaluation emits a structured event to the analytics pipeline, enabling near-real-time measurement of conversion, revenue, and engagement per variant.
Architecture — Cloudflare Workers + D1 + Feature Flags
CCFish runs on Cloudflare Workers for global edge compute with D1 as the flag configuration store. D1 was chosen over KV because experiments need relational queries ("find all active experiments targeting US users on iOS 16+") and SQL makes that straightforward.
The data model:
```sql
CREATE TABLE experiments (
id TEXT PRIMARY KEY,
name TEXT NOT NULL,
status TEXT DEFAULT 'draft', -- draft | active | paused | completed
created_at TEXT DEFAULT (datetime('now')),
started_at TEXT,
ended_at TEXT
);
CREATE TABLE variants (
id TEXT PRIMARY KEY,
experiment_id TEXT NOT NULL,
name TEXT NOT NULL, -- 'control', 'variant_a', etc.
traffic_pct REAL NOT NULL, -- e.g. 33.3
config JSON NOT NULL, -- payload: copy, price, delay
FOREIGN KEY (experiment_id) REFERENCES experiments(id)
);
CREATE TABLE targeting_rules (
id TEXT PRIMARY KEY,
experiment_id TEXT NOT NULL,
attribute TEXT NOT NULL, -- 'country', 'app_version'
operator TEXT NOT NULL, -- 'eq', 'neq', 'in', 'gte'
value TEXT NOT NULL, -- 'US', '["US","CA"]'
FOREIGN KEY (experiment_id) REFERENCES experiments(id)
);
```
Marketing operators define experiments with a few SQL INSERTs, and a Worker endpoint exposes a REST API for the mobile apps to fetch resolved assignments on launch.
Implementation — Flag Evaluation and Experiment Routing
The heart of the system is the evaluator. This core function resolves a user into their variant:
```javascript
export async function resolveVariants(user, request) {
const { results } = await env.DB.prepare(`
SELECT e.*, v.id AS variant_id, v.name AS variant_name,
v.traffic_pct, v.config, t.attribute, t.operator, t.value
FROM experiments e
LEFT JOIN variants v ON v.experiment_id = e.id
LEFT JOIN targeting_rules t ON t.experiment_id = e.id
WHERE e.status = 'active'
AND (e.started_at IS NULL OR e.started_at <= datetime('now'))
AND (e.ended_at IS NULL OR e.ended_at > datetime('now'))
ORDER BY e.id, v.traffic_pct DESC
`).all();
const assignments = {};
for (const experiment of groupByExperiment(results)) {
if (!matchTargeting(experiment.rules, user)) continue;
const bucket = hashUserToBucket(user.id, experiment.id);
const variant = selectVariantByBucket(experiment.variants, bucket);
assignments[experiment.id] = {
variant: variant.name,
config: JSON.parse(variant.config),
expires_at: experiment.ended_at
};
}
return assignments;
}
function hashUserToBucket(userId, experimentId) {
const hash = crypto.subtle.digestSync(
'SHA-256',
new TextEncoder().encode(`${userId}:${experimentId}`)
);
const hashInt = new DataView(hash.slice(0, 4)).getUint32(0);
return (hashInt % 10000) / 100;
}
```
Deterministic hashing is critical: the same user always gets the same variant, ensuring consistent UX across sessions. The `hashUserToBucket` function produces a value 0.00–99.99, mapped to a variant by its `traffic_pct` range.
The mobile client integration is minimal:
```json
{
"user_bucket": 47.32,
"assignments": {
"exp_onboarding_copy": {
"variant": "variant_b",
"config": {
"headline": "Catch More Fish, Faster",
"subtitle": "Your personal fishing spot guide",
"cta_text": "Explore Waters"
}
},
"exp_laps_offer": {
"variant": "control",
"config": {
"price_tier": "discounted_monthly",
"discount_pct": 40,
"offer_label": "Limited Time"
}
}
}
}
```
The app calls this endpoint on cold start and caches the result for the session. Every experimentable UI element reads from this object rather than hard-coded constants.
Results — Iteration Speed and Conversion Improvements
CCFish has run this system in production for six months:
| Metric | Before | After | Improvement |
|---|---|---|---|
| Experiment launch time | 48 hours | 8 minutes | 360x faster |
| Concurrent experiments | 2 | 12 | 6x more |
| Rollback time | 72 hours | 30 seconds | Instant |
| Marketing-led experiments | 0 | 8/month | Fully empowered |
| IAP conversion uplift | Baseline | +23% | Via variant iteration |
| Push notification CTR | Baseline | +17% | Via copy testing |
**Specific wins:**
- **IAP Price Testing**: By experimenting with discount percentages (20%, 30%, 40% off monthly) as server-side configs, the team found the sweet spot: 30% off drove the highest revenue-per-user because 40% increased conversion but eroded ARPU. This experiment was significant in under 4 hours.
- **Push Notification Timing**: A/B testing delivery times (morning vs. evening vs. lunch hour) without client code. The winner — lunch-hour notifications with social proof — improved CTR by 17% and re-engagement by 12%.
- **Onboarding Copy**: Three welcome-flow variants across 10,000 new users. Variant B (benefit-driven headlines) improved Day-7 retention by 8%. The losing variant was killed in 90 minutes, not 3 days.
Key Takeaways
1. **Move experiment logic to the edge, not the client.** Cloudflare Workers evaluate flags in under 10ms at global scale. The app never needs to know which variant it’s in until the API responds.
2. **D1 suits experiment config well.** The relational structure (experiments → variants → targeting rules) maps naturally to SQL. KV is great for caching resolved assignments, but D1 is the clean source of truth.
3. **Deterministic hashing is non-negotiable.** Without it, users bounce between variants across requests, breaking both UX and statistical validity. A SHA-256 bucket function solves this with zero infrastructure.
4. **Server-side experiments empower non-engineering teams.** With flag infrastructure in place, marketing operators launch experiments through a dashboard without filing tickets or waiting for release trains. CCFish went from zero marketing-led experiments to eight per month.
5. **Start simple, extend later.** CCFish began with just `traffic_pct` and a JSON config blob. Targeting rules and analytics sinks were added later. The architecture is extensible by design — Bayesian bandit optimization, holdout groups, and multi-factorial designs can be added without changing the evaluation path.
For any team building a mobile app on Cloudflare Workers, this pattern is a force multiplier. The same edge infrastructure serving your API can run your marketing experiments — and that convergence is where agility lives.