The Creative Delivery Problem
Every playable ad campaign requires delivering hundreds of creative variants to different ad networks. Each network has its own requirements: Meta needs ZIP files under 2MB, TikTok expects a specific directory structure, Google requires HTML files with specific viewport configurations. On top of that, each platform needs A/B test variants delivered to different user segments.
Before PlayableAd Studio built its delivery pipeline, each campaign required a manual upload process. A studio manager would:
1. Generate all creative variants locally
2. Rename files to match network requirements
3. ZIP files per network specification
4. Upload to each platform\’s ad manager
5. Configure A/B test splits manually
6. Monitor each variant\’s delivery status separately
For a campaign with 10 variants across 3 networks, that\’s 30 manual steps — per campaign, per week.
Serverless Creative Delivery: The Architecture
We replaced the manual process with a Cloudflare Workers-based delivery pipeline. Here\’s how it works:
```
Creative build completes in Cocos Creator
→ CI/CD pipeline pushes to Workers KV
→ Variants stored by campaign_id + variant_key
→ Ad network requests a creative for a specific user
→ Worker looks up variant assignment (from A/B test config)
→ Worker serves the correct HTML/JS creative
→ Worker logs delivery event to D1
```
Variant Storage in KV
Each creative variant is stored in KV with a structured key:
`creative:{campaign_id}:{variant_id}:{asset_type}`
Asset types include:
- `html` — the full MRAID HTML wrapper
- `js` — the playable ad logic (pre-built Cocos Creator output)
- `config` — network-specific metadata (viewport, file size limits, etc.)
- `thumbnail` — preview image for the ad manager dashboard
KV\’s global replication means the creative is delivered from the edge closest to the user, regardless of which ad network they\’re browsing through.
A/B Test Routing at the Edge
When an ad network requests a creative for user impression N, the Worker needs to decide which variant to serve. We use deterministic assignment:
```javascript
function assignVariant(userId, campaignConfig) {
const variants = campaignConfig.variants;
const hash = await crypto.subtle.digest(
'SHA-256',
new TextEncoder().encode(userId + campaignConfig.seed)
);
const index = new DataView(hash.slice(0, 4)).getUint32(0) % 100;
let cumulative = 0;
for (const v of variants) {
cumulative += v.weight;
if (index < cumulative) return v.id;
}
return variants[0].id;
}
```
This ensures **consistent assignment** — the same user always sees the same variant — without storing any session state or making any database queries at delivery time.
Network-Specific Adaptations
Each ad network needs different creative handling:
**Meta (Facebook):** The Worker wraps the creative in Meta\’s `FbPlayableAd` JavaScript interface and ensures the viewport matches 1:1 aspect ratio. It also injects the `fbq('track')` call for conversion tracking.
**TikTok / Pangle:** The Worker serves a ZIP archive on-demand by fetching the creative assets from KV, assembling them into TikTok\’s required directory structure (with `playable.html` and `config.json` at the root), and streaming the ZIP to the network\’s creative endpoint.
**Google AdMob:** Google requires a single HTML file with inline CSS and JS. The Worker reads the HTML template from KV, inlines the JS bundle, injects Google-specific tracking, and returns a minified file.
Monitoring Delivery Health
Every creative delivery is logged to D1 with metadata: variant ID, network, response time, and HTTP status. If a variant starts failing (e.g., bundle too large for Meta\’s 2MB limit), the Worker automatically rolls back to a known-good variant and alerts the team via Telegram.
```sql
CREATE TABLE delivery_logs (
id TEXT PRIMARY KEY,
campaign_id TEXT,
variant_id TEXT,
network TEXT,
status INTEGER,
size_bytes INTEGER,
duration_ms INTEGER,
error TEXT,
timestamp TEXT DEFAULT (datetime('now'))
);
-- Alert on failure spike
SELECT network, COUNT(*) as failures
FROM delivery_logs
WHERE status != 200
AND timestamp > datetime('now', '-15 minutes')
GROUP BY network
HAVING failures > 5;
```
Results
The automated delivery pipeline eliminated manual uploads entirely:
- **Delivery time per campaign:** 45 minutes → instant (sub-second per variant)
- **Error rate:** 12% (manual upload mistakes) → 0.3% (automated validation)
- **Campaign throughput:** 2 per week → 15+ per week per studio
- **Infrastructure cost:** $0 additional (KV + Workers + D1, all within existing free tier limits)