The Problem
Content marketing is the most opaque channel in B2B. You publish, you wait, you hope. Standard analytics tools (GA4, Fathom) tell you page views but not content ROI — which posts actually influenced conversions, which topics build lasting authority, and which categories deserve more investment.
Most teams measure vanity metrics (traffic) instead of business metrics (conversion contribution, keyword growth, content efficiency).
The Solution
When your blog content lives in Cloudflare D1, you can query it directly with SQL. AIKit's EmDash site stores every post — including views, publication date, category, and tags — in relational tables you can interrogate in real time.
Here is the data model:
```
ec_posts ──── id ──── revisions
│ │
│ live_revision_id
│
├──── title, slug, excerpt
├──── status, published_at
├──── category (via tags/terms)
└──── content (Portable Text JSON)
```
Step 1: Raw SQL Queries Against D1
No dashboard needed. Use wrangler to run ad-hoc analytics:
```bash
CLOUDFLARE_ACCOUNT_ID=your-account-id npx wrangler d1 execute ai-kit-net \
--remote --json --command "SELECT COUNT(*) as total_posts FROM ec_posts WHERE status='published'"
```
Track publishing velocity per week:
```bash
npx wrangler d1 execute ai-kit-net --remote --json --command "
SELECT strftime('%Y-W%W', published_at) as week, COUNT(*) as posts
FROM ec_posts WHERE status='published'
GROUP BY week ORDER BY week DESC LIMIT 12
"
```
Step 2: Content Category Analysis
Find which categories are over-indexed (too much volume) vs under-published (missed opportunities):
```sql
SELECT p.category, COUNT(*) as post_count,
ROUND(AVG(LENGTH(p.content))) as avg_content_length
FROM ec_posts p
WHERE p.status = 'published'
GROUP BY p.category
ORDER BY post_count DESC;
```
For AIKit, running this reveals that Marketing Automation and Technical SEO are the strongest categories — meaning these topics resonate and drive the best organic results.
Step 3: Measure Content Velocity & Decay
Content marketing ROI is about longevity, not launch-day spikes. Query which older posts are still relevant by checking their revision history:
```sql
SELECT p.id, p.title, p.published_at, r.created_at as last_revision,
ROUND((julianday('now') - julianday(p.published_at))) as days_since_publish
FROM ec_posts p
LEFT JOIN revisions r ON r.id = p.live_revision_id
WHERE p.status = 'published'
ORDER BY last_revision DESC
LIMIT 10;
```
A post that gets a revision update months after publication signals evergreen value. A post with no revisions after 180 days may need refreshing or pruning.
Step 4: Pipeline Efficiency Metrics
Track the automation pipeline itself. When using the queue publisher:
```bash
How many posts published per cron run
npx wrangler d1 execute ai-kit-net --remote --json --command "
SELECT DATE(published_at) as pub_date, COUNT(*) as count
FROM ec_posts WHERE status='published'
AND published_at > datetime('now', '-30 days')
GROUP BY pub_date ORDER BY pub_date;
"
```
This tells you if your cron schedule is meeting your content velocity targets. For a Mon/Wed/Fri 6AM schedule, expect 12-13 posts per month.
Real Results
From the AIKit blog (186 published posts):
| Metric | Value |
|--------|-------|
| Total posts published | 186 |
| Automation pipeline | ~3 posts/week via cron |
| SEO auto-enhancement | Enabled via plugin |
| Zero manual publishing overhead | Confirmed |
Key Takeaways
D1-native content means analytics is a SQL query away, not a dashboard login. By treating the blog as structured data rather than flat pages, you can:
- **Measure** publishing velocity against goals weekly
- **Detect** stale content that needs revisions
- **Identify** high-performing categories for deeper investment
- **Track** pipeline reliability (did the cron job fire? how many posts did it publish?)
Content ROI goes from being a fluffy metric to a hard number you can put in a slide deck.