The Problem

Content marketing is the single highest-ROI channel for B2B SaaS, but it demands consistent output. Most teams stall after the first 10 posts because manual writing, editing, and publishing can't scale. The average content team publishes 3-4 posts per month — too slow to build topical authority in competitive SEO spaces like MarTech.

The Solution

AIKit's Auto-Blog/SEO plugin solves this by connecting an LLM directly to your EmDash CMS's Cloudflare D1 database. You configure the plugin once, and it generates, formats, and publishes blog posts on a schedule — no human in the loop.

Here is the architecture at a glance:

```

LLM Provider (OpenRouter/GPT-4o)

↓ (API call generates markdown)

AIKit Auto-Blog Plugin

↓ (converts to Portable Text JSON)

Cloudflare D1 Database (ec_posts table)

↓ (runtime query, no rebuild)

ai-kit.net/blog — live within seconds

```

Step 1: Configure the LLM Provider

The plugin stores its settings in Cloudflare KV. You set the provider, API key, and model via wrangler:

```bash

cd ~/Projects/AIKitLLC/EmDash

CLOUDFLARE_ACCOUNT_ID=your-account-id npx wrangler kv key put \

--namespace-id 252b5037f96f4ccea391fc1d51ff7f1f \

"settings:llmProvider" "openrouter" --remote

CLOUDFLARE_ACCOUNT_ID=your-account-id npx wrangler kv key put \

--namespace-id 252b5037f96f4ccea391fc1d51ff7f1f \

"settings:llmApiKey" "your-key-here" --remote

CLOUDFLARE_ACCOUNT_ID=your-account-id npx wrangler kv key put \

--namespace-id 252b5037f96f4ccea391fc1d51ff7f1f \

"settings:llmModel" "openai/gpt-4o" --remote

```

The KV namespace `252b5037f96f4ccea391fc1d51ff7f1f` is the project's CACHE namespace. All plugin settings live there.

Step 2: Queue-Based Publishing Pipeline

For maximum control, bypass the admin UI's schedule and use the queue-publisher pattern. Pre-write post content as JSON files in a queue directory, then publish with a cron job:

```

~/cmo/content/queue/

├── 187-slug.json ← Pending

├── 188-slug.json ← Pending

└── published/ ← Archived after publish

```

Each JSON file follows this structure:

```json

{

"title": "Your Post Title",

"body_text": "## Section\n\nFull markdown body...",

"excerpt": "SEO excerpt...",

"category": "Marketing Automation",

"tags": ["AIKit", "SEO"]

}

```

Step 3: Publish via D1

The publisher script converts markdown to Portable Text (Sanity's content format) and inserts into Cloudflare D1:

```bash

cd ~/Projects/AIKitLLC/EmDash

export CLOUDFLARE_ACCOUNT_ID=your-account-id

python3 ~/cmo/scripts/blog-publisher.py ~/cmo/content/queue/187-slug.json

```

This script handles four inserts in sequence:

1. **ec_posts** — the blog entry with NULL revision refs

2. **revisions** — a full content snapshot linked to the post

3. **ec_posts UPDATE** — wires the revision IDs back to the post

4. **_emdash_seo** — optional SEO metadata for OG tags

Step 4: Verify

D1 inserts are live immediately — no build step needed. EmDash queries D1 at runtime. Check the blog listing:

```bash

curl https://ai-kit.net/blog/your-post-slug

```

The post renders with auto-calculated reading time, table of contents (from h2 headings), related posts (from keyword overlap), and social share buttons.

Results

In production, the pipeline publishes 3-6 posts per week entirely on autopilot. The 176+ posts on ai-kit.net were generated through this exact pipeline. Key metrics:

- **Zero editorial overhead** — no copy editor, no CMS login

- **Immediate SEO indexing** — D1 inserts appear in the dynamic sitemap within seconds

- **Consistent output** — cron-fired even when the marketing team sleeps

Key Takeaways

The Auto-Blog plugin plus a queue-based D1 pipeline turns content marketing into a data engineering problem. Once you configure the LLM provider and queue directory, the system runs itself. The hardest part is deciding what to write about — and the content calendar handles that too.