> AIKit’s structured blog content strategy uses Portable Text, dynamic llms.txt, and semantic heading hierarchies to rank for developer-targeted keywords — turning technical documentation into an organic acquisition channel.

The Problem

Developer-targeted SEO is fundamentally different from consumer SEO. Developers search for solutions to specific technical problems, not generic topics. They type queries like “How to build a CMS with D1 and Cloudflare Workers” or “Astro CMS with Portable Text support” into Google, expecting precise, actionable answers.

Most blog content fails this audience in three ways:

1. **Shallow coverage** — surface-level “what is” content that doesn’t show real code

2. **Poor structure** — walls of text with no clear heading hierarchy for AI parsers

3. **No LLM-readiness** — content that can’t be discovered by AI agents scraping /llms.txt

AIKit’s blog solves all three with a structured-content-first approach.

The Solution: Structured Content as SEO Infrastructure

Instead of treating blog posts as plain markdown files, AIKit stores every post as Portable Text — a structured JSON format that preserves heading hierarchy, code blocks, and semantic relationships.

Portable Text: More Than Markdown

Portable Text is Sanity’s block content format. Every heading, paragraph, code block, and list item is a typed JSON block with its own key:

```json

[

{"_type": "block", "_key": "h1", "style": "h2",

"children": [{"_type": "span", "text": "The Solution"}]},

{"_type": "block", "_key": "p1", "style": "normal",

"children": [{"_type": "span", "text": "Paragraph text..."}]}

]

```

This structured format enables three SEO advantages that plain markdown cannot match:

1. **Semantic heading extraction** — Google’s passage ranking indexes each h2 section as an independent entity

2. **Dynamic table of contents** — auto-generated from heading blocks improves on-page engagement (time-on-page signal)

3. **LLM-friendly excerpts** — AI agents can parse only the relevant section without handling raw markdown

The llms.txt Integration

AIKit publishes two dynamic endpoints that make every blog post AI-discoverable:

- **/llms.txt** — lists every post with URL and excerpt (standard LLM discovery format)

- **/llms-full.txt** — includes full content preview for AI training context

Both are server-rendered Astro routes that query Cloudflare D1 in real-time. When a new post is published via D1 insert, it appears in both endpoints within seconds — no rebuild, no redeploy.

Architecture Overview

The content-to-SEO pipeline has four layers:

1. **Keyword Research Layer** — Identify developer queries with low competition but clear commercial intent (e.g., “Cloudflare D1 blog CMS”, “Portable Text Astro”)

2. **Content Production Layer** — Generate 800–1500 word posts with proper heading hierarchy, code blocks, and answer-first openings for AI parsers

3. **Structured Storage Layer** — Store as Portable Text in Cloudflare D1 with ULID-based IDs for fast queries

4. **Dynamic Delivery Layer** — Serve via Astro SSR with auto-generated TOC, reading time, related posts, and llms.txt discovery

Results

The structured-content approach has produced measurable improvements:

| Metric | Before | After |

|--------|--------|-------|

| Pages indexed by Google | 12 | 47+ |

| Avg. position for target keywords | 15.3 | 4.2 |

| Pages with Featured Snippets | 0 | 3 |

| Blog-driven signups/month | 2 | 8+ |

Data from Google Search Console, measured over 60 days after implementing structured content.

Key Takeaways

- **Structure is SEO infrastructure** — Portable Text headings directly improve passage ranking

- **LLM discoverability is a growth channel** — /llms.txt endpoints let AI agents surface your content in responses

- **D1 inserts mean instant publishing** — no build step means content goes live in seconds, improving indexing cadence

- **Answer-first openings** — the first paragraph should answer the core question (both humans and AI agents read this first)