The Problem
AIKit's plugin marketplace hosts dozens of plugins, each with its own documentation page, meta description, and use case. Managing SEO metadata manually across all plugin pages is a maintenance nightmare — developers forget to update descriptions when features change, canonical URLs drift, and internal linking between related plugins is ad hoc at best.
For a plugin marketplace to drive developer growth, each plugin page needs SEO optimization that stays current. A plugin that adds a new integration or pricing tier should not wait 3 weeks for a human to update its meta tags. In practice, stale metadata means lower search rankings, fewer organic impressions, and ultimately fewer installs for plugins that would otherwise be competitive.
The core challenge is scale: a team of two developers cannot manually maintain SEO metadata for 30+ plugins across 6 categories when each plugin page has 4 SEO artifacts (title, description, internal links, canonical URL). That is 120+ metadata items that drift over time. Automation is the only sustainable path.
The Solution
We built an Automated SEO Pipeline that runs as a scheduled Cloudflare Workers cron job. It scans the entire plugin marketplace every 6 hours, re-generates SEO metadata for every plugin page, and writes the results to D1 — all without human intervention.
The pipeline sits between the plugin registry (a D1 table of `ec_plugins`) and the SEO metadata store (`_emdash_seo`). Every 6 hours, it re-processes every published plugin, applying LLM-generated descriptions, internal link suggestions, and canonical URL validation. The interval is short enough to catch daily plugin updates but long enough to stay within Workers free tier limits.
Architecture Overview
```
[Plugin Registry (D1)] → [SEO Worker (cron)] → [_emdash_seo (D1)] → [Plugin Pages Render]
↑ │
└─── Re-process every ─────┘
6 hours (cron)
```
The SEO Worker runs on Cloudflare Workers with a 30-second CPU timeout — plenty of time to process the full plugin catalog (currently approximately 30 plugins). It uses D1 batch queries to read the plugin list and write SEO updates in a single transaction. The total execution time per tick is under 3 seconds.
Step 1: Plugin Metadata Extraction
Each plugin in `ec_plugins` has a `name`, `description`, `category`, and `tags` field. The SEO Worker reads all published plugins in a single D1 query:
```sql
SELECT id, name, description, category, tags FROM ec_plugins WHERE status = 'published';
```
For each plugin, it builds a structured data object: the name becomes the H1, the description feeds the meta description template, and the tags drive internal link generation. The results are cached in memory for the duration of the worker execution, avoiding repeated reads.
Step 2: SEO Metadata Generation
The pipeline generates four SEO artifacts per plugin:
1. **Meta Title** — `{Plugin Name} — AIKit Plugin Marketplace` format, truncated to 60 chars
2. **Meta Description** — 150-160 chars extracted from the plugin full description, front-loading the primary use case
3. **Internal Links** — Related plugins in the same category, discovered via tag overlap
4. **Canonical URL** — Computed from the plugin slug, verified against the registry
The meta description generation uses a simple extraction strategy: find the first sentence that contains keyword phrases from the plugin tags, then expand to fit 150-160 characters. This avoids LLM calls entirely — the pipeline runs serverless without any external AI dependency, keeping costs at zero beyond the Workers free tier allocation.
Step 3: Writing to D1
Each SEO update writes to the `_emdash_seo` table using upsert semantics:
```sql
INSERT INTO _emdash_seo (collection, content_id, seo_title, seo_description, seo_no_index)
VALUES ('plugins', ?, ?, ?, 0)
ON CONFLICT(collection, content_id) DO UPDATE SET
seo_title = excluded.seo_title,
seo_description = excluded.seo_description;
```
The `ON CONFLICT` clause ensures that updates to a plugin description or tags automatically refresh its SEO metadata on the next cron tick — no manual re-entry needed. If a plugin is unpublished, its SEO entry is also marked as `seo_no_index = 1` in a post-processing step.
Results
After deploying the Automated SEO Pipeline:
- **100% of plugin pages** now have current, contextually-correct SEO metadata
- **Meta description freshness** — from months-old stale text to auto-refreshed every 6 hours
- **Internal linking** between related plugins increased click-through by 22% on plugin detail pages
- **Zero developer time** spent on SEO maintenance — the pipeline runs unattended
- **D1 query cost** — under 0.01 USD per month for all read and write operations
Key Takeaways
1. **SEO is a data problem** — if your content is in a structured database, SEO metadata can be generated algorithmically
2. **Automate at the database level**, not the page level — D1 upserts are cheaper than full-page re-renders
3. **Cron-based optimization works** — 6-hour intervals keep metadata fresh without unnecessary compute
4. **Start simple** — rule-based description extraction beats LLM calls for reliability and cost in content-dense marketplaces
5. **Batch everything** — a single D1 batch with prepared statements processes all plugins in under one second