The Problem
App Store Optimization (ASO) is one of those tasks that everyone knows they should do systematically but almost nobody actually treats like engineering work. Most mobile teams update their Play Store listing once per release, write a few keywords they think will work, and hope for the best. The result is leaving organic installs on the table -- and worse, having no data to tell you what you're missing.
For CCFish, a real-time fish price comparison app with a global user base, ASO isn't optional. With competitors popping up in multiple regions, the Play Store listing needs constant iteration. The question is: how do you run ASO experiments without a dedicated marketing ops person manually editing metadata every week?
The Solution: CI/CD-Driven Meta Testing
CCFish treats its Play Store listing as code. The app name, short description, full description, and keyword list all live in a YAML file in the repository. When that file changes in a pull request, the CI pipeline automatically:
- Validates the metadata against Google's character limits per field
- Generates a diff showing exactly what changed vs the current live listing
- Runs a keyword density analysis against top 10 competitor listings
- Posts a preview comment to the PR with expected search rank impact
The pipeline doesn't push changes live automatically -- that would be reckless. Instead, it creates a prepared store listing update task. A single approval click deploys the new metadata to the Play Store's "Draft" state, ready for the next release.
Architecture Overview
Here's how the pipeline connects:
```
Git Push -> PR Created -> CI Trigger
|
+-> Validate metadata YAML
+-> Check character constraints
+-> Scrape competitor keywords
+-> Generate keyword overlap matrix
+-> Post PR comment with summary
|
+-> [Manual Approval] -> Deploy to Play Store Draft
Keyword Gap Analysis
The most powerful part of this pipeline is the automated competitor keyword gap analysis. Every time an ASO PR is opened, the pipeline:
1. Downloads the current Play Store listing text for the top 10 competitors
2. Extracts keyword tokens from all 10 listings plus CCFish's proposed listing
3. Identifies keywords competitors rank for that CCFish doesn't have
4. Suggests adding high-volume, low-competition keywords to the metadata
This turns what would be a manual 2-hour research session into a 30-second CI check. The keyword suggestion output looks like this:
```
Keyword Gap Report for PR #347
High priority: 'fish price tracker' (vol: 2400, competitors: 3)
High priority: 'seafood market near me' (vol: 1800, competitors: 5)
Medium: 'local fish prices' (vol: 900, competitors: 2)
Missing from current listing: 12 keywords with 500+ monthly volume
```
Results After 3 Months
CCFish ran this pipeline for one quarter. The measurable results:
| Metric | Before | After | Change |
|--------|--------|-------|--------|
| Keywords in top 10 | 14 | 31 | +121% |
| Organic installs/week | 420 | 820 | +95% |
| ASO change frequency | 1/month | 2/week | 8x faster |
| Time per experiment | 3 hours | 5 minutes | 97% faster |
The 8x increase in experiment velocity is the key metric. When ASO changes cost nothing to propose and validate, the team iterates more. And when you iterate more, you find combinations that work.
Key Takeaways
- Store your Play Store metadata as YAML in your repo -- it makes ASO a code review, not a guessing game
- Automate competitor keyword scraping in CI -- it costs pennies per run and saves hours per experiment
- Never push meta changes live automatically -- always require a human approval gate
- The pipeline pays for itself in the first week of organic install growth
- This pattern works for iOS too -- Apple's App Store Connect API supports the same workflow
CCFish proves that ASO doesn't need a dedicated marketing ops person. With the right CI/CD pipeline, a single developer can run more experiments per week than a traditional team running manual updates.
Implementation: GitHub Actions Workflow
The actual workflow file is surprisingly compact -- under 100 lines of YAML. CCFish uses a custom GitHub Action called `aso-meta-validator` that wraps the Google Play Developer API for metadata validation and the Google Custom Search API for competitor keyword extraction.
```yaml
name: ASO Meta Check
on:
pull_request:
paths:
- 'store/metadata.yaml'
jobs:
validate-and-analyze:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Validate metadata
uses: ccfish/aso-meta-validator@v1
with:
metadata-file: store/metadata.yaml
play-console-service-account: ${{ secrets.PLAY_SERVICE_ACCOUNT }}
- name: Keyword gap analysis
uses: ccfish/aso-keyword-scraper@v1
with:
competitors: '["fishbrain","fishidy","anglersapp"]'
target-locale: en-US
- name: Post PR comment
uses: ccfish/aso-pr-commenter@v1
```
The `aso-keyword-scraper` action is the secret sauce. It uses the Google Custom Search API to find competitor listings, scrapes the Play Store public pages, and builds a keyword frequency matrix. The output is a structured JSON report that gets posted as a PR comment by the third action.
Handling Multiple Locales
A subtle challenge: ASO keywords don't translate directly. A keyword that works in US English may have zero search volume in Japan or Germany. CCFish handles this by maintaining separate metadata YAML files per locale:
```
store/
metadata.yaml # Default (en-US)
metadata.ja.yaml # Japan
metadata.th.yaml # Thailand
metadata.vi.yaml # Vietnam
```
The CI pipeline validates ALL locale files simultaneously when any one of them changes. This prevents the common mistake of updating the English listing but forgetting to update the Thai version -- the pipeline flags missing translations as a validation error.
Cost Analysis
Running this pipeline costs approximately $0.03 per CI run (GitHub Actions compute) plus $5/month for the Custom Search API access. For an app generating 800+ organic installs per week, that's roughly $0.002 per install -- compared to $0.50-$1.50 per install through paid UA campaigns. The ROI is approximately 250x on the automation investment alone.