Identify programmatic SEO page patterns worth building for your product — vs/ pages, integrations/, for-{industry}/, alternatives-to/, use-cases/ — and design the template structure, data model, and priority order. Outputs a complete pSEO blueprint with URL patterns, title templates, content frameworks, and data sources per variable.
npx gooseworks install --claude # Then in your agent: /gooseworks <prompt> --skill programmatic-seo-planner
Programmatic SEO is how startups generate hundreds of high-intent pages from templates — "{Your Product} vs {Competitor}", "Best {Category} for {Industry}", "{Integration} + {Your Product}". This skill figures out which patterns are worth building, designs the templates, and prioritizes the buildout.
Core principle: pSEO isn't about spinning content. It's about finding a data axis (something that varies per page) where each variation has real search demand and your product has a genuine answer. This skill validates both before you invest in building.
This skill works with existing capabilities but produces significantly better results when paired with a keyword data API for bulk volume lookups across hundreds of long-tail variations.
"I can plan your programmatic SEO strategy using our existing tools. However, for the best results — especially accurate search volume data across hundreds of keyword variations — I'd recommend connecting a keyword data API."
Recommended: DataForSEO (pay-per-use, ~$0.01/keyword, no monthly minimum)
- Sign up at dataforseo.com → get API login + password
- Set
DATAFORSEO_LOGINandDATAFORSEO_PASSWORDenv varsAlternatives that also work:
- Keywords Everywhere API ($1 per 10 credits = 100K keywords, very cheap) → set
KEYWORDS_EVERYWHERE_API_KEY- SEMrush API (if you have a subscription) → set
SEMRUSH_API_KEY- Ahrefs API (if you have a subscription) → set
AHREFS_API_TOKEN"Want to use one of these, or should I proceed with baseline mode? Baseline still works — I'll use our Apify-based SEO tools for top-level data, though volume estimates for individual long-tail patterns will be less precise."
seo-domain-analyzer (Apify) for domain-level metrics and web_search for pattern validation. Volume estimates are directional, not exact. Still produces a solid blueprint — just less granular on per-variation demand.Run site-content-catalog for each competitor:
python3 skills/site-content-catalog/scripts/catalog_site.py \
--url "<competitor_url>" \
--output jsonAnalyze URL structures for programmatic patterns:
/vs/, /compare/, /alternatives/ — Comparison pages/integrations/, /connect/, /apps/ — Integration pages/for-{industry}/, /solutions/, /use-cases/ — Vertical/use-case pages/templates/, /examples/, /glossary/ — Resource pages/tools/, /calculators/, /generators/ — Tool pagesFor each pattern found, note:
Based on your product category, evaluate these standard pSEO pattern types:
| Pattern Type | URL Structure | Data Axis | Best For |
|---|---|---|---|
| Versus/Comparison | /vs/{competitor} | Competitor names | High-intent, bottom-funnel |
| Alternatives | /alternatives/{competitor} | Competitor names | Displacement queries |
| Integrations | /integrations/{tool} | Tool/app names | Mid-funnel, ecosystem |
| Industry verticals | /for/{industry} | Industry names | Vertical targeting |
| Use cases | /use-cases/{use-case} | Job-to-be-done | Mid-funnel, discovery |
| Glossary/Definitions | /glossary/{term} | Industry terms | Top-funnel, authority |
| Templates/Examples | /templates/{type} | Template types | Mid-funnel, utility |
| Tools/Calculators | /tools/{tool-name} | Tool functions | Top-funnel, link bait |
| Location pages | /{service}-in-{city} | City/region names | Local-intent (if relevant) |
Run reddit-scraper to find how ICP talks about the problem:
python3 skills/reddit-scraper/scripts/scrape_reddit.py \
--query "<category> OR <problem keyword>" \
--subreddits "<relevant_subs>" \
--sort relevance --time year --limit 50Extract:
Enhanced mode (DataForSEO / Keywords Everywhere / SEMrush / Ahrefs):
For each candidate pattern, generate 20-50 keyword variations and pull exact volumes:
Aggregate per pattern type:
Baseline mode:
Use seo-domain-analyzer for competitor domain metrics, web_search to spot-check if key variations have SERP results (indicating real demand), and manual estimation based on:
Score each candidate pattern on:
| Factor | Weight | How to Assess |
|---|---|---|
| Search demand | 30% | Total addressable volume across all variations |
| Intent quality | 25% | How close to purchase decision? (vs/ = high, glossary = low) |
| Template feasibility | 20% | Can you create a useful, differentiated page from a template? |
| Data availability | 15% | Can you programmatically source the data that varies? |
| Competitive gap | 10% | Are competitors NOT doing this pattern, or doing it poorly? |
Score each pattern 0-100. Rank by score.
For each pattern scoring 50+, validate:
Data source — Where does the variable data come from?
Content differentiation — Can each page offer genuine value, or will they be thin?
Technical feasibility — Can your CMS generate these at scale?
For each pattern being built, design:
## Pattern: [vs/{competitor}]
### URL Structure
/vs/{competitor-slug}
### Title Template
{Your Product} vs {Competitor} — [Year] Comparison | {Your Brand}
### Meta Description Template
Compare {Your Product} and {Competitor} side-by-side. See pricing, features,
pros/cons, and which is better for {ICP description}.
### H1
{Your Product} vs {Competitor}: Honest Comparison
### Page Sections (content framework)
1. **TL;DR** — 3-sentence summary of key differences (above fold)
2. **Quick comparison table** — Feature matrix with checkmarks
3. **Detailed comparison** — 4-6 key dimensions, 2-3 paragraphs each
4. **Pricing comparison** — Plan-by-plan breakdown
5. **Who should choose {Your Product}** — ICP fit description
6. **Who should choose {Competitor}** — Fair assessment
7. **What real users say** — Review quotes from both sides
8. **CTA** — Trial/demo prompt
### Data Required Per Page
- competitor_name: string
- competitor_slug: string
- competitor_features: array (from their website/docs)
- competitor_pricing: object (from pricing page)
- competitor_reviews: array (from G2/Capterra)
- your_differentiators: array (per competitor)
### Content Guidelines
- Minimum 1,500 words per page
- Must include at least one unique insight (not just feature lists)
- Use actual screenshots or diagrams where possible
- Update quarterly (pricing/features change)Repeat for each pattern type.
Build the implementation plan:
| Pattern | Score | Est. Pages | Volume/Page | Total Volume | Build Effort | Priority |
|---|---|---|---|---|---|---|
| vs/ comparisons | 85 | 15 | 300 | 4,500 | Medium | P0 — Build first |
| integrations/ | 72 | 40 | 80 | 3,200 | High | P1 — Build second |
| for-{industry}/ | 68 | 12 | 200 | 2,400 | Medium | P1 — Build second |
| alternatives-to/ | 65 | 8 | 250 | 2,000 | Low | P0 — Quick win |
| glossary/ | 45 | 100 | 40 | 4,000 | Low | P2 — Authority play |
Month 1: Quick wins
Month 2: Scale
Month 3: Expand
For each pattern, specify exactly where the variable data comes from:
# Programmatic SEO Blueprint — [Product Name] — [DATE]
## Executive Summary
- [N] patterns evaluated, [M] recommended for buildout
- Total addressable search volume: [X]/month
- Estimated pages to build: [Y]
- Recommended buildout timeline: [Z] months
---
## Pattern Analysis (ranked by priority)
### P0: [Pattern Name]
- URL structure: [pattern]
- Pages to build: [N]
- Total monthly volume: [X]
- Template blueprint: [see below]
- Data source: [where variable data comes from]
- Build effort: [Low/Medium/High]
- Expected time to rank: [2-4 months / 4-8 months / etc.]
[Full template blueprint per Phase 3]
### P1: [Pattern Name]
...
---
## Technical Requirements
- CMS: [capabilities needed]
- Data pipeline: [how to source variable data]
- Update cadence: [how often to refresh]
---
## Quick-Start Guide
1. Start with [pattern] — lowest effort, highest intent
2. Create [N] pages using the template above
3. Monitor for [X] weeks before expanding
4. ...Save to clients/<client-name>/seo/pseo-blueprint-[YYYY-MM-DD].md.
| Component | Cost |
|---|---|
| Site catalog per competitor (Apify) | ~$0.05-0.10 |
| Reddit scraper | ~$0.05-0.10 |
| SEO domain analyzer | ~$0.10-0.20 |
| DataForSEO keyword lookups (enhanced) | ~$0.50-2.00 (depending on variation count) |
| Keywords Everywhere (enhanced alt) | ~$0.01-0.05 |
| Analysis | Free (LLM reasoning) |
| Total (baseline) | ~$0.20-0.50 |
| Total (enhanced) | ~$0.70-2.50 |
APIFY_API_TOKEN env varsite-content-catalog, seo-domain-analyzer, reddit-scraperDATAFORSEO_LOGIN + DATAFORSEO_PASSWORD), Keywords Everywhere (KEYWORDS_EVERYWHERE_API_KEY), SEMrush (SEMRUSH_API_KEY), or Ahrefs (AHREFS_API_TOKEN)Check and improve your brand's visibility across AI search engines (ChatGPT, Perplexity, Gemini, Grok, Claude, DeepSeek). Set up tracking, run visibility analyses, audit your website for AI readability, and get actionable recommendations. Uses the npx goose-aeo@latest CLI.
Extract competitor and customer intelligence from any company's landing page HTML. Discovers tech stack, analytics tools, ad pixels, customer logos, SEO metadata, CTAs, hidden elements, and more. No API keys required.
Discover all customers of a given company by scanning websites, case studies, review sites, press, social media, job postings, and more. Use when you need competitive intelligence on who a company sells to.