Skip to content

watchdealer-pavel/seo-content-autopilot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SEO Content Autopilot

SEO Content Autopilot

Data-driven blog content pipeline for AI agents.
Keyword research → content gaps → SEO writing → GEO optimization → publish.
Works with OpenClaw, Claude Code, or any agent that reads markdown.


What It Does

Turns your Google Search Console data into published blog content — automatically.

Skill What it does
keyword-research Pulls GSC data, classifies intent, clusters topics, scores opportunities, generates content calendar
content-gap-analysis Maps your content vs. competitors, finds missing topics, identifies format gaps
blog-writer Researches, writes, optimizes, and publishes SEO + GEO articles to your CMS
geo-content-optimizer Makes content citable by AI systems (ChatGPT, Perplexity, Google AI Overviews)
seo-audit Finds quick wins from GSC data — indexing issues, CTR drops, cannibalization

No paid SEO tools required. Just GSC, optional GA4, and your AI agent.


Who This Is For

  • Solo founders / small teams who need consistent blog output but don't have a content team
  • E-commerce sites wanting data-driven content strategies tied to real search data
  • SaaS companies running content marketing on autopilot
  • SEO practitioners who want to skip the manual research grind
  • Anyone with GSC access who wants their agent to handle the content pipeline end-to-end

Impact: Sites using this pipeline typically see 3-5x content output, better keyword targeting (real data vs. guessing), and measurable organic traffic growth within 2-3 months.


Install

Any agent (OpenClaw, Claude Code, Cursor, Codex, etc.)

git clone https://github.com/watchdealer-pavel/seo-content-autopilot.git
cd seo-content-autopilot

Tell your agent to set it up

Paste this into your agent chat:

I want to set up SEO Content Autopilot for my blog.
Read SETUP.md and walk me through the configuration.

The agent will ask you ~5 questions (site URL, competitors, industry, CMS type) and configure everything.

Manual setup

Replace placeholders in skill files ({{SITE_URL}}, {{COMPETITORS}}, {{INDUSTRY}}) with your values.


Dependencies

# GSC data access (required) — v0.3.0+ recommended for quick wins, regex filters, rich results
git clone https://github.com/watchdealer-pavel/mcp-server-gsc.git
cd mcp-server-gsc && npm install && npm run build

# GA4 reporting (optional)
pip install google-analytics-data

Both require a Google Cloud service account with access to your GSC/GA4 properties. See connectors/ for setup guides.

mcp-server-gsc v0.3.0+ Tools

Tool What it does
search_analytics Core GSC query — clicks, impressions, CTR, position
enhanced_search_analytics Regex filtering + auto quick-wins flagging
detect_quick_wins Find ranking opportunities with revenue estimation
coverage_report Cross-reference sitemap vs analytics for orphaned pages
rich_results_check Audit structured data (Product, FAQ, Review)
batch_inspect Check indexing status for multiple URLs
list_sitemaps / submit_sitemap Sitemap management

Usage

Just ask your agent naturally:

"Find keyword opportunities for my blog"

"What content gaps do we have vs. competitors?"

"Write a blog article about [topic] and publish it as draft"

"Run an SEO audit — what are our quick wins?"

"Optimize this article for AI citations"

The agent reads the matching skill file and follows the workflow. No special commands needed.


Scheduling

OpenClaw

# Monthly keyword research (1st of month, 9am)
openclaw cron add "0 9 1 * *" "Run keyword research and save report" --model anthropic/claude-opus-4

# Blog articles 2x/week
openclaw cron add "0 10 * * 1,4" "Write and publish a blog article from keyword priority list" --model anthropic/claude-opus-4

GitHub Actions (Claude Code)

Pre-built workflows in scheduling/github-actions/:

  • Weekly keyword research → creates GitHub issue with report
  • Monthly content gap analysis → creates GitHub issue
  • On-demand article publishing → manual trigger

Copy to your repo: cp -r scheduling/github-actions/.github .

System cron

See scheduling/system-cron.md for crontab examples.


Data Flow

GSC Data → keyword-research → content calendar
                                    ↓
         content-gap-analysis → gap priorities
                                    ↓
                  blog-writer → draft article
                                    ↓
           geo-content-optimizer → AI-optimized
                                    ↓
                              CMS → published
                                    ↓
                  seo-audit → performance tracking → back to keyword research

Connectors

Connector Purpose Guide
GSC Search Console data (required) connectors/gsc/
GA4 Analytics traffic data (optional) connectors/ga4/
CMS Framer, WordPress, Ghost, or markdown connectors/cms/

Examples

Ready-to-use configs for different industries:


License

MIT


Built for the OpenClaw community.
Inspired by real-world SEO workflows from chronotimepieces.com.

About

Data-driven blog content pipeline for AI agents. Keyword research → content gaps → SEO writing → GEO optimization → publish. Works with OpenClaw, Claude Code, or any agent.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages