Competitor Intel — daily competitor signal feed — Claude Skill
A Claude Skill for Claude Code by Gooseworks — run /competitor-intel in Claude·Updated
Automated competitor monitoring across Reddit, X, and LinkedIn
- Builds deep competitor profiles via one-time agent research
- Runs daily Reddit and Twitter signal collection
- Runs weekly LinkedIn deep-dive with founder activity
- Generates daily and weekly digest reports
- Delivers reports as markdown files and via email
Who this is for
What it does
Replace ad-hoc Google searches with a structured daily and weekly competitive intel feed.
Surface competitor frustrations on Reddit and Twitter that you can convert to outbound.
Watch what competitor founders are saying publicly to understand strategic direction.
How it works
Take competitor list as input via setup script
Run agent-driven deep research to populate profiles
Run daily Reddit and Twitter scrapers for ongoing signals
Run weekly LinkedIn deep dive on Mondays
Generate and email daily and weekly digest reports
Metrics this improves
Works with
Want to use Competitor Intelligence?
Choose how to get started.
Install and run this skill locally on your computer.
Open a terminal on your computer and paste this command:
This downloads the skill with all its files to your computer:
Add -g at the end to make it available in all your projects.
Start Claude Code, then type the command:
Competitor Intelligence
Automated competitor monitoring and intelligence gathering system.
Important: Before running any competitor intel commands, ask the user which competitors they want to track. Do not assume or hardcode competitor names.
Quick Start
One-Time: Research a New Competitor
# Create a new competitor profile (agent-driven research)
python3 competitor-intel/scripts/setup_competitor.py --name "CompanyName" --website https://example.com --slug companyname
Then run an agent research session to populate the profile.
Daily: Reddit + Twitter Monitoring
# Run daily signals collection (automated via cron)
python3 competitor-intel/scripts/run_daily.py
# Or for a specific competitor
python3 competitor-intel/scripts/run_daily.py --competitor <competitor-slug>
Weekly: Deep Dive + LinkedIn
# Run weekly deep dive (Monday mornings)
python3 competitor-intel/scripts/run_weekly.py
Generate Report
# Generate daily or weekly report
python3 competitor-intel/scripts/generate_report.py --type daily --date 2026-02-21
python3 competitor-intel/scripts/generate_report.py --type weekly --date 2026-02-21
Data Structure
competitor-intel/competitors/[slug]/profile.md- Deep research profile per competitorcompetitor-intel/competitors/[slug]/snapshots/- Point-in-time weekly snapshotscompetitor-intel/reports/- Daily and weekly digest reportscompetitor-intel/raw-data/- Raw JSON from Reddit/Twitter scrapescompetitor-intel/config.json- Global configuration
Tracked Competitors
Competitors are configured per-project. Ask the user which competitors to track, then add them here:
| Competitor | Slug | Status |
|---|---|---|
| (ask user) | — | — |
Automation
- Daily (8 AM PT): Reddit + Twitter scrape → daily report → email
- Weekly (Monday 9 AM PT): Deep dive + LinkedIn + weekly report → email
Dependencies
- Apify API (Reddit + Twitter scraping) - token in config.json
- Web search (LinkedIn founder tracking, one-time research)
- AgentMail (report delivery)