DataForSEO has the cleanest SERP data I've found. No UI overhead. Just API calls and JSON responses.
This tutorial assumes you can make HTTP requests. If you can't, skip to the "why you should learn" section.
Why DataForSEO instead of SEMrush or Ahrefs
SEMrush is better for UI-based competitor analysis. Ahrefs is better for backlink research. DataForSEO is better for programmatic keyword research at scale.
If you're building an agent or a custom dashboard, you need clean API access. SEMrush and Ahrefs have APIs, but DataForSEO's is more straightforward and cheaper.
Setup
1. Sign up at dataforseo.com 2. Go to your account settings and grab your API credentials (username and password) 3. Your endpoint is always https://api.dataforseo.com/v3/
That's it. No OAuth nonsense.
Your first query: Get SERP data for a keyword
`bash curl -s -u "username:password" \ -H "Content-Type: application/json" \ -X POST \ https://api.dataforseo.com/v3/serp/google/organic/live \ -d '{ "keyword": "best travel credit card", "location_code": 2840, "language_code": "en", "depth": 100 }' | jq '.' `
What this does:
location_code: 2840 is the USThe response includes:
Takes about 2 seconds. Costs $0.01 per query.
Your second query: Get keyword metrics
`bash curl -s -u "username:password" \ -H "Content-Type: application/json" \ -X POST \ https://api.dataforseo.com/v3/keywordsdata/googleads/search_volume/live \ -d '{ "keywords": ["best travel credit card", "travel rewards card", "credit card for travel"] }' | jq '.' `
Response includes:
This is raw search data. Not estimated. Real Google Ads data.
Find keyword gaps (the useful part)
Here's a bash script that finds keywords your competitors rank for that you don't:
`bash #!/bin/bash
Get top 10 results for a competitor keyword
curl -s -u "username:password" \ -H "Content-Type: application/json" \ -X POST \ https://api.dataforseo.com/v3/serp/google/organic/live \ -d '{ "keyword": "travel credit card", "location_code": 2840 }' | jq -r '.tasks[0].result[0].items[].url' > competitor_urls.txt
For each URL in the results, get their top keywords
while IFS= read -r url; do curl -s -u "username:password" \ -H "Content-Type: application/json" \ -X POST \ https://api.dataforseo.com/v3/domain_analytics/competitors/live \ -d "{\"target\": \"$url\"}" | jq '.tasks[0].result[0].keywords[]' >> competitor_keywords.txt done < competitor_urls.txt
Now find keywords they rank for that are high-volume and low-difficulty
cat competitorkeywords.txt | jq 'select(.searchvolume > 100 and .competition_level < 0.5)' | head -20 `
This finds low-competition keywords that competitors are already ranking for. You can basically steal their strategy.
The real power: Batch queries
`bash #!/bin/bash
KEYWORDS=("travel credit card" "best credit card" "rewards card" "no annual fee card" "cashback credit card")
for keyword in "${KEYWORDS[@]}"; do curl -s -u "username:password" \ -H "Content-Type: application/json" \ -X POST \ https://api.dataforseo.com/v3/serp/google/organic/live \ -d "{ \"keyword\": \"$keyword\", \"location_code\": 2840 }" | jq '.tasks[0].result[]' >> allkeywordsdata.json
sleep 1 # Be respectful with API calls done
Aggregate and analyze
cat allkeywordsdata.json | jq -s 'group_by(.type) | map({type: .[0].type, count: length})' `
This gives you SERP data for multiple keywords in one go. Takes 5 minutes to run. Gives you hundreds of data points for $0.10.
Practical example: Find content opportunities
`bash #!/bin/bash
Query keyword data for your target niche
curl -s -u "username:password" \ -H "Content-Type: application/json" \ -X POST \ https://api.dataforseo.com/v3/keywordsdata/googleads/search_volume/live \ -d '{ "keywords": ["seo for beginners", "seo tutorial", "how to do seo", "seo tips"] }' | jq '.tasks[0].result[] | select(.searchvolume > 50 and .searchvolume < 500) | {keyword, search_volume}' `
This finds keywords with real search volume but not so much that they're impossible to rank for. Sweet spot for content that will actually drive traffic.
Real numbers from my workflow
I run this query every week:
That spreadsheet feeds my content calendar. It tells me what to write next.
Building a full keyword research system
Here's what I've built on top of DataForSEO:
Weekly report generation: `bash #!/bin/bash
Pull SERP data for all keywords
Generate keyword clusters
Identify rank changes from last week
Flag opportunities (high volume, low difficulty, not ranking)
Email report to clients
`
Takes 5 minutes to run, generates a report that would take me 3 hours to create manually.
Quarterly strategy refresh:
This used to be a $2,000-3,000 consulting engagement. Now it's automated.
Monthly dashboard: Airtable dashboard that shows:
Updated automatically every 24 hours via DataForSEO API.
Why this matters
You don't need a $300/month dashboard to understand the keyword landscape. You need: 1. Clean API access to SERP data 2. The ability to make HTTP requests (bash, Python, whatever) 3. 15 minutes to write a script
DataForSEO gives you that for $20-40/month depending on volume.
Everything else you see sold as a "keyword research tool" is just UI on top of data that's available cheaper elsewhere.
The people charging $200-500/month for "keyword research platforms" are taking data that costs $20/month to access and adding a dashboard interface. They're selling convenience and UI, not insights.
You can skip the convenience layer and go direct to the data. It requires a bit of technical chops, but the savings are real.
Learn the API. You'll save money and get better insights because you're not filtered through someone else's dashboard design.