技能 市场营销 SerpApi 搜索排名监控与自动化

SerpApi 搜索排名监控与自动化

v20260423
serpapi-webhooks-events
本技能利用SerpApi实现全面的搜索结果页面(SERP)监控。它支持异步搜索结果的轮询获取,并构建了关键词排名跟踪流水线。可用于SEO工具、竞品分析,实现自动化、定期的搜索表现监测,追踪关键词排名变化。
获取技能
84 次下载
概览

SerpApi Webhooks & Events

Overview

SerpApi does not have traditional webhooks, but supports async searches and the Searches Archive API. Build SERP monitoring by combining scheduled searches with change detection. Common use case: track keyword rankings over time.

Instructions

Step 1: Async Search with Polling

import serpapi, os, time

client = serpapi.Client(api_key=os.environ["SERPAPI_API_KEY"])

# Submit async search (returns immediately)
result = client.search(engine="google", q="your keyword", async_search=True)
search_id = result["search_metadata"]["id"]
print(f"Submitted: {search_id}")

# Poll for completion
while True:
    archived = client.search(engine="google", search_id=search_id)
    status = archived["search_metadata"]["status"]
    if status == "Success":
        break
    elif status == "Error":
        raise Exception(f"Search failed: {archived.get('error')}")
    time.sleep(2)

print(f"Results: {len(archived.get('organic_results', []))}")

Step 2: SERP Monitoring Pipeline

import json, hashlib
from datetime import datetime

class SerpMonitor:
    def __init__(self, client, db):
        self.client = client
        self.db = db

    def track_keyword(self, keyword: str, domain: str):
        """Track a domain's ranking position for a keyword."""
        result = self.client.search(engine="google", q=keyword, num=100)
        organic = result.get("organic_results", [])

        position = None
        for r in organic:
            if domain in r.get("link", ""):
                position = r["position"]
                break

        self.db.insert({
            "keyword": keyword,
            "domain": domain,
            "position": position,  # None if not in top 100
            "total_results": result.get("search_information", {}).get("total_results"),
            "checked_at": datetime.utcnow().isoformat(),
            "search_id": result["search_metadata"]["id"],
        })

        return position

    def detect_changes(self, keyword: str, domain: str):
        """Compare current vs previous ranking."""
        current = self.track_keyword(keyword, domain)
        previous = self.db.get_previous_position(keyword, domain)

        if previous and current:
            change = previous - current  # Positive = improved
            if abs(change) >= 3:
                self.notify(f"Ranking change for '{keyword}': {previous} -> {current} ({'+' if change > 0 else ''}{change})")

Step 3: Scheduled Monitoring (Cron)

// Run daily keyword tracking
import cron from 'node-cron';
import { getJson } from 'serpapi';

const keywords = ['react framework', 'next.js tutorial', 'typescript guide'];
const targetDomain = 'yoursite.com';

cron.schedule('0 8 * * *', async () => { // Daily at 8 AM
  for (const keyword of keywords) {
    const result = await getJson({
      engine: 'google', q: keyword, num: 100,
      api_key: process.env.SERPAPI_API_KEY,
    });

    const position = result.organic_results?.findIndex(
      (r: any) => r.link?.includes(targetDomain)
    );

    console.log(`${keyword}: Position ${position >= 0 ? position + 1 : 'Not found'}`);
    // Save to database, send alerts on changes
  }
});

Error Handling

Issue Cause Solution
Async search never completes Server issue Timeout after 60s, retry
Position tracking uses many credits 100 results per search Run daily not hourly
Ranking fluctuates Normal SERP volatility Track 7-day moving average

Resources

Next Steps

For performance optimization, see serpapi-performance-tuning.

信息
Category 市场营销
Name serpapi-webhooks-events
版本 v20260423
大小 4.23KB
更新时间 2026-04-28
语言