Google sues SerpAPI: What SearchGuard reveals about bot detection
The lawsuit that exposed Google's anti-bot secrets
On December 19, 2025, Google filed a lawsuit against Texas-based SerpAPI LLC, alleging the company circumvented SearchGuard to scrape "hundreds of millions" of queries daily from Google Search. The legal basis is DMCA Section 1201-the anti-circumvention provision of copyright law.
Security researchers have now fully decrypted version 41 of the BotGuard script, providing an unprecedented look at how Google distinguishes humans from automated scrapers.
What is SearchGuard?
SearchGuard is the internal name for Google's BotGuard system when applied to Google Search. BotGuard (internally called "Web Application Attestation") has protected Google services since 2013-YouTube, reCAPTCHA v3, Google Maps, and more.
Unlike traditional CAPTCHAs, SearchGuard operates invisibly. It continuously collects behavioral signals and analyzes them using statistical algorithms-all without the user knowing.
The code runs inside a bytecode virtual machine with 512 registers, specifically designed to resist reverse engineering.
How Google detects bots
The system tracks four categories of behavior in real time:
Mouse movements
Humans don't move cursors in straight lines. We follow natural curves with acceleration and deceleration.
Google tracks:
- Trajectory (path shape)
- Velocity (speed)
- Acceleration (speed changes)
- Jitter (micro-tremors)
Detection threshold: Mouse velocity variance below 10 flags as bot behavior. Normal human variance falls between 50-500.
Keyboard rhythm
Everyone has a unique typing signature. Google measures:
- Inter-key intervals
- Key press duration
- Error patterns
- Pauses after punctuation
Detection threshold: Key press duration variance under 5ms indicates automation. Normal human typing shows 20-50ms variance.
Scroll behavior
Natural scrolling has variable velocity, direction changes, and momentum-based deceleration. Programmatic scrolling is often too smooth or perfectly uniform.
Detection threshold: Scroll delta variance under 5px suggests bot activity. Humans typically show 20-100px variance.
Timing jitter
This is the killer signal. Humans are inconsistent.
Google uses Welford's algorithm to calculate variance in real-time with constant memory usage. If your action intervals have near-zero variance, you're flagged.
Detection threshold: Event counts exceeding 200 per second indicate automation. Normal human interaction generates 10-50 events per second.
The 100+ DOM elements Google monitors
Beyond behavior, SearchGuard fingerprints your browser environment by monitoring over 100 HTML elements:
- High-priority elements: BUTTON, INPUT (bots often target interactive elements)
- Structure: ARTICLE, SECTION, NAV, ASIDE, HEADER, FOOTER, MAIN, DIV
- Interactive: DETAILS, SUMMARY, MENU, DIALOG
It also collects extensive browser and device data:
- Navigator properties (userAgent, platform, hardwareConcurrency, deviceMemory)
- Screen properties (dimensions, colorDepth, devicePixelRatio)
- Performance timing precision
- Visibility state (document.hidden, hasFocus())
WebDriver detection
The script specifically checks for automation signatures:
navigator.webdriver(true if automated)window.chrome.runtime(absent in headless mode)- ChromeDriver signatures (
$cdc_prefixes) - Puppeteer markers (
$chrome_asyncScriptInfo) - Selenium indicators (
__selenium_unwrapped)
Why bypasses become obsolete in minutes
The script generates encrypted tokens using an ARX cipher (Addition-Rotation-XOR)-similar to Speck, a lightweight block cipher released by the NSA in 2013.
The critical discovery: the magic constant rotates. The cryptographic constant embedded in the cipher changes with every script rotation.
Observed values from security analysis:
- Timestamp 16:04:21: Constant = 1426
- Timestamp 16:24:06: Constant = 3328
The script is served from URLs with integrity hashes. When the hash changes, every client downloads a fresh version with new cryptographic parameters.
Even if you fully reverse-engineer the system, your implementation becomes invalid with the next update.
The OpenAI connection
SerpAPI isn't just any scraping company. OpenAI has been partially using Google search results scraped by SerpAPI to power ChatGPT's real-time answers. SerpAPI listed OpenAI as a customer on its website as recently as May 2024.
Google declined OpenAI's direct request to access its search index in 2024. Yet ChatGPT still needed fresh search data.
Google isn't attacking OpenAI directly-it's targeting a key link in the supply chain that feeds its main AI competitor.
The bigger picture for SERP scraping
This lawsuit follows a pattern of tightening access:
- January 2025: Google deployed SearchGuard, breaking nearly every SERP scraper overnight
- September 2025: Google removed the
num=100parameter, forcing scrapers to make 10x more requests
The combined effect: traditional scraping approaches are increasingly difficult and expensive to maintain.
If SearchGuard qualifies as a valid "technological protection measure" under DMCA, every platform could deploy similar systems with legal teeth. Under DMCA Section 1201, statutory damages range from $200 to $2,500 per circumvention act.
What this means for SEO tools
For anyone using tools that scrape SERPs:
- Higher costs: More requests needed, more sophisticated infrastructure required
- Legal risk: Third-party scrapers may face similar lawsuits
- Reliability issues: Bypasses can become obsolete within minutes
The message is clear: the old scraping playbook is over.
Official APIs remain the stable path
Google's position is effectively: "You want our data? Go through official channels."
For SEO professionals and developers who need reliable, consistent access to search data, using official APIs-or API providers with proper infrastructure-remains the most sustainable approach.
At Autom, we continue to monitor these developments and adapt our services accordingly. The landscape is changing, but the need for search data isn't going away.