Google
|
por Autom Team

How To Scrape Google Search Results using Python in 2026

Scraping Google search results can get you a lot of data, and that data can help you with many kinds of analysis.

Google has the largest indexed data on the web, which makes this information useful in one way or another. But to actually access this data, you need a proper method. That is where scraping steps in. And with Python, we can take it to the next level by handling the extraction in the right way.

In this blog we will walk you through how to scrape Google search results using Python. We will begin with a simple script that you can run on your laptop. After that we will look at the bigger picture which includes scaling, blocks, proxies, and how to deal with all of these challenges. In the end we will introduce an API route for the time when you are ready to go serious. Yes, the same one we build at Autom.

Let us get started.

Requirements for Scraping Google Search Results

Before you jump into code it is helpful to check you have all the tools ready. If you skip this then you risk getting stuck halfway.

1. Python installed
Make sure you have Python installed on your computer. If not download it and install it. Choose a folder where you will keep your scripts.

2. Install libraries
You will need a few Python libraries to make your scraper work. At minimum you will need:

  • selenium — for browser automation so the script can act like a real user.
  • beautifulsoup4 — for parsing the HTML you get back.
  • pandas — for storing the scraped results into a CSV or table.
    You can install them with:
pip install selenium beautifulsoup4 pandas

3. Browser driver
Since you’re using Selenium you also need the driver for your browser (for example ChromeDriver if you use Chrome). The driver must match your browser version and be accessible from your script.

4. Basic folder structure
Create a folder for the project (for example google_scraper). Inside it create a Python file (for example search.py). This keeps things organised.

5. Understand Google’s rendering
Google uses JavaScript and dynamic loading. Simple HTTP GET requests may not work reliably any more. That means the scraper needs to act more like a real browser. Using Selenium helps with that.

6. Handle anti-bot and usage limits
Even with all tools in place you may still face blocks or CAPTCHAs if you send too many requests too fast. Be ready to include delays, use proxies if needed, or switch to a service/API if you hit scale.

Let’s Start Scraping Google Search Results with Python

Now that everything is ready, let us start scraping. We will begin with a very simple script. The goal is to load the Google search page, enter a query, wait for the results to appear, and then extract the data.

First, import the libraries.


from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from bs4 import BeautifulSoup
import time
import pandas as pd

Next, set up the browser.


driver = webdriver.Chrome()
driver.get("https://www.google.com")
time.sleep(2)

Once the page loads, search for something.


search_box = driver.find_element("name", "q")
search_box.send_keys("best python tutorials")
search_box.send_keys(Keys.RETURN)
time.sleep(3)

After the results appear, get the page source and pass it to BeautifulSoup.


page_source = driver.page_source
soup = BeautifulSoup(page_source, "html.parser")

Now extract each result. For most queries you will find the results inside <div class="g">.


results = soup.select("div.g")

data = []

for item in results:
    title = item.select_one("h3")
    link = item.select_one("a")

    if title and link:
        data.append({
            "title": title.get_text(),
            "link": link.get("href")
        })

Save your data.


df = pd.DataFrame(data)
df.to_csv("google_results.csv", index=False)

Close the browser.


driver.quit()

At this point you have a basic scraper that fetches titles and links from a search page. It is not perfect and it will break at scale, but it helps you understand how Google results can be extracted using Python.

✅ What the code does correctly

  • Opens Google
  • Searches for a query
  • Waits for results
  • Parses the HTML
  • Extracts titles and links
  • Saves them into a CSV
  • Closes the browser

❌ What it does not handle

If you want a fully complete scraper like real-world tools, then it still does not include:

1. Pagination
It only scrapes page 1.
Google has multiple pages.
A complete scraper loops through pages.

2. Error handling
If the selector fails or Google changes layout, the script will stop.

3. SERP features
It does not extract:

  • People Also Ask
  • Featured Snippets
  • Knowledge Panel
  • Videos
  • Sitelinks
  • Local Pack
  • Top Stories

Limitations of Scraping Google Search Results using Python

Scraping Google with Python works well when you do it for a few keywords. It is simple to set up and easy to run. But the problems start when you try to scale it or use it for regular data collection.

The biggest limitation is blocking. Google does not like automated scraping. If you send too many requests or repeat the same pattern, Google will show a CAPTCHA or stop loading new results. This can happen within minutes.

Another challenge is speed. Selenium takes time to open pages and load everything. If you want to scrape hundreds of keywords, the process becomes very slow. Running multiple browsers on a normal laptop is also difficult.

You also need to maintain everything yourself. When the browser updates, the driver needs an update. When Google changes the structure of the page, your selectors break. You then need to fix the script again.

Scaling is another issue. A small script can scrape a few queries. But scraping thousands of keywords every day requires proxies, better hardware, and more monitoring. Managing all of this quickly becomes painful.

Because of these limitations, Python scraping is good for learning and small side projects. It becomes hard to trust for production use.

A Better Way: Using the Autom Google Search API

Now that we know how to scrape google search results using Python. But if we want to scale the process without getting blocked, the Autom Google Search API.

Autom Google Search API

The API offers some free credits to test it, so that you can spin it first before committing to a paid plan. You can also integrate this API with no-code tools, if you are not a developer.

Here’s the Python code for using Autom API:


import requests
import json

def search_google(query: str, api_key: str, results_per_page: int = 10):
    url = "https://api.autm.dev/google/search"
    params = {
        "q": query,
        "api_key": api_key,
        "results_per_page": results_per_page
    }
    response = requests.get(url, params=params)
    response.raise_for_status()
    return response.json()

if __name__ == "__main__":
    API_KEY = "YOUR_API_KEY"
    query = "best python tutorials"
    result = search_google(query, API_KEY, results_per_page=10)

    # print the full JSON response for inspection
    print(json.dumps(result, indent=2))

    # Extract titles + links
    data = []
    for item in result.get("organic_results", []):
        title = item.get("title")
        link = item.get("link")
        if title and link:
            data.append({"title": title, "link": link})

    # Simple output
    for row in data:
        print(f"{row['title']} -> {row['link']}")

Explanation

  • We import requests to make the HTTP call.
  • search_google is a function that sends a query to the Autom API, passing the API key, query text, and desired results per page.
  • We then parse the JSON response, dump it for inspection, and iterate over the organic_results (replace if the API’s key is different).
  • Finally we print title → link pairs; you can expand this to include ranking, snippet, or other data returned by the API.

You can refer to the documentation here to understand how this API works & what parameters can be used.

6 Use Cases of Scraping Google Search Results

Scraping Google search results can support real work in many areas. Here are some common ways people use this data.

1. Keyword and topic research
You can see which pages rank for a term, how strong the competition is, and what type of content Google prefers. This helps you decide which topics are worth your time and which ones you can skip.

2. Rank tracking
By scraping the same set of keywords on a regular basis you can track how your pages move up or down in search. This is useful when you want to see the impact of a new page, a content update, or a campaign.

3. Competitor monitoring
You can keep an eye on which domains keep showing up for your main keywords. Over time this gives you a clear picture of who you really compete with and what new pages they are launching.

4. SERP feature analysis
Search results are no longer only ten blue links. You can use scraped data to check how often you see featured snippets, people also ask, videos, or news boxes for your keywords. This tells you what kind of content format you should try.

5. Building internal tools and dashboards
Many teams pull Google search data into their own tools. For example a simple dashboard that shows top results, changes in ranking, or new competitors for each keyword group.

6. Local and niche research
You can scrape results from different locations to understand how search changes by country or city. This is helpful for local businesses and apps that serve users in specific regions.

All of these use cases are possible with a basic Python script for small tests. When you need to run them at scale or on a schedule, using the Autom Google Search API makes the whole process much easier to manage.

Conclusion

Scraping is getting tougher & tougher day by day. And if you or your company are serious about getting this data, you should always use a Google Search Scraper API.

Not only that it saves you from blockage, it also avoid you the time & cost for building your own scraper from scratch. Even if you build one, chances are that you have to maintain it as Google keeps changing its page layout.

In any case, if you would need our help to integrate Automn APIs to your workflow, do let us know, we are a ‘Hi’ away on chat!

SERP API

Discover why Autom is the preferred API provider for developers.