Proxy locations

Europe

North America

South America

Asia

Africa

Oceania

See all locations

Network statusCareers

Back to blog

How to Scrape Google Maps: A Comprehensive Guide

Danielius Radavicius

2023-06-095 min read
Share

In the current day and age, where public web data scraping has become a foundation for many businesses, it’s unsurprising to see that Google Maps is yet another area commonly scraped for its valuable data. In this article, we’ll discuss what this data may be and how to build a scraper that gathers it using an Oxylabs solution.

Before we get started, let’s briefly look at the legalities of scraping Google Maps. The legality of web scraping is a much-debated topic among everyone who works in the data-gathering field. It’s important to note that web scraping may be legal in cases where it’s done without breaching any laws regarding the source targets or data itself. That being said, we advise you to seek legal consultation before engaging in scraping activities of any kind. 

We’ve explored the legality of web scraping in this blog post, so feel free to check it out for a more in-depth explanation.

Why scrape Google Maps?

The core purposes of scraping Google Maps are numerous. From a research perspective, a user may want to employ a Google Maps data scraper to analyze demographic information or transportation routes. For businesses, a Google Maps scraper may be the go-to tool for competitor analysis, as it allows you to collect data on competitors' locations, customer reviews, and ratings. Gathering real estate/property listings is a possible use case as well.

Overall, this makes Google Maps data scraping a highly lucrative solution that many businesses are certain to make use of.

Should you use the official Google Maps API?

Quite a few popular websites like Twitter or Amazon provide their own APIs. Google is no exception, therefore naturally the question arises, why not use the official Google Maps API?

Let’s begin with the price. Each user gets 200$ monthly credit for API calls. Within these 200$ are:

  • Up to 40,000 Geolocation calls

  • Up to 100,000 Static Maps loads

  • Up to 28,000 Dynamic Maps loads

  • Up to 40,000 Directions calls

At first glance, this may appear as plenty, but it’s likely not. Google's API, like many other APIs, begins to charge you when the given amount is used up. Then, imagine a scenario where you use the Embed API in Directions, Views, and Search modes. Suppose your service loads a map that initiates address search through autocomplete. This singular request is now using up 2 different API calls. Add another requirement, say geolocation services for directions or distances, and now a single request is taking up 3 separate API calls. Furthermore, as your business scales, so does the daily amount of calls you’ll make, meaning after a certain point, Google Maps API becomes an unbelievably pricey solution.

Yet, the high price isn’t the only limitation of Google’s own API. There are also strict request limitations. Google’s current enforced rate limit is up to 100 requests per second.

Google is also known to implement unpredictable changes that offer little benefit to their users, such as the limits imposed in 2010.

However, products like Oxylabs' Google Maps API solution are specifically made to avoid limitations such as the ones mentioned above, which is why they’re commonly chosen instead of official APIs.

How do I extract data from Google Maps?

Before you begin

To scrape Google Maps data, you will need Oxylabs' SERP Scraper API. Sign up for Google Search Results API and take note of your username and password.

Replace USERNAME with your username and PASSWORD with your password throughout the code samples in this guide.

Setting Up Your Project Environment

Before writing code to scrape data from Google Maps, we must set up a project environment and install the necessary Python libraries.

Create a new virtual environment to separate your project dependencies from your system packages. Ensure that you have Python 3.8 or newer installed. Run the following command in a terminal:

$ python3 -m venv env

On Windows, use python instead of python3:

·   Windows: env\Scripts\activate
·   macOS/Linux: source env/bin/activate

Install the required Python libraries for this project. We'll be using beautifulsoup4, requests, and pandas. You can install them by running the following:

$ pip install beautifulsoup4 requests pandas

With your project environment set up, we're ready to start writing code to scrape Google Maps data.

Fetching Data Using the Google Scraper API

We'll be using Oxylabs' Google Search API to fetch data from Google Maps. This API allows us to send HTTP requests to Google and receive the HTML content of the search results page. For a detailed tutorial, see How to Scrape Google Search Results.

1. First, open google.com in your browser and search for "restaurants near me". You will see the search results with the restaurants' names, ratings, hours, and other data points.

2. Copy this URL. We will use Google Search Scraper API to fetch data from this URL.

3. To use Google Search Results Scraper API, we need to set the following parameters:

  • Source: This will be google.

  • URL: The URL that you copied after searching for restaurants near me.

  • geo_location: Google Scraper API allows us to use any location for search

4. Create a dictionary as follows that will contain these parameters:

payload = {
	"source": "google",
	"url": f"https://www.google.com/search?tbs=lf:1,lf%5C_ui:9&tbm=lcl&q=restaurants+near+me#rlfi=hd:;si:;mv:[[54.6781006,25.2765623],[54.6672417,25.2563048]];tbs:lrf:!1m4!1u3!2m2!3m1!1e1!1m4!1u2!2m2!2m1!1e1!2m1!1e2!2m1!1e3!3sIAE,lf:1,lf_ui:9",
	"geo_location": "New York,New York,United States",
}

5. The next step is to send these parameters to the API endpoint. For this, we can use the request library to send a POST message as follows:

response = requests.request(
	"POST",
	"https://realtime.oxylabs.io/v1/queries",
	auth=("USERNAME", "PASSWORD"),
	json=payload,
	timeout=180,
)

Replace USERNAME and PASSWORD with your actual username and password.

6. If everything is well, you should get a response status code 200.

7. You can get the HTML from the results as follows:

html = response.json().get("results")[0].get("content")

8. The next step is to parse this HTML.

Parsing Google Maps Data

Once we have the HTML content of the search results page, we can use the BeautifulSoup library to parse the data. In this example, we'll extract the following data points from each place listed in the search results—Name, Place Type, Address, Rating, Price Level, Rating Count, Latitude, Longitude, Hours, and other details.

First, open the browser and open the same URL that you used in the code. Right-click on any of the listings and select Inspect.

Try to create a selector that selects exactly one listing at a time.

One possible selector is [role='heading]. The other is [data-id]. We will use the [data-id] in this example.

We can loop over all the matches and look for specific data points.

The next step is to create a CSS selector for each data point you want to scrape. For example, you can select the name of the restaurant with the following CSS selector:

[role='heading']

The following are all the selectors:

name_selector = "[role='heading']"
type_selector = ".rllt__details div:nth-of-type(2)"
address_selector = ".rllt__details div:nth-of-type(3)"
hours_selectors = ".rllt__details div:nth-of-type(4)"
rating_count_selector = 'span:contains("(")'
rating_selector = "[aria-hidden='true']"
details_selector = ".rllt__details div:nth-of-type(5)"
price_selector = "span[aria-label*='xpensive']"
lat_selector = "[data-lat]"
lng_selector = "[data-lng]"

We can use BeautifulSoup's select and select_one methods to select elements and then extract the text within those elements.

Rating count needs a different approach. The rating count is enclosed in brackets along with the rating. For example, 4.3(513). In this case, the rating count is within the brackets.

In this case, we can use the regex to extract this value as follows:

count_match = re.search(r"\((.+)\)", rating_count_el.text)
rating_count = count_match.group(1) if count_match else ""

Putting everything together, the following code generates a list of dictionaries that contain all the data from all the listings on the page:

soup = BeautifulSoup(html, "html.parser")
data = []
for listing in soup.select("[data-id]"):
	place = listing.parent
	name_el = place.select_one(name_selector)
	name = name_el.text.strip() if name_el else ""
 
	rating_el = place.select_one(rating_selector)
	rating = rating_el.text.strip() if rating_el else ""
 
	rating_count_el = place.select_one(rating_count_selector)
	rating_count = ""
	if rating_count_el:
	count_match = re.search(r"\((.+)\)", rating_count_el.text)
	rating_count = count_match.group(1) if count_match else ""
 
	hours_el = place.select_one(hours_selectors)
	hours = hours_el.text.strip() if hours_el else ""
 
	details_el = place.select_one(details_selector)
	details = details_el.text.strip() if details_el else ""
 
	price_level_el = place.select_one(price_selector)
	price_level = price_level_el.text.strip() if price_level_el else ""
 
	lat_el = place.select_one(lat_selector)
	lat = lat_el.get("data-lat") if lat_el else ""
 
	lng_el = place.select_one(lng_selector)
	lng = lng_el.get("data-lng") if lng_el else ""
 
	type_el = place.select_one(type_selector)
	place_type = type_el.text.strip().split("·")[-1] if type_el else ""
 
	address_el = place.select_one(address_selector)
	address = address_el.text.strip() if address_el else ""
 
	place = {
    	"name": name,
    	"place_type": place_type,
    	"address": address,
    	"rating": rating,
    	"price_level": price_level,
    	"rating_count": rating_count,
    	"latitude": lat,
    	"longitude": lng,
    	"hours": hours,
    	"details": details,
	}
	data.append(place)

The next step is to save this data as CSV.

Exporting Google Maps Data to CSV

With the data parsed, the final step is to export it to a CSV file. We'll use the Pandas library to create a DataFrame and save it as a CSV file:

df = pd.DataFrame(data)
df.to_csv("data.csv", index=False)

When you run this code, it will save the data to a CSV file named data.csv.

Conclusion

Scraping Google Maps isn’t an easy task, but this guide should help you navigate both how the scraping process works and how it functions in tandem with our API solution. The aim of the tutorial was to provide a step-by-step, comprehensive guide, but in case you have any questions, don't hesitate to contact us or chat with our 24/7 available live support team.

About the author

Danielius Radavicius

Copywriter

Danielius Radavičius is a Copywriter at Oxylabs. Having grown up in films, music, and books and having a keen interest in the defense industry, he decided to move his career toward tech-related subjects and quickly became interested in all things technology. In his free time, you'll probably find Danielius watching films, listening to music, and planning world domination.

All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.

People also ask

Is it possible to collect data from Google Maps?

Yes, you can use various programming languages or automated solutions such as Google Maps data extractor APIs to scrape Google Maps.

Is it legal to scrape Google Maps?

Yes, Google Maps provides publicly available data. Either copy-pasting, writing the data down, or using a Google Maps extractor for web scraping – no matter how you do it, the essence of public data is that it’s free to use and share.

It’s essential to comply with the website's terms of service, respect any restrictions or limitations on data usage, and adhere to legal and ethical guidelines governing Google Maps business scraper activities.

For more specific data extraction scenarios involving copyrighted or sensitive material, please seek professional legal guidance and analyze applicable national and international legislation. To learn more about the legalities of web scraping, check here.

Why collect data Google Maps?

The purpose of web data extraction from Google Maps is subsequent business data analysis. By extracting data with a Google Maps business scraper, users can gain insights, identify patterns, perform market research, or make informed decisions.

Related articles

Get the latest news from data gathering world

I’m interested

IN THIS ARTICLE:


  • Why scrape Google Maps?

  • How do I extract data from Google Maps?

  • Exporting Google Maps Data to CSV

  • Conclusion

Try Google Maps Scraper API

Choose Oxylabs' Google Maps Scraper API to gather real-time public data hassle-free.

Scale up your business with Oxylabs®