Proxy locations

Europe

North America

South America

Asia

Africa

Oceania

See all locations

Network statusCareers

Back to blog

Real Estate Scraper API Quick Start Guide

Real Estate Scraper API Quick Start Guide

Vytenis Kaubre

2023-10-173 min read
Share

Real Estate Scraper API is an advanced web scraping tool designed for hassle-free and large-scale data extraction from real estate websites. It’s bundled with sophisticated features to ensure block-free access and highly accurate results, making the scraping process significantly less complex.

Follow this guide to get started with Real Estate Scraper API and successfully execute your first scraping project.

Setting up Real Estate Scraper API

1. Register or log in to your account on the Oxylabs dashboard.

2. Choose a free trial or a subscription plan. Select Web Scraper API as Real Estate Scraper API is a part of it. 

3. After completing the previous step, a pop-up window will appear, asking to create an API user. Think of a username and password, and click Create API user.

4. Once you create your API user in the dashboard, a pop-up window will appear with a test query that you can copy. Run a scraping test of a real estate website, Zillow. Use the code in your terminal and replace the USERNAME and PASSWORD with your API user credentials.

A test query from the dashboard

Here's the same query for you to copy and use:

curl 'https://realtime.oxylabs.io/v1/queries' \
--user 'USERNAME:PASSWORD' \
-H 'Content-Type: application/json' \
-d '{"source": "universal", "url": "https://www.zillow.com/homedetails/10066-Cielo-Dr-Beverly-Hills-CA-90210/243990393_zpid/"}'

When run, the output will look similar to this:

{
  "results": [
    {
      "content": "<!doctype html>\n<html lang=\"en\">\n<head>
      ...
      </script></body>\n</html>\n",
      "created_at": "2023-09-19 07:12:36",
      "updated_at": "2023-09-19 07:12:37",
      "page": 1,
      "url": "https://www.zillow.com/homedetails/10066-Cielo-Dr-Beverly-Hills-CA-90210/243990393_zpid/",
      "job_id": "7109796405241185281",
      "status_code": 200
    }
  ]
}

For a visual tutorial, see this video on how to set up and use Real Estate Scraper API:

You can also see how Web Scraper API works in our Scraper APIs Playground, accessible via the Oxylabs dashboard.

Integration methods and endpoints 

The previous example uses the Realtime integration method. It allows you to send a query and receive results using the same connection.

Real Estate Scraper API can be integrated using one of the three methods:

  • Realtime

  • Push-Pull

  • Proxy Endpoint

The table below summarizes the essential differences of integration methods. See this blog post for more details that’ll help you choose the most suitable option for your needs.

Push-Pull Realtime Proxy Endpoint
Type Asynchronous Synchronous Synchronous
Job Query Format JSON JSON URL
Job Status Check Yes No No
Batch Query Yes No No
Upload to Storage Yes No No

You can find more examples of all three integration methods in our documentation and GitHub.

Parameters

The below table represents the main query parameters. You can find additional parameters in the documentation, such as context and parsing instructions.

Parameter Description
source Sets a specific scraper for requests. Real Estate Scraper API uses the universal source.
url Direct URL (link) to a real estate web page
user_agent_type Device type and browser. The default value is desktop. The full list can be found here.
geo_location Geo-location of the proxy used to retrieve the data. The full list can be found here.
locale Locale, as expected in the Accept-Language header.
render Enables JavaScript rendering.

Response codes

Here are the most common response codes Real Estate Scraper API can send you back. In case you receive a response code that’s not listed in our documentation, please get in touch with our support team.

Response Error message Description
200 OK All went well.
202 Accepted Your request was accepted.
204 No content You are trying to retrieve a job that has not been completed yet.
400 Multiple error messages Wrong request structure. Could be a misspelled parameter or an invalid value. The response body will have a more specific error message.
401 Authorization header not provided / Invalid authorization header / Client not found Missing authorization header or incorrect login credentials.
403 Forbidden Your account does not have access to this resource.
404 Not found The job ID you are looking for is no longer available.
422 Unprocessable entity There is something wrong with the payload. Make sure it's a valid JSON object.
429 Too many requests Exceeded rate limit. Please contact your account manager to increase limits.
500 Internal server error We're facing technical issues, please retry later. We may already be aware, but feel free to report it anyway.
524 Timeout Service unavailable.
612 Undefined internal error Job submission failed. Retry at no extra cost with faulted jobs, or reach out to us for assistance.
613 Faulted after too many retries Job submission failed. Retry at no extra cost with faulted jobs, or reach out to us for assistance.

Using API features

Real Estate Scraper API offers several free features to ease your processes:

  • Web Crawler can crawl any website, pick the content that you need, and deliver it in bulk. You can use it for various purposes, such as to discover URLs or crawl content on a large scale. See Web Crawler documentation.

  • Scheduler allows you to automate repetitive web scraping and parsing jobs by scheduling them for specified times. Simply send a request that tells our service how often to repeat the jobs and when to stop. See Scheduler documentation.

  • Custom Parser enables you to create your own parsing instructions for scraped results. You can use CSS and XPath selectors to get parsed results for your specific needs. See Custom Parser documentation.

  • Cloud Integration delivers results directly to your cloud storage bucket. You can choose either Amazon S3 or Google Cloud Storage; this way, you don’t need to make additional requests to fetch results. See Cloud Integration documentation.

Usage statistics

You can see detailed statistics of your API usage in the Oxylabs dashboard. Visit the Statistics section to find a graph that showcases scraped pages and a table with your API user’s data. You can see metrics like average response time, daily request counts, and total requests. Furthermore, you may filter the statistics to see your usage for specific dates of your choice.

Additional resources

Don’t miss a chance to try out Real Estate Scraper API freely for a week with up to 5K results. If you have any questions, please contact our 24/7 support team via live chat or email.

Curious for more? Visit the following links for web scraping guides and tips:

Frequently asked questions

What are the Real Estate Scraper API rate limits?

Each API user account has job submission rate limits according to their subscribed standard plan. Considering the projected volume of scraping jobs, the rate limit should be more than enough.

How to download images using Real Estate Scraper API?

To download images, you can save the output to the image extension via the Proxy Endpoint. Alternatively, if you’re using Push-Pull or Realtime integration methods, you can use the content_encoding parameter. See our documentation for more information and code samples.

Can I get a free trial for Real Estate Scraper API?

Yes, you can. Register an account in the Oxylabs dashboard, select Web Scraper API product, click Get started, and then Start free trial. As Real Estate Scraper API is part of Web Scraper API, you’ll be able to use it freely for a week with up to 5,000 results.

What are the pricing options of Real Estate Scraper API?

You can choose a Real Estate Scraper API pricing plan based on your business needs. There are plans for small businesses and for enterprises, starting at $49/month.

How does billing work for Real Estate Scraper API?

You’ll be billed for the number of successful results. Unsuccessful scraping attempts caused by our system errors won't be included in your bill.

About the author

Vytenis Kaubre

Copywriter

Vytenis Kaubre is a Copywriter at Oxylabs. As his passion lay in creative writing and curiosity in anything tech kept growing, he joined the army of copywriters. After work, you might find Vytenis watching TV shows, playing a guitar, or learning something new.

All information on Oxylabs Blog is provided on an "as is" basis and for informational purposes only. We make no representation and disclaim all liability with respect to your use of any information contained on Oxylabs Blog or any third-party websites that may be linked therein. Before engaging in scraping activities of any kind you should consult your legal advisors and carefully read the particular website's terms of service or receive a scraping license.

Related articles

Get the latest news from data gathering world

I’m interested

IN THIS ARTICLE:


  • Setting up Real Estate Scraper API

  • Integration methods and endpoints 

  • Parameters

  • Response codes

  • Using API features

  • Usage statistics

  • Additional resources

Forget about complex web scraping processes

Choose Oxylabs' advanced web intelligence collection solutions to gather real-time public data hassle-free.

Scale up your business with Oxylabs®