How to use Search Console bulk data export
4 mins read

How to use Search Console bulk data export

Google Search Advocate Daniel Waisberg recently presented an in-depth video on Bulk Data Export, a feature that lets you export, store, and analyze Search Console data.

This novel solution surpasses the possibilities and makes managing enormous amounts of data child’s play.

Here’s how.

An overview of current data export solutions

Before introducing the bulk data export feature, Waisberg summarized the existing methods for exporting Search Console data.

The easiest way to do this is via the user interface. You can directly export up to 1,000 rows of data with a simple click of the Export button.

Looker Studio and the API provide solutions for people who need larger amounts of data. Both channels allow you to retrieve performance data, URL inspection data, sitemaps and site data, with an export limit of up to 50,000 rows.

Introduction of bulk data export

The final and most advanced method of exporting data from Search Console is bulk data export.

This unique feature allows you to extract large amounts of data via Google BigQuery with no row limit. This is beneficial for large sites with many pages or high traffic.

Waisberg explains, “A bulk data export is a scheduled daily export of your Search Console performance data. It contains all the data used by Search Console to generate performance reports. The data is exported to Google BigQuery where you can run SQL queries for advanced data analysis or even export it to another system.”

Set up bulk data export

Due to its complexity and power, bulk data export requires prior knowledge of Google Cloud Platform, BigQuery, and Search Console.

Note that using this tool may incur costs. Therefore, it is important to consider the possible costs before setting up a new export.

Setting up a bulk data export involves Google Cloud and Search Console.

Step one: Google Cloud

First, go to the relevant project on Google Cloud and ensure that the BigQuery API is enabled.

  1. Open your Google Cloud Console and go to the project you are exporting data to.
  2. Navigate to APIs & Services > Enabled APIs & Services and enable the BigQuery API if it is not.
  3. Navigate to IAM and Admin, click + GRANT ACCESS and paste search-console-data-export@system.gserviceaccount.com in New Principals.
  4. Assign two roles to this account: BigQuery Job User and BigQuery Data Editor, then click Save.

Step two: Search Console

In Search Console, do the following:

  1. Navigate to Settings > Bulk Data Export.
  2. Enter your Google Cloud project ID in the Cloud Project ID field.
  3. Choose a record name. The default is “searchconsole”.
  4. Choose a location for your dataset. This cannot easily be changed later.
  5. Click Next to start the export. The first export takes place up to 48 hours after successful configuration.
  6. Set a partition expiration after table creation, if necessary, but avoid schema changes.
  7. For historical data prior to initial setup, use the Search Console API or reports.

Monitor and manage data exports

The new data export system has a built-in feature that allows you to monitor data exports using BigQuery. For example, you can track exports using an export log table.

Note that data accumulates indefinitely unless you set an expiration time. The export process will continue until manually disabled or until you encounter problems with Search Console.

In case of errors, Search Console notifies all property owners.

In total

In summary, the ability to export large amounts of data can improve the management of large amounts of Search Console data.

Stay tuned for upcoming content from Google that dives deeper into processing data after export is set up and best practices for extracting data from BigQuery.


source: Youtube

Featured image created by the author using Midjourney.