If you’ve always been in awe of folks using the Google Search Console API to do cool things, this article is a good read for you.
You can use BigQuery with the GSC bulk data export to get some of the same benefits without requiring the help of a developer.
With BigQuery, you can efficiently analyze large volumes of data from the GSC bulk data export.
You won’t have real-time data retrieval; that’s available with the API in our scenario but you can rely on daily data imports which means that you are working with up-to-date information.
By leveraging BigQuery and the GSC bulk data export, you can access comprehensive search analytics data – that’s the part you hear everyone raving about on LinkedIn.
I aim to make you feel more comfortable getting into the groove of analyzing data without the limitations that come with the Google Search Console interface. To do this, you need to consider five steps:
- Identify use cases.
- Identify relevant metrics for each use case.
- Query the data.
- Create a looker studio report to help stakeholders and teams understand your analysis.
- Automate reporting.
The issue we often face when getting started with BigQuery is that we all want to query the data right away. But that’s not enough.
The true value you can bring is by having a structured approach to your data analysis.
1. Identify Use Cases
It is often recommended that you know your data before you figure out what you want to analyze. While this is true, in this case, it will be limiting you.
We recommend you start by determining the specific purpose and goals for analyzing content performance.
Use Case #1: Identify The Queries And Pages That Bring The Most Clicks
“I believe that every high-quality SEO audit should also analyze the site’s visibility and performance in search. Once you identify these areas, you will know what to focus on in your audit recommendations.”
Said Olga Zarr in her “How to audit a site with Google Search Console” guide.
To do that, you want the queries and the pages that bring the most clicks.
Use Case #2: Calculating UQC
If you want to spot weak areas or opportunities, calculating the Unique Query Count (UQC) per page offers valuable insights.
You already know this because you use this type of analysis in SEO tools like Semrush, SE Ranking, Dragon Metrics, or Serpstat (the latter has a great guide on How to Use Google Search Console to Create Content Plans).
However, it is incredibly useful to recreate this with your own Google Search Console data. You can automate and replicate the process on a regular basis.
There are benefits to this:
- It helps identify which pages are attracting a diverse range of search queries and which ones may be more focused on specific topics.
- Pages with a high UQC may present opportunities for further optimization or expansion to capitalize on a wider range of search queries.
- Analyzing the UQC per page can also reveal which position bands (e.g., positions 1-3, 4-10, etc.) display more variability in terms of the number of unique queries. This can help prioritize optimization efforts.
- Understanding how UQC fluctuates throughout the year can inform content planning and optimization strategies to align with seasonal trends and capitalize on peak periods of search activity.
- Comparing UQC trends across different time periods enables you to gauge the effectiveness of content optimization efforts and identify areas for further improvement.
Use case #3: Assessing The Content Risk
Jess Joyce, B2B SaaS SEO expert has a revenue generating content optimization framework she shares with clients.
One of the critical steps is finding pages that saw a decline in clicks and impressions quarter over quarter. She relies on Search Console data to do so.
Building this query would be great but before we jump into this, we need to assess the content risk.
If you calculate the percentage of total clicks contributed by the top 1% of pages on a website based on the number of clicks each page receives, you can quickly pinpoint if you are in the danger zone – meaning if there are potential risks associated with over-reliance on a small subset of pages.
Here’s why this matters:
- Over-reliance on a small subset of pages can be harmful as it reduces the diversification of traffic across the website, making it vulnerable to fluctuations or declines in traffic to those specific pages.
- Assessing the danger zone: A percentage value over 40% indicates a high reliance on the top 1% of pages for organic traffic, suggesting a potential risk.
- This query provides valuable insight into the distribution of organic traffic across a website.
2. Identify Relevant Metrics
Analyzing your content lets you discern which content is effective and which isn’t, empowering you to make data-informed decisions.
Whether it’s expanding or discontinuing certain content types, leveraging insights from your data enables you to tailor your content strategy to match your audience’s preferences.
Metrics and analysis in content marketing provide the essential data for crafting content that resonates with your audience.