Mr Jonathan Jones

Mr Jonathan Jones

SEO & Digital Consultant

Google’s Product Review Update, April 2021

In typical Google fashion, Google recently announced a bombshell of an announcement called the “product review update” on Twitter & Google Search Central titled “What creators should know about Google’s product reviews update“. This update, at a glance, looks to target affiliate websites. The affiliate space is growing and is huge, so it’s no surprise… Continue reading

Cloudflare + Page Speed

I am really impressed with Cloudflare as a CDN to improve page speed for your website. I hadn’t before deployed any CDN on any of my portfolio of websites and for a good while I was stubborn thinking I’d be fine without a CDN — even though I knew of the benefits. The results are… Continue reading

Google to Roll Back Knowledge Panel, Featured Snippet-like Variant

Update 29/01/2020: Google has officially rolled back the change to the ‘Knowledge Panel Featured Snippets‘ – as pointed out by Saijo George on Twitter. Results featuring this type of Featured Snippet will now show duplicated results, like before, in the regular listings and in the Featured Snippet located at the right-hand side section of search… Continue reading

January 2020 Featured Snippet Google Update

Google has certainly come out with quite a number of significant changes for many SEOs and consumers to its search results in January so far, and we are only 22 days into 2020. On January 13th, they announced a change to match a layout they rolled out on mobile device a year ago to add… Continue reading

Key takeaways: Google Webmaster Conference, Zurich – Notes #WMCZRH

I was lucky to be one of the 150-200 people to attend the Google Webmaster Conference hosted in one of Google‘s Zurich offices on Wednesday, 11th of December, and so I thought I’d write up a blog post on some notes I’d taken and post up some of the pictures I had taken at the event.

It was also a really good experience with speaking to others who work in the industry and I have undoubtedly created a lot of new connections and hopefully gained new like-minded friends who do the sort of stuff I do.

The official Googlers speaking or in attendance of the event were John Mueller, Daniel Waisberg Martin Splitt, Gary Illyes and Lizzy Harvey.

And the community speakers were Tobias Willman, Aleksej Dix and Izzi Smith.


Google opened up the presentation with some housekeeping and some information on how they can’t give out advice for specific websites. This is important to note, as any question you ask will be answered in a more general manner. This has been, historically speaking, their policy for some time and they typically will not give specific feedback unless they think it benefits more than a single website:

Google Search Console – Daniel Waisberg

Daniel is a Search Advocate who works on the Google Search Console product. The Google Search Console product team appears to be based out in Tel Aviv. Daniel is well-known for the work he’s done at Google on the Google Analytics product and is the author of Google Analytics Integrations.

In his slides, he details the purpose of the mission of Google Search Console, updates and information on new features they have rolled out this year – and how they got there.

  • Speed of Google Search Console data processing increased 10x due to engineering developments, hence they were able to give webmasters access to 16 months of data and more recent data
  • They send millions of emails per month to webmasters in an attempt to try and help webmasters fix problems
  • It was highlighted that the newly created ‘Speed report’ in Google Search Console was extremely complex as they were taking data from the Chrome User Experience (CrUX) report, and worked “across departments” to get it working
    • Purpose of the speed report is to give speed optimisation suggestions and to validate fixes
  • Launched the ‘alerts’ and ‘messages’ last week (w/c 2nd of Dec) to help webmasters keep track of issues and alert webmasters to problems impacting their website on search
  • Some of the old Google Seach Console features will go away, some will be moved over, but might not exactly be 1:1 moves and will look to try and be improvements (follow up by John Mueller later)
  • Imanaged to quickly catch up with him separately regarding BigQuery and Google Analytics afterwards, as an audience member had asked the question around natively integrating BigQuery into Google Search Console. I briefly mentioned Supermetrics to him as a connector that could be used for such purpose to connect to BigQuery (but obviously a direct connector would be way more accessible).

How does Google Search Analytics work?

Page speed report in Google Search Console

The mission of Google Search Console: 

Provide data and tools to help site owners for their website(s) and to help their website appearance on Google

Google Search News Update – John Mueller 

help websites to be successful through search

  • The Google Webmaster Trends’ goal & mission is to ensure websites are successful on Google
  • Anecdote: Cheese comes in all shapes and sizes (as with a lot of websites), but if you want the best cheese – you need great high-quality ingredients (or something to that extent) – either way, it’s pretty clear that John loves cheese; I mean, who doesn’t?

John Mueller and Cheese

Other topics presented by John:

Mega Menus

Mega menus

  • Create a clear structure (top, category – detail)
  • Big menus are not necessarily bad
  • No need to hide links, can use nofollow if needed
  • Focus on usability
  • Don’t copy other websites

Pagination without rel next/prev

Pagination without rel next/prev

  • Link naturally between pages (in a way that the internal links are indexable)
  • Use clean URLs (don’t use parameters)
  • Use a local crawler (Screaming Frog, Site Bulb etc)



  • Canonicalisation uses more than rel=”canonical”
    • Redirects
    • Internal and external links
    • Sitemaps and hreflang
    • Cleaner URLs, HTTPS, preferred domain setting

What did Google remove/add in 2019?


  • Support for flash
  • rel=next, rel=prev (Google sees websites already doing this correctly, so no need to do anything special for Google)
  • DMOZ/ODP went away
  • Noindex in robots.txt
  • Google link: and info: operators


  • Introduced favicons
  • Better preview/ snippet controls (schema)
  • rel=”nofollow” as a hint
  • Added rel=”sponsored” and rel=”ugc”
  • Discover / Google Feed
  • Introduced a robots.txt standard

Use structured data

  • Lizzy Harvey wrote up all of the structured data documentation
  • Google looking to do more in this space and will be making further structured data enhancements for 2020
  • Structured data is good for giving even more context to a page (outside of pure direct impact to search snippets)

Use structured data (JSON-LD)

Core Updates & BERT

  • Google’s focus is to better understand pages and queries better, and show better, more relevant results
  • Core updates – always happen, but they will be bigger in future
  • No specific fix for websites impacted by core updates – other than trying to make your website better (as a whole)
  • BERT – tries to understand web search queries better and tries to identify which pages are relevant for search queries (more on BERT query understanding)
  • Key focus on natural language for content – content should be written naturally
  • H1 – H3 headings help Google understand even images (semantics of the sections of a page – associated with specific images)
  • Read the Quality Rater Guidelines 

BERT & Core Updates

Crystal ball and looking into the future

  • More “core” algorithm updates
  • Better understand of queries and pages
  • “data-nosnippet” attribute

Google predictions from John Mueller for 2020


A combination of Martin Splitt, John Mueller and community speaker Aleksej Dix talked about dynamic rendering and server-side rendering throughout their presentations. Google went into more detail on how rendering works, and Aleksej provided details on practical examples of what you can do using NUXT.js (more on that below).

What is tree shaking? Inevitably javascript increases to a certain size over time – tree shaking is removing code not in use anymore as it is removed in the bundle during the build process. Martin went on to talk about some examples below:

Martin went to talk on about how Google tries to read javascript; they try and given enough time to read the whole file (around 40 seconds) for rendering purposes –

  • Google tries to cache javascript aggressively for purposes of indexation
  • If there are excessive amounts of time/CPU taken, then Google “will cut  […] off” and potentially stop working hard at reading a file

Javascript Takeaways: 

  • All of the general guidelines in SEO apply to javascript (links, URLs, images, structured data, crawling etc)
  • Use real URLs (no “#” fragments in URLs) – 12% of single-page applications use fragment URLs
  • Minimise embedded resources (speed up server for crawling, and rendering speed/CPU )
  • Use testing tools & a local rendering crawler
  • Link through to the Google Search JavaScript documentation
  • Budget for rendering, use capable servers & robust JS
  • “SEO is not dead” – off the back of saying that as JS becomes more popular, there will need to be SEO support to ensure that Google is able to efficiently crawl the javascript for indexation and ranking purposes

NUXT.JS // Server-Side Rendering

I won’t pretend to understand all that was talked about in the javascript slides, but the examples of the output from Aleksej Dix, who presented NUXT.JS, “a progressive framework based on Vue.js to create modern web applications.”, were very good at explaining this to non-devs such as myself.

It impacts SEO due to the fact that it renders content on both the client and server-side. I was very impressed with the example given as Alek showed a live example of how he had disabled javascript in the browser and the functionality and content remained. Usually with other libraries, if you disabled javascript in your browser, your screen would go white/blank, but in this instance, the content persisted – read more NUXT.js and serverside rendering.

  • NOTE: I have also found out through a small personal project of mine that in order to get NUXT.js running, you need Node.js installed on your server (though this is also in Alek’s slides, so I just wasn’t paying enough attention). Not all shared web hosts have Node.js installed or enabled, but some do like A2 Hosting.

A view into how it works: 

  • Server-Side Rendering (SSR) means that some parts of your application code can run on both the server and the client. This means that you can render your components directly to HTML on the server-side (e.g. via a node.js server), which allows for better performance and gives a faster initial response, especially on mobile devices. Source.

The key takeaway here is to use NUXT.js on a Node.js enabled server – if you’re using vue.js. However, there are other options which Google has developed called “dynamic rendering” if NUXT.JS isn’t an option:

Dynamic rendering is good for indexable, public JavaScript-generated content that changes rapidly, or content that uses JavaScript features that aren’t supported by the crawlers you care about. Not all sites need to use dynamic rendering, and it’s worth noting that dynamic rendering is a workaround for crawlers.

Useful documentation: 

I will stop writing at this point and this has already taken me quite some time to write up and to remember all of the key takeaways, but there were obviously other presentations from community speakers such as Tobias Willman, and Izzi Smith.

News SEO & Featured Snippets

Tobias kindly shared his slides on Google slides which were based on ‘News SEO’ – more for those in the publishing space:

Izzi talked about featured snippets and content – with the key advice being:

  • “become a well-structured, engaging, and satisfying resource with relevant authority and high accessibility”

All in all, it was a really good event and those in the Google Webmaster Trends team planned it excellently, so credit where credit is due. Everyone at the event was super friendly and chatty, so it was easy to chat to others as I went alone (as did many others), so it was a good way to get out of my comfort zone and speak to other likeminded people. 🙂

Questions & Answers: 

Q (disclosure, this was my question): John, you had in one of your slides regarding the change in policy at Google around rel=”nofollow” and will be using that as a hint. How does this impact the web from a Google perspective? What will change as a result? 

A: Gary responded to this as he was the co-author on the Google blog around the communications Google gave upon this announcement in September that they were going to roll this out on March 1st, 2020. The general theme of the response was around Gary saying that there will still be no PageRank flowing through from nofollow links, and said there were ‘other signals’ which could be used in order to make nofollow links useful for other Search purposes outside of PageRank.

Q: Will there be a separation of ‘voice-activated searches’ in Google Search Console?   

A: No. Google explained (as they have before) that there is not much in terms of this insight being actionable, and that whilst you can’t separate this in Google Search Console, the voice data is part of Google Search Console reporting.

Q: Meta description and title tag length?

A: There is no single length or amount for title tag and meta data length because depending on screen size, this will vary – hence Google will not be giving out any guidelines on this.

Q: Will there be any updates to the APIs for Google Search Console? 

A: Nothing planned. The response was that they weigh up engineering resources they have by whether there is a high amount of API usage vs all else, and right now, API usage isn’t high enough and will require good amounts of resources to improve.

Q: Is there a plan to increase (already increased to 16 months) the length of data in Google Search Console to 18 or 24 months? 

A: No. A lot of hard work has already gone into increasing the maximum date range to 16 months from 3 months previously, and with that –  increasing the speed in which Google is able to fetch this type of data. Reference back to Daniel Waisbergs reference to engineers in the Google Search Console team increasing the data collection speed by 10X.

Q: Why doesn’t Google use image recognition to help provide context for images so websites don’t need to provide alt text? 

A: Google goes into detail around the fact that webmasters still need to provide the context within articles for images and that for accessibility purposes alt text will be required still for those with visual impairments – using screen readers.

Q: A page has a content page that updates dynamically based upon IP address, what is Google seeing and indexing? 

A: Googlebot will see the US content version of that page as the canonical (through a US IP address); the discussion around creating country-specific sections/pages is where this conversation/answer ended – and potentially going down the hreflang route.

Q: What javacript library is the gold standard?

A: Martin responded by saying that this is not a good question. The output can be different regardless of the framework and is more centered around how it is deployed.

Pre-filtering reports in Google Data Studio

Google Data Studio has a lot of functionality when it comes to filtering, and I use the filtering constantly, least not the default “filter control” options that are available which allow you to filter by dimension:

However, I got this question from someone using the report I created and shared yesterday (Google Data Studio Alternative to Google Search Console – exporting by date) about filtering out the Query data to NOT contain certain queries and I was going to simply respond to the comment saying that you’re able to use Regular Expressions (see the ‘Query -> REGEXP’ example above) to do exactly that. You can see this question below:

As an aside, I am so humbled and pleased by the fact that so far on Twitter, the response to sharing the report has been massively well received, we’re about 266 likes and 89 re-tweets in and my phone is still buzzing away as I write this at 8:11 AM. Dawn Anderson, who I hugely respect in the SEO industry, also shared the post on LinkedIn!

Back to filtering…

Not to detract from the subject of this blog post, so going back into it: what if you wanted to remove queries that just were not relevant for your report in Google Data Studio?

Introducing ‘Report Settings’ in Google Data Studio

So, if you’ve already got a snazzy report setup, and you want to either 1. Filter out the query data to only contain certain queries or 2. Filter your queries to not contain certain queries, on a more permanent basis, then this is all feasible within the powerful filtering options that exist within Google Data Studio.

All you need to do is go to ‘Report settings’, which you can find in the top left hand corner menu under ‘File’:

Once you click ‘Report settings’ a menu will open up on the right hand side of your monitor:

The option that we’re looking for here is the ‘ADD A FILTER’ option. In Google’s own words, this is what this does to your report:

Configuring a filter in the report settings panel sets it as the default for the entire report. All components that share the same (or similar) data source are affected by the filter. You can override this by turning off filter inheritance for a selected component.Google Data Studio

So we can conclude by adding this filter, it’ll impact your entire report.

Once you click that ‘ADD A FILTER’ option, you’ll be presented at the bottom of your screen to ‘Create a filter’ – I won’t screenshot this as I’ve got a lot of images on this post already. Once you click that, you’ll be presented with the below filter options and you’ll have the ability to ‘Exclude’ or ‘Include’ and then the option to filter directly by dimension:

Going back to the original question, Andrew Coco wanted to filter by Query to exclude certain keywords. In this report, I’ve setup a very basic filter that will exclude the Query dimension, with strings that contain ‘youtube’ from the Site Impression table:

This then removes all traces of the word ‘YouTube’ from the entire report.

I can see where this might be useful, as you’ll be able to setup reports that exclude branded search terms or that might only include branded search terms.

Chart Inherited Filtering

Because we set this up via the ‘Report setting’ section in Google Data Studio, all of the charts in the report will inherit this filter. You’ll be able to see if this is the case by selecting any of the charts in your report:

You can simply toggle off or on this filter as you wish and even setup a chart specific filter, depending on what you’re trying to achieve.

That should hopefully answer the question in probably a certain amount of detail that Andrew Coco was not expecting, but it was a good question, so I thought I’d give it a good answer. 🙂

What am I looking at next?

I think my next blog post for Google Data Studio is showing everyone how you can create drop downs that allow you to toggle easily between groups of keywords that you’ve grouped together through creating Calculated Fields, so that you’re able to, for example, create a drop down menu that will give you the option to filter from branded search queries to generic search queries. This relies on using Google’s CASE function, which is yet another powerful part of the platform.

Stay tuned!

Use Google Data Studio to Export Google Search Console Data by Date

If you’ve been using Google Search Console recently, you’ll have spotted that Google recently retired the old Google Search Console user interface for the Search Analytics report on the 13th of December, 2018 with their more Firebase looking interface. This has left people without core functionality that existed before, but fear not – Google Data Studio is the solution! Read more below:

There are pros and cons to the new user interface, and many are already finding that the functionality that existed before does not exist within the new Google Search Console interface. In particular, my favourite part of the platform is the Google Search Analytics report. This particular report allows you to download keyword data, whether that is Impressions, Clicks, CTR or Average Positions. These are all super useful metrics that allow you to measure performance. However, the problem that people are running into is the fact that within the previous Google Search Console report, you could export data, whether on a single keyword level, broad level or for multiple terms on a trended and daily basis.

You can’t export by date…

In the new Google Search Console, you can only export snapshots of data which is not very useful if you’re trying to marry separate data sources or trying to export the data to visualise via Excel for that super important presentation you’ve got to complete with this data source.

Update [23rd of September, 2019]: It is now possible to export by date in the new Google Search Console as per Google’s announcement on the 23rd of September, 2019: 

I would still recommend continuing with this tutorial as Google Data Studio is pretty damn awesome and there’s so much more you can do to segment your data further than what you can do through the Google Search Console UI. 

Google Data Studio to the rescue!

In light of this, I’ve created a dashboard in Google Data Studio that anyone can re-use and download:

Click to view the Google Search Console 3.0 – Data Studio Report

Jonathan Jones

I’ve taken the liberty to screenshot the report itself – below:

Google Search Console 3.0

The key aspect here is that you can better visualise data in Google Data Studio because you can customise the report to your specific needs. However, if you’re landing on this page wanting only the ability to export data by day then you can see in the report I’ve added that table in the bottom left hand corner.

The three little dots…

You’ll see that you can export the data by day/trended view, when you hover over where I’ve highlighted below with the circle with your cursor. This will show you 3 little dots, as seen below:

Download options

Once you click those three little dots, you’ll be presented with the below with some download options:

I tend to download first into Google Sheets because for some reason Google Data Studio rounds the data when you export directly into Excel via a CSV – which might I add is rather annoying – but I suppose works for Google as you’re using yet another service they provide.

Whilst I export into Google Sheets, I tend to copy and paste the data in Excel as it’s my preferred method, but if you like Google Sheets then keep the data in there. This is what the data ends up looking like, and is pretty much the same as what you would have been able to do in the old version of Google Search Console:

And the benefits of the new Google Search Console API mean you can download 16 months worth of data on a daily/trended view, versus the 90 rolling days that was present in the old Google Search Console.

How can you get started with this?

If you access the link to the Google Data Studio report that I shared earlier, and that I’m sharing again, you’ll be able to see the below option to copy the report (see red circle):

Once you click that button, you’ll be presented with below to link up your Google Search Console account with Google Data Studio (effectively this swaps out the reference from the existing Google Search Console account connector, with your account connector):

Connect your Google Search Console account to Google Data Studio

If you haven’t got Google Search Console connected already, then you’ll have to click on ‘Create a new data source’.

It is relatively straightforward to connect your Google Search Console from this point.

IMPORTANT – Use and connect the Site Impression table for Google Search Console and NOT URL Impression

You’ll want to connect your ‘Site Impression’ table to this report as that is what I’ve used to create the report.

I’ve elected to use the Site Impression table over the URL Impression table because the data within the URL Level Impression table is not as reliable as the Site Impression table – for quite a few reasons.

I’ll need to write a separate blog post on why I believe that to be the case, but this report I’ve created, a la the ‘Google Search Console 3.0‘ allows you to filter by certain keywords, and other fields, including the options to use Regular Expressions.

So for now, use the Site Impression table, though I am looking to test joining together and blending Site Impression and URL Impression tables, so that you have the ability to filter by landing page by filtering from the Site Impression data only. You can see two options below – select ‘Site Impression‘:

Invalid Metric Issues – how to resolve these…

Once the data is imported, you may come across some issues which are simple to resolve. I’ve seen that sometimes the metrics don’t carry over properly for one reason or the other. All you need to do is re-select the ‘invalid metric’:

In this case, the ‘invalid metric’ is Site CTR. All you need to do is re-select this metric – 

This issue seems to have cropped up and has killed all of the tables and charts with the “Site CTR” metric. The fix is simple, so I’m confident that people can fix this with the advice I’ve given above. 

To end with…

I am increasingly a fan of Google Data Studio for the versatility that it offers, and with the increasing amount of brand new features that are coming out. It’s automated a lot of my workload, so I see it as a very powerful tool to use and to identify issues.

Hopefully, if you’re reading this, you’ve found this useful. Feel free to comment below if you need any help with setting this up.

Frequently Asked Questions

Am I still able to pull data by day from Google Search Console? 

In short, the answer is yes. Google Data Studio gives you this ability. 

Can I pull data by device type? 

Yes, you can add this as a breakdown dimension which will allow you to pull data over time by all three device types – Mobile, Desktop, and Tablet.


Google Search Console vs Google Analytics

Russ Jones at Moz recently came out with an article around the reliability of Google Search Console data – listing examples of where they had tested some of the platform’s features versus what they saw as the realities. It’s a really interesting read, and it poses the question whether SEOs should even be using the… Continue reading

The Growth of Featured Snippets (2016 – 2017)

Featured Snippets was a pretty hot topic back in 2016, especially in the Finance sector, an area that I work in. It’s only just begun in terms of the topic really as we’ve experienced 700% growth in terms of Answer Box/Featured Snippet appearances in this area. Google are making further changes to these snippets, which… Continue reading