Google Webmaster Conference Mountain View 2019 Recap

Google provided rare access to its Search product managers for a small group of webmasters and search engine optimizers in Mountain View, CA. This is a collection of highlights from the event.

On November 4th, 2019, Google hosted a webmaster conference for a small group of SEOs and webmasters and was emceed by Danny Sullivan (aka Google’s Search Liaison). The event was used as an opportunity to share what Google has been working on, and to get feedback from people who regularly use Google Search Console (GSC) and optimize their sites for Google Search.

There weren’t any huge revelations, but there were several topics that were presented or discussed that offered a glimpse into how Google approaches search. Some of the presentations also confirmed specific theories related to how Google handles different kinds of page analyses, like de-duplication and JavaScript rendering.

The following is a compilation of the most important topics discussed at the conference.

Google Search lives in a mobile world

Desktop and Mobile Search

It was stated clearly that the Google product team is primarily focused on mobile search experiences. The focus on mobile was something Danny Sullivan highlighted early on in his opening remarks, and there were no desktop examples in any of the presentation slides.

It’s not new that Google has started to focus more on mobile, but I left the event with the impression that desktop search is now just an afterthought. Very few search experiences have been added to desktop search results in the past few years, and when they are added, they are modified versions of what’s been on mobile results for a significant amount of time.

My takeaway is that it’s time for me to break my decades’ old habit of focusing on desktop results. Mobile-first is no longer just an approach to web design; it’s the present and future of search.

Structured data isn’t just for rich results

Structured Data

There wasn’t a lot of new information about what Google is doing with structured data. The presenter gave examples of how they’re using it and recommended the audience visit the Search Gallery to learn more.

I have always contended that Schema.org structured data should be used regardless of whether or not Google has a rich result for it. That stance was confirmed during the presentation. Even if Google doesn’t display rich results for certain Schema.org types, they can still use the structured data to verify and disambiguate the page content.

CrUX is a bigger deal than you think

Speed Report

The Chrome User Experience Report (CrUX) was mentioned several times through out the event. It was stated that CrUX is as a data source that the Google Search team likes because it’s based on real world data. They implied that they’re using CrUX to make decisions, and there was evidence of that with the new Speed Report in GSC.

To my surprise, the Speed Report uses data from CrUX instead of Lighthouse or PageSpeed Insights. It’s possible they did that because it’s the least resource-intensive way to populate results. However, I think that the decision might be shortsighted and flawed. The CrUX results in the Speed Report were wildly off from PageSpeed Insights and Lighthouse results. Many attendees expressed in the room that they prefer to have data from PageSpeed Insights and Lighthouse. The team behind the tool is also aware that it’s not perfect, which is why they’ve labeled it as Experimental.

De-Duplication signal priorities and the importance of disambiguation

De-Duplication

I enjoyed the de-duplication presentation. It’s the first time I’ve ever heard a Google representative give a detailed overview of how they approach duplicate content. The most interesting part of the presentation to me was in regards to Signals. The de-duped pages they pick are based on prioritized signals. Their first concern is determining the correct owner and page for the content so they can avoid hijacking. Their second concern is UX. The page that they think has a better user experience – performs better, is secure, etc. – will get preference. The signals that get the least amount of priority are redirects, rel=“canonical” and sitemaps.

Knowing that Google gives the least amount of priority to redirects, hints, and sitemaps is essential because it means it’s up to the site to do everything it can to disambiguate its pages. That’s especially true when it comes to international pages. It’s not enough to use hreflang and to put pages inside a country-specific folder. Sites need to find a way to disambiguate the content further, like including the country name in the page title and on the page. Doing that can help Google also understand that it’s a relevant page for a locale instead of a duplicate page with a false signal.

JavaScript can adversely affect rendering and indexing

Developer using too much JavaScript

The presentation on rendering discussed the impact JS can have on crawling and rendering. They emphasized that the amount of JavaScript on a page impacts how a page is crawled, and also how many pages they will crawl and index. They also pointed out that they limit CPU consumption when they render JavaScript, and will even stop a script from running if it hits a certain threshold.

Paul Haahr provides insights into how Google approaches Search

The keynote presentation was given by Paul Haahr, Distinguished Engineer for Search. I recommend that you go through his presentation. It provides insight into how Google approaches search and solves problems. Paul provided interesting examples of how they approach understanding synonyms, similes, and non-compositional compounds. In particular, he discussed the problems they encountered and explained why they prefer to identify patterns of failure and to fix them algorithmically, instead of manually.

Download presentations

Except for Paul Haahr, I’ve removed all of the presenters’ names. Google specifically requested that we don’t mention their names.

We encourage you to discuss topics presented at the event with colleagues and other members of the Webmaster community, but please refrain from revealing specific individuals in those discussions.

It’s not 100% clear whether they meant individual discussions with product managers or the presentations, so I’m playing it safe and assuming they meant everything. They said they might end up releasing videos of the presentations, so the Chatham House Rule they invoked may be for naught.

Download presentations (Zip file with PDF files)

Related Articles

Jon is the founder of Coywolf and the EIC and the primary author reporting for Coywolf News. He is an industry veteran with over 25 years of digital marketing and internet technologies experience. Follow @henshaw