For the past several years, Google has been trying to make the web a better place. It’s partly motivated by altruistic employees who want to make finding and browsing sites less annoying and partly driven by Google’s business needs.
Progressive Web Apps (PWA) is an excellent example of Google attempting to make the web better. PWAs are designed to create a site or web app that is always accessible, regardless of network conditions. Google has been pushing web developers to make PWAs for several years. They’ve incorporated PWAs into Lighthouse, and they’ve been touting it as a better alternative to creating native apps. However, when you peel back the layers, it becomes apparent that Google needs PWAs more than sites need PWAs.
Another example of Google trying to push for better mobile user experience is with AMP. AMP, which started as an acronym for Accelerated Mobile Pages, is a more heavy-handed approach that Google took to make the web better. Google wanted to fix slow sites, so they created a web component framework with strict parameters and thresholds. Google communicated that AMP was its preferred format for mobile. They pressured news publishers to use it by making it a ranking factor for the Google News Top Stories carousel on mobile search results. AMP has also been controversial because of how it caches and serves content from Google, instead of the site’s host.
A lot has changed since AMP and PWAs were launched. AMP, in particular, has matured into a viable open source web component framework and is set to be decoupled as a ranking factor on mobile search results for Top Stories by 2021. The ranking factor change was primarily driven by Google’s creation of Core Web Vitals, which sets a new UX standard for all sites.
Core Web Vitals is a significant development that will affect how sites are created, optimized, and maintained for the foreseeable future. Instead of “create high-quality content,” it will be “create a high-quality user experience.”
The data and details behind Core Web Vitals
A couple of weeks after Core Web Vitals was announced, the Chromium Blog published a post on the science behind Web Vitals. The article described how the Chromium team analyzed millions of page impressions and found that sites that met or exceeded the Core Web Vitals “Good” thresholds had significantly lower abandonment rates.
- 24% less likely to abandon page loads
- 22% less abandonment for news sites
- 24% less abandonment for shopping sites
From the outset, Google has taken a quantitative approach to determine what thresholds should be used when grading site performance. The launch of the Speed report in Search Console provided a clear example of that approach. They chose to utilize the Chrome User Experience Report (CrUX) because they wanted to analyze sites with data from users worldwide.
At the 2019 Google Webmaster Conference in Mountain View, attendees provided feedback to Google about the significant difference between CrUX data results compared to manually running PageSpeed Insights and Lighthouse results. The product team stated they were aware of the discrepancy but believed using data from CrUX was the best approach. To alleviate concerns, they did reiterate to the audience that the Speed report should be considered experimental, and they were open to feedback.
It appears that Google did consider the feedback. Before Core Web Vitals was released, the Speed report returned most pages with a nearly perfect Lighthouse score as “Needs Improvement.” When Core Web Vitals was released, they all changed to “Good” while still utilizing the same data source.
Google was able to determine better metrics that indicated speed and performance while still using CrUX, and those new metrics are now known as the Core Web Vitals. The metrics that define Core Web Vitals are:
- LCP: Largest Contentful Paint
- FID: First Input Delay
- CLS: Cumulative Layout Shift
It’s important to note that Google has stated that these could change in the future, but these are what they’re primarily focused on for now.
Largest Contentful Paint
The Largest Contentful Paint (LCP) attempts to measure when a page’s primary content has loaded. Google determined that a page isn’t beneficial to users if they can’t see the main content.
To optimize for LCP, Google recommends focusing on asset delivery and speeding up browser rendering. A useful resource for optimizing for LCP is Coywolf’s Performance and Speed Optimization Guide.
First Input Delay
The First Input Delay (FID) typically occurs during the Time to Interactive (TTI) event. It measures how long it takes for the page to react to input, like clicks, taps, and key presses. Google states that long FIDs result in a poor UX.
Cumulative Layout Shift
The Cumulative Layout Shift (CLS) addresses the frustrating experience of having content on the page shift after a visually loaded page. Layout shifts typically occur when a slower third-party server is serving a page asset or ad. Aside from the abrupt shifting of the content, it can also result in unintentional clicks on the wrong object.
Optimizing for CLS can sometimes be as simple as specifying the dimensions of an image or video. For example, if you have an ad that always uses the same dimensions, adding the height and width to the block that contains the advertisement should keep the page content from shifting if the ad is slow to load. It can get more complicated if an ad server dynamically displays ads with different dimensions or if the content is dynamically injected after the page load.
How Core Web Vitals will impact content strategy and web development
Google has a history of pressuring sites to adhere to the standards they set. Recent examples include the push for SSL, mobile-friendly layouts, and AMP for Google News. Core Web Vitals is their next foray into getting sites to conform to its standards by making it a ranking signal.
Except for AMP and its initial rollout, the changes Google is trying to get sites to make are relatively positive. The complaints towards Google are about its tactics, and the lack of choice sites have because of its monopoly. However, what they’re trying to achieve is a faster, more secure, and reliable user experience for all netizens. That in itself is a good thing, and it’s something I can quickly get behind.
Core Web Vitals represents an evolution in Google’s attempt to “fix” the internet. All of their past efforts have led to this initiative, and I think it will be seen as a watershed moment for site strategy and development.
Faster and fewer ads
Advertising has run amok for several years. The variety and amount of ads that can be shown on a page have grown, and it’s brought with it excessive code bloat, slow asset delivery, and an abysmal UX. To circumvent abusive ad experiences, users have turned to use software that blocks ads and trackers.
Publishers have attempted to thwart ad blockers by implementing server-side software that either tries to get around them or displays a modal over the content with a request to the visitor to turn off adblocking. Both approaches present imperfect stopgaps that are not long-term solutions. That’s especially true with the onset of Core Web Vitals.
Core Web Vitals will force publishers to improve how they deliver ads. Publishers will have to provide fewer ads faster, and they’ll have to do it without making the page layout shift.
Expect to see the following changes starting in 2021:
- The speed of ad servers will become significantly faster.
- Ad delivery services will develop new methods to reduce the chance of layout shifts.
- Publishers will start to move towards localized ad delivery.
- There will be fewer ads on pages, but they will be served at a higher cost. The higher cost will be offset by higher click-through and conversion rates.
- There will be a significant increase in commerce and affiliate related content to make up for lost ad revenue.
Increased use of frameworks that don’t degrade over time
In general, sites perform worse over time. It’s caused by technical debt, new features, adding more ads to a page, back-end and front-end mistakes, code bloat, and having multiple developers work on the same site. Development teams also rarely run performance tests or have requirements that adhere to strict code budgets or speed thresholds.
The best way to prevent site degradation is to use a strict performance-based framework. That type of framework already exists, and it’s one that Google wholeheartedly recommends: AMP.
AMP has a checkered past, and I prefer not to use it for my sites, but it’s one of the best solutions for keeping development teams from degrading sites. After working with enterprise sites and large development teams, I am convinced that the best way to create and maintain a high performing site is to use AMP.
In 2021, I expect more sites to convert entirely to AMP, similar to what Axios did in February 2020. I also predict that we’ll see more frameworks and services that force development teams to adhere to strict code budgets, performance thresholds, and utilize the CrUX API for validation metrics.
How to prepare for Core Web Vitals
Faster sites with fewer ads will change what users expect from publishers. Publishers who don’t adapt quickly enough won’t just suffer from less visibility and traffic from Google; they will also see traffic decline from direct visits as visitors eschew and avoid them.
If you are a publisher or work with publishers, now is the time to take action. The following questions should be considered and addressed before 2021:
- What will it take to get your site within the “Good” thresholds of Core Web Vitals?
- If you’re able to update your site performance, can it be easily maintained without it degrading over time?
- Is converting your site to AMP or a similar framework a viable option?
- How will you keep or grow revenue with fewer ads?
- Does it make sense to run a local ad server like Revive and fully control your ad delivery performance?
- Is it feasible to create a paywall for premium content and services similar to Business Insider’s Premium offering or CNBC’s Pro membership?
- Can you replace the loss of ad revenue with affiliate links for products and services?
Google has intentionally given publishers time to prepare for the new search signals. I recommend not wasting that time and getting ahead of the competition.