(img: Moz featured snippet)
Image credit link: Front-end frameworks popularity of React, Vue, and Angular
You will need to take all the necessary steps to ensure your website is SEO friendly, allowing Google to crawl, render, and index in order to get higher positions in the search results.
The diagram below may help you get an understanding of the process. There’s an extra step at the end added on for ranking, which can only come after the whole process has been completed successfully.
Image credit link: more about this on Google’s documentation
Crawling, discovering pages and resources
SEOs and Devs can meet and discuss how crawling is all about the content discovery phase. This is where Google tries to discover all your pages and resources. Since the advent of mobile-first indexing, Google will likely be crawling with Googlebot smartphone crawler.
- Crawler sends GET requests to your web server.
- Your web server gives response headers, & file contents
- This content is saved to the crawl queue for processing
Crawl requests usually come from Mountain View, CA, USA, but Googlebot can also crawl locally in your local area outside of the USA. For this reason, it’s important to pay attention to any geo-based page redirects which may prevent Googlebot from accessing essential areas of your site.
Processing, preparing HTML & Resources for rendering
During the project, you can discuss how Google looks for resources (CSS, JS, Images, etc.) and links to pages to build the page later and provide relevance and context.
- Finds resources and links to pages and files needed to build the page.
- These are added to the crawl queue and cached.
- Prioritising and scheduling is based on the crawl queue.
Rendering, seeing Pages the way users see them
Get an understanding of how the processed pages are placed in the render queue:
- Renders Using “evergreen” headless Chrome
As you can imagine, rendering trillions of pages on the web is massively resource-intensive, so Google needs to have shortcuts. So it renders all the resources that were cached during the processing stage.
However, this may cause unexpected results if the state of any shared resources has changed since it was cached. Take, for example, a timer-based popover stored in cache for rendering. If more time has elapsed and the popover appears on the screen, then Google may render this, which may not be the desired page for Google to index.
It’s tempting to use the 5-second rule for the rendering timeout. In reality there’s actually no fixed 5-second timeout; rather, Google uses indicators such as inactivity or blocked processing. Google has said that if it renders in Mobile-Friendly Test, then it should get indexed. Of course getting indexed is not always a simple process.
Indexing and crawling content in two waves
Developers should note that the render queue is held in a cache, so it’s not real-time. Google will render the page at a later stage based on the resources it crawled and stored in state. This is regardless of whether the state has now changed due to a timeout elapsing or a user interaction with the page.
Common Issues preventing Rendering
Even though your page is queued for rendering, it may not actually render as expected. A range of Dev related factors can be preventing Google from rendering your page properly, such as:
- Robots directives blocking important .js resources, preventing Google from crawling and therefore rendering the content.
- Timeouts causing network resources to take a long time to become reachable, so Google gives up and does not render.
- Lazy loading images and text or other content, which is integral for the content, however, is not yet available in the cached render queue.
If any nuance of these things goes wrong as Google renders the content it has in cache, then the page Google sees, and indexes may be quite different from the page developers are expecting to be rendered.
Therefore the task of achieving that wonderful clean SEO-friendly render becomes a shared project between SEOs and Devs, and there’s a lot each party can bring to the table. An ongoing process of best practice and trial and error will most likely need to be embarked upon.
You may think that because rendering is so important that what you see is what you get, and that Google renders your page just the same way as a visitor to your page might. But there’s actually a wide variety of rendering methods a developer can choose to implement, depending on how the developer has gone about approaching things.
Common Rendering Methods
- Client-side rendering
- Server-side rendering
- Dynamic rendering
- Hybrid rendering
This article from Google covers a variety of rendering methods
For the purpose of this article, we will focus on three main rendering methods – Client-Side Rendering, Server-Side Rendering, and Dynamic Rendering .
Consider the costs of DOM vs. vDOM
It’s important that SEOs and Developers understand how the different frameworks use the DOM. Vue and React use a Virtual Document Object Model (vDOM), while Angular uses a real DOM. Individual elements can be updated with a vDOM instead of having to update the entire tree, which is the case with a real DOM.
PageSpeed has long been a major concern, and more now than ever, as Core Web Vitals become an integral consideration for Web development and SEO alike. From a performance perspective, vDOM is considered faster than a real DOM, and pages built with Vue and React are often faster than Angular.
It’s helpful for the SEOs and Devs to work together, looking at methods used for adding elements to the DOM and how they are also being rendered as a result.
Client-Side Rendering (CSR)
This is where most SEOs encounter issues as you start coming up against various unforeseen discrepancies between what Devs are rendering to the browser compared to what Googlebot can render successfully, or not render properly at all.
A worst-case scenario is what developers should do when Googlebot sees a somewhat empty HTML document when rendered. Begin troubleshooting with some of these tools:
Server-Side Rendering (SSR)
If you decide to work with the SSR method, you may consider rendering tools such as:
As you progress your discussions, it might become favourable for both Devs and SEOs to opt for a dynamic rendering methodology. Here we are offering Google a static version of the page while giving the dynamically rendered version to other user clients.
The bots receive a version of the page designed to be machine-readable. It contains a paired back version of the content, with text and links so it can be more easily parsed by crawlers.
Image Credit: Google
Colluding your prerendering tests
Pre-rendering using a hybrid or dynamic rendering approach may be a little tricky to get your head around at first as Dev or an SEO because you may not understand how to test what Googlebot crawler is seeing.
One way to get the conversation going between Devs and SEOs is to firstly look at using a reliable solution such as prerender.io (https://prerender.io/). You could also use Puppeteer, or Rendertron.
To test how the rendered page is being interpreted by Google you can run the page through the Mobile-Friendly Test tool. This tool uses Googlebot smartphone crawler to crawl the page. It sees the prerendered version that has been set.
Another good way to test prerendering at scale is to use Screaming Frog SEO Spider (https://www.screamingfrog.co.uk/seo-spider/) to perform a crawl across your site, or sections of your site.
Configure Screaming Frog User-Agent, to use a Googlebot crawler so that the rerender triggers the crawler to look at the prerendered version of the page.
Isn’t this cloaking?
SEOs and Devs may at first be a little concerned about switch content served to Google detecting the User-Agent. Any SEO who has been around for while will be familiar with the idea of cloaking, which uses similar methods to serve different content to Google than the user Sees.
It has been declared that the practice of cloaking is against Google’s policies, and may result in penalties if it’s implemented on your website. However, Dynamic rendering methods are permitted and recommended by Google, as long as your rerendered version of the page does not show different content to the version being served to the clients.
It’s an extremely low-risk practice; however, to avoid any unforeseen issues, Developers and SEOs can read the Google documentation explaining how dynamic rendering can be set up.
Your Collaboration Journey
By Peter Mead
Peter Mead is a highly experienced award-winning SEO Consultant. Over the years, Peter has developed a heavy focus on Technical SEO and Content Marketing and is well equipped with a large variety of Advanced Technical SEO Skills.
Having had many years of experience in digital since 1997, before Google began, Peter thrives on challenges, and is appreciated for his analytical and strategic experience combined with his strong work ethic and friendly demeanour.