JavaScript SEO Devs and SEOs getting on the same page
Organic website traffic is a tough battle, and sometimes SEOs and Devs have trouble seeing each other’s perspectives. Time to get on the same page!
StudioHawk
April 19, 2022

Both SEOs and Devs today have a lot to deal with regarding the issues faced when getting JavaScript content indexed by Google. JavaScript websites are becoming a lot more commonplace. It’s certainly no big SEO secret that many of these sites are having issues.



Organic website traffic is a hard-fought and won battle, and sometimes SEOs and Devs have trouble seeing each other’s perspectives. It’s really worth getting to understand each other’s objectives and getting on the same pages so you can solve the many issues which may arise with JavaScript SEO.

Understanding JavaScript SEO

Optimising for JavaScript websites

JavaScript SEO deals with the Technical SEO elements of websites built with JavaScript, particularly how you can make it easier for search engines to crawl, render and index your website.

What is javascript seo

(img: Moz featured snippet)

Both SEOs and Devs need to focus on ways to allow search engines to crawl, render, and index your JavaScript site. That’s essentially what JavaScript SEO is. You need to make it easy for Google to index your site effectively.

As JavaScript websites become ever more more popular, it’s essential that you take the SEO journey to get on the same page with developers. Your website may have been developed with a certain functionality that required JavaScript to achieve it. However, it’s possible the SEO implications may not have been considered, resulting in poor organic search results.

JavaScript Frameworks, Vue, Angular, React

Devs working with JavaScript will already know that it’s a popular web scripting language. Used in conjunction with HTML and CSS, JavaScript allows web pages to have dynamic functionality, which standard HTML pages would not allow for. This functionality includes all kinds of animations and interactivity features, which can only be accomplished with JavaScript.

To achieve this dynamic functionality, many developers are electing to build websites with JavaScript frameworks alongside or instead of using a well-recognised CMS.

Popular JavaScript Frameworks

Some of the more popular JavaScript frameworks include:


Popular javascript frameworks

Image credit link: Front-end frameworks popularity of React, Vue, and Angular

Although it’s quite doable to include various dynamic functionality within the HTML of a page, the rise of single-page and multiple-page web applications has seen developers opting to build complete websites with JavaScript frameworks.

Using a JavaScript framework to build an entire website can often mean less success with Google. This approach can cause unforeseen issues with crawling, rendering, and indexing, often resulting in your site losing out on Google love.

Why you need JavaScript SEO

Most content-driven websites don’t have complex functionality. They are mainly for the purpose of acquiring customers or sales. The goal for moving to these kinds of complex JavaScript websites would be in the realms of the developers’ preferences and experience.

Suffice it to say, unless these kinds of websites are properly tested for common crawling, rendering, and indexing issues, then you may not have much success with the search engines. You’ll need to get on the same page to understand JavaScript SEO and how to optimise for it, whether you’re the developer who built the website or the SEO who’s trying to get it found with Google.

You will need to take all the necessary steps to ensure your website is SEO friendly, allowing Google to crawl, render, and index in order to get higher positions in the search results.

How Googlebot processes JavaScript Websites

The three phases of JavaScript processing

Google now renders JavaScript content much better than it did several years ago. It now happens near instantaneously, whereas in the past, it could have taken weeks.

However, just because Google has improved its JavaScript rendering, doesn’t mean that your pages will automatically get indexed if there are issues. To understand the kind of issues you could face, Devs and SEOs could schedule a meeting to walk through how Google does this.

Google processes JavaScript in three main phases:

  1. Crawling
  2. Rendering
  3. Indexing
The diagram below may help you get an understanding of the process. There’s an extra step at the end added on for ranking, which can only come after the whole process has been completed successfully.

crawl render index process

Image credit link: more about this on Google's documentation

Crawling, discovering pages and resources

SEOs and Devs can meet and discuss how crawling is all about the content discovery phase. This is where Google tries to discover all your pages and resources. Since the advent of mobile-first indexing, Google will likely be crawling with Googlebot smartphone crawler.

  1. Crawler sends GET requests to your web server.
  2. Your web server gives response headers, & file contents
  3. This content is saved to the crawl queue for processing
Crawl requests usually come from Mountain View, CA, USA, but Googlebot can also crawl locally in your local area outside of the USA. For this reason, it’s important to pay attention to any geo-based page redirects which may prevent Googlebot from accessing essential areas of your site.

Google wants to crawl and store ALL of the resources to build the page, including HTML, JavaScript files, CSS, and any APIs. Then all this content goes into and begins to build the crawl queue, with its own specific crawl budget emerging.

Processing, preparing HTML & Resources for rendering

During the project, you can discuss how Google looks for resources (CSS, JS, Images, etc.) and links to pages to build the page later and provide relevance and context.

  1. Finds resources and links to pages and files needed to build the page.
  2. These are added to the crawl queue and cached.
  3. Prioritising and scheduling is based on the crawl queue.

Google will look for valid anchor links tags with a href attribute. If there are JavaScript links to resources and pages, these may not get crawled or processed.

At this time, the most restrictive robots directives are chosen from either the HTML or Javascript versions of the page as Google encounters them. For example, a noindex directive overrides an index, and a noindex in HTML skips the rendering.

Rendering, seeing Pages the way users see them

Get an understanding of how the processed pages are placed in the render queue:

  1. Renders the changes made to the Document Object Model (DOM) by Javascript
  2. Renders Using “evergreen” headless Chrome

As you can imagine, rendering trillions of pages on the web is massively resource-intensive, so Google needs to have shortcuts. So it renders all the resources that were cached during the processing stage.

However, this may cause unexpected results if the state of any shared resources has changed since it was cached. Take, for example, a timer-based popover stored in cache for rendering. If more time has elapsed and the popover appears on the screen, then Google may render this, which may not be the desired page for Google to index.

It’s tempting to use the 5-second rule for the rendering timeout. In reality there’s actually no fixed 5-second timeout; rather, Google uses indicators such as inactivity or blocked processing. Google has said that if it renders in Mobile-Friendly Test, then it should get indexed. Of course getting indexed is not always a simple process.

Indexing and crawling content in two waves

It’s also crucial for both SEOs and Devs to understand that Google goes about this whole process in what it calls two waves. During the first wave, any of the static HTML is to be indexed. Then on the second wave, any of the additional JavaScript rendered content gets indexed.

Crawling indexing two waves

If you’re developing JavaScript websites, it will help you to know that in May 2019, Googlebot was upgraded to the latest version of the Chromium headless browser. It’s now called “evergreen,” making it fully compatible with the up-to-date versions of JavaScript. But don’t be fooled because there still can be indexing issues, which have a lot to do with crawl budget.

Even though the time from crawl-to-render is faster now than ever, there are still unforeseen issues your JavaScript page may face. These issues mostly depend on whether Google can actually render important content created with JavaScript.

The pages built with JavaScript have to go through the extra rendering step, which delays things slightly to save crawl budget. Crawl budget costs Google processing time and therefore is expensive, so being added to the Web Rending Service queue is for efficiency’s sake.

Developers should note that the render queue is held in a cache, so it’s not real-time. Google will render the page at a later stage based on the resources it crawled and stored in state. This is regardless of whether the state has now changed due to a timeout elapsing or a user interaction with the page.

Common Issues preventing Rendering

Even though your page is queued for rendering, it may not actually render as expected. A range of Dev related factors can be preventing Google from rendering your page properly, such as:

  • Robots directives blocking important .js resources, preventing Google from crawling and therefore rendering the content.
  • Timeouts causing network resources to take a long time to become reachable, so Google gives up and does not render.
  • Errors within the execution of the code resultant from state or network connectivity issues, such that JavaScript is unable to return HTTP status code errors.
  • Lazy loading images and text or other content, which is integral for the content, however, is not yet available in the cached render queue.
  • Hyperlinks generated by JavaScript are not crawlable, because Google does not click links, so use static links for them to be rendered.
If any nuance of these things goes wrong as Google renders the content it has in cache, then the page Google sees, and indexes may be quite different from the page developers are expecting to be rendered.

JavaScript Rendering Methods

As already stated, issues you may encounter with getting Google to index your JavaScript content will mostly be determined by the level of successful page rendering. However, the very best outcomes will be obtained when SEOs and Devs work together towards the common goal of attaining a clean and efficient rendering of JavaScript pages.

Therefore the task of achieving that wonderful clean SEO-friendly render becomes a shared project between SEOs and Devs, and there’s a lot each party can bring to the table. An ongoing process of best practice and trial and error will most likely need to be embarked upon.

You may think that because rendering is so important that what you see is what you get, and that Google renders your page just the same way as a visitor to your page might. But there’s actually a wide variety of rendering methods a developer can choose to implement, depending on how the developer has gone about approaching things.

Common Rendering Methods

Some of the most common JavaScript rendering methods include the following:

  • Pre-rendering
  • Client-side rendering
  • Server-side rendering
  • Dynamic rendering
  • Hybrid rendering
  • Isomorphic JavaScript

This article from Google covers a variety of rendering method.

For the purpose of this article, we will focus on three main rendering methods – Client-Side Rendering, Server-Side Rendering, and Dynamic Rendering .

Consider the costs of DOM vs. vDOM

Developers will know the DOM provides a tree structure of web elements within the HTML document. Devs take advantage of the JavaScript framework features to manipulate the web elements on the DOM. I will avoid diving too deep into the nuances in this article, but if you want to dive deeper, here is documentation about the DOM on Google.

It’s important that SEOs and Developers understand how the different frameworks use the DOM. Vue and React use a Virtual Document Object Model (vDOM), while Angular uses a real DOM. Individual elements can be updated with a vDOM instead of having to update the entire tree, which is the case with a real DOM.

PageSpeed has long been a major concern, and more now than ever, as Core Web Vitals become an integral consideration for Web development and SEO alike. From a performance perspective, vDOM is considered faster than a real DOM, and pages built with Vue and React are often faster than Angular.

It’s helpful for the SEOs and Devs to work together, looking at methods used for adding elements to the DOM and how they are also being rendered as a result.

Client-Side Rendering (CSR)

Open and constructive conversations around the impact of Client-Side Rendering (CSR) can be eye-opening for SEOs and Devs. With this method, the Googlebot renders the JavaScript content using the DOM as it’s fetched from the render queue. You are aiming to render the page content relying on the browser’s capability rather than loading content from HTML.

This is where most SEOs encounter issues as you start coming up against various unforeseen discrepancies between what Devs are rendering to the browser compared to what Googlebot can render successfully, or not render properly at all.

A worst-case scenario is what developers should do when Googlebot sees a somewhat empty HTML document when rendered. Begin troubleshooting with some of these tools:

Server-Side Rendering (SSR)

Discussions between Devs and SEOs about Server-Side Rendering will involve acronyms such as SSR and will likely be the most complex for developers to achieve. With this method, it’s the server, not the client, that renders the JavaScript and passes the rendered page to the client.

Google still crawls the page, but since rendering is already done on the server, it goes straight to the indexing stage. As JavaScript was used to render the page, it’s now treated like any other HTML page, which means JavaScript-related rendering issues are avoided.

If you decide to work with the SSR method, you may consider rendering tools such as:

Dynamic Rendering

As you progress your discussions, it might become favourable for both Devs and SEOs to opt for a dynamic rendering methodology. Here we are offering Google a static version of the page while giving the dynamically rendered version to other user clients.

This is a hybrid solution serving the content based on the user agent that requested it. A best of both worlds approach, delivering static HTML for the bots and crawlers, while delivering the JavaScript version for real-world users when identified as such.

The bots receive a version of the page designed to be machine-readable. It contains a paired back version of the content, with text and links so it can be more easily parsed by crawlers.

The human users receive the JavaScript version of the page. They experience all the fully interactive features which you have designed and developed according to your user engagement strategy.

dynamic rendering model

Image Credit: Google

Colluding your prerendering tests

Pre-rendering using a hybrid or dynamic rendering approach may be a little tricky to get your head around at first as Dev or an SEO because you may not understand how to test what Googlebot crawler is seeing.

One way to get the conversation going between Devs and SEOs is to firstly look at using a reliable solution such as prerender.io (https://prerender.io/). You could also use Puppeteer, or Rendertron.

To test how the rendered page is being interpreted by Google you can run the page through the Mobile-Friendly Test tool. This tool uses Googlebot smartphone crawler to crawl the page. It sees the prerendered version that has been set.

Another good way to test prerendering at scale is to use Screaming Frog SEO Spider (https://www.screamingfrog.co.uk/seo-spider/) to perform a crawl across your site, or sections of your site.

Configure Screaming Frog User-Agent, to use a Googlebot crawler so that the rerender triggers the crawler to look at the prerendered version of the page.

user-agent selection screaming frog

Also, make sure you have your spider configured, so it does not have Rendering with JavaScript enabled. The spider can look at the prerendered version you have selected with your dynamic rendering method.

Javascript rendering setting screaming frog

 

Isn’t this cloaking?

SEOs and Devs may at first be a little concerned about switch content served to Google detecting the User-Agent. Any SEO who has been around for while will be familiar with the idea of cloaking, which uses similar methods to serve different content to Google than the user Sees.

It has been declared that the practice of cloaking is against Google’s policies, and may result in penalties if it’s implemented on your website. However, Dynamic rendering methods are permitted and recommended by Google, as long as your rerendered version of the page does not show different content to the version being served to the clients.

It’s an extremely low-risk practice; however, to avoid any unforeseen issues, Developers and SEOs can read the Google documentation explaining how dynamic rendering can be set up.

Your Collaboration Journey

At the end of the day, Devs and SEOs can agree on the potential for JavaScript to cause SEO-related issues for crawling and indexing your website’s content. Beginning a collaborative JavaScript SEO approach can really be about finding ways for the Devs and SEOs to get on the same page.

Taking the time to understand the issues from both sides can help us get the SEO success we are hunting for. By working together for the same intended outcome, we can learn a lot more about the kinds of JavaScript issues faced together, and then make better-informed decisions about the kinds of rendering approach that are best for us.


 

By Peter Mead

Peter Mead - SEO Consultant

Peter Mead is a highly experienced award-winning SEO Consultant. Over the years, Peter has developed a heavy focus on Technical SEO and Content Marketing and is well equipped with a large variety of Advanced Technical SEO Skills.

Having had many years of experience in digital since 1997, before Google began, Peter thrives on challenges, and is appreciated for his analytical and strategic experience combined with his strong work ethic and friendly demeanour.

we’re especially specialist
StudioHawk
April 19, 2022

Our SEO Services.

screen_search_desktop
Technical SEO

Great SEO starts with solid foundations. Our in-depth website audit will help us uncover any “behind the scenes” technical issues that are hindering your SEO.


Learn more

shopping_basket
eCommerce SEO

In the world of eCommerce, competition is fierce. Our eCommerce SEO specialists have mastered what works and will help you reach more shoppers with credit cards in hand.

 

Learn more

location_on
Local SEO

With 4 out of 5 customers turning to search to find local information, our local SEO services will help your business show up at the right place, right time.

 

Learn more

domain
Enterprise SEO

Great SEO starts with solid foundations. Our in-depth website audit will help us uncover any “behind the scenes” technical issues that are hindering your SEO.

 

Learn more

link
Link Building

Our link-building campaigns use ethical, 100% white-hat techniques to build high quality backlinks to your store. This shows Google you’re a trusted authority and worth putting higher in the search results!

 

Learn more

phonelink_ring
Digital PR

StudioHawk can help to minimise the loss of traffic to your new domain or CMS. We map and implement redirects, provide recommendations, help with site structure, monitor traffic, and report to you on the progress and any impact on your organic traffic.

 

Learn more

storefront
Small Business SEO

Forget generic SEO services. Every small business is different, and things change quickly. Our specialist small business SEO experts will tailor a unique SEO strategy that works best for your business, budget and niche.

 

Learn more

language
International SEO

We’ll find your audience whenever they are in the world. We’ll craft masterful campaigns that cater to their linguistic and cultural nuances and help grow your brand globally.

 

Learn more

Australia’s biggest brands read 
our SEO newsletter. Shouldn’t you?

Subscribe to Australia’s smartest SEO newsletter for a competitive edge and 
front-row seat to what’s working today.