DeepCrawl Ultimate Guide To JavaScript For SEO
DeepCrawl Ultimate Guide To JavaScript For SEO
ULTIMATE GUIDE TO
JAVASCRIPT FOR SEO
Contents
The state of the modern web 4
What is JavaScript? 5
Pre-rendering 24
Client-side rendering 25
Server-side rendering 26
Dynamic rendering 28
Hybrid rendering 30
Isomorphic JavaScript 31
Conclusion 69
The state of the
modern web
To succeed in the current digital landscape, it is
crucial to understand how JavaScript works as well
as how to optimise JavaScript-powered websites
for both users and search engines. Why? Because
JavaScript is all around us.
I M AG E S O U R C E : H T T P A R C H I V E
04
The purpose of this guide is to address and
alleviate some of the fears around JavaScript by
explaining exactly how it works, how it can
impact website performance and what you can
do to make sure your JavaScript-powered
ASCRIPT website
HAS performs in search. JAVASCRIPT HAS
COME A CRUCIAL BECOME A CRUCIAL
JavaScript isn’t going anywhere, so let’s get
T OF THEmore familiar with it. PART OF THE
DERN WEB, SO MODERN WEB, SO
NEED TO DO WE NEED TO DO
R PART TO OUR PART TO
RN MORE LEARN MORE
OUT IT AND HOW ABOUT IT AND HOW
AN IMPACT OUR IT CAN IMPACT OUR
BSITES. WEBSITES.
I M AG E S O U R C E : T W I T T E R
What is JavaScript?
To learn more about JavaScript, the first place “Every time a web page does more than
to start is understanding exactly what it is and
just sit there and display static information
how it works on the websites we manage.
for you to look at — displaying timely
JavaScript is essentially a programming content updates, interactive maps,
language that is used to implement the
animated 2D/3D graphics, scrolling video
interactive or dynamic elements of a website,
including personalisation, dynamically updating jukeboxes, etc. — you can bet that
content and notifications. The HTML and CSS of JavaScript is probably involved.”
a page usually form the main structure and
styling of a page, but JavaScript is what brings it 1
Emma Wedekind,
UX Engineer
at LogMeIn
06
“There are many benefits and arguments for using JavaScript, and the fact
it’s taking over the web, mobile and desktop landscapes only solidifies its
position in the industry. JavaScript is unique. Times have changed where
the server environment does the heavy lifting, this is down to the client
(browser) now. We deploy static JavaScript assets and the server simply
acts as a JSON API gateway, transferring information between database
and client. This helps clean up architecture and allows us to bring sensible
engineering patterns to the front-end development landscape. User
interactions are much more powerful, we can validate things such as email
addresses, passwords and complex logic on the fly - without full page
reloads and waiting times. The user’s perception of speed is altered with a
powerful JavaScript environment.
All in all, the benefits of JavaScript are immense and seeing that 95.1% of
websites use JS doesn’t surprise me.”
Ana Cidre,
Developer Advocate &
Engineer at Ultimate Angular
07
How JavaScript works DOCUMENT
ROOT
<HTML>
TEXT NODE
TEXT NODE TEXT NODE
You can see what the DOM looks like for “THIS IS AN
“EXAMPLE” EXAMPLE PAGE” “EXAMPLE PAGE”
your website by using the ‘Inspect’
element in your browser, which will lay
out the different head and body elements
of a page. I M AG E S O U R C E : C O M P U T E R H O P E
Once a page’s structure has been defined, then JavaScript can be triggered which will make any changes
necessary and modify the DOM and HTML, meaning that the final view of the page can then be updated.
08
“A very common use of JavaScript is to dynamically modify HTML and MDN Web
2
CSS to update a user interface, via the Document Object Model API. If Docs
the JavaScript loaded and tried to run before the HTML and CSS were
there to affect, then errors would occur.”
“Until a few years ago, SEOs focused on HTML and CSS. HTML has traditionally been responsible
for content and CSS for the layout, design, and visual effects. These two languages can craft an
aesthetically appealing, functional, flat pages.
If Google is a library, a website is a book. A site built with JavaScript is a pop-up book. With
JavaScript, a dozen lines of flat HTML sent from your server unfurl and execute a web version of
programming code. The desire for personalized, interactive, and engaging user experiences is
why today more than 95% of sites use JavaScript. For search engines and SEOs to experience
site content, they need to execute JavaScript.”
Jamie Alberico,
SEO Product Owner at
Arrow Electronics
External JavaScript, on the other hand, is included in a link to a separate file that is requested. E.g.:
JavaScript is an embedded scripting language, meaning that it can be embedded into different applications
using an API (Application Programming Interface). This flexibility is one of the reasons why JavaScript can
be so powerful for developers when building websites.
“Application Programming Interfaces (APIs) provide you with extra superpowers to use in your
JavaScript code.”
If you’d like to brush up on some more of the key terms and concepts around JavaScript and rendering,
then make sure you read this guide on JavaScript fundamentals.
A JavaScript framework contains a set of “At their most basic, JavaScript frameworks
libraries, components and instructions for
are collections of JavaScript code libraries
building websites and user interfaces with
code. that provide developers with pre-written JS
code to use for routine programming
Each framework comes with its own positives
features and tasks—literally a framework to
and negatives, so choosing the right one will
depend on a developer’s needs and the build websites or web applications around.”
specifications of the project they’re working on.
4
Skillcrush
These are the main JavaScript frameworks:
B AC K B O N E METEOR EMBER R E AC T
REASON S V E LT E ELM P R E AC T
INFERNO P O LY M E R
I M AG E S O U R C E :
M A R T I N S P L I T T, F R O N T E N D C O N N E C T
12
I M AG E S O U R C E : B A R T O S Z G Ó R A L E W I C Z , S M X
These are the different steps involved in the JavaScript rendering process:
I M AG E S O U R C E : G O O G L E D E V E LO P E R S
1 “JavaScript: Typically JavaScript is used to handle work that will result in visual changes.
2 Style calculations: This is the process of figuring out which CSS rules apply to which elements. They
are applied and the final styles for each element are calculated.
3 Layout: Once the browser knows which rules apply to an element it can begin to calculate how
much space it takes up and where it is on screen.
4 Paint: Painting is the process of filling in pixels. It involves drawing out text, colors, images, borders,
and shadows, essentially every visual part of the elements.
5 Compositing: Since the parts of the page were drawn into potentially multiple layers they need to be
drawn to the screen in the correct order so that the page renders correctly.” 5
“One of the biggest struggles which search Until search engines have a way to handle
engines (and SEOs) have with JavaScript is this new world, it's imperative that SEO
that it often breaks our working model of practitioners understand how JavaScript
what a "page" is. We're used to a world websites work, and can wrangle them into
where content lives in HTML code, on a a format which Google can consume and
webpage, which is represented by a URL; understand. In most cases, that's as simple
and for the most part, pages are generally (and as complex) as ensuring that the
consistent in form and behaviour. This website behaves like a "normal" site when
model sits at the very heart of how Google JavaScript isn't enabled.
crawls, processes, and evaluates content.
Make no mistake, JavaScript websites are
With JavaScript, that paradigm shatters. We the future. Consumers expect and demand
lose the connection between URLs, pages the kinds of rich experiences which only
and content, as the browser fluidly changes apps can deliver. But if we don't step in,
based on user interaction. We move into a steer best practice and level-up the
world of "states" and "views", which don't development community, the
neatly fit our model. This makes it hard for web as we know it will break.”
search engines to understand what a
"page" is, nevermind the challenges of
Jono Alderson,
accessing, evaluating and associating the
Mad Scientist at Yoast
value of content with a URL.
14
We’ve put together the latest updates on how Google
the main search engines are currently equipped
for rendering, as well as some key Google is one of the few search engines that
considerations for each one. currently renders JavaScript, and provides a lot
of documentation and resources on JavaScript
best practice for search. This means we’re able
to build a pretty clear picture of what we need
to do to get our websites indexed in Google’s
SERPs (Search Engine Results Pages).
Rendering
I M AG E S O U R C E :
M A R T I N S P L I T T, A N G U L A R U P C O N F E R E N C E
TEMPLATES
DATA
To carry out this process, Googlebot uses a The most recent version at the time of writing is
headless browser for its web rendering service Chrome 72, so there is a gap between the
(WRS). A headless browser is essentially a latest browser functionalities and how
browser without the visual elements, which Googlebot is able to render content. To put this
outputs rendered code rather than a visually in perspective, since Chrome 41 was released,
rendered page. Google’s WRS is based on 892 new Chrome features have been added.
6
Chrome 41 which was launched in 2015, and is
limited by the features of the Chrome version it
is using.
“The last time I checked my calendar, it was 2019. I wish search engines were perfect at
rendering JavaScript, but unfortunately, that's not the case. The key is to know search engines’
limitations and how to deal with them.
Google’s Web Rendering Service uses a 4-year old browser for rendering. You can’t change that
but you should ensure that Google can access your content by using proper tools, like the URL
Inspection Tool, Chrome 41, Mobile-friendly Test and Rich Results Test. In addition, make sure
Google has indexed your content by using the “site” command.
If something is wrong, you can analyze what errors Google gets while
rendering. Maybe you forgot about polyfills or transpiling to ES6? Sit with
your developer and start talking about the issue. If you can’t find
anything that helps, you may consider moving to hybrid
rendering or dynamic rendering.”
Tomek Rudzki,
R&D Specialist at Elephate
16
How Google’s rendering
process impacts search
Martin Splitt,
Webmaster Trends
7
Analyst at Google
Instant,
first wave of indexing
CRAWL
As rendering resources
become available
INDEX
New links to be
crawled
RENDER
Second wave of
indexing
I M AG E S O U R C E : G O O G L E I / O 2 0 1 8
When resources do become available, there What is the gap between the first and second
isn’t a specific way of prioritising the pages that wave of indexing then? According to Google’s
will be rendered first. Google’s John Mueller Tom Greenaway and Martin Splitt during
explained that any prioritisation is done in the Chrome Dev Summit 2018, it could take
same way as for regular crawling and indexing. “minutes, an hour, a day or up to a week” for
Google to render content after a page has been
crawled.
Google doesn't have a separate way of
prioritising pages for rendering. If your website gets stuck between these two
waves of indexing, any new content you add or
John Mueller, any changes you make to your website won’t
8
Google Webmaster Hangout be seen or indexed for an undetermined
amount of time. This will have the biggest
impact on sites that rely on fresh search results,
such as ecommerce or news sites.
John Mueller,
10
Google Webmaster Hangout
The team at Google are aware that this isn’t an “In general, Bing does not have crawling
ideal situation for website owners and are and indexing issues with web sites using
working on updating their rendering services.
JavaScript, but occasionally Bingbot
At Chrome Dev Summit 2018, Martin Splitt
announced that Google will be updating its
encounters websites heavily relying on
WRS so that it stays up to date with Chrome’s JavaScript to render their content,
release schedule, meaning that the WRS will especially as in the past few years. Some of
always use the latest version of Chrome. these sites require far more than one HTTP
request per web page to render the whole
Bearing all of this in mind, you can start to see
page, meaning that it is difficult for
that Google needs some additional help to
render your modern JavaScript websites and Bingbot, like other search engines, to
applications. We’ll go over some of the things process at scale on every page of every
you can do later on in this guide. large website.
specifically for search engine crawlers, we JavaScript user agents and crawlers with
scarier for the SEO community than getting Yahoo Webmaster Resources
penalized for cloaking. The good news is
that as long as you make a good faith effort
Yandex
to return the same content to all visitors,
with the only difference being that the Yandex’s documentation explains that the
content is rendered on the server for bots search engine doesn’t render JavaScript and
and on the client for real users, this is can’t index any content that is generated by it. If
you want your site to appear in Yandex, make
acceptable and not considered cloaking.”
sure your key content is returned in the HTML
upon the initial request for the page.
Fabrice Canel,
Principal Program Manager at Bing “Make sure that the pages return the full
content to the robot. If they use JavaScript
Even though Bing can render in some capacity, code, the robot will not be able to index the
it isn’t able to extract and follow URLs that are content generated by the script. The
contained within JavaScript. content you want to include in the search
should be available in the HTML code
“Don’t bury links to content inside immediately after requesting the page,
JavaScript.” without using JavaScript code. To do this,
use HTML copies.”
11
Bing Webmaster Guidelines
13
Yandex Support
Not indexed I M AG E S O U R C E : M O Z
“Bing, Yahoo, AOL, DuckDuckGo, and Take a look at the full article covering the
Yandex are completely JavaScript-blind and experiment and results to learn more about
Bartosz’s conclusions.
won’t see your content if it isn’t in the HTML.”
Bartosz Góralewicz,
14
Co-founder of Elephate
Martin Splitt,
15
Google Webmaster Hangout
Rachel Costello
I M AG E S O U R C E : S L AC K
When a user tries to access a page that uses JavaScript causes an issue for browser
JavaScript, their browser will firstly receive rendering because it has the potential to modify
HTML, CSS and JavaScript in packets of data. the page. This means that rendering has to be
The code from these packets is then parsed paused each time a new script is found and the
and mapped out to create the DOM, which structuring and styling of the page is put on
defines the structure of the page. This structure hold to keep up with any changes that
is combined with instructions from the CSS JavaScript might make to the page.
about the styling of the different elements on
the page, which creates the render tree. This is To learn more about how browsers render
what the browser uses to layout the page and JavaScript, this guide explains the process in
start painting pixels to the screen. more detail.
CHROME S A FA R I F I R E F OX
23
Each browser has its own rendering environment and can have its own unique challenges. This is why it’s
crucial to test how your website loads across different browsers. Find out the main browsers that are used
by people visiting your website, and test how those render as a priority. You can find this information in the
Audience section of Google Analytics under ‘Technology’.
It’s possible to make alterations to how your content is delivered to users and search engines by
implementing different rendering methods. You can alter how much of the rendering work falls on the client,
i.e. the user’s browser or the search engine bot requesting the page. In this section of the guide, we’ll take a
look at how some of these different methods work
Pre-rendering
Pre-rendering a page is when the client creates the structure of a page which is shown before the final
content is rendered to the screen. It shows a static snapshot of the page immediately which can be useful
for showing content to search engines on page load without them having to render anything. However, this
snapshot is not the complete version of the page.
“As SEOs, it's important that we prioritize our sites' users and ensure search engines are able to
grasp our experience. Luckily for us, there are a variety of solutions to ensure that both users and
bots can receive the optimal experience. Two top options include:
24
1. Do nothing. Perhaps the simplest technical solution, we can test whether or not a more
JavaScript-heavy solution has any effect on performance (including: rankings, crawl logs,
traffic, and engagement metrics). If you are changing experiences, (if possible) attempt on a
smaller section of the site. This will enable any kinks to be ironed out early in the process.
CONS
Client-side rendering
Client-side rendering is when the client, such as the user’s browser or search engine crawler, does the work
of processing and rendering any JavaScript. The server will respond to the initial request for the page, but
the rest is down to the client. Client-side rendering is often advised against because the time and expense
of processing JavaScript falls on users’ devices and on search engines. Also, most search engines can’t
even render JavaScript in the first place.
I M AG E S O U R C E : G O O G L E I / O 2 0 1 8
25
PROS CONS
Puts less strain on the server. Puts more strain on the CPU (Central
Processing Unit) of the user’s device.
Addy Osmani,
16
Engineering Manager at Google
Server-side rendering
With server-side rendering, the server does the heavy lifting and will render any JavaScript on the page,
meaning that it can send a fully processed page straight to the client. All the client has to do is display the
finished content. Server-side rendering allows search engines and social media crawlers to index your
content, even the ones that struggle with rendering.
PRE-RENDERED
DISPLAY
HTML
I M AG E S O U R C E : G O O G L E I / O 2 0 1 8
Implementing server-side rendering has shown dramatic increases in page speed and performance for
some of the biggest brands, including Netflix.
PROS
17 36
27
CONS “Server-side rendering TTFB (Time To First
Byte) is slower than client-side rendering
Can cause UX issues if a user can see a page
but has to wait until the client is alive and able TTFB, because your server will have to
to process any interactivity.
spend the time to create the HTML for your
Puts more strain on the server. page instead of just sending out a relatively
empty response.”
17
WalmartLabs
Dynamic rendering
Dynamic rendering works by detecting the user agent of the client making the request. If Googlebot or
another search engine or social media crawler user agent is detected, a mini client-side renderer will be
used to render all JavaScript. The fully rendered page will then be sent to the client. Any other user agents
will need to render JavaScript client-side.
SERVER INFRASTRUCTURE
WEB
SERVER
INITIAL HTML
PRE-RENDERED
REQUIRED TO RENDER DISPLAY
HTML
CLIENT-SIDE VERSION
I M AG E S O U R C E : G O O G L E I / O 2 0 1 8
18
Bing Blog
PROS
Bartosz Góralewicz,
20
Co-founder of Elephate
Hybrid rendering
Hybrid rendering involves a combination of server-side and client-side rendering. The core content of the
page is rendered on the server and is sent to either the browser or search engine requesting the page. This
means that the client will always receive the rendered content and markup straight away.
There is a final step in this process for users. Once the core content has been displayed, additional
JavaScript is then sent to be rendered client-side so the user can interact with the page.
PRE-RENDERED DISPLAY
HTML
INDEXED
JS UPDATE DISPLAY
SMX,
20 17 https://www.slideshare.net/goralewicz/dynamic-rendering-is-this-really-an-seo-silver-bullet, 30th January 2019 30
Hybrid rendering is a popular choice and has been adopted by some of the biggest brands, including Netflix
and Nike.
PROS CONS
Faster content and link discovery for The full-page experience isn’t available
search engines. without some client-side rendering.
Isomorphic JavaScript
Isomorphic JavaScript, also referred to as Twitter and Airbnb are just a couple of the
Universal JavaScript, renders on both the prominent brands that have been implementing
server-side and client-side. Pre-rendering is and experimenting with Isomorphic JavaScript.
used to serve content to the user or search
engine on the first request, then interactivity
PROS
that requires JavaScript is handled on the
client-side.
Content is available quickly to search engines
for indexing.
“Isomorphism means that you can feed The site is perceived to load faster and is user-
friendly.
your application with some data coming
from the database on the server, run it, and
capture the resulting HTML output that
would normally require a browser to
assemble. This output can then be served
as your initial HTML to everyone who
requests it (including crawlers, such as
Googlebot).”
Bartosz Góralewicz,
21
Co-founder of Elephate
Robin Rozhon,
SEO Strategist
CONS
at EA
Your server needs to support Node JS
applications to enable isomorphism.
Whichever rendering method you use, make Your choice of rendering method should be
sure the content you’re serving is accessible to informed by the needs of your business and
and works for whichever search engine bot or website. For example, if you have a section of
browser that is requesting it. Client-side your site that requires a log in to be accessed,
rendering might work for users on a modern none of that will need to server-side rendered.
browser and a high-end CPU, but search You would implement server-side rendering for
engines, social media crawlers or users on a your publicly accessible landing pages that
lower end CPU device will need some help to contain descriptive content about your services
be able to see your website’s content. instead.
32
The main things to watch out
for with JavaScript
JavaScript rendering is often a complicated and 1 Rendering speed
resource-intensive process for a number of
different reasons, and can significantly impact a 2 Main thread activity
variety of different performance and user
3 Conflicting signals between HTML
experience factors. It’s crucial to understand
and JavaScript
where these issues can occur and how they can
impact your website. 4 Blocked scripts
These are the 8 main things to watch out for 5 Scripts in the head
within a JavaScript-powered website:
6 Content duplication
1
7 User events
8 Service workers
Rendering speed
JavaScript is hosted in the browser which it relies on for all the heavy lifting because it doesn’t have its own
storage or network facilities. The JavaScript will give instructions on what needs to happen to construct and
load the page, but it relies on its host environment to actually do all of this.
The process of rendering JavaScript is also very expensive because of the four different stages that it runs
through:
These multiple steps are one of the main reasons why JavaScript is much more expensive to process than
other elements, such as images or HTML.
I M AG E S O U R C E : B A R T O S Z G Ó R A L E W I C Z , D E E P C R AW L W E B I N A R
33
Having JavaScript-heavy pages that take a long Another issue to consider is that a user’s device
time to process and render means that they are and CPU will usually have to do the hard work
at risk of not being rendered or processed by with JavaScript rendering, but not all CPUs are
search engines. up for the challenge. It’s important to be aware
that users will experience page load times
differently depending on their device. Just
“If your web server is unable to handle the because a site appears to load quickly on a
volume of crawl requests for resources, it high-end phone, it doesn’t mean that this will be
may have a negative impact on our the case for a user accessing the same page
capability to render your pages. If you’d like with a lower-end phone.
VASCRIPT JAVASCRIPT
NDERING CAN BE RENDERING CAN BE
EXPENSIVE AND AN EXPENSIVE AND
RENUOUS STRENUOUS
OCESS. THIS PROCESS. THIS
USES CAUSES
NIFICANT SIGNIFICANT
UES WHEN THAT ISSUES WHEN THAT
RK FALLS ON A WORK FALLS ON A
I M AG E S O U R C E : T H I N K W I T H G O O G L E
ER’S BROWSER USER’S BROWSER
SEARCH ENGINE OR SEARCH ENGINE
22 Google Webmaster Central Blog, https://
AWLER. webmasters.googleblog.com/2014/05/understanding-web-pages- CRAWLER.
better.html, 23rd May 2014
35
2
Main thread activity
JavaScript is single-threaded, meaning that each command is run one at a time on the browser’s main
thread of activity. The entire main thread is halted while JavaScript is parsed, compiled and executed. With
this kind of setup, queues can form and bottlenecks can happen, meaning that the entire process of loading
a page can be delayed.
Delays within the main thread can significantly increase the time it takes to load a page and for it to become
interactive for users, so avoid blocking main thread activity wherever possible. Keep an eye on how many
resources are being executed and where request timeouts are happening, as these can be some of the
main culprits which create bottlenecks.
3
Conflicting signals between HTML and JavaScript
Adding important meta tags using JavaScript is The signals you provide via JavaScript
advised against because either Google won’t shouldn't conflict with the ones in the
see these tags straight away because of its
HTML.
delayed rendering process, or other search
engines won’t see them at all due to the fact
John Mueller,
that they can’t render. 24
Google Webmaster Hangout
All search engines will use the signals from the
HTML in the initial fetch to determine crawling
For example, if you use JavaScript to remove a
and indexing. Google and the few search
robots meta tag like noindex, Google will have
engines that have rendering capabilities will
already seen the noindex tag in the HTML and
then render pages at a later date, but if the
won’t waste resources rendering a page it has
signals served via JavaScript differ from what
been told not to include in its index. This means
was initially found in the HTML, then this will
that the instructions to remove the noindex
contradict what the search engine has already
won’t even be seen as they’re hidden behind
been told about the page.
JavaScript which won’t be rendered.
FIRST
Eoghan Henn,
IMPRESSIONS
Co-founder of
COUNT WITH searchVIU
26
SEARCH ENGINES,
SO MAKE SURE
YOU’RE GIVING
THEM CLEAR,
STRAIGHTFORWAR
D INSTRUCTIONS IN
THE HTML AS SOON
AS THEY COME
ACROSS THE PAGE.
John Mueller, 28
29
Yandex Support
5
Scripts in the head
When JavaScript is served in the head, this can delay the rendering and loading of the entire page. This is
because everything in the head is loaded as a priority before the body can start to be loaded.
Don't serve critical JavaScript in the head as this can block rendering.
John Mueller,
30
Google Webmaster Hangout
JavaScript snippets can close the head prematurely and cause any elements below to be
overlooked.
John Mueller,
31
Google Webmaster Hangout
6
Content duplication
JavaScript can cause duplication and “If you're using a SPA-type setup where the
canonicalisation issues when it is used to serve
content. This is because if scripts take too long
static HTML is mostly the same, and
to process, then the content they generate JavaScript has to be run in order to see any
won’t be seen. This can cause Google to only
of the unique content, then if that
see boilerplate, duplicate content across the
site, meaning it won’t be able to find any unique JavaScript can't be executed properly, then
content to rank pages with. This can often be
the content ends up looking the same. This
an issue for Single Page Applications (SPAs)
where the content dynamically changes without is probably a sign that it's too hard to get to
having to reload the page. your unique content -- it takes too many
content.”
John Mueller,
32
Webmaster Trends Analyst at Google
34
Google Search
8
Service workers
A service worker is a script that works in the background of the browser and on a separate thread. Service
workers can run pages and provide content based on their own memory, meaning they can work offline
without the server being involved.
“A service worker is a script that your browser runs in the background, separate from a web page,
opening the door to features that don't need a web page or user interaction. Today, they already
include features like push notifications and background sync.”
35
Google Web Fundamentals
41
What to test for
When running a JavaScript audit for a website,
it’s important to focus on these main areas:
36
DeepCrawl
For more detail on each of these tools and their highlight features, take a look at this guide on tips and
tools for testing rendering.
44
The JavaScript console messages even provide stack trace reports which allow developers to see where
errors are happening in the code and debug them.
Remember that the results you see from Google’s tools may not reflect real rendering. These tools have a
shorter timeout as they aim to give results to their users as quickly as possible. Google’s indexing systems
will wait for a page to render for a longer period of time.
Google's indexing systems are more patient with rendering than the live testing tools are.
John Mueller,
38
Google Webmaster Hangout
Mobile-friendly Test
The Mobile-friendly Test shows the rendered HTML of a page so you can check that everything is showing
properly and making sense for Google. It lists different elements that could negatively impact how your site
appears on mobile devices, such as blocked resources and JavaScript errors.
John Mueller,
39
Google Webmaster Hangout
The Rich Results Test can also be used to see how Google renders a page and how it handles different
JavaScript frameworks. This tool can be used for testing rendering on a page-by-page basis.
Test JavaScript frameworks with the Rich Results feature in Google Search Console.
John Mueller,
40
Google Webmaster Hangout
The Index Coverage report in Google Search Console is useful for getting an overview of what is and isn’t
being indexed on a website. Whereas some of Google’s other tools can provide granular information on
errors at a page-level, this top-level view can give valuable insights into more widespread issues with
JavaScript that could be impacting indexing across the site.
PageSpeed Insights
The PageSpeed Insights tool has been revamped to include Lighthouse data and a variety of new
performance and JavaScript-related reports. The ‘Opportunities’ and ‘Diagnostics’ sections show key issues
that need to be addressed, as well as how much time could be saved on page load time if they were fixed.
49
Diffchecker
Diffchecker allows you to analyse the differences between the unrendered source code of a page side by
side against the rendered DOM. This allows for detailed comparisons between rendered and unrendered
content on a page-by-page basis.
50
You can copy the HTML that is shown once you right click and select ‘View Page Source’ in the browser and
paste this in the ‘Original Text’ box in Diffchecker. Then copy the ‘outerHTML’ after right clicking in the
browser and selecting ‘Inspect’, and paste this code in the ‘Changed Text’ box. Diffchecker will then show
you any differences between the two sets of code.
I M AG E S O U R C E : P O L E M I C D I G I TA L
51
Chrome DevTools
Chrome DevTools can be used to debug and edit pages within the Chrome browser. By right clicking on a
page when viewing it and clicking ‘Inspect’, the DevTools dock will appear with a wide variety of analysis
options, such as monitoring performance and network conditions as well as showing JavaScript and
resource errors.
52
The waterfall under the
‘Network’ tab shows exactly
which scripts were run and in
which order, visualising
dependencies and roadblocks
to be addressed. This is a
really useful report as you can
see the Load Event as a red
line, which is the point when
Google determines that
rendering has been
completed and takes a
snapshot of the rendered
HTML. Any content produced
by resources that load after
this point may not be seen.
Focus on getting key content
to load before this red line so
Google won’t miss it.
JavaScript-powered content is
indexed depending on
whether it is visible to
Googlebot on page load.
John Mueller,
Google Webmaster
41
Hangout
54
Chrome isn’t the only browser that provides JavaScript debugging functionality, there are also developer
tools available for the main browsers such as Firefox, Opera and Safari, so you should be able to find a
solution for JavaScript analysis that works for you.
I M AG E S O U R C E : F I R E F OX D E V E LO P E R T O O L S
55
O P E R A D E V E LO P E R T O O L S . S O U R C E : R AYG U N
S O U R C E : S A FA R I D E V E LO P E R T O O L S
56
Web Developer Extension
An easy way to see what works and what doesn’t work without JavaScript rendering is to use the Web
Developer Extension to disable JavaScript in the browser. This will show you what can function and be
displayed in the browser without the page being rendered. This extension can be used on Chrome, Firefox
and Opera.
WebPageTest
WebPageTest provides insights into the different types of resources on a page and how quickly they are
able to be processed. This tool can give you an idea of how much time is spent on which stages of
JavaScript rendering on a page-by-page basis, so you can spot patterns of performance issues and
bottlenecks.
57
The ‘Request Map’ is a great feature as it shows all of the different requests made on a page and the
dependencies between them. Each circle is a different request and you can see what type of resource each
one is when you hover over them, such as JavaScript or an image. The larger the circle, the longer the
request took. For example, the large green circle in this request map is a JavaScript file which had the
longest load time for the page.
DeepCrawl
DeepCrawl is a website crawler which can render JavaScript like search engines can by using its own Page
Rendering Service (PRS). This allows you to monitor rendering and resource issues at scale, rather than
having to analyse them on a page-by-page basis. The tool shows you whether or not links and content
modified by JavaScript can be crawled and indexed by search engines.
With JavaScript rendering enabled in the project setup, you’ll be able to see reports on the number of
JavaScript files you have on your site, broken or disallowed JavaScript files, JavaScript redirects and more.
58
The key things to watch out for within your
crawls are thin pages and low internal linking as
this gives an indication of important elements
that are being hidden from search engines.
Also, watch out for pages that exceed the
maximum file size as this could prevent
browsers and search engines from being able
to process them.
External files
Resource counts
59
“A reliance on client-side JavaScript means a delayed and inefficient indexing process, plus a
plethora of related repercussions such as poor flow of internal link value and potential conflicts
between pre- and post-rendered content.
The best solution is to prevent the problem entirely in the first place. Make sure a webpage's
content, links, and relevant meta data is present in the raw HTML source and doesn't rely on any
client-side JavaScript. Progressive Enhancement is a great approach, where the webpage's base
content is in the HTML and JavaScript is then used to enhance the page's functionality.
To test how your site's reliance on JavaScript could impact on your SEO, you
can use crawling tools that allow you to compare raw HTML to rendered code
(such as DeepCrawl). Significant variances between the HTML and fully
rendered page could point to possible problems with Google's crawling and
indexing of your site.”
Barry Adams,
Founder of Polemic Digital
60
1
Prioritise your most
important content and
resources
Sending JavaScript to the browser or search
engine all at once in the initial request to be
rendered client-side is a bad idea. This is a big
ask even for users with higher-end devices and
Google which has relatively sophisticated
rendering capabilities.
ORITISINGInstead,
THE focusing on the core content of the PRIORITISING THE
page, such as the above-the-fold, hero content
ST IMPORTANT MOST IMPORTANT
and making sure that it is available first as a
NTENT ONpriority,
A is a much better strategy. CONTENT ON A
E TO LOAD PAGE TO LOAD
Here are some solutions to look into for helping
ST CAN FIRST CAN
to prioritise key resources:
MATICALLY DRAMATICALLY
ROVE THE IMPROVE THE
Optimise the critical rendering path: Make
R’S sure that none of the most important USER’S
CEPTION resources
OF on a page are being blocked and PERCEPTION OF
that they can render as soon as possible.
D TIME. LOAD TIME.
Ilya Grigorik,
Web Performance
42
Engineer at Google
43
empty caches.”
MDN Web Docs
Jake Archibald,
44
45
Google Webmaster Central Blog
Essentially, SEOs should ensure that there is a version of their website available that works
without JavaScript. This could be achieved, for example, by using Puppeteer which is a Node
library that can be used to control a headless Chrome browser. This browser could then be used
to crawl either a JavaScript-heavy website or even a single page JavaScript application and as a
result, generate pre-rendered content.
Bastian Grimm,
Director of Organic Search at
Peak Ace AG
“The technology used on your website can sometimes prevent Bingbot from being able to find
your content. Rich media (Flash, JavaScript, etc.) can lead to Bing not being able to crawl through
navigation, or not see content embedded in a webpage. To avoid any issue, you should consider
implementing a down-level experience which includes the same content elements and links as
your rich version does. This will allow anyone (Bingbot) without rich media enabled to see and
interact with your website.”
47
Bing Webmaster Guidelines
65
Here’s what Google’s Martin Splitt recommends for optimising web apps for
search engines and users:
2 Use meaningful markup: Use clean links and clear page title, meta
description and canonical tags.
3 Use the right HTTP status codes: Avoid soft 404s so Google can serve the
right pages.
4 Use structured data: This will help search engines better understand your
content.
Martin Splitt,
Webmaster Trends
48
Analyst at Google
4
Reduce file transfer size
Reducing the file size of a page and its resources where possible will improve the speed with which it can
be rendered. You can do this by making sure that only the necessary JavaScript code is being shipped.
Minification: Removing redundant code that has no function, such as whitespace or code comments.
Compression: Use GZIP or zlib to compress text-based resources. Remember that even if something is
initially compressed, it will still have to be unzipped and downloaded.
Code splitting: This involves dividing long, expensive tasks into smaller ones so the main thread doesn’t
become blocked and is still able to react to user interactions. This increases the perceived speed of a
page for users.
Learn how to audit and trim your JavaScript bundles. There’s a high chance
you’re shipping full-libraries when you only need a fraction, polyfills for
browsers that don’t need them, or duplicate code.”
Addy Osmani,
49
Engineering Manager at Google
5
Enhance JavaScript performance
JavaScript doesn’t have to dramatically decrease site speed; there are things you can do to optimise
the performance of the JavaScript on your website. Enhancing JavaScript performance should be a
focus topic for developers and SEOs alike going forwards. Here are some methods to potentially
implement on your site:
Cache resources: This is especially helpful for users making repeat visits and server-side rendered or
dynamically rendered content, as too many pre-render requests at once can slow down your server.
Preload resources: This allows you to tell the browser which scripts and resources are the most
important and should be loaded first as a priority.
Prefetch links: This allows the browser to proactively fetch links and documents that a user is likely to
visit soon within their journey, and store them in its cache. This means that the resource will load very
quickly once a user goes to access it.
Use the PRPL pattern: This method is an acronym for the different elements that its process is made up
of: Push, Render, Pre-cache and Lazy-load. The PRPL pattern is recommended for improving
performance and load times on mobile devices especially.
S O U R C E : G O O G L E W E B F U N DA M E N TA L S
68
In conclusion
SEOs and developers need to learn to work “SEOs need to stay ahead of the curve
better together to get their websites when it comes to JavaScript. This means a
discovered, crawled and indexed by search
lot of research, experimentation, crunching
engines, and, therefore, seen by more users.
Both departments have a lot of influence over data, etc. It also means being ready for the
the different aspects of this process and we next big thing that probably dropped
share common goals of serving optimised yesterday and you now need to adapt to,
websites that perform well for both search
whether it’s Google finally updating its
engine bots and humans.
rendering browser or one of the countless
JavaScript frameworks suddenly becoming
popular among developers.”
Bartosz Góralewicz,
50
70