Practical SEO Tips for Optimizing Your ReactJS App

Avatar

Getting on the same page: One of the most outdated SEO “best practices” circulating the web is the idea that JavaScript-driven websites (read: web apps) are bad for SEO because Googlebots can’t crawl the page/content properly. When in fact, Google has confirmed on its blog as far back as 2015 that this is possible.

To add additional color to the state of websites/apps in 2020, it’s worth acknowledging that growing number of companies (particularly at the enterprise level) have chosen to migrate their once static HTML webpages to a JavaScript framework.

This all to say: Using JavaScript to generate content on your website is not only an acceptable way of doing things in 2020, but also an increasingly popular one.

ReactJS, the ‘Next Supreme’ Front End Framework/Library?

There are several types of JS frameworks/libraries available to developers, with varying levels of experience and enthusiasm surrounding them. In my opinion, the rising ‘Supreme’ that SEOs should keep their eye on in the next few years is React.

According to the most recent ‘State of JavaScript‘ research study, React has had the highest satisfaction rating 3 times in the last four years.
Since 2016, React is the only front end framework/library that has consistently been used and liked by developers (denoted by it being in the top right corner of the chart).

What this means for SEOs in particular: Though only 18% of the top 10,000 websites are currently built with React, this number has been steadily climbing over the past three years. This type of data suggests that SEOs will become increasingly likely to work on a React app at some point within their SEO career.

This is especially true for SEOs that aspire to work with enterprise-level companies. In fact, many impressive companies (that SEOs may like to work at one day 🤓) have already joined React app bandwagon – including CNN, Instagram, Facebook, FOX, Netflix, and the New York Times.

SEO-Optimizing Your ReactJS App

When performing an initial SEO audit on a React app, there are five items that are a little bit different / I recommend paying especially close attention to. Check them out below:

1) Don’t Block Important JS Files from Googlebot (Accidentally or on Purpose)

While this is true for all websites, it has a significantly bigger impact on websites that generate content using JavaScript.

If search engines are blocked from crawling your site’s important JS files, Googlebots is unable to “see” the same thing as the end user – resulting in a loss of page authority and rankings in search.

Solution: Check your website’s robots.txt file to ensure no that no important JS files are labeled as “disallow”.

FOX.com doesn’t have any JS files blocked in robots.txt.

I also recommend referencing Google Search Console’s URL Inspection Tool > page resources section, which lets you know if/which page resource are blocked by the robots.txt.

In the example below, we can see that Googlebot is currently blocked from accessing one of FOX.com’s JavaScript elements. However, it is an external file, which we have no control over. (You can tell if a file is internal or external by the inclusion of your domain name within the file.)

If your team deems an external resource important enough, a solid work around I’ve found is to copy the js file to your own server, and link to it there. For this particular file, we’ve kept it as is.

Screen grab of URL Inspection tool results for a URL on FOX.com.

2) Find and fix and JavaScript Errors as they arise

There are two types of JavaScript errors that you might encounter in Google Search Console:

  • Syntax errors:
    • Related to spelling errors in your code that cause the program not to run/stop working part way through.
  • Logic errors:
    • Related to errors where the syntax is correct but the code is not what you intended it to be. So, while the program runs successfully but gives incorrect results.

Solution: The URL inspect tool > JavaScript Console section is great for identifying JS errors.

Using FOX as an example, we can see from the screenshot below that FOX.com doesn’t currently have any JavaScript errors, which is good!

You will notice however, that there is a warning regarding ‘Unsupported Browser’. Using deductive reasoning (and confirming one of our in-house developers 😉) we can see that it is related to a third party customer survey software (note the ‘Delighted’ hint). It provides this message to user(s) that an older version of a web browser. This is fine.

Screen grab of JavaScript console message for FOX.com

3) Make sure internal links are implemented via anchor tags 

Internal linking should be implemented with anchor tags within the HTML or the DOM, versus leveraging JavaScript functions (like onclick events) to allow the user to travel throughout the site.

While URLs might be able to be found and crawled without anchor tags (Ex: strings in JavaScript code or XML sitemaps), they won’t be associated with the global navigation of the site – which is important for SEO.

Solution: Rely on good old fashioned anchor tags (href=”URL”) to communicate most effectively with Googlebot.

FOX.com uses anchor tags to interlink between pages.

4) Maintain an organized URL structure, free of fragment identifiers

To ensure that Google knows you want it to crawl two different pages and index them separately, it is important to have two separate URLs rather than fragments (denoted with a # that changes the content of a page).

Hypothetical Fragment Page: FOX.com#entertainment

See Also
NFL.com Fumbles its Website Revamp

SEO-optimized Page: FOX.com/entertainment

Solution: Work with your developer to optimize site wide usage of fragments.

5) Double Check That Google Can See Your Menu Links and Tabbed Content

For most accurate data, Go to Chrome developer tools > JavaScript Console > Network > Check the box “Disable Cache”.

Screengrab of a section from Chrome developer tools.

Next, navigate to the elements section and find the div element that holds one of your menu or tabbed links/content. If you can find it there Google has access to it!

Screen grab of FOX.com’s mobile main menu.
Screen grab of the elements that make up the menu, from JavaScript Console.

Another way to spot check that your VIP content is being indexed is to type in the following formula: “site:example.com [content name].” As we can see below, Google is able to index the Masked Singer clip requested, which is good.

Screengrab of indexed video content on FOX.com.

Bonus Tip: Dynamic Rendering of the Page

If you’re in need of a work around to help bots render your page correctly, consider implementing dynamic rendering.

“[With dynamic rendering] requests from crawlers are routed to a renderer, [whereas] requests from users are served normally.” Martin Splitt, Webmaster Trends Analyst

Image Credit: developers.google.com

This can be a good solution for when there are no clearly blocked resources, when JS takes longer to load, or you’re just otherwise stumped. An example of this can be made of FOX.com’s filter pages (e.g., /comedy).

JavaScript-powered content is indexed depending on whether it is visible to Googlebot on page load. -John Mueller, Google Webmaster Hangout

Solution: To learn more about how to implement dynamic rendering on your ReactJS site, check out this great resource from Google on the topic.

Your Turn

Now I want to hear from you – what issues have you run into while optimizing your JavaScript-Driven website for SEO? How were you able to overcome them? Share your tips with me on socials/comments below!

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
View Comment (1)

Leave a Reply

Your email address will not be published.

Scroll To Top