It’s a clash of the times, the age-old battle between design-, development- & SEO teams. Even more so now, with Google adding the ‘E’ of Experience to their E-A-T guidelines. For now, we’ll leave the devs to their matrix-like activities, and focus on design & SEO. As a fine line has to be walked, between good web design and SEO-proof page building. The problem is that those beautiful elements usually aren’t exactly going to supply you with high-quality, crawlable on-page content… Right, enough setup, let’s get to what you should be aware of, avoid, and how to fix some of the issues you’re bound to encounter! 

HOW TO DISCOVER CONTENT DEFICIENCIES

To make sure your website still ranks, though still looking fly as hell, you need to keep track of how much JavaScript-heavy elements are on your page. Let’s start right there, and let’s call it the ‘discovering if you have a big problem’ -step. The easiest way to find out if JS is going to tank your rankings is to install this plugin (Chrome only, sorry not sorry, Safari users!). By using this custom toggle, you can toggle off all JavaScript on a page. By turning it off, you’ll see what the Google crawler sees when loading your page. Best case scenario, your devs have paid attention during production and haven’t made entire elements JS reliant. Worst case though… let’s just say that it wouldn’t be the first time that this toggle results in a fully blank page, with some scattered images.

JAVASCRIPT: FLASHINESS VS BASIC FUNCTIONALITY

It’s no secret that JavaScript has brought us a lot in terms of UX design. The dynamic content options are near-limitless and tend to give your webpages that ‘wow-factor’ that you might have been missing. But all the flashiness of dynamic elements bobbing around, does come with some heavy caveats in terms of crawlability and obtainability.

To give you an easy example, you only have to look as far as site speed. With complex animations and pop-ups crowding your page, you’re going to pay the price in terms of loading speed. This could pose grave issues for your site, as the Google crawler only has a very particular budget available for your website. Simply put: if JS is making your website run like an old Diesel engine, you’re going to experience some serious difficulties trying to get it ranking well.

RENDERING PROBLEMS & CRAWLABILITY ISSUES

After crawling, Google indexes websites in two waves, the first wave goes through weeks to even a month earlier than the second one. The difference between these two? Pages that need rendering, and ones that don’t. And you guessed it: JavaScript greatly increases the amount of rendered resources on your page. Which means that your site is going to get lobbed into the second batch, being processed weeks later than HTML pages. Having discrepancies between your HTML and your JS can even cause your page to not index at all.

Now, of course, you’d like a way to test if you’re going to experience some of these problems, right? No worries, I got you all sorted. Take for example Google’s own URL Inspection Tool, hidden snugly inside of one of my favorite tools: Google Search Console.

Using this tool, you can detect crawling issues in real-time, which even neatly displays JavaScript console messages, like warnings and errors that Google encountered when trying to crawl your page.

ALWAYS MORE TO LEARN

There are loads more tools and tricks to get a better understanding of the use of JavaScript on your pages. But let’s get started first, shall we? Got any more questions relating to SEO- & JavaScript issues? Feel free to shoot me an email, or a Tweet!