Web developers and SEO experts have long debated the impact of JavaScript on search engine visibility. At the same time, Google has tried to reassure site owners that its crawlers can now process JavaScript-generated content. However, the SEO industry invariably disagreed. To this day, sites relying heavily on JavaScript may still face indexing challenges, leading to lost visibility and reduced traffic.
But the issue now stretches beyond Google. The rise of AI-powered search engines introduces a new challenge—many struggle to interpret JavaScript-triggered content, potentially limiting your website’s reach with the new generation of AI tools.
Of course, optimisation does not end with rendering when it comes to how LLMs retrieve information; we need to discuss layout, content, and even schema requirements. This will have to be another article altogether.
Let’s first explain why JavaScript can still be a roadblock for SEO. I’ll reveal how you can optimise your site to ensure better content discoverability, regardless of the search platform.
To explain the current challenges of JavaScript-powered functionality, it is essential to first discuss the basics of Server-Side Rendering (SSR) and Client-Side Rendering (CSR)…
When building a web application, the choice between server-side rendering (SSR) and client-side rendering (CSR) significantly impacts performance, user experience and SEO.
SSR generates fully rendered HTML on the server before delivering it to the browser, resulting in faster initial page loads. This approach is particularly beneficial for SEO, as search engines can easily index pre-rendered content. However, SSR may require full-page reloads for navigation, making interactions slightly slower compared to CSR. This could potentially impact your scores when optimising for Core Web Vitals performance.
In contrast, CSR loads a minimal HTML shell and relies on JavaScript to dynamically render content in the browser. While this can lead to slower initial load times, it enhances the user experience by allowing seamless navigation without refreshing the page. The downside is that CSR can pose challenges for SEO, as some search engine crawlers struggle to interpret JavaScript-generated content.
A Note on Other Rendering Types
Dynamic rendering serves different versions of a webpage according to the user agent. It is useful for frequently updated content, ensuring a pre-rendered version can be made available for search bot user agents while JavaScript-driven elements load dynamically depending on users. However, Google has since gone on record to no longer recommend this method to maximise search visibility.
Static rendering generates static HTML files with minimal JavaScript functionality. These pre-rendered pages can be stored in CDNs and efficiently crawled by search engines.
Static Site Generation Technology (SSG) and Rehydration, on the other hand, delivers static HTML first and then progressively restores JavaScript functionality on the client side (more about this later). While this balances performance and interactivity, delays in rendering could affect user experience, depending on implementation.
How does this play with AI-powered search tools?
LLMs (Large Language Models) are designed as learning systems that rely on inputted data rather than independently searching for information. However, they can be equipped with specialised tools to access real-time data from websites when needed, such as the latest weather outlook or stock prices.
Unfortunately, using JavaScript libraries and similar technologies can obscure content from AI search engines (and traditional search engines alike), making it challenging to reach their expanding AI-driven audience.
The main guidelines are straightforward: avoid restricting key content behind logins or paywalls, and ensure a plain HTML version of your page is available.
As AI-driven tools like ChatGPT (AI Chatbot) and Perplexity (AI search engine) become more adept for information retrieval, ensuring that your website is accessible and easily ‘indexable’ by these models is increasingly important. One of the most effective ways to achieve this is through Server-Side Rendering (SSR for short), which pre-renders full HTML on the server before delivering it to the browser.
SSR improves AI and search engine indexation because the content is immediately available in its final form, eliminating the need for JavaScript execution. This can improve page load speeds and ensure that search crawlers can access all relevant content.
Several technologies support SSR, including Next.js for React applications, Nuxt.js for Vue, and Express.js with templating engines like Handlebars or EJS. Frameworks such as Django with Jinja templates and Ruby on Rails (with ERB templates) also provide robust server-side rendering solutions. These technologies ensure that content is fully rendered and structured for discoverability by AI models and search engines alike.
How Static Site Generation Technology (SSG) Helps AI Retrieve Information
Static Site Generation (SSG) takes another different approach to making content easily accessible to AI models. Since SSG pre-builds HTML pages before deployment, the content is fully rendered and ready for indexing and retrieval, ensuring that search engines and AI tools can retrieve it efficiently. JavaScript is implemented soon after in a process referred to as “Rehydration”, thus making pages interactive by attaching event listeners to the HTML elements for the end-user. You will find this delivery method in React applications.
One of the biggest advantages of SSG webpages for AI retrieval is its strong SEO foundation. SSG delivers fast-loading, structured content, and static pages from a Content Delivery Network (CDN). This results in faster indexing, better search rankings, and easier accessibility for AI models that rely on web crawlers to fetch information. Additionally, SSG can ensure consistency in page content, meaning AI tools retrieve the same data every time without being affected by dynamic elements or personalised user experiences.
What tools can help SEOs identify rendering issues?
If you have access to Search Engine Console admin for your target domain, you can use the URL Inspect Tool to test individual pages making sure to see the HTML code and snapshot are what you expect to see.
A simple way to discover issues quickly is to install the Web Developer extension to Chrome (or Firefox). Load your target page. Disable Javascript (via the extension). Reload the page and make a visual comparison. If there are visual discrepancies, then it’s time to dig deeper.
Another extension available is View Rendered Source. This plugin presents a side-by-side code comparison of what is rendered (with Javascript modifications) and what is the raw source. This tool is useful, particularly if you need to see code version differences according to device.
Screaming Frog has a great way to analyse multiple pages at once with a subscription fee (restricted to 500 pages on the free version).
We need to ‘Spider’ analyse the website but make sure that your crawl is set up properly. Here’s my step-by-step approach:-
1. Go to Configuration> Spider> Rendering
Use the following settings as a base…
In the HTML panel, make sure that ‘Store HTML’ and ‘Store Rendered HTML’ is checked
3. Your user agent should be Googlebot (Smartphone) to consider the mobile-first approach for Googlebot. You can define a custom search bot by looking up your preferred crawler via Google. For instance: ChatGTP bots are found here: https://platform.openai.com/docs/bots
Using Dentsu Fetch and Render (formerly Merkle technical SEO tool) (https://technicalseo.com/tools/fetch-render/)
- Enter the URL of the page
- Set user-agent to ‘Googlebot Smartphone’. (The tool also has a comprehensive set of AI Crawlers)
- Check the tick box for Rendering
JavaScript is playing a growing role in enhancing website functionality, sometimes in essential ways and other times purely for creating “that cool-looking website”.
To ensure your content remains visible to modern search platforms, avoiding JavaScript that unintentionally masks key information is very important. These features do not reveal the final DOM rendering without the user triggering Javascript events.
Google’s Developer Relations Expert, Martin Splitt, urges website developers and SEOs to avoid unnecessary complexities regarding JavaScript reliance.
Consider that some features can be implemented without JavaScript, leading to a better user and SEO experience.
Simplifying site architecture not only benefits user accessibility but also makes it easier for search engines and AI tools alike to see your pages effectively.
Pingback: Weekly News - 24th to 28th March 2025- David Chung Digital