SEO for websites built with JavaScript.
React, Vue, Angular — these tools build beautiful, fast web apps. But they can also make your website completely invisible to Google and AI search engines unless you take specific steps to fix it.
When a beautiful website gets zero search traffic.
Rafael spent four months and a significant portion of his startup budget building a web application in React. The product was a SaaS platform for small restaurants — online ordering, menu management, customer reviews. His developer built a gorgeous single-page application (SPA) with smooth animations, fast page transitions, and a modern feel. Every visitor who landed on the site loved it.
The problem? Almost nobody landed on the site. Three months after launch, Rafael checked Google Search Console and discovered that Google had indexed exactly two pages out of forty-seven. His product pages, pricing page, feature comparisons, blog posts, and customer testimonials — all invisible. When someone searched “restaurant ordering system Philippines” on Google, Rafael’s site was nowhere. When they asked ChatGPT or Perplexity for recommendations, his platform was never mentioned.
Rafael’s site was not broken. His hosting was fine. His domain was properly connected. The content was genuinely good. The problem was how the site was built. His React application loaded as an empty HTML shell and relied entirely on JavaScript running in the visitor’s browser to fill in the actual content. Google’s crawlers arrived, saw a mostly blank page, and moved on. AI crawlers saw even less.
This is the JavaScript SEO problem, and it affects thousands of business websites every day. If your site was built with a JavaScript framework — React, Vue.js, Angular, Svelte, or similar tools — this article explains exactly what is happening and how to fix it.
Why JavaScript makes websites invisible to search engines.
To understand this problem, you need to know how a traditional website works versus how a JavaScript-powered website works. Do not worry — no coding knowledge required. Think of it like two different ways of running a restaurant.
The traditional website (server-side rendering)
A traditional website works like a restaurant where the kitchen prepares your meal completely before the waiter brings it to your table. When you (or a search engine) request a page, the server assembles the full page — headings, text, images, links, everything — and sends the finished product. Google’s crawler arrives, reads the complete page, and indexes every word. Simple. Reliable. This is how WordPress, Shopify, Wix, Squarespace, and plain HTML sites work.
The JavaScript website (client-side rendering)
A JavaScript-powered website works like a restaurant that sends you a plate, a bag of raw ingredients, and a recipe, and asks you to cook the meal yourself at your table. The server sends a nearly empty HTML page with a set of JavaScript instructions. Your browser (the “client”) runs those instructions and builds the visible content right there on your device.
For a human visitor with a modern browser, this works perfectly. The page appears fast and feels smooth. But when Google’s crawler arrives, it gets that bag of raw ingredients instead of a finished meal. Google has to “cook” the page itself — executing all the JavaScript to see what the final result looks like. This process is called rendering, and it introduces several problems:
- It takes extra time. Google does not render every page immediately. It first reads the raw HTML (fast pass), then queues the page for rendering (slow pass). The rendering step can happen minutes, hours, or even days later.
- It is not always accurate. Some JavaScript does not render correctly in Google’s environment. If your code depends on user interactions, specific browser features, or has any errors, Google may see a broken or incomplete page.
- It costs Google resources. Rendering JavaScript is computationally expensive. Google has limited rendering capacity and prioritises popular, authoritative sites. A new startup like Rafael’s goes to the back of the queue.
- Delayed content is delayed indexing. Even when rendering works, the gap between discovery and rendering means your content enters Google’s index later than it would with a traditional site.
Real example — Ivy’s portfolio problem:
Ivy is a graphic designer who built her portfolio website using Vue.js. She chose Vue because it let her create beautiful page transitions and interactive project showcases. Her portfolio looked stunning. But when she checked Google’s cached version of her homepage, she saw what Google actually stored: a page with her site name and a single line of text. All her project thumbnails, descriptions, client testimonials, and service offerings were missing. Google had crawled the raw HTML but had not rendered the JavaScript that built her actual content. Her portfolio was effectively a blank business card in Google’s eyes.
What Google can render — and what it cannot.
Google has invested heavily in JavaScript rendering. Their crawler (Googlebot) uses an evergreen version of Chromium (the technology behind Chrome) to execute JavaScript. In theory, this means Google can render most modern JavaScript frameworks. In practice, there are important gaps.
What Google handles well:
- Standard React, Vue, and Angular applications that do not depend on user interaction to display content
- JavaScript that runs on page load without needing clicks, scrolls, or form submissions
- Pages that do not require authentication or cookies to display content
- Websites with clean, error-free JavaScript code
What Google struggles with:
- Content behind interactions. If a user needs to click a tab, scroll to a section, or expand an accordion to see content, Google may never trigger those interactions. Content hidden behind clicks is often invisible to Google.
- Content loaded after delays. If your JavaScript waits for external data (like pulling from an API) before displaying content, Google may give up before the data arrives.
- JavaScript errors. Any error in your JavaScript can stop rendering entirely. Your visitors might never notice because browsers are forgiving. Google’s renderer is less patient.
- Infinite scroll. Pages that load content as the user scrolls down (like social media feeds) are difficult for Google to fully process. Google does not scroll.
- Content requiring authentication. If your JavaScript needs to check if a user is logged in before showing content, Google’s crawler is never logged in.
Real example — Anthony’s news aggregator:
Anthony built a news aggregation website using Angular. The site pulled articles from multiple APIs and displayed them in a beautiful, interactive grid. The problem was that all the content loaded asynchronously — the JavaScript had to call three different APIs, wait for responses, and then build the page. Google’s crawler arrived, saw the empty shell, attempted to render the JavaScript, but the API calls timed out in Google’s rendering environment. Result: Google indexed Anthony’s site with zero articles. The site had thousands of articles, but Google could not see a single one.
AI crawlers have an even harder time with JavaScript.
If Google’s crawler struggles with JavaScript, AI crawlers are in far worse shape. Here is the reality in 2026:
The crawlers behind ChatGPT (GPTBot), Perplexity (PerplexityBot), Google’s Gemini, and Microsoft Copilot mostly read raw HTML. They behave more like simple web readers than full browsers. When they visit your JavaScript-powered website, they see the empty HTML shell — not the finished page your visitors see.
This means:
- If your business information, product descriptions, or service details are built by JavaScript, AI search tools literally cannot read them
- When someone asks ChatGPT “what is the best restaurant ordering platform in the Philippines,” your site will never be cited if the AI crawler could not read your content
- Perplexity, which provides source citations with its answers, can only cite pages it has successfully crawled and read — JavaScript-only content is excluded
- Even Google’s own Gemini AI relies on Google’s index, which may have incomplete data if your JavaScript did not render properly
The bottom line: a JavaScript-only website in 2026 is invisible not just to traditional search but to the entire AI search ecosystem. Every AI assistant that might recommend your business, cite your content, or answer questions about your industry cannot see what you have built.
How to check if your JavaScript content is visible to Google.
Before panicking, check whether your site actually has a JavaScript SEO problem. Many business owners do not even know what technology their site uses. Here are simple ways to find out.
Step 1: The “View Source” test
Open your website in Chrome. Right-click anywhere on the page and select “View Page Source.” This shows you the raw HTML that arrives from your server — the same thing Google’s crawler sees on its first pass.
If you see all your text content, headings, product descriptions, and page information in the source code, you are fine. Your site serves content in HTML and does not depend on JavaScript for SEO-critical content.
If you see mostly JavaScript code, a nearly empty <body> tag, or something like <div id="app"></div> with nothing else, your content is being built by JavaScript. You have a potential problem.
Step 2: The Google Search Console test
Go to Google Search Console, use the URL Inspection tool, and enter any page from your site. Click “Test Live URL,” then click “View Tested Page” and look at the screenshot. If the screenshot shows your complete page with all content, Google can render it. If the screenshot is blank, missing content, or broken, Google cannot render your JavaScript properly.
Step 3: The “site:” search test
Type site:yourdomain.com into Google. Look at the results. Do your pages appear with proper titles and meaningful descriptions? Or do they show generic text, your framework name, or the same description repeated across every result? If the snippets look wrong or identical, Google is not reading your actual content.
Step 4: The “cache:” test
Search for your page on Google and click the three dots next to the result. View the cached version. This shows you exactly what Google has stored. If the cached page is mostly empty while your live page is full of content, you have confirmed a JavaScript rendering problem.
Real example — Manny’s one-page portfolio:
Manny is a freelance web developer who ironically had his own JavaScript SEO problem. He built a single-page portfolio using a JavaScript framework with fancy scroll animations, dynamic project loading, and a contact form that appeared via JavaScript. When he ran the “View Source” test, his page source showed exactly this: a page title, a single <div> element, and five JavaScript files. Zero readable content. Google had indexed his page with only his name as the title and no description. When potential clients searched for “freelance web developer” in his area, his portfolio was buried under hundreds of competitors whose sites Google could actually read.
Three proven solutions for JavaScript SEO.
If your site has a JavaScript rendering problem, there are three main approaches to fix it. Each has different costs, complexity, and trade-offs. The right choice depends on your situation.
Solution 1: Server-Side Rendering (SSR)
Server-side rendering means your server builds the complete HTML page before sending it to the visitor (or crawler). Instead of sending raw ingredients and a recipe, you send the finished meal. The JavaScript framework still runs in the browser for interactivity, but the initial page load includes all the content in readable HTML.
How it works in practice: If your site uses React, you can switch to Next.js (a React framework that includes SSR by default). If you use Vue.js, you switch to Nuxt.js. For Angular, there is Angular Universal. These tools keep all your existing components and code but change how the first page load is delivered.
Pros: Best SEO performance. Search engines and AI crawlers receive complete pages. Fastest time to content for visitors. Works perfectly with every crawler.
Cons: Requires significant code changes. Your developer needs experience with SSR frameworks. Server costs may increase because the server does more work per request. Can take weeks to migrate an existing application.
Real example — Celeste’s travel agency migration:
Celeste runs a travel agency that had a beautiful single-page application built with React. Her site featured destination pages, tour packages, pricing calculators, and customer reviews — all rendered client-side. After nine months of low organic traffic, she hired a developer to migrate to Next.js with server-side rendering. The migration took three weeks. Within two months of launching the SSR version, her indexed pages jumped from 8 to 94. Within four months, organic traffic increased by over 300 percent. AI search tools started citing her destination guides in travel-related queries. The migration cost her about what she was spending monthly on paid ads that were her only traffic source.
Solution 2: Pre-rendering
Pre-rendering is a middle-ground solution. Your JavaScript application stays exactly as it is. But you add a service that detects when a search engine or AI crawler visits and serves them a pre-built HTML version of each page instead of the JavaScript application.
How it works in practice: A pre-rendering service (like Prerender.io or a self-hosted solution using Puppeteer) visits every page on your site in advance, executes the JavaScript, captures the final HTML output, and stores it. When Google or an AI crawler requests a page, the server checks if the visitor is a crawler. If yes, it serves the pre-rendered HTML. If it is a regular visitor, it serves the normal JavaScript application.
Pros: No changes to your existing JavaScript code. Relatively quick to set up (days, not weeks). Works with any JavaScript framework. Solves the AI crawler problem immediately.
Cons: Pre-rendered pages can become stale if not regenerated regularly. Adds a layer of complexity to your hosting setup. Some pre-rendering services charge monthly fees. Google technically discourages serving different content to crawlers versus users (called “cloaking”), though pre-rendering that shows the same content is generally accepted.
Solution 3: Hybrid Rendering (Static Site Generation)
Hybrid rendering combines the best of both worlds. Pages that rarely change (your homepage, about page, service descriptions, blog posts) are pre-built as static HTML files at build time. Pages that need real-time data (search results, user dashboards, shopping carts) are rendered on the server per request or on the client.
How it works in practice: Frameworks like Next.js, Nuxt.js, and Gatsby support this out of the box. Your developer marks each page or template as either static (built once at deploy time) or dynamic (rendered on each request). Most business websites can make 80 to 90 percent of their pages static, which gives excellent SEO performance and fast load times.
Pros: Best performance for static pages. Excellent SEO. Lower server costs than full SSR because most pages are pre-built. Flexible — dynamic pages still work as needed.
Cons: Requires framework migration if you are not already using a compatible tool. Static pages need to be rebuilt when content changes, which adds a deployment step. Build times can be long for sites with thousands of pages.
Which solution should you choose?
Here is a simple decision guide:
- If you are building a new site: Start with Next.js (React), Nuxt.js (Vue), or SvelteKit (Svelte). These handle SSR and static generation from the beginning.
- If you have an existing SPA and limited budget: Add pre-rendering. It is the fastest fix with the least disruption.
- If you have an existing SPA and can invest in a proper rebuild: Migrate to an SSR framework. The long-term SEO benefits justify the investment.
- If most of your content is static: Use static site generation (hybrid approach). It is the most efficient option for content-focused business websites.
When JavaScript SEO matters — and when it does not.
Not every website has a JavaScript SEO problem. Here is how to know if this article applies to your situation.
JavaScript SEO matters if:
- Your website was custom-built using React, Vue, Angular, Svelte, or another JavaScript framework
- Your developer built a single-page application (SPA)
- When you “View Source” on your site, you see mostly JavaScript code instead of your actual content
- Google Search Console shows far fewer indexed pages than your site actually has
- You rely on organic search traffic to find customers
- You want AI search tools to discover and cite your content
JavaScript SEO does not matter if:
- Your site uses WordPress, Shopify, Wix, or Squarespace — these platforms serve content as traditional HTML
- Your site is built with plain HTML and CSS files
- Your site uses a static site generator like Hugo, Jekyll, or Eleventy (these produce plain HTML)
- Your JavaScript only handles interactive features (menus, sliders, forms) while the main content is in HTML
- Your application is behind a login and you do not need public search visibility (like internal dashboards or admin tools)
The key distinction: if JavaScript is responsible for displaying your core content (the text, descriptions, and information you want Google and AI crawlers to index), then JavaScript SEO is critical. If JavaScript only handles interactive enhancements on top of HTML-delivered content, you have nothing to worry about.
A quick check for business owners:
Ask your developer one question: “Does our website work — showing all our main content — if a visitor has JavaScript disabled in their browser?” If the answer is no, you need to address JavaScript SEO. If the answer is yes, your content is already in the HTML and you are in good shape.
What to do right now.
- Find out what your site is built with. Ask your developer or check the source code. If they mention React, Vue, Angular, or “SPA,” keep reading.
- Run the View Source test. Right-click your homepage, click “View Page Source,” and look for your actual content in the HTML. If it is missing, you have a JavaScript rendering problem.
- Check Google Search Console. Use the URL Inspection tool to see if Google can render your pages. Look at the screenshot to confirm your content is visible.
- Count your indexed pages. Search
site:yourdomain.comon Google. Compare the number of results to the number of pages on your site. A big gap signals a rendering problem. - Choose a solution. For quick wins, implement pre-rendering. For long-term results, migrate to an SSR framework like Next.js or Nuxt.js. For new projects, start with SSR from day one.
- Test after implementing. Re-run the Google Search Console test and the View Source test after your fix is deployed. Confirm that Google can now see your content.
Frequently asked questions about JavaScript SEO.
Why does JavaScript cause SEO problems?
JavaScript causes SEO problems because search engines need to see your content in the raw HTML code that arrives when they request your page. Many JavaScript frameworks build the page content inside the visitor’s browser instead of on the server. When Google requests the page, it receives a nearly empty HTML file with instructions to build the content later. Google can sometimes execute that JavaScript and see the content, but the process is slower, less reliable, and not guaranteed. AI crawlers from ChatGPT, Perplexity, and others often cannot execute JavaScript at all, meaning they see a blank page.
What is the difference between client-side rendering and server-side rendering?
Client-side rendering means the website content is built inside the visitor’s browser using JavaScript after the page loads. The server sends a mostly empty HTML shell, and JavaScript fills it in. Server-side rendering means the server builds the complete HTML page before sending it to the browser. Search engines and AI crawlers receive a fully formed page with all content visible immediately. Server-side rendering is far better for SEO because crawlers can read all your content without needing to execute JavaScript.
Can Google render JavaScript content?
Yes, Google can render JavaScript, but with important limitations. Google uses a two-phase indexing process: first it reads the raw HTML, then it queues the page for rendering (executing the JavaScript). The rendering step can be delayed by seconds, minutes, or even days depending on Google’s resources and your site’s crawl priority. Some JavaScript content may not render correctly if it depends on user interactions, requires authentication, or has errors. For reliable SEO, you should not depend on Google rendering your JavaScript correctly every time.
Do AI search crawlers like ChatGPT and Perplexity render JavaScript?
Most AI crawlers do not render JavaScript. They behave more like simple web scrapers that read the raw HTML response from your server. If your content only exists after JavaScript runs in a browser, most AI crawlers will see a blank or nearly empty page. This means JavaScript-heavy websites are often completely invisible to AI search tools like ChatGPT, Perplexity, Gemini, and Copilot. Server-side rendering or pre-rendering solves this problem by ensuring content is present in the initial HTML.
What is pre-rendering and how does it help JavaScript SEO?
Pre-rendering is a technique where you generate static HTML versions of your JavaScript pages in advance. When a search engine or AI crawler requests a page, the server detects the crawler and serves the pre-rendered HTML version instead of the JavaScript application. Regular visitors still get the interactive JavaScript version. Pre-rendering gives you the SEO benefits of server-side rendering without rebuilding your entire application. Services like Prerender.io can handle this automatically.
How do I check if Google can see my JavaScript website content?
Use the URL Inspection tool in Google Search Console. Enter your page URL and click “Test Live URL,” then view the rendered page screenshot. If the screenshot shows your content, Google can render it. If it shows a blank or broken page, Google cannot see your content. You can also use the “View Tested Page” option to see the exact HTML Google received. Another quick test: search site:yourdomain.com on Google and check if your pages appear with proper titles and descriptions. Missing or generic snippets suggest rendering problems.
Does my small business website need to worry about JavaScript SEO?
It depends on how your website was built. If your site uses WordPress, Shopify, Wix, Squarespace, or plain HTML, you probably do not need to worry about JavaScript SEO because these platforms serve content as regular HTML. If your site was custom-built using React, Vue, Angular, or another JavaScript framework, then yes, JavaScript SEO is critical. Ask your developer what technology your site uses. If they mention a single-page application, a JavaScript framework, or client-side rendering, you need to ensure proper SEO measures are in place.
Terms used in this article.
- JavaScript framework
- A pre-built collection of JavaScript code that developers use to build websites and web applications. Popular examples include React, Vue.js, Angular, and Svelte.
- Single-page application (SPA)
- A website that loads a single HTML page and dynamically updates content using JavaScript instead of loading new pages from the server. Common in modern web apps but challenging for SEO.
- Client-side rendering (CSR)
- When the visitor’s browser builds the page content by executing JavaScript. The server sends a mostly empty HTML file and the browser fills it in.
- Server-side rendering (SSR)
- When the server builds the complete HTML page before sending it to the visitor or crawler. This ensures all content is visible without needing JavaScript to run.
- Pre-rendering
- A technique that generates static HTML snapshots of JavaScript pages in advance, serving these to search engines and AI crawlers while regular visitors get the interactive JavaScript version.
- Static site generation (SSG)
- Building HTML pages at deploy time rather than on each request. Pages are pre-built and served as plain HTML files, providing excellent SEO performance and fast load times.
- Rendering
- The process of executing JavaScript to produce the final visible page. Google’s crawler can render JavaScript, but the process is delayed, resource-intensive, and not always reliable.
- Hydration
- The process where a server-rendered HTML page becomes interactive in the browser. The server sends complete HTML, then JavaScript activates to add interactivity without rebuilding the content.
Bottom line: If your website was built with a JavaScript framework like React, Vue, or Angular, there is a real chance that Google and AI search tools cannot read your content. Rafael lost months of potential traffic because his React SPA was invisible to crawlers. Ivy’s portfolio was a blank page in Google’s eyes. Celeste fixed the problem by migrating to server-side rendering and saw her organic traffic increase by over 300 percent. The fix is not complicated, but it requires knowing the problem exists. Run the View Source test today. If your content is not in the HTML, talk to your developer about server-side rendering, pre-rendering, or static site generation. Do not let a beautiful website remain invisible.
Is your JavaScript website invisible to search engines?
A properly built website should be discoverable by Google, Bing, and every AI search engine from day one. If yours is not, let’s fix that.