For example, in server-side rendering, the server contains the website’s contents. Upon a request, the browser receives the fully-rendered HTML.
The third option for rendering is dynamic rendering, where content rendered at the client-side goes to the browser, whereas the content rendered at the server-side goes to the search engine(s).
The rendering techniques affect how JS is rendered and hence the page rankings.
Fetches a URL from the crawling queue via the HTTP request
Checks the robots.txt file for URLs that the site does not allow for crawling
Skips the ‘disallowed’ URLs, parses the response for other URLs, and adds them to the crawl queue
Queues the pages for rendering, except those that are marked to be not indexed
Parses the rendered HTML again for links
Queues the URLs for crawling
Potential SEO issue
Possible SEO solution
Search engines (like Google) cannot render your page if its resources are blocked in your site’s robots.txt file. Also, Google cannot index or render JS and CSS files, which are blocked or hidden.
Use anchor links with the href attribute, descriptive anchor texts for the links. Pseudo links like <div> and <span> tags are not crawled
Unless the site uses Node.js packages like vue-meta, search engines may be crawling the same or, worse, no metadata for each view or page.
Use Node.js packages like react-helmet, vue-meta, react-meta-tags
Search engine crawler does not pick any content that’s marked for lazy-loading. The search engine cannot scroll for content, and hence some content may never be rendered.
Use the IntersectionObserver API, which understands the visibility and position of DOM elements once they are available. You can also use the native lazy-loading feature of the browser (Chrome).
Page load times
Add critical JS code inline and defer non-critical JS code till the main content is rendered, reducing the overall JS code.
By following some of the best practices, we can get search engines to crawl and render the pages better:
Add links and images as per defined web standards
Add all the links using the ahreftag rather than onclick, #pageurl, or window.location.href='/page-url. Google can easily crawl the links and follow them.
<a href=”http://geekflare.com”>Welcome to Geek world</a>
Same way, add images using the img src tag and not the img data-src tag:
<img src=”myimg.png” />
Prefer server-side rendering
Make sure your website content is available on the server apart from the user browser.
Ensure your rendered HTML has all the important content you want to show
The rendered HTML should have the correct title, meta robots, meta descriptions, images, structured data, and canonical tags.
Check if Googlebot gets all the important content and tags: You can use the Google mobile-friendly test tool or the rich results test tool to check how Googlebot uses the raw HTML to render content.
You can check for the important tags (title, meta description, etc.) on the rendered HTML using the SEO Pro Chrome extension.
From childhood to now, my love of writing never stopped, rather it only got better by the day, thanks to the opportunities that came along my way! I started with simple blog entries that I wrote just by observing my surroundings, and then hooked… read more
Python is a very versatile language, and Python developers often have to work with a variety of files and get information stored in them for processing. One popular file format you’re bound to encounter as a Python developer is the Portable Document Format popularly known as PDF