Bing has published recommendations on how to optimize websites for search crawlers when they’re built with JavaScript.
Crawling JavaScript sites is more complicated than crawling static HTML sites because they tend to link to many JavaScript files that need to be downloaded from the web server.
This is also referred to as server-side rendering, which requires multiple HTTP requests compared to a single HTTP request required to render static HTML.
Needless to say, dozens of HTTP calls required to render a single page is not optimal. However, Bingbot has a way to deal with it.
Bingbot can render JavaScript, but it does not support all the same frameworks that are supported in the latest version of modern web browsers. Neither does Googlebot, for that matter.
Although Bingbot is capable of rendering JavaScript, it’s still difficult to process at scale while keeping HTTP requests to a minimum. Googlebot faces the same challenge.
Bing offers the following recommendations to minimize HTTP requests while ensuring its web crawler can render the most complete version of a site every time:
- Program the site to detect the Bingbot user agent
- Prerender the content on the server side and output static HTML
- Utilize dynamic rendering as an alternative for relying heavily on JavaScript
The above recommendations will help increase the predictability of crawling and indexing by Bing and should assist other web crawlers as well.
JavaScript for Dynamic Rendering = Cloaking?
The inevitable question Bing gets asked when it comes to rendering content for search crawlers is whether it is technically considered cloaking.
As long as the same content is shown to all visitors it is not considered cloaking, Bing says.
Here is the exact quote:
“The good news is that as long as you make a good faith effort to return the same content to all visitors, with the only difference being the content is rendered on the server for bots and on the client for real users, this is acceptable and not considered cloaking.”