Updates to Google Webmaster Tools Make Sites More Crawlable

While JavaScript, CSS, and linked images make websites look good and function properly, they can cause SEO headaches if those resources are blocked from crawling. Now, Google is aiming to remedy that problem by making sure webmasters know exactly which website features are being blocked.

In a blog post this morning, Google said its new reporting feature will begin with the names of blocked hosts. Clicking on the “rows” column will diagnose the problem in more detail with a list of blocked resources and a step-by-step guide to remedying the issues.

Googling is also making it easier to test sites for crawling problems with Fetch and Render, a URL retrieval feature that gives webmasters screenshots of how a page appears to Googlebot and a typical reader.

Greater transparency into Googlebot crawling issues impacts a number of issues, including “Mobile-Friendly” tags.

Related reading

Search engine results: The ten year evolution
Five ways PPC customer support can help SMBs
#GoogleDoBetter The latest on internal issues at Google and Alphabet
Google Sandbox Is it still affecting new sites in 2019