Contact UsWDN News & more...

New Google Index Report Doesn’t Show Blocked URLs From Sitemap Report

Alan Bleiweiss shared a scree shot of how the Google Sitemaps document in Google Search Console shows over eleven,000 URLs that were blocked by robots.txt as a field and warning. Alan requested why is the new Google Index document within the new Google Search Console no longer reporting on these errors?

John Mueller said on Twitter that the new document received’t document on an error on sample URLs at the sitemap submission level. He said “those are sample URLs tested sooner than being submitted to indexing – that is carried out at the sitemap submission, so it wouldn’t be within the indexing document within the new SC.”

Here is Alan’s show cloak cloak shot:

click for fleshy size

I ought to aloof show cloak, that on this deliver, I block most productive one URL by technique of the robots.txt file and it shows as an error each and each in my Sitemap submission and within the Index document.

Sitemap document:

click for fleshy size

New Index protection document:

Forum discussion at Twitter.