Google search console’s index coverage report is receiving 4 updates to keep site owners better informed about indexing issues.
The Index Coverage report is new compared to other Google coverage reports, as it first introduced the new version of Search Console launched in 2018.
Since the launch of the index coverage report site owners have been sharing feedback with Google about future improvements.
The change in the index coverage report today, is based on feedback provided by the webmaster community.
“Based on the feedback we have received from the community, today we are making significant improvements to this report so that you are better informed on issues that may prevent Google from crawling and indexing your pages. This change focuses on providing more precise positioning of existing issues, which should help you resolve them more easily. “
Continue reading below
Changes to the console index coverage report
The list of changes in the Index Coverage report in the Search Console include:
- Deleting generic “crawl mismatch” problem type – All crawl errors should now be mapped to a finer resolution issue.
- Pages that were submitted but blocked and indexed by robots.txt are now reported as “submitted but blocked” (error) as “indexed but blocked” (warning).
- Addition of a new issue: “indexed without content” (warning)
- Soft 404 reporting is now more accurate
The overreaching theme of these updates seems to be the accuracy of the data.
No further inference for crawling errors is included as the “crawl discrepancy” problem is replaced with specific issues and resolutions.
If the page reported by Google has been blocked by the site, the site owner will know with certainty. according to the report “Indexed But instead of “blocked”Presented But blocked. ”Submitting the URL is not the same as being indexed, and the report is now updated to reflect that.
Continue reading below
Soft 404 reporting has been called more accurate, and a new issue called “indexed without content” has been included. Let’s take a closer look at that issue in our report.
Here is the search console help page Indexed without content:
“This page appears in the Google index, but for some reason Google cannot read the content. The possible reason is that the page may be closed to Google or the page may be in a format that Google cannot index. This is not a case of blocking robots.txt. “
If you come across Indexed without content The point is that the URL is in Google’s index, but its web crawler cannot see the content.
This can mean that you have accidentally published a blank page, or there is an error on the page that is preventing Google from providing content.
For further guidance on resolving indexes without a content error, I recommend site owners run specific pages through Google’s URL inspection tool.
The URL inspection tool will render the page as Google views it which can help in understanding why the content is not viewable to Google’s web crawler.
These changes are now reflected in the index coverage report. Site owners may see new types of issues, or change the number of issues.
See Google’s official blog post for more information.