Google Search Console (GSC) is a free tool that helps webmasters monitor and optimize their websites for Google search. One of the features of GSC is the Index Coverage report, which shows the status of the URLs that Google has crawled or tried to crawl on your site. The status can be one of the following:
• Valid: The URL is indexed by Google and can appear in search results.
• Valid with warnings: The URL is indexed by Google, but there are some issues that may affect its performance or appearance in search results.
• Excluded: The URL is not indexed by Google, either because it was blocked by the webmaster, redirected to another URL, or not found by Google.
• Error: The URL is not indexed by Google, because there was an error when Google tried to crawl or index it.
Causes
One of the common errors that webmasters encounter in the Index Coverage report is the “Discovered – currently not indexed” status. This means that Google has discovered the URL, either from a sitemap, a link, or another source, but has not crawled or indexed it yet. This can happen for various reasons, such as:
• Google has postponed crawling the URL to avoid overloading your site.
• Google has prioritized crawling other URLs that are more important or relevant for your site.
• Google has encountered some technical issues that prevented it from crawling or indexing the URL.
How to Fix Discovered – Currently Not Indexed
If you see this status for some of your URLs, you may wonder how to fix it and get them indexed by Google. Here are some steps that you can take to troubleshoot and resolve this issue:
1. Request indexing for the URL: This is a simple way to tell Google to crawl and index your URL as soon as possible. To do this, go to GSC and click on URL inspection. Enter the URL that you want to index and click on Request indexing. If there are no issues with your URL, you should see a message that says “URL was added to a priority crawl queue”. However, this does not guarantee that your URL will be indexed immediately, as it depends on Google’s resources and algorithms.Also, there is a limit on how many URLs you can request indexing for per day.
2.Check your robots.txt file: This is a file that tells Google which URLs on your site it can or cannot crawl. If your robots.txt file blocks Google from crawling your URL, it will not be indexed. To check this, go to GSC and click on URL inspection. Enter the URL that you want to index and look for the Crawl allowed? section. If it says No, it means that your robots.txt file is blocking Google from crawling your URL.To fix this, you need to edit your robots.txt file and remove the rule that blocks your URL.
3. Check your meta tags: These are tags that provide information about your web pages to Google and other search engines. Some meta tags can affect whether your URL is indexed or not, such as:
•: This tells Google not to index your URL.
•: This tells Google not to follow any links on your URL.
•: This tells Google specifically not to index your URL.
If you have any of these meta tags on your URL, it will not be indexed by Google. To fix this, you need to remove these meta tags from your HTML code.
4. Check your canonical tag: This is a tag that tells Google which version of a URL is the preferred one to index and show in search results. For example, if you have two versions of a URL, such as https://example.com/page and https://example.com/page/, you can use a canonical tag to tell Google which one is the canonical version and which one is the duplicate version. However, if you use a canonical tag incorrectly, it can cause indexing issues for your URL. For example, if you use a canonical tag that points to a different domain or a non-existent page, Google may not index your URL. To fix this, you need to make sure that your canonical tag points to the correct version of your URL.
5. Check your sitemap: This is a file that lists all the URLs on your site that you want Google to crawl and index. If you have a sitemap, you can submit it to GSC and help Google discover and index your URLs faster. However, if your sitemap has errors or inconsistencies, it can cause indexing issues for your URLs. For example, if your sitemap includes URLs that are blocked by robots.txt, have noindex meta tags, or have incorrect canonical tags, Google may not index them. To fix this, you need to make sure that your sitemap is error-free, up-to-date, and consistent with your site structure.
6. Check your site speed and performance: This is how fast and smoothly your site loads and responds to user actions. If your site is slow or unstable, it can affect Google’s ability to crawl and index your URLs. For example, if your site takes too long to load, has broken links, or returns server errors, Google may not index your URLs. To fix this, you need to optimize your site speed and performance by using tools like PageSpeed Insights , Lighthouse , or Web Vitals . These tools can help you identify and fix the issues that affect your site speed and performance.
7. Check your content quality and relevance: This is how useful and valuable your content is for your target audience and Google. If your content is low-quality, thin, duplicate, or irrelevant, Google may not index your URLs. For example, if your content is poorly written, has spelling or grammar errors, has no originality or value, or has nothing to do with your site topic, Google may not index your URLs. To fix this, you need to improve your content quality and relevance by following Google’s guidelines for creating high-quality content . These guidelines can help you create content that meets the needs and expectations of your users and Google.
By following these steps, you can troubleshoot and fix the “Discovered – currently not indexed” status in GSC and get your URLs indexed by Google. However, keep in mind that indexing is not a guarantee of ranking or visibility in search results. You still need to optimize your URLs for SEO and make them relevant and valuable for your target keywords and audience.
Conclusion
The conclusion of this topic is that the “Discovered – currently not indexed” status in Google Search Console indicates that Google has found the URL, but has not crawled or indexed it yet. This can happen for various reasons, such as Google’s prioritization, technical issues, or webmaster’s settings. To fix this issue, webmasters can follow some steps to troubleshoot and resolve it, such as requesting indexing, checking robots.txt, meta tags, canonical tags, sitemap, site speed and performance, and content quality and relevance. However, webmasters should also remember that indexing is not a guarantee of ranking or visibility in search results, and they still need to optimize their URLs for SEO and user experience.