Skip to content Skip to sidebar Skip to footer

Discovered Currently not Indexed - Causes and Solutions

Discovered Currently not Indexed - Due to Google Core Update on July to August 2021, it seemed that Google bots were “busy” so that many blogs (especially BlogSpot) were not crawled and indexed; even, most of the time we got 5xx errors when we tried to Live Test our blog URLs; using other URL testers, the 5xx error didn’t exist. After Google Core Update completed, lots of crawl requests bombard Google Search Console so that it is fine if your blog posts are found under Excluded, especially Discovered – currently not indexed.

discovered currently not indexed, google search console, cover image

However, is it the only reason why your blog posts are found under that “discouraging” place in Google Search Console? It is not. Therefore, in this very good day, I want to address this issue and help you identify the possible factors that cause your blog posts not crawled and indexed.

Here are some possible factors that put your blog posts under excluded tab in Google Search Console.

1. Page Redirection

If you enable mobile version of your blog, each blog post will have two URL versions. The first version is the canonical version and the second one is mobile version. The mobile version URL of your blog post has additional parameter like this your-blog-url.html?m=1. The ?m=1 declares that the URL is mobile compatible where m stands for mobile and 1 means true. This kind of URLs are not indexed intentionally because they are automatically accessible when the users uses mobile device to access your canonical URL. When they access your URL using mobile devices, they will be redirected to this mobile compatible URLs.

Solution: there is nothing to worry about this.

2. Duplicate Content

In Google’s eyes, there is no need to index duplicate contents in your blog. However, it is weird that Google still indexes duplicate contents when those are submitted from different domains. You need to check if your blog posts contain duplicate contents and when you find ones, paraphrase the contents or just remove them.

Solution: paraphrase or remove duplicate contents.

3. Overloaded Page

Google bots has schedules to crawl your blog posts in a certain duration in each check. If your blog posts take long loading time, I am afraid if Google bots are forced to skip them. Mainly, a page will be overloaded if the media (images or videos) used in the page are big and too many scripts running behind the page.

Solution: consider to compress the images and reduce unnecessary scripts in your blog posts; remove ads scripts from unindexed posts.

4. Bad HTML Structure

Many people ignore using proper HTML tags in their posts. What the search engines crawl and browsers render is html pages. Using HTML tags properly guarantees the readability of the page. The more readable the page, the easier the page to be crawled and indexed.

Solution: use HTML tags in your posts properly.

5. Robots Rules

Your blog has robots setting that you need to take into account carefully. If you are using BlogSpot, you have three ways to set robots rules. The first is in your blogger template by using meta robots tag, the second is in your custom robots.txt in your blogger setting, and the third is the custom robots header tag in your blog post and pages; this also can be found in the blogger setting. Your robots rules may be set to blog your page from being crawled and indexed by Google bots.

Solution: Make sure that in your meta robots tag there is no noindex and nofollow value; make sure that you set your custom robots.txt properly; and make sure that the custom robots header tag in your blog posts are set to all.

6. Security Issues

Your posts may have security issues. For example, you may have a form on your post that needs the visitor to enter their login credentials or your blogger template contains malware. Sometimes, if you display ads on your post, you need to be careful if the ads direct the visitors to malicious pages. By default, Google Search Console will notify you if security issues are detected so that you can do something to fix it.

Solution: check your posts for any form that requires users to enter their login credential; scan your blog for malwares; replace your blogger template if needed; check if the ads you display refer to malicious pages.

Read also: Foodify Fast Loading Blogger Template

7. Poor Internal Link

When Google Search Console scans your URLs, it also looks for referring page. If your URL is not referred by indexed pages, it seems like your URL is not likely to be indexed for it is lack of trust. Ironically, if your URL is referred by malicious page, aka bad backlink, your URL will not be indexed either.

Solution: link your unindexed URL to indexed post in your blog; improve your quality backlink; disavow malicious backlink.

8. Google Bot Schedule

Last but not least, your posts are excluded, discovered – currently not indexed because Google bot has scheduled to crawl your posts next time. It is hard to tell when it will return to crawl your blog post but if your posts don’t have issues mentioned above, it will certainly return.

Solution: wait patiently and keep updating your blog with quality contents or articles.

I hope this help you to solve your blog post indexing issues. Don’t stop writing and blogging because people need your thought.

Please, only relevant comments are accepted. Comments that are irrelevant and/or containing active links will be deleted. Thank you.

Post a Comment for "Discovered Currently not Indexed - Causes and Solutions"