The site is ready to work, launched, but the traffic is “0”. It is not listed in Google search results. Most likely, this problem is easily solved. Consider 4 reasons why the site is not in the search results and tips on how to speed up its registration.
Site not indexing
Search robots discover a new web resource by clicking on links and register it in the database. Find out if your site has been indexed by searching for “site:site name” without a space after the colon. For example: site:star-marketing.com.ua. All indexed pages with this address will appear in the search results.
The site will get into the database faster through the «Google Search Console». Register, submit your request for indexing. In the webmaster’s panel, track error notifications, recommendations, indexing status, and conversion statistics.
To send a page for indexing manually, enter the page address in the “Check URL” field.
After a while, the submitted page should appear in Google search results.
The algorithms will quickly scan the online platform on the sitemap for search robots. Build internal links for every page you want to index. Do not use links that appear after user actions (site search results, filters, etc.).
Search engines shouldn’t index them. The sitemap is located at sitemap.xml. For example: https://star-marketing.com.ua/sitemap.xml. Submit it to Google Search Console:
Prohibition of indexing in the robots.txt file
Developers add a robots.txt file to the root directory, which regulates web page visits by robots. They are closed for indexing if there are strings in it:
When the work on the site is completed, the publisher may forget to remove the ban, and the system will not see it. Check if Tracking Prevention is active by submitting the site file to the Google Search Console. If the algorithms scanned it and found a block, look for the error: “Submitted URL blocked by robots.txt” in the “URL Check – Coverage” report.
Found it on your platform? We recommend contacting professionals. They will correctly disable tracking prohibition. After removing robots.txt, submit a request to register with Google.
When a site is indexed, a check mark appears in the URL Checker – Coverage report and indicates that crawling is allowed:
Blocking indexing on WordPress
Such a ban makes pages invisible to algorithms, which is needed during the process of layout or testing. They won’t show up even if you submit your sitemap to Google.
The code looks like this:
WordPress automatically adds a “noindex” meta tag to HTML, which prohibits indexing, if this option is activated in the “Search Engine Visibility” section:
The checkbox in this block is set during the development of the site so that users do not go to it. When the work is completed, this option is often forgotten to be turned off. Find out if there is a crawl-ban notification on the report page in the Google Search Console:
Sanctions from Google
This is the least likely reason why the site is not listed. If it meets the system requirements and changes in algorithms are taken into account, no filters are applied. More information about search engine sanctions can be found in our article. Trust experienced webmasters to fix such problems!
Improving the site for indexing
When developing a web resource, take into account the requirements of Google and the peculiarities of the algorithms. SEO site optimization helps to bring the website in line with them. Requires extensive tuning of parameters affecting search engine registration and ranking.
Consider several factors that speed up the registration of an online platform:
- Language of web pages. HTML is indexed better than Java or AJAX.
- External links. If your site is linked from a reputable source, Google will find it faster.
- URL path or page names are easier for algorithms to find than URL parameters:
- SEO-optimization – a set of work on a website to promote it to the TOP. The web pages that match the search terms rank well. Key phrases in the content, description, URL show that there is information on the web resource that is of interest to users. If they are not used, the site will not appear in the search results for these phrases.
- Pages that are identical in content. There are different ways to detect duplicate pages. One of them is Ashrefs Site Audit Online Service. On the found duplicates, you need to specify the canonical page (main) using the rel = “canonical” attribute, or set up a redirect to the desired page (redirect 301).
- Non-unique content. Google does not support plagiarism. Publish on the site texts with a uniqueness of at least 95%. Add copyright images.
- Slow website loading speed. This happens if “heavy” images and videos, complex animation are used, the web platform and hosting have a small resource. Compress images, change caching options and disable unnecessary plugins. This will speed up the site. If you need a deeper optimization of the download speed, contact the professionals.
Often there is no indexing if the sitemap is not sent to Google or the page code has a block in robots.txt. These problems are easy to fix. If search engine filters are imposed, contact the specialists to fix the problem so as not to aggravate the situation with erroneous actions. Work on the quality of the site speeds up indexing and promotes rapid advancement to the TOP.
Rating / 5.