Critical issues

This section contains solutions to common “Critical” issues detected by Yandex.Webmaster during website diagnostics. If your site has any of these issues, the site or individual pages on it may be excluded from search results.
Tip. Track and fix errors as soon as possible. You can configure notifications for site monitoring results.
  1. Slow server response time
  2. Invalid SSL certificate settings
  3. Duplicate pages with GET parameters were found

Slow server response time

This means that the average response time for all site pages was longer than three seconds when accessed by the search robot. This could be due to a particularly slow server response for certain pages on the site. If the server is currently responding quickly, the error message will disappear within a few days.

Invalid SSL certificate settings

This message is displayed in the following cases:

  • Certificate expired.
  • The certificate was issued for a different domain or not for all subdomains where it is used. For example, the certificate was issued for the domain example.com but is used for the domain www.example.com.
  • The certificate from the certification authority is missing in users' browsers or it has been revoked.
  • A self-signed certificate is used.

If there are problems with the SSL certificate, the browser notifies the user about them. Users might avoid the site because it is not secure.

To fix this problem, check the SSL certificate and the server settings. You may need to contact your hosting provider.

The Yandex robot will discover any changes the next time it crawls the site. If it doesn't detect a problem, the message will stop appearing in Yandex.Webmaster.

Duplicate pages with GET parameters were found

Duplicate pages are pages that have same content but are located at different URLs. Links with GET parameters can also be considered duplicates, since the Yandex robot considers them different pages. Such pages are merged into a group of duplicates.

If your website has duplicate pages:

  • The page you need may disappear from search results, if the robot has selected another page from the group of duplicates.
  • In some cases, if there are GET parameters, pages may not be grouped and may participate in the search as different documents. As a result, they compete with each other. This may impact the website's ranking in search results.
  • Depending on which page remains in the search, the address of the document may change. This may affect, for example, the reliability of statistics in web analytics services.
  • It takes the indexing robot longer to crawl the website's pages, which means the data about pages that are important to you is sent to the search database more slowly. The robot can also create an additional load on your website.
Note. If you recently added Clean-param directives, prohibited crawling duplicates using robots.txt, or placed the rel="canonical" attribute, it may take several days for these changes to be taken into account in Site diagnostics in Yandex.Webmaster and for the notification to stop being displayed.

To tell Yandex which page should be included in the search and get rid of duplicates, add the Clean-param directive to the robots.txt file so that the robot doesn't take the URL parameters into account.

Tell us what your question is about so we can direct you to the right specialist:

If you followed the recommendations, but the error still appears in Yandex.Webmaster, fill out the form:

If you recently added Clean-param directives, prohibited crawling duplicates using robots.txt, or placed the rel="canonical" attribute, it may take several days for these changes to be taken into account in Site diagnostics in Yandex.Webmaster and for the notification to stop being displayed.

If the error still appears in Yandex.Webmaster after a while, fill out the form: