Therefore, we do not see the full benefits of scanning HTTP / 2.
But with more websites implementing push notification feature, Googlebot developers are on the point of adding support for HTTP in future.” It should be recalled that in April 2016, John Mueller said that the use of the HTTP / 2 protocol on the website does not directly affect the ranking in Google, but it improves the experience of users due to faster loading speed of the pages.
I have it for 4 years already and I do not have a file named Disavow. Thus, in the case when before a website owner was engaged in buying links or using other prohibited methods of link building, then conducting an audit of the reference profile and rejecting unnatural links is necessary in order to avoid future manual sanctions.
It is important to remember that rejecting links can lead to a decrease in resource positions in the global search results, since many webmasters often reject links that actually help the website, rather than doing any harm to it.
"I talked to a lot of SEO specialists from big enterprises about their business and their answers differed.Most of the other reports that come to us is just information that we collect and can use to improve our algorithms in the future.At the same time, he noted that small reports about violations of one page scale are less prioritized for Google.Oct 08/2017 During the last video conference with webmasters Google rep called John Mueller said that Googlebot still refrains to scan HTTP.The reason is that the crawler already scans the content that fast, so the benefits that the browser receives (web pages loading time is decreased) are not that important. We are still investigating what we can do about it.