As we said, it's the next major version of HTTP , the protocol the internet primarily uses for transferring data. In general, we expect this change to make crawling more efficient in terms of server resource usage. With h2, Googlebot is able to open a single TCP connection to the server and efficiently transfer multiple files over it in parallel, instead of requiring multiple connections. The fewer connections open, the fewer resources the server and Googlebot have to spend on crawling. In the first phase, we'll crawl a small number of sites over h2, and we'll ramp up gradually to more sites that may benefit from the initially supported features, like request multiplexing. If your server supports h2 and Googlebot already crawls a lot from your site, you may be already eligible for the connection upgrade, and you don't have to do anything. There's no explicit drawback for crawling over this protocol; crawling will remain the same, quality and quantity wise. You can do that by instructing the server to respond with a HTTP status code when Googlebot attempts to crawl your site over h2.
Why we're making this change
What is HTTP/2
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4. For details, see the Google Developers Site Policies. Wednesday, February 12, Webmaster Level: Advanced Faceted navigation, such as filtering by color or price range, can be helpful for your visitors, but it's often not search-friendly since it creates many combinations of URLs with duplicative content.
At the recent Search Engine Strategies conference in freezing Chicago, many of us Googlers were asked questions about duplicate content. We recognize that there are many nuances and a bit of confusion on the topic, so we'd like to help set the record straight. Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar. Most of the time when we see this, it's unintentional or at least not malicious in origin: forums that generate both regular and stripped-down mobile-targeted pages, store items shown and -- worse yet -- linked via multiple distinct URLs, and so on. In some cases, content is duplicated across domains in an attempt to manipulate search engine rankings or garner more traffic via popular or long-tail queries. Though we do offer a handy translation utility , our algorithms won't view the same article written in English and Spanish as duplicate content. Similarly, you shouldn't worry about occasional snippets quotes and otherwise being flagged as duplicate content. Our users typically want to see a diverse cross-section of unique content when they do searches.
Please note : changes in NS servers and DNS-records may take effect after a period of time — from 4 to 72 hours. This is due to the storage of information about domains at Internet providers: they save all user's requests and store them for a certain time. UA, you need to: Place an order for name servers NS-servers for your domain on the " Name Servers " page: to activate name servers, you must confirm your contact mail: if you have not done this yet, then you will receive a letter from NIC.