Post by account_disabled on Feb 25, 2024 1:45:27 GMT -5
The accidentally carried over into the live site and it often does it will prevent search engines from crawling the site. And when search engines cannot crawl an indexed page the keywords associated with the page will get demoted in the search results and eventually the page will get deindexed. But if the robots.txt file on staging is populated with the new sites robots.txt directives this mishap could be avoided. When preparing the new sites robots.txt file make sure that It doesnt block search engine access to pages that are intended to get indexed.
It doesnt block any JavaScript or CSS resources search engines require Czech Republic Mobile Number List to render page content. The legacy sites robots.txt file content has been reviewed and carried over if necessary. It references the new XML sitemapss rather than any legacy ones that no longer exist. Canonical tags review Review the sites canonical tags. Look for pages that either do not have a canonical tag or have a canonical tag that is pointing to another . Dont forget to crawl the canonical tags to find out whether they return a server response.
If they dont you will need to update them to eliminate any xx xx or xx server responses. You should also look for pages that have a canonical tag pointing to another URL combined with a noindex directive because these two are conflicting signals and youll need to eliminate one of them. Meta robots review Once youve crawled the staging site look for pages with the meta robots properties set to noindex or nofollow. If this is the case review each one of them to make sure this is intentional and remove the noindex or nofollow directive if it isnt. XML sitemaps review one that contains all the new sites indexable pages and another that includes all the old sites indexable pages. The former will help make Google aware of the new sites indexable URLs. The.
It doesnt block any JavaScript or CSS resources search engines require Czech Republic Mobile Number List to render page content. The legacy sites robots.txt file content has been reviewed and carried over if necessary. It references the new XML sitemapss rather than any legacy ones that no longer exist. Canonical tags review Review the sites canonical tags. Look for pages that either do not have a canonical tag or have a canonical tag that is pointing to another . Dont forget to crawl the canonical tags to find out whether they return a server response.
If they dont you will need to update them to eliminate any xx xx or xx server responses. You should also look for pages that have a canonical tag pointing to another URL combined with a noindex directive because these two are conflicting signals and youll need to eliminate one of them. Meta robots review Once youve crawled the staging site look for pages with the meta robots properties set to noindex or nofollow. If this is the case review each one of them to make sure this is intentional and remove the noindex or nofollow directive if it isnt. XML sitemaps review one that contains all the new sites indexable pages and another that includes all the old sites indexable pages. The former will help make Google aware of the new sites indexable URLs. The.