Latest News !

Block GoogleBot Temporarily Vs Permanently


John explains there are many ways to block Google from indexing your page but the technique would be different based on the time period you want something blocked from indexing. Often I see that webmasters are confused about how to handle this, so here is John Mueller's advice:
  • If you just don't want the content indexed (maybe you're trying something out on the page), then using the robots.txt is a good approach
  • If it's very temporarily, maybe even a 503 HTTP response code
  • If you want the page actively removed from search, then I'd definitely recommend using a noindex over the robots.txt
  • If you're using a staging server and don't want that indexed, limiting access to just the testers' IP address ranges or using server-side authentication would be good approaches too.
John added that don't flip/flop back and forth between these techniques on a single page because it will confuse GoogleBot. He wrote:
One thing I'd try to avoid is quickly fluctuating back & forth. Removing content from search, and then bringing it back can sometimes result in us not recrawling as quickly as you'd like, and it therefore taking a bit longer for the pages to come back into search. If you can avoid that by running your tests (for example) on a separate site that's not indexed, that's generally preferred. Of course, if you're looking to remove something permanently, that's less of an issue.
     Google+    |     Visit Website 
Copyright © 2015 Cheap SEO - Web Development Services in UK‎ | Islamabad,Pakistan. Designed by VisionLogix VisionLogix