- If you just don't want the content indexed (maybe you're trying something out on the page), then using the robots.txt is a good approach
- If it's very temporarily, maybe even a 503 HTTP response code
- If you want the page actively removed from search, then I'd definitely recommend using a noindex over the robots.txt
- If you're using a staging server and don't want that indexed, limiting access to just the testers' IP address ranges or using server-side authentication would be good approaches too.
One thing I'd try to avoid is quickly fluctuating back & forth. Removing content from search, and then bringing it back can sometimes result in us not recrawling as quickly as you'd like, and it therefore taking a bit longer for the pages to come back into search. If you can avoid that by running your tests (for example) on a separate site that's not indexed, that's generally preferred. Of course, if you're looking to remove something permanently, that's less of an issue.Google+ | Visit Website