Google Robots now blocked from staging sites by default
  • Hey all --

    A few of us have had problems with staging pages getting indexed in Google (perhaps releasing a blog post into the public sooner than we wanted, etc).

    So, I finally did something about it. By default, Vae will respond with a robots.txt that blogs all search engine robots when on the staging site. It will NOT insert this robots.txt file into your production site.

    To disable this behavior, simply create a robots.txt file of your own.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Sign In Apply for Membership