We have a Drupal website that we maintain in a git repo. Because the site is a public site but is for a private audience, we want to set the robots.txt
to disallow all.
However, Drupal makes core updates from time to time, and these contain a robots.txt
in the root of the site repo, amongst many other files. The core update is provided through a tar.gz
file, not through a remote of another repo. Updates to Drupal core, then, show up as diffs, just as if you had edited the code yourself.
If someone updates Drupal core in our repo, they might overwrite our custom robots.txt
, and we probably would not notice until results started showing up in Google again.
Is there a way to "fix" or pin the state of a file in a git repo? Or at least make it noisy to someone who goes to commit changes to it?