There's lots of ways to reduce the problem - most of the other answers boil down to making a local mirror of the site, applying the changes there then switching the mirror in place of the live code. In effect these are just reducing the amount of time for the new files to be written - they don't actually eliminate the problem. e.g. suppose you've got 20 files to update, and the file at the top of the list requires one from the bottom of the list?
I'm guessing that this is a managed service - a VPS, or a dedicated/shared host - this eliminates the possibility of transfering the accessible IP address to another cluster node while the file update operation takes place. This approach can completely eliminate downtime assuming you don't have dependencies on a common storage substrate (e.g. sticky session handling, database structure).
It is possible to suspend activity to a certain extent using a auto-prepend script, while the operation completes, something like......
<?php
if (file_exists('/somewhere/uploading_new_content')) {
// following should be implemented as a seperate file
// which is only included when necessary....
$targettime=30; // assuming the update takes 30 seconds....
$sleeptime=$targettime - time() + filemtime('/somewhere/uploading_new_content');
sleep($sleeptime);
print "Sorry - Your request was paused due to essential maintenance<br />\n";
$dest=$_SERVER['REQUEST_URI'];
if (count($_POST)) {
print "<form method='POST' action='$dest'>";
foreach ($_POST as $key=>$val) {
$val=htmlentities($val);
$key=htmlentities($key);
print "<input type='hidden' name='$key' value='$val'>\n";
}
print "<input type='submit' name='uploading_pause' value='continue'>\n";
print "</form>\n";
} else {
print "<a href='$dest'>continue</a>";
}
exit;
}