0

We have two "Linux Deluxe" servers in GoDaddy.

I'm used to work with two identical servers for development and production, however I have never set up or managed them so I'm wondering how to do it.

How are files moved between servers? Which tools can I use to automate this process?

Bhargav Rao
  • 50,140
  • 28
  • 121
  • 140
lisovaccaro
  • 32,502
  • 98
  • 258
  • 410
  • 4
    There are several problems with this question that might explain the lack of attention. First, it is quite broad. Also, it is bound to attract opiniated answers, because there are many ways to solve this problem. Asking for tools to recommend is also not in the scope of Stackoverflow. Finally, you might want to do some research, and update your question with early results. Searching for Software Deployment might be a good starting point. Also, don't forget good ol' Rsync. – SirDarius Feb 13 '14 at 21:02

4 Answers4

3

As has been said, we need to know much more about your system to give a good answer.

But I'll point out that in most environments it's worthwhile to set up the production machine as a git host and use git push to manage synchronization from the development environment. (If you've ever used Heroku, this will seem familiar.)

The reasons:

  1. It relieves you of remembering which files have been updated. Just push.
  2. It provides a quick and simple rollback mechanism. If you've pushed a change and it turns out to be broken, just roll back as shown here.
  3. You should have a local repo anyway for change management.

git makes updating the production machine easy with a post receive hook. Set up a bare repository in your login home on the production server.

mkdir site.git
cd site.git
git init --bare

Set up the site root, e.g.:

mkdir /var/www/www.mysite.com

Then create n shell script hooks/post-receive (don't forget chomod +x) containing something like:

#!/bin/sh
GIT_WORK_TREE=/var/www/www.mysite.com git checkout --force

Now add site.git as a remote of the development machine repository. Pushing to that target will update the production machine.

NB

I was not advocating for git to be a complete solution. As @Will I Am said in the comments, you do need to think through sensitive data in light of where your repos are stored. However even for this, git can be a useful tool if set up the right way e.g. as explained here. The general idea for sensitive data is to use git and a separate repo or submodule as a smart form of secure FTP. Of course if the amount and/or complexity of sensitive data are small, a simple copy or remote shell script will do just as well.

Community
  • 1
  • 1
Gene
  • 46,253
  • 4
  • 58
  • 96
  • This will work for application files but configuration data and non public facing keys shouldn't be stored in a git repo. This type of data should either be stored in environment variables or in files that are sent from one server to another via SSH. – Will I AM Feb 20 '14 at 17:04
1

Ways to move files:

  1. FTP: Works on both Windows and Linux, but very very insecure - you should never be using this.

  2. SFTP: Highly secure but you need to have shell (SSH) access to your linux servers.

    • If you're using Windows for development, you can install OpenSSH for Windows and you'll get command line SCP.
    • If you're using Linux, it comes bundled with SCP so you can start using it right away.
    • A short syntax for SCP is as follows, more details here. This command is run in your local development server, and it will copy files over to the production server.

      scp <your-local-files> <prod-username>@<prod-host>:<prod-port>

    • There are GUI like WinSCP and Putty/SuperPutty as well, for performing SCP from Windows to Linux.

  3. RSync: A wrapper over SCP, can be used to efficiently copy large number of files since it does a lot of things like copying only changed files, compression, etc. Works on Linux, but you might not find it for Windows.

Ways to move data: I assume you're talking about Database data? In that case, it depends on your database. Again the general instructions are:

  1. Refer to your database documentation on how to take Export/Snapshot/Backup (terminologies differ but all mean the same) all data into a file. e.g.

  2. Copy the generated dump file using above file copy methods.

  3. Again refer to your database documentation on how to import this file.

Tools to automate this process: I assume you're asking on how to automate copying of these files. Here are the options:

  • SCP and RSync are command-line tools, so you can simply use Cron (job scheduler) in Linux to automatically copy the files at some given time each day. If you're using Windows, you need to find some other scheduling tools.

  • You need some way to save the passwords if you're going to automate. But it is recommended that you use password-less key-based authentication which is more secure than hardcoding your password somewhere in the script. You can find a lot of tutorials on SSH Public-key authentication.

And finally, there are various tools and wrappers that simplify (or sometimes complicate) this process. But they will require additional packages to be installed/setup, etc. Generally they come under the realm of Software Deployment, but I'll refrain from explaining this further as it can get very opinionated. Better understand the basic tools first before moving on to higher level abstractions.

Subhas
  • 14,290
  • 1
  • 29
  • 37
0

True that to the fact that you don't give us enough information, but that being said, I would go to an approach like @Gene says, with a few changes:

  1. Have source control on the testing server
  2. Have SSH open and properly set up on the production server, but DON'T USE root to connect to it
  3. Have an SSH valid login on the testing server that is set up to connect to the production server, ideally you'll use a certificate on both sides (production and testing) so that you can connect from testing to production passwordless (you'll have a public key from prod installed on testing server), check this link on how to do it
  4. Use commit politics on your source control in such a way that you know that certain branches or tags are pre-tested versions or that have been gone through a QA process. This is important!
  5. Set up a post-commit hook on the testing server (for GIT or Subversion) that does the following:
    1. Checks if a tag has been created
    2. If so, copy the tag folder files and upload them through your properly-setup passwordless SSH account to the production server
    3. Do whatever you need to do on the production server by ssh on this same post-commit shell-script (like restarting apache on production in case an .htaccess file changed or something like that)

Doing it this way has some advantages:

  • You make sure that the process is automatic, so if you use the tags or certain branch only for production code you'll be able to attach an automatic deployment script, called by your post-commit SCM hook
  • You do post-install steps on this same script
  • You can control if something goes wrong in this same script and either notify by mail someone or do whatever is needed to rollback on production
  • ...and you don't need to have SCM installed on the production server.

Of course this is just a way to do it, since you don't give much information on what your application is and how you use source control or what is needed for a properly setup for your app configuratio needs, but I think this should apply to most basic setups.

Good luck!

Gustavo Rubio
  • 10,209
  • 8
  • 39
  • 57
0

I would use these tools to automate file copy:

  1. ftp or sftp - These are used to copy files between the two servers. However, they ask for manual entry of user and password.
  2. expect - Expect can be used to automate the manual entry of user and password.
  3. scp - This is another option and can be setup to automate file copy.

Check out this: linux script to automate ftp operation

Community
  • 1
  • 1