yum
upgrades for production use, this is the repository for you.
Active subscription is required.
When you are switching hosting, you need a quick yet reliable way to transfer files between two servers.
Rsync is the best tool for the job. Here is how you do it.
First step to import website files. SSH keys
It is not required but you may want to have SSH connectivity between old and new server to work without passwords first. Further, this will let you run import commands in background easily.
Generate SSH key on the new server:
ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
Next, make the old server trust this key:
ssh-copy-id -i ~/.ssh/id_rsa.pub root@1.2.3.4
where 1.2.3.4 is the old server’s IP address.
If SSH runs on a non-standard port, refer to our previous tutorial on password-less SSH login.
Run import with Rsync
Make sure it is installed on the new server first:
yum -y install rsync
Create the directory where you want to have a copy of remote files first, i.e. mkdir /var/www/html
. Copying files with is easy now with a few lines:
REMOTE_PORT=22
REMOTE_USERHOST='root@1.2.3.4'
rsync -e "ssh -p $REMOTE_PORT" -avz $REMOTE_USERHOST:/var/www/html/ /var/www/html/
Run these commands one by one. You can change the default port from 22 to whatever SSH port configured on remote machine. The second line specifies system user and IP address of remote server. The last command will copy all the files from remote location over to local directory, including hidden files.
If you want to have the import running in the background, even after you close SSH session, run:
nohup rsync -e "ssh -p $REMOTE_PORT" -avz $REMOTE_USERHOST:/var/www/html /var/www/html > import.log 2>&1&
It is safe to close Terminal / Putty now. You will be able to check the log of running command by listing contents of created import.log
file:
tail -f import.log
The command above will display realtime updates of newly transferred files of our background import process.
Making import reliable
Now, I often find that the reason for changing host is reliability. Quite often times the original server has unreliable network and thus the import simply fails because network went down on the old server.
How do we deal with that? Let’s expand our previous commands into a script, that will repeatedly check rsync returned status and restart transfer until it is successful.
#!/bin/bash
REMOTE_PORT=22
REMOTE_USERHOST='root@1.2.3.4'
ssh-keyscan -t rsa -T 10 $REMOTE_HOST >> ~/.ssh/known_hosts
ssh-copy-id -i ~/.ssh/id_rsa.pub $REMOTE_USERHOST -p$REMOTE_PORT
while ! rsync -e "ssh -p $REMOTE_PORT" -avz $REMOTE_USERHOST:/var/www/html /var/www/html
do
sleep 1
echo "Restarting program..."
done
We also added two helpful lines:
- ssh-keyscan will accept remote host key for us. This is insecure but for our automation purpose we will sacrifice security over convinience
- ssh-copy-id will make sure there is trust between the two hosts and will allow to skip password prompt between subsequent tries
Save the script as, i.e. import.sh
and make it executable by giving +x permission: chmod +x import.sh
. Then run it using the background approach:
nohup import.sh > import.log 2>&1&
This will pickup only remaining files after each failure and the script will take care of making log of transferred files and failures. It will not have to download all the files every time – this is what makes it great. The script will work indefinitely until it’s done transferring all original files.