9. Nov. 2008

Sooner or later it can come to a situation in which we are keen to work with some blogging software or CMS. Or we need to upload a bigger set of photos to our server as for we want to publish them within an online slideshow or album.

Sure we can upload all zillion files with a handy ftp client  … file by file, but we soon will figure out about it will take ages to complete …. or we can try to speed it up using the best out of all worlds ……..

Most of todays webhosting is done on UNIX systems and even if we are not familiar with the UNIX Operation System itself, the knowledge of a few basic Unix Commands will help us speeding up with file handling in general. Assuming we DO have shell access to our webserver system (often called ssh (Secure Shell) access) the only thing we need is a ssh client and a minimum set of UNIX instructions to succeed.

As you might have got already, I’m suggesting you to upload all your files in a single set as an archive. Zip or Gzip your files locally and upload them to your server within one junk. As you will see later on, this has a lot of advantages as for it keeps the amount of data we have to transfer small as for it is zipped and therefore compressed.  As it also lowers or eliminates the rate of transfering problems you usually will have uploading several hundred files instead.

Alter uploading our file we now enter the server within our ssh client and switch to the folder in which we have uploaded our archive file earlier. We can do that with a simple command like cd /home/username/htdocs

Depending on which archive format we have used or deal with, we can now directly unzip the archive with: unzip filename.zip
(ZIP support is directly build in within all major UNIX implementations). But given the case we deal with some .gz file (mostly used within the UNIX world providing software distributions), we have to use a different set of commands.

In this case two different tools were used to provision us with the software package. One was the GZIP (compression) utility of UNIX, while before the Archive builder TAR was used to fit together all the several files and folders making it up to the final tool we are keen to use.

While GZIP is just a compressor, TAR could directly archive and compress itself. But to keep it easy here, I will explain it in two separate steps pointing out how the two tools work together to give you an idea about how it works.

In this mentioned second case we ususally start with some file called filename.tar.gz. Now we first have to decompress the archive file with a command: gzip -d  filename.tar.gz  and this will leave us with a decompress file now only named filename.tar.

As a now final step for getting our software at hand we finally have to extract all the content out of the archive file with the following command: tar xf filename.tar

Doing so will directly extract the archive content within the folder we are actually sitting in, by extracting all the achived directory structures stored within the archive file. As problem out of this we could end up with an unwished folder like product-version-x.x containing all the files, instead as wished having them directly within the root folder of our hosted environment.

This is not really done by the developer to annoy us as someone could assume now! In a difference to that, it is done for security purposes preventing us to mess up our file structures by mixing content which does not belong together … and at the end best practice on UNIX.

To correct this problem we now switch to the new subfolder with cd foldername and issue the command mv * ../.  invoking the UNIX command line utility move, which will move us all content one folder above.

Getting rid of the now orphaned subfolder is easily done with rmdir foldername  or … but be carefully with that rm -R foldername   as for it will delete everything within and below.  Is it necessary to mention that before we have to switch to the folder above with cd .. ???

And last and finally in case we should be happy with the new subfolder, but want to get rid of the version number though, we plainly can rename the folder by using the move utility again mv oldname newname e.q mv wordpress-2.6.3 wordpress

7. Nov. 2008

Some of the low cost Hosting Providers do not even offer an interface to backup your webspace.

Personally I do think about it as annoying to download all the previous uploaded files again and again through a slow and less reliable FTP Client interface when an easy script executed on the server can do so much quicker for you instead.

To give you an idea what about I’m talking, I’ve written together an easy example you may feel free about to use it in your own environment in future.

It is able to do both, creating a local backup in a separate folder on your server, as also to transfer this backup to your local machine or even to another FTP Server on the net:

#!/bin/bash
# FTP Backup by Michael Lohmar
# Script: ftpbackup.sh
# Author: Michael Lohmar
# Contact? info@mikelo.com

if [ $# != 3 ];then
echo ""
echo "Shell script for backing up one given domain."
echo "Usage: $(basename $0) domain_to_backup [FTP/NOFTP] [DEL/NODEL]"
echo ""
exit
fi

version=1.0

##### INSTALL INSTRUCTIONS: STEP 1 #####
##### START ENTER YOUR INFO HERE #####

serverip=yourserver.com
# Your remote servers IP address
# EG: serverip=192.168.1.1

serveruser=youruser
# The FTP login for the remote server
# EG: serveruser=bob

serverpass=yourpassword
# The FTP password for the remote server
# EG: serverpass=mypassword

localdir=/home/your/local/folder
# WHERE LOCAL FILES ARE TO BACKUP
# NO TRAILING SLASH
# EG: localdir=/backup/folder/daily

sourcedir=/home/your/source/folder
# WHERE LOCAL FILES ARE TO BACKUP
# NO TRAILING SLASH
# EG: localdir=/domain/source/folder

remotedir=your/remote/folder
# FTP directory where you want to save files to
# This directory must exist on the FTP server!
# NO TRAILING SLASH
# EG: remotedir=/serverdirectory

##### END YOUR INFO HERE #####

##### INSTALL INSTRUCTIONS: STEP 2 #####
# CHMOD the script to 755: # chmod 755 ftpbackup.sh

# Add the script to a scheduled cron job to run as often as you like (if wished!)

# In SSH do crontab -e, then paste in the following
# 0 6 * * 0,1,3,5 /home/admin/ftpbackup.sh
# This does a FTP backup every second day of the week, lookup cronjobs for more info on setting dates and times.
# Don’t forget to substitue the path info to the script with your details
##### INSTALL COMPLETE #####
# DO NOT MODIFY ANYTHING BELOW #

host=`hostname`
cd $sourcedir

echo "Starting FTP Backup on " $host

# Creating a local tar.gz Archive
tar cfvz $localdir/$1_`date +%y_%m_%d`.tar.gz $1

# Transfer the tar.gz Archive to remote server
if [ $2 == FTP ];then
cd $localdir
echo "user $serveruser $serverpass
cd $remotedir
bin
verbose
put $1_`date +%y_%m_%d`.tar.gz
" | ftp -i -n $serverip
fi

# Delete local tar.gz Archive again
if [ $3 == DEL ];then
rm $localdir/$1_`date +%y_%m_%d`.tar.gz
fi

echo "Ftp backup complete on " $host
exit 0

31. Oct. 2008

Many webdevs are doing an excellent stuff and as we all know their webpages do impress us every day again and again. But neing experts one hand, at other topics, like doing necessary database backups, they often act like bloody beginners.

Often when it comes to database interactions using some CMS (Content Management System) they often neglect even the basics like doing backups frequently. But it can be that easy to have well ordered backups of your CMS’ underlaying database done automatically in an most convenient way. And in case you are in some ‘need’ – just restore your database backup and you are back in business again.

Sure you can code that backup all by yourself, but today we wll show you how easy the combination of just a few things can do all that backup stuff automatically for you. At least when it comes to MySQL databases which are widely common for all kind of web related database activities.

We first start with some free avalable PHP based backup tool in which we will define the backup to take, while later on we will use some easy to implement tricks to automise the backups taken for you.

The tool of our choise is called phpMyBackupPro which is available for free at phpMyBackupPro.net. (The tool itself is for free, but the author who has really made a great job offers to to donate some little money helping him to continue his excellent work.)

Installing the tool is easy done by extracting the archive and uploading it’s content to your webserver, but for sure it also works for you on your local system as well.

the basic configuration page(enlarge view)
Once uploaded it can get directly invoke and we do start with some basic configuration questions. It isn’t much about to know or perhaps guessing it needs  and filling out at that page – you should anyway know that already.

the scheduled MySQL backup pageFrom there we will head over the the more        (enlarge view)
interesting part of scheduling a backup. There we will now first select the databases to backup at the left side. Depending on your environment there might be just one, but in a more complex situation you and also choose more or all backing up them all together later on.

It’s not problem going with all other setting as they are defined by default, but feel free to play with it if wished. Even the dangerous appearing question about to add a ‘drop table’ command is fine and should be checked, as for it is referencing to a possible later restore of a database and in this case for sure a prior drop table instruction should be executed.

When we now press the button ‘show script’, as a result we will see some PHP code we can just copy&paste into some new file (lets call it cron.php) we later on will upload into the root folder of our phpMyBackupPro installation.

Depending on our basic installation invoking this script within any of the usual browsers will now generate us the wished database backup.

The result can then either get stored within some subfolder inside the tools’ folder, emailed to one of our email account or pretty much handy transfered via FTP to some remote server really creating an offline backup there.

Now calling this script agan and again out of your browser is not what I really would call automated at all. But honestly it does not need much more now.

Once we tested the script as working we do now just need some automatim calling it for us again and again.

This can be done by the OS itself when we’re going to help it a bit. What we now need is some line mode browser invoking the script. On UNIX platform this can be done by using either lynx or curl.

An easy and simple command line script will do:

  • for lynx (which is pretty much common on RedHat systems) we use:

    /usr/bin/lynx -source http://www.yourdomain.com/phpMyBackupPro/cron.php > /dev/null 2>&1

  •   and with curl (mostly used on Debian distributions) we use:

    /usr/bin/curl http://www.yourdomain.com/phpMyBackupPro/cron.php
   
So either one of these lines we will put into another new file (lets call it cron.sh and don’t forget about to proper ‘chmod’ it) which we then finally schedule out of the UNIX usual cron utility.

So with the command crontab -e we will invoke our crontab file and with adding there a line like:

0 0 * * * /home/ /cron.sh

the system will do an automated backup for us seven days a week around midnight.

For a higher frequence and or other specialties please refer into the separate cron documentation first.

« previous