I have been, or can be if you click on a link and make a purchase, compensated via a cash payment, gift, or something else of value for writing this post. Regardless, I only recommend products or services I use personally and believe will be good for my readers.
I’ve got a dedicated server with GoDaddy, and have been pretty happy with things. I know how important backups are, but don’t like how GoDaddy handles it.
First, you have to set it up yourself, which is OK if you know what you’re doing in Plesk, but Plesk backups SUCK. You can’t extract a single file from them, or get the SQL commands to rebuild part the database.
Furthermore, Plesk restorations SUCK. Plesk removes the current (live) site, then attempts the restore. Oh, and if the restore fails? Sorry – you have nothing now. Talk about back-asswards.
The final nail in the coffin is the fact that you can ONLY access the GoDaddy backup FTP server from your main server. So if your main server fails, don’t worry: you have a backup – you just can’t get to it.
After all that, I decided to write my own backup script. If you want to do something right, you have to do it yourself.
I started with a basic script that saved the files an FTP server, which was a good first step. But I couldn’t find a cheap FTP server for backups, so I looked at Amazon S3. While the pricing is a bit confusing, I know it’s cheap. It looks like I can backup my entire server, daily, for about $0.02 / day. You’re not going to find a better deal than that!
Enough already! Here’s the script. It’s completely free, and probably has more comments than actual code in it. Just PLEASE don’t ask for support on this one – get a sys. admin if you need help configuring it. Or figure it out with trial & error.
Storing your data in “the cloud” can be dangerous if you don’t have permissions set right. A webinar by @WilsonMattos cleared that up for me (previous webinar available for download).
So, get your server backed up, and once you do, make sure you can successfully restore the data!
Comments
Rodney
Thanks for sharing your script. Is this setup to only work on Plesk servers, or do you think it might work on a WHM/cpanel server as well?
Eric
If you change the paths (mainly $cVhostsDir), this should work on WHM/cpanel as well.
Dash
Hi Eric, how will this script handle the number of backups in the S3 bucket? Do I manually have to delete old backups? Or is that included in the script?
Eric
@Dash: the very first variable in the script is $nDays, which is “How many days of backups to keep?” It’s set to 5, but you can change that to whatever you’d like. After backing up, the script takes care of clean up.
Z
Just wanted to stop by and say thank you very much for sharing this script. It’s very well commented, and worked perfectly. I set it up as a daily cron job. I now have piece of mind. Thanks a lot!
Z
RPM
Great Script!
I have a question if you don’t mind; if i would like to run this script in the daily cron folder should i change $workingDir ?
cheers!
Eric Nagel
@RPM – everything above “Done editing script” should be looked over and probably changed, including $workingDir
Niro
Eric, thanks for the excellent script.
For some reason it gets stuck when trying to send to amazon a database file which is 400M after the compression. Top command shows cpu is at 100% on httpd process. smaller files did work.
I tried removing the https in the s3 initialization but it did not help.
Anyone know how to solve this?
Jordi
I have the same problem as Niro. Is there some sort of file limit? The file that it’s uploading is around 2GB.
Eric Nagel
Hi Niro, Jordi – I’m not sure why it’s getting stuck. According to the author of the S3 component
Known Issues:
Files larger than 2GB are not supported on 32 bit systems due to PHP’s signed integer problem
SSL is enabled by default and can cause problems with large files. If you don’t need SSL, disable it with S3::$useSSL = false;
For the database, you can change the dump to dump tables, not the whole database at once, and maybe then you’d be under 2GB. Or, take the .sql file that’s the export from the database and split it up.
Jordi
Thanks for the quick response. For me unfortunately it didn’t work though. I tried to turn of SSL but that didn’t do the trick. It’s probably the 2GB file limit then because the tar.gz file is just a little over 2GB.
For me it’s not the database dump that’s causing problems. That’s actually uploading correctly. The problem lies with the complete file dump, probably have to many images 😀
I’ll have to try something else then. Maybe I’ll have to split it into different files. Too bad automated backups with an option to upload to S3 isn’t baked into Plesk.
Thanks again!
Jordi
I wrote a new version which fixed my large file problems and the timeout issue. Hopefully this will fix your problem too Niro, or anyone else that stumbles on the same issues 🙂
http://www.jor-on.com/blog/2010/01/07/php-amazons3-ftp-server-backup-script
My server is backing up again, I feel much saver now hahaha 😀
Eric Nagel
Nice job, Jordi! Thanks for not only improving on the backup script, but giving it away, too!
@Niro – take a look at what Jordi did, and you should be able to backup your server now
Adam J
Does this back up the plesk settings?
i.e. dns/ ssl certs/ email accounts etc?
Eric Nagel
Hi Adam – no, it does not. Also, Plesk is not required… this script will backup just about any configuration (I just happen to have & use Plesk)
Adam J
Hi Eric,
I use plesk as well and recreating and having to re-enter all those settings would take me a long time as I have many domains and email accounts.
Brade
Just an FYI, I tried Jordi’s script, and it worked great for DB’s but slowed our server way down during the actual web directory back-ups. I’ve since found that JungleDisk now has a server-specific version, which I personally have installed on our server, and it works really great: https://www.jungledisk.com/business/server/features/
$5 a month + .15 per GB (1st 10 GB are free), so still a pretty cheap solution. Has functions to automate recovery, etc. as well, and much less resource intensive during the backup.
Carlos
Thank you friend, Worked perfect!
For make a backup from Linux server, I made this changes:
First, I changed the original values in $mysql_server, $mysql_username $mysql_password.
Before, change the path (original from plesk), for the path in linux server:
$workingDir = ‘/home/myusername/’;
I downloaded the S3 Class file (S3.php) from http://undesigned.org.za/2007/10/22/amazon-s3-php-class with the version (0.4.0 – 20th Jul 2009)
And I put the file in a new folder in public_html/backupfiles/s3.php
And later:
include(‘backupfiles/s3.php’);
And Done!
Obviously I´m a newbie user. But I´m secure that this information will be useful for another user.
guilliam
hello eric,
is there a way i can include the cpanel’s email on my backup? thanks for this easy to newbie solution you shared.
– g
Eric Nagel
hrm… not really sure. You’d have to either dump the email from cpanel, automatically, or find where they’re stored tar / gzip them in their native format.
Chris Seckler
I noticed you mentioned your script will work on Plesk servers and WHM/cpanel servers, but will your script work on Media Temple servers as well?
Eric Nagel
Hi Chris,
This will work with any set-up, as long as you set the variables at the beginning of the script properly. You do NOT have to be running plesk. I just used the basis of this script for a webmin set-up.
Juan Lopez
Hi Eric,
This script:
http://ericnagel.wpengine.com/wp-content/uploads/backup-src.phps
it will dump all databases, public_html files and compress it all, correct?
Thanks.
JL
Eric Nagel
Yep, that’s the basis of it.
microno
Hi, this script is no longer available, could you please share it again, or share it with me? I’m really interested, thank you.