Results 1 to 10 of 10
  1. #1
    Full Member c4's Avatar
    Join Date
    January 18th, 2005
    Posts
    488
    I'm finishing a new project and it is VERY important to me, that the site has 0 downtime.

    To achieve this I want to have site hosted on serverA and a backup on serverB (from another company). Is it possible (and how) to setup some kind of automatically backup system which would copy all the files from serverA (including SQL databases!) and move them to serverB at least 1 time per day, so the domain would switch to ServerB if serverA was down for some reason.

    Thank you



    YOURsoft affiliate software
    Join YOURsoft "affiliate program for affiliate software"!

  2. #2
    ABW Ambassador
    Join Date
    January 18th, 2005
    Posts
    594
    c4,

    I assume these 2 servers will have domain rights and/or authentication that allows access to one another.

    For the flat files in the web site: One of the more reliable methods is just using the an old xcopy command, drop it into a .bat file and use the scheduler to run it each day. This has worked well for me for a long time and is very reliable.

    Example and syntax:
    http://www.robvanderwoude.com/index.html

    For the SQL Databases. I suggest using the xcopy above to copy over any and all backups of your databases. Second, I suggest setting up a task within your SQL Agent to back up the Databases at your required interval. (I run this every four hours.) If you are a DBA (or know one), you can set up your Databases as a publisher/distributor relationship to copy your primary to the secondary. These copies are instantaneous, and quite reliable.

    IamJaloppy

  3. #3
    ABW Ambassador
    Join Date
    January 18th, 2005
    Posts
    1,916
    yes

  4. #4
    Newbie
    Join Date
    January 18th, 2005
    Posts
    8
    c4:

    It depends on what software your using. For example if your running Windows Server and MS SQL Server and you have a static site (pages and database stuff always stays the same) you can have a "cold server" that simply picks up when the main server(s) fail. Microsoft does not charge you to have additional licenses of SQL Server in a case like this.

    However, if you are doing log shipping in SQL Server, Microsoft requires you to buy all the licensing of the software for the backup server as it is constantly being used, even though not being used directly. (It is being used to backup the database and files, even though it is not being used to directly create web pages 99% of the time) I'm not sure about the technacalities, but you might be able to get away with one SQL license that is per processor and another that is per seat...??

    However, if your using Linux, MySQL, Oracle, etc. pricing will be either be free or set up in different ways. I just know if your running a heafty database based on MS SQL Server (Say 4 processors), that is 4 additional licenses you must purchase. Thats an extra $8,800 for Standard edition and $14,000 for Enterprise or so.

    Just trying to help. I've never actually implemented a system like this before but have done quite a bit of reading up on it. Anyone else feel free to correct me on this if I happen to be wrong on something.

    Andrew

  5. #5
    Newbie
    Join Date
    January 18th, 2005
    Posts
    8
    Oh ya, if your really interested check out the book "Strategies for Web Hosting and Managed Services" by Doug Kaye. It is more aimed at web hosting in general but does have a good chapter or two about overall Mirroring, Load Balancing, Cold-Standby and Hot-Standby servers, though it is not specific to any software in general.

    If you are running Windows and SQL Server I can probably recommend a few good books there too which would be better informational sources about doing this sort of stuff.

    Andrew

  6. #6
    Full Member c4's Avatar
    Join Date
    January 18th, 2005
    Posts
    488
    Thaks for the replys.

    My site is running on a Linux server. The site has static pages with member area and some other scripts written in Perl with the use of MySQL database (thanks for the windows/MS SQL information anyway Rebies). It also has a SSL certificate installed and I'll probably have mod_perl installed to for better performance when the site grows larger.

    I think I will really go with the xcopy and Cron to run it. Was thinking about something similar myself, but wanted to hear your opinions.

    I'll check out "Strategies for Web Hosting and Managed Services" for any extra info. If anyone has any other ideas please let me know.

    Thank you



    YOURsoft affiliate software
    Join YOURsoft "affiliate program for affiliate software"!

  7. #7
    ABW Ambassador
    Join Date
    January 18th, 2005
    Posts
    1,916
    rsync (over ssh)
    mysql replication

    forget all that m$ garbage, you'll need 3 extra servers to account for the m$ failures

  8. #8
    Full Member c4's Avatar
    Join Date
    January 18th, 2005
    Posts
    488
    <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>Originally posted by Joseph Monuit:
    forget all that m$ garbage, you'll need 3 extra servers to account for the m$ failures<HR></BLOCKQUOTE>





    YOURsoft affiliate software
    Join YOURsoft "affiliate program for affiliate software"!

  9. #9
    Full Member
    Join Date
    January 18th, 2005
    Posts
    388
    If you are running linux then you can get the following setup:
    - on box A, run a cron job that would tar files you need, do mysqldump of your databases and gzips them. don't use mysql replication if it can even be called that) it's much worse than people think. just use mysqldump utility to make snapshots of your db. and gzip your data
    - on box B, create a cron job to scp or sftp (don't use ftp or rcp) your file from the box A and place them accordingly. also drop and recreate the database from the newly downloaded mysqldump'ed sql command set.
    - on box B, run another cron job that would curl (wget, or fetch) a page on the box A and if curl returns an error or the file size is wrong (like your db cvonnection failed and the test page is not what it's supposed to be) - then change the A record in your zone. it's easiest done with having bind running a master on the box B. set something like 5 seconds for TTL for your zone (some clients won't obey it, but that's life).

    also set up box B to email you whenever such even occurs so you can actually check if everything is ok.

    Two things to worry about:
    1) mysqldump can dump the db in an incorrect state (with respect to your application logic)
    2) don't tar files that might be changed while tar is working with them. instead copy them first and then tar from the other location, where you know the files are not being written to.

    This is as good El-Cheapo solution for running a backup box. If you want something really solid, you'll need to buy some enterprise level software that was specifically designed for that.

    Konstantin,
    www.GenericGifts.com

  10. #10
    Full Member c4's Avatar
    Join Date
    January 18th, 2005
    Posts
    488
    Hey Konstantin, thanks for the comments. I am actually planing something like this, but appreachiate your warnings about mysql replication and taring the files from a copy instead of the "live" files.

    I hope I can get this baby working



    YOURsoft affiliate software
    Join YOURsoft "affiliate program for affiliate software"!

  11. Newsletter Signup

+ Reply to Thread

Similar Threads

  1. What OS are you using on your servers?
    By popdawg in forum Voting Booth
    Replies: 7
    Last Post: October 29th, 2009, 10:16 AM
  2. The different CJ servers
    By Sleepless in forum Commission Junction - CJ
    Replies: 4
    Last Post: April 24th, 2007, 03:07 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •