I am looking for a script that will do the following. [list] all files + folders in a public web server directory and download them to a directory on local server a set time that it will grab/backup the downloaded directory (e.g. Download from web server every 10 minutes) [/list]
Basically what this needs to do is download all contents (Files, and subfolder content) and place it into the local servers nginx web server folder.
So it will… 1.) Download files from public directory online 2.) Download all of the public directory files & subfolders into the “C:/nginx/html” folder every 10 minutes.
I am paying but please contact me first! PHP or any language is fine. the server is running windows 2k8, apache + php5~.
Please send me a message if interested! I need this urgently!!
I’m interested in knowing more details. Since there is no contact information on your profile, please get in touch via mine.
All the best, Imad Jomaa.
I am interested in working on this one too.
Please let me know your thoughts.
Best regards, Noman.
Note that the script as described wouldn’t create a proper backup. For example, it would miss all databases and most sites today use databases of some kind. You’d also need to rotate the backups because it’s entirely possible that the site will be hacked shortly before the files are downloaded. In that case, you’d end up with two copies of garbage. There are about ten other problems with this approach, but you get the idea.
You may want to look into a professionally designed, time tested solution like Clonebox.