Code

Discussion on Ultimate Backup

cezarys

cezarys does not currently provide support for this item.

50 comments found.

1. Besides the database, can files be backed up to AS3 in different folders? 2. Will it create folders on the fly if needed? 3. Does it have an interface to drop and upload files?

Hi,

@1 – no, the databases dump (.sql) goes to the zip file if you define the catalog from setDatabaseDumpDir() inside setBackupPath(). Then the zip file with backuped files and database(s) goes to S3.

I think this is something I should add in next version, thank you very much for suggestion.

@2 – yes, the script creates folders on fly if needed

@3 – no, the script doesn’t have any interface, the main purpose of this script is to add it to a cron job.

Hi,

The new version has arrived. I’ve added an interface to generate PHP files and to execute single backup. Which means, that answer to question 3 is now “yes”.

In addition, I fixed a few more things. You can upload the backup file to different folder on AWS, not only into root of you bucket.

Hi

Can i backup other website that not on the same server?

No, the script backups the website within one server. It can just send the backuped website afterwards to other server, via FTP.

How is possible to backup from website root folder?

i’m tryng to backup quite large website (files and db) of about 10GB

Well, to be honest, I didn’t checked for that big websites. I’ll check today and let you know what I can do with this. I’ll probably release the update afterwards.

I just performed a test. I have a catalog with pictures, 977 mb. My machine has 512 mb memory limit for php. I was able to create a proper backup of these files.

Therefore, the backup (and website) size can be higher then memory limit.

But in your case, 32mb is very low value for these times.

Please write a message to your hosting admins, they should be able to increase your memory limit.

Can this connect to a remote MySql server and download backup of a DB ??

Yes that’s correct.

Connecting with database is not a part of the class, you need to be already connected with DB before using it.

As long as you have the connection with mysql, the script can fetch the list of databases and make copy of it.

The GUI tool asks for DB credentials, however it’s used to invoke the default mysql_connect right before using the class.

Hi, would like to clarify if it also do incremental, differential or full backups? or synchronization, copy only the new files?

For the databases, can it be set to do hourly or periodic backups?

Thanks a lot in advance.

Hi,

@backup type – it always creates full backups, it doesn’t store an information about previous backups, therefore it cannot backup “new files only”

@periodic backup – you can add this script to cron, and this is how it’s resolved.

Thank you.

Cezary

Hello,

I have kept the script in public_html/backup folder.

And I am backing up the whole public_html folder.

As usual the zip file is created in the backup folder.

During the next backup, the zip file gets included again, thus increasing the size of the backup and upload to dropbox.

How do I get the script to delete the zip file after it is done uploading to dropbox?

Please write a message on cezary.siwa@gmail.com so I could give you a fix. I’ll probably make an official update of it.

Hi

Sorry for making you wait so long.

Anyway I checked the issue and here is a solution.

In includes/UltimateBackup.php on the very bottom you have a line

$this->log(‘--—‘);

Please add

unlink($this->backupFilename);

Right after that line. I’m gonna add it to next update.

How can you check that he is working?

I have put the script into a subfolder on my root. Subfolder is “backup” Is this correct? $ultimateBackup->setBackupPath(’../’);

If you’re not about to use Amazon S3, and you don’t like to modify your server config, you can alternate the UltimateBackup.php file, by removing this:

require_once dirname(FILE) . ’/aws.phar’; use Aws\S3\S3Client;

from the top.

Ok thanks But I got a 500 Internal Server Error.

In the FTP: http://prntscr.com/4y03jy

Al the sites are together 5gb big

Error 500 is mostly related to problems with server. I can see on your screenshot that the script already started to create the ZIP, however the size is not fully shown. Is this 1 gb ? Did you opened the zip and it works, or it not, so it means the process has been interrupted ?

I already provided test (here, in comments) max backups size vs memory limits and I personally did a backup which were almost 1gb big, with memory limit 512, so in your case it wasn’t out of memory error.

To realise the cause of error 500 you might need to read apache error log

Can this back up an restore larger database….. and is it a staggered approach

thanks

This software doesn’t restore backups, it only creates it. You can restore the sql file using phpmyadmin. Please let me know what “larger database” means to you (how many mb), So far I tested for databases up to 30mb, but it’s probably not really big.

do you still support this item? its can be used to backup file and database to ftp on other server and its can running via cron? its right? do you give us script to automatic delete that file more than 1 week old? so we can save more space? thanks a lot

Yes I’m supporting this item. It can be used to backup files and databases, and then send the zipped package via ftp to other server. It can be executed via cron. However the package doesn’t include the file, which would delete the old backups. I can create one for a quite low fee, please contact me using the contact form (or e-mail, it’s in documentation if you buy the item).

Hello, I would also like to know if the feature to delete backups after a set time period would be available in a future release, or if it is something you could build/add in for me? I also want to confirm that it can backup multiple databases on the same server, not just multiple tables inside the database? Do I have to manually set up a cron job to run the backup or does this application allow me to set the run time in the interface itself?

As you can see on the product page

“Software Version PHP 5.3, PHP 5.4, PHP 5.5”

5.1.6 is VERY old (24 Aug 2006).

ZipArchive requires at least 5.2.

It’s strongly recommended to upgrade your PHP to at least 5.4.x line.

I don’t think my server is up to date enough to run this backup, can I just get a refund on this program? It is useless to me at this point.

Let me know what the procedure is for a refund?

I’m not really sure, I never did it and I’m trying to figure this out now.

As far as I read on forum, you might need to write to support, however I’m not sure if they agree to refund it because it’s clearly written down that product won’t work on PHP below 5.3.

Basically it’s not my fault that it doesn’t work on your server.

Hello!! A pre-sale question, Does it offer any restore option too? please briefly explain

Hi. I’m sorry but this script doesn’t restore the backup, it just creates it.

To restore the backup you need to unzip the file and load the database, using for example phpmyadmin. Database is in sql file in the same packages as the rest of the files.

I just purchased the library and testing it with my ecommerce site on shared hosting account. It seems my hosting supplier disable set_time_limit() function.

Warning: set_time_limit() has been disabled for security reasons

Is that a problem in terms of using the script on such hosting?

Hi. I investigated the issue some time ago and adding such functionality requires bigger update. I’ll provide it soon, since I already have a list of feedbacks – like the one from Jurarj (see below)

Hi. Any updates? I am still unable to use the product as the sql dump filename can not contain timestamp.

Well seems you do not reply to many questions lately. I suggest you should at least modify product description – it says fs and db can be compressed with zip and you can use timestamp in the zip filename. It is not true. No update after 5mo waiting. False advertising. I think this should not be tolerated on the codecanyon.

Would be able to backup only specific database tables? Example: DB SITE with 30 tables, but I want to backup only 10 tables.

At this moment there is no such posibility but this is a great idea and quite easy to implement. I’ll try to provide an update within days with this functionality (and a few more suggestions the Buyers had). Thank you!

Hi, do you have any plans to implement it in your script?

The Dropbox is not working. Issues with the Dropbox API

Hi.

I yesterday investigated this issue with other user. It’s related to dynamic naming of the zip file (dates).

It can be resolved by one line fix:

includes/UltimateBackup.php

line 490 add right after the

function backup() {

this line:

$this->backupFilename = strftime($this->backupFilename);

After that fix I tested this and it worked on my side.

The update for this will be soon.

The official update is under way. It will be available once it’s accepted by the Reviewer.

One more tip. If you’re using

$ultimateBackup->setBackupFilename(’./catalog/test.zip’);

with catalog name, please make sure that “catalog” exists.

Looks great. However, can I compress to TAR GZIP ? I think there is a 2GB limit on ZIP files.

The only limitations for this plugin are limitations of your hosting. I tested it on multiple GB websites on my local computer. I already prooved that you can backup website bigger then maximum allowed memory, but if there are other limits caused by hosting – we cannot override it.

Hi,

This app can take backup from hostgator share hosting ?

Thanks Umair

Hi.

I tested it on Godaddy, it should work with most shared hostings but you need to check the requirements

1) PHP 5.3 – 2) set_time_limit() turned on 3) cURL enabled 4) MySQL database

According to this

http://support.hostgator.com/articles/php-modules

you have cURL for sure.

According to this

http://support.hostgator.com/articles/hosting-guide/hardware-software/what-version-of-php-are-you-using

PHP version won’t be a problem

MySQL should also be enable – hoever if you don’t have MySQL database, you just won’t backup them :-)

HOWEVER

according to this

http://support.hostgator.com/articles/cpanel/php-settings-that-cannot-be-changed

max_execution_time = 30

So the question is – will your website backup in 30 seconds ? If it’s up to 30 mb, it should.

is your software work auto backup through corn job and auto upload on dropbox or google drive? without any click.

https://www.youtube.com/watch?v=TBr47SVcymI here is an example tutorial how to setup cronjon in cPanel. There is plenty of videos like this.

how to work your script make backup without corn job .? you have any demo for check.

No I don’t have the demo for check, but it works the same. You just need to execute the PHP file.

Let’s say your website is http://example.com

You upload the example file from the package (you modify it for your need of course), which has name example.php

and you just enter in browser

http://example.com/example.php

And that’s it. This is how you run the script manualy.

Hello,

Please for the Googel Drive credentials, I found e-mail address & client id but APPLICATION NAME is not there! Where to find it?

Thanks

Hi

Please go to

https://code.google.com/apis/console/

Can you see “Project ID:” ? If so, this should be the applicaton name.

Would be able to backup only specific database tables? Example: DB SITE with 30 tables, but I want to backup only 10 tables.

Hello!! I am using it for a client and having difficulties while specifying google drive’s .p12 file. Every thing else is right but I guess problem is with google drive .p12 file path. Its right in same directory as of my backup.php file but when i give url to that file /home8/webkitme/public_html/yummfood/backup/google.p12 its not working. even i tried giving just ./google.p12 and also ./ also google.p12 but nothing is working. Please guide me in this. Log says error while uploading file to google drive

Hi.

Thank you for contacting me via e-mail. I already asked you to provide me the FTP credentials so I could investigate the issue for you directly on your environment.

Thank you

Cezary

by
by
by
by
by
by