PHP Classes

1) Huge and too complicated, unnecessarily! 2) Empty SQL fil...

Recommend this page to a friend!

      PHP Multi MySQLDump  >  All threads  >  1) Huge and too complicated,...  >  (Un) Subscribe thread alerts  
Subject:1) Huge and too complicated,...
Summary:Package rating comment
Messages:3
Author:Alekos Psimikakis
Date:2017-10-15 15:55:37
 

Alekos Psimikakis rated this package as follows:

Utility: Insufficient
Consistency: Not sure
Documentation: Bad
Examples: Bad

  1. 1) Huge and too complicated,...   Reply   Report abuse  
Picture of Alekos Psimikakis Alekos Psimikakis - 2017-10-15 15:55:37
1) Huge and too complicated, unnecessarily!
2) Empty SQL files created (in 'dumpsql' subfolder), although 'dbConnSettings.php' is used as per instructions and index.php issues "Dump successfully accomplished" message.
3) Clicking on "zip" links to download the SQL files produces fatal error in 'downloaddump.php':
"Class 'LogDeltaTime' not found in G:\XAMPP\htdocs\Work\temp\x1\downloaddump.php on line 30"

Oh, come on! Dumping a DB is a very simple thing. There's even a 'mysqldump' command! I do it often using a batch file!

  2. Re: 1) Huge and too complicated,...   Reply   Report abuse  
Picture of Alessandro Quintiliani Alessandro Quintiliani - 2017-10-17 10:00:21 - In reply to message 1 from Alekos Psimikakis
Hello Alekos

About the empty sql files, check the grants on the server where you are dumping database from.
If the database is the back end of some hosting service (i.e. Aruba) you cannot dump it, due to some policy restrictions.

As to point 3), check the permission of the folders and the path where you put Class.LogDeltaTime.php; remember that the activation of this class is optional (in the README file you can find the instruction on how to do it)

I know that dumping a MySQL database is quite easy and you can do it by mysqldump command, but if you carefully read the README file you could see that all the database tables you choose to dump are saved to their sql files each by parallel processes; also, a table is backupped only if something in their structure and/or data have changed respect of their previous run; if nothing changed, a new dump of that table is not executed (anyway you can decide to force the dump at each run): this saves time especially if you have big size tables (i.e. 10 GB) that seldom change their data

Regards

Alessandro Quintiliani

  3. Re: 1) Huge and too complicated,...   Reply   Report abuse  
Picture of Alekos Psimikakis Alekos Psimikakis - 2017-10-17 17:06:47 - In reply to message 2 from Alessandro Quintiliani
Hi Alessandro

I don't think the proble is priviledge restrictions because I run this in localhost using standard credentials (host='localhost', user='root', pass=''). Besides if there were such an error your code would have trapped it, I believe.

As for huge DBs, maybe your class would be indeed useful as you explain it, but I couldn't say, could I, if I couldn't make it work with small ones ...

You have done a lot for this package; I believe it is worthwhile testing it again against the issues I mentioned and fixing them, using a small DB.