Split and import a very large database with PHP | Intracto

Split and import a very large database with PHP

I had to import a SQL-file with a size of over 20Gb into a mysql-database but ran into a lot of problems doing this. I don't have the best server in the world so this import takes about 8 hours to complete and I noticed that after a few hours the import stopped or my connection broke down so I had to start the import again. I've tried it for 5 times without succes and got a little frustrated so I decided to write a little and simple PHP-script to split this large file into smaller pieces so I could import these pieces and eventually easily resume the script after the connection broke down.

Splitting the sql-file I just read every line in with PHP and when I encounter a comment (-- ), I create a new file.

//Set time limit to one hour set_time_limit(3600); $dir = '/path/to/store/splits'; $file = '/path/to/large/sql/file.sql'; $i = 1000000; //open large sql-file $handle = fopen($file,"r"); if($handle) { while(($buffer = fgets($handle)) !== false) { //read line and append it to the file with name $i.sql $newfile = fopen($dir . '/' .$i . '.sql', 'a+'); fwrite($newfile,$buffer); //if a comment is found, create a new file if(substr($buffer,0,3) === '-- '){ $i++; } } fclose($handle); }

Splitting this 20Gb file took about 10 minutes on my server and created approximately 500 files. The largest file was 5Gb and I know I could import a 5Gb-file without a problem.

Importing the sql-file Importing the file was pretty simple. I wrote a script that loops over all the files and imports them into the database.

set_time_limit(3600000); //The directory containing the splitted files $dir = '/path/to/store/splits'; $files = scandir($dir); ob_start(); //loop over the files and import them into the database foreach($files as $file){ echo $dir . $file . "n"; system("mysql -u mysql_username -pmysql_password mysql_databasename < $dir/$file"); ob_flush(); }

In my case, it took more than 8 hours to completely import those database but with the help of these 2 scripts, it worked like a charm.