Splitting massive MySQL dumps

As I posted yesterday, I have a massive MySQL dump to import. I tried BigDump, but one of the tables kept producing errors and so BigDump would exit. I don't need the whole db imported, so I wrote this to split it by table. It produces a new sql file for every table it finds, numbered sequentially so if you process them in alphabetical order it's the equivalent of the whole dump. USE statements get their own files in the same sequence.

  1. #! /usr/bin/perl
  2.  
  3. use strict;
  4. use warnings;
  5. use 5.010;
  6.  
  7. my $dump_file = $ARGV[0];
  8. &usage() if !$dump_file;
  9.  
  10. say "using ".$dump_file;
  11.  
  12. my ($line, $table,@query, $file_number,$file_name);
  13. my $line_number = 1;
  14. my $find_count = 0;
  15.  
  16. open(DUMP_IN, "< $dump_file");
  17. while(<DUMP_IN>){
  18. my $line = $_;
  19. if (/^USE\s.(\w+)./){
  20. say "changing db: ".$1;
  21. $file_name = &make_file_name("USE_$1", "$find_count");
  22. &write_USE($file_name, $line);
  23. $find_count++;
  24. }elsif (/^-- Table structure for table .(.+)./){
  25. ## If the current line is the beginning of a table definition
  26. ## and @query is defined, then @query must be full of the previous
  27. ## table, so we want to process it now:
  28. if (@query){
  29. $file_name = &make_file_name("$table", "$find_count");
  30. open(OUTPUT, ">$file_name");
  31. foreach(@query){
  32. print OUTPUT $_;
  33. }
  34. close OUTPUT;
  35. undef @query;
  36. }
  37. $table = $1;
  38. $find_count++;
  39. }
  40. next unless $table;
  41. push @query, $line;
  42.  
  43. $line_number++;
  44. }
  45. close DUMP_IN;
  46. say $line_number;
  47.  
  48. ## Subroutines!
  49. sub write_USE() {
  50. my($filename, $line) = @_[0,1];
  51. open (OUTPUT, ">$filename");
  52. print OUTPUT $line;
  53. close OUTPUT;
  54. }
  55.  
  56. sub make_file_name() {
  57. my ($type, $number) = @_[0,1];
  58. $number = sprintf("%05d", $number);
  59. $file_name=$number."_".$type.".sql";
  60. return $file_name;
  61. }
  62.  
  63. sub usage() {
  64. say "Error: missing arguments.";
  65. say "Usage:";
  66. say "$0 [MYSQL_DUMP]";
  67. exit 1;
  68. }
  69.  

A small downside is that this replaces my 2.5Gb file with about 1800 smaller ones. A scripted importer is to follow.

Massive dumps with MySQL

hurr. *insert FLUSH TABLES joke here*

I have a 2.5GB sql dump to import to my MySQL server. MySQL doesn't like me giving it work to do, and the box it's running on only has 3GB of memory. So, I stumbled across bigdump, which is brilliant. It's a PHP script that splits massive SQL dumps into smaller statements, and runs them one at a time against the server. Always the way: 10 lines into duct-taping together something to do the job for you, you find that someone else has done it rather elegantly.1

In short, we extract the directory to a publicly http-accessible location, stick the sql dump there and tell it to go.

In long, installation is approximately as follows:

  1. avi@jup-linux2:~$ cd www
  2. avi@jup-linux2:~/www$ mkdir bigdump
  3. avi@jup-linux2:~/www$ chmod 777 bigdump
  4. avi@jup-linux2:~/www$ cd bigdump/
  5. avi@jup-linux2:~/www$ wget -q http://www.ozerov.de/bigdump.zip
  6. avi@jup-linux2:~/www$ unzip bigdump.zip
  7. avi@jup-linux2:~/www/bigdump$ ls
  8. bigdump.php bigdump.zip

Where ~/www is my apache UserDir (i.e. when I visit http://localhost/~avi, i see the contents of ~/www). We need permissions to execute PHP scripts in this dir, too (which I have already). We also need to give everyone permissions to do everything - don't do this on the internet!2

Configuration involves editing bigdump.php with the hostname of our MySQL server, the name of the DB we want to manipulate and our credentials. The following is lines 40-45 of mine:

  1. // Database configuration
  2.  
  3. $db_server = 'localhost';
  4. $db_name = 'KBDB';
  5. $db_username = 'kbox';
  6. $db_password = 'imnottellingyou';

Finally, we need to give it a dump to process. For dumps of less than 2Mb3, we can upload through the web browser, else we need to upload or link our sql dump to the same directory as bigdump:

  1. avi@jup-linux2:~/www/bigdump$ ln -s /home/avi/kbox/kbox_dbdata ./dump.sql

Now, we visit the php page through a web browser, and get a pretty interface:

BigDump lists all the files in its working directory, and for any that are SQL dumps provides a 'Start Import' link. To import one of them, click the link and wait.

  1. Yes, you Perl people, it's in PHP. But it's not written by me. So on balance turns out more elegant. []
  2. Those permissions aside - anyone can execute whatever SQL they like with your credentials through this page. Seriously, not on the internet! []
  3. Or whatever's smaller out of upload_max_filesize and post_max_size in your php.ini []