How to Import Large MySQL Data Files

One Step: Use BigDump

I've been so frustrated at the file size limits for uploading large MySQL data files, and BigDump is the answer. Works like a charm.

More details below...

Basic Idea

  1. Enter database/login info into BigDump PHP script
  2. Upload script file and MySQL datafile to writable, web-accessible directory on your website.
  3. Run the script by navigating to your website URL containing the BigDump script.

Result: Script loads data by breaking the large file into smaller chunks and loading each chunk separately into the database. Very smooth.

NOTE: "BigDump will fail processing large tables containing extended inserts...Turn off extended inserts when exporting database from phpMyAdmin."

(:amazonpl wanderings-20 1890774685:)(:amazonpl wanderings-20 B007I8S1TY:)(:amazonpl wanderings-20 0672327120:)


  1. Open BigDump in a text editor and adjust the database configuration
  2. Drop the old tables on the target database if your dump doesn't contain "DROP TABLE" (use phpMyAdmin)
  3. Create the working directory (e.g. dump) on your web-server
  4. If you want to upload the dump files directly from the web-browser give the scripts writing permissions on the working directory (e.g. make chmod 777 on a Linux based system). You can upload the dump files from the browser up to the size limit set by the current PHP configuration of the web-server. Alternatively you can upload any files via FTP.
  5. Upload bigdump.php and the dump files (*.sql or *.gz) via FTP to the working directory (take care of TEXT mode upload for bigdump.php and dump.sql but BINARY mode for dump.gz if uploading from MS Windows).
  6. Run the bigdump.php from your browser via URL like . Now you can select the file to be imported from the listing of your working directory.
  7. BigDump will start every next import session automatically if you enable the JavaScript in your browser.
  8. Relax and wait for the script to finish. Do not close the browser window!
  9. IMPORTANT: Remove bigdump.php and your dump files from your server

Note 1: BigDump will fail processing large tables containing extended inserts. An extended insert contains all table entries within one SQL query. BigDump isn't able to split such SQL queries. In most cases BigDump will stop if some query includes to many lines. But if PHP complains that allowed memory size exhausted or MySQL server has gone away your dump probably also contains extended inserts. Please turn off extended inserts when exporting database from phpMyAdmin.

Note 2: Some web-servers disallow script execution in the directory with writing permissions for security reasons. If you changed the permissions on the working directory and you are getting a server error when running the script restore the permissions to their normal state for directories.

Note 3: If Timeout errors still occure you may need to adjust the $linespersession setting in bigdump.php.

Note 4: If mySQL server overrun occures you can use $delaypersession setting to let the script sleep some milliseconds or more before starting next session. This setting will only work if the JavaScript is activated.

Note 5: BigDump is currently not able to restore a single dump file with multiple databases inside (switched by the USE statement).

Note 6: If you experience problems with non-latin characters while using BigDump you have to adjust the $db_connection_char_set configuration variable in bigdump.php to match the encoding of your dump file.


  • Need help restoring or moving the large mySQL database?
  • Need help converting your dump file to use BigDump?
  • Need help developing new software that use BigDump?
  • Need a feature that BigDump doesn't currently provide?
  • Need help getting BigDump running?
  • Need a bug fixed right away?
  • Need someone you can ask if you run into a problem?

Support services for BigDump can provide the solution you need. For more information please contact me.


To restore the very large backup of your mySQL database (or a part of it) into the new or the same mySQL database. You can't access the server shell and you can't import the dump using phpMyAdmin or any other scripts due to hard memory resp. runtime limit of the web-server.


1. Bigdump-Script bigdump.php from the download above
2. Dump file(s) of your database created by phpMyAdmin I'll call it dump.sql from now on. You can also use GZip compressed dump files if you call them somehow like dump.gz. GZip support is only with PHP 4.3.0 and later

Note: Using a huge GZip compressed dump file can cause the script to exceed the PHP memory/runtime limit since the dump file has to be unpacked from the beginning everytime the session starts. If this happens use the uncompressed dump. It's your only chance.

3. Access account for your mySQL database
4. Access account for some web-server with PHP 4.1.0 or later installed. This web-server must be able to connect to the mySQL database (this ability is probably present if your web-server and the mySQL server are from the same ISP)

5. Some text editor like Notepad to edit the configuration file
6. Some FTP client to upload the files to the web-server
7. Common knowledge about files, PHP, mySQL databases, phpMyAdmin, FTP and HTTP


I get an error: "MySQL: Bad syntax near DEFAULT CHARACTER SET...". Why?

You are probably trying to restore a dump file from a newer mySQL database version into an older mySQL database. Sorry, this is obviously not a very good idea and this can't work. You have to recreate the dump file in compatibility mode or remove all incompatible stuff by hand.

I get an error: "MySQL: Table 'some_tbl_name' already exists". Why?

Your dump file doesn't contain DROP queries. Use phpMyAdmin to drop all the tables on the target database which must be restored before you start the import.

I get an error: "Fatal error: Allowed memory size of xxx bytes exhausted" or "MySQL server has gone away". Why?

Your dump file probably contains extended inserts. An extended insert contains all table entries within one SQL query. BigDump isn't able to split such SQL queries. Please turn off extended inserts when exporting database from phpMyAdmin.

Why does BigDump fail putting strange SQL errors if I run it from the localhost on my Windows PC?

This is IMHO a bug in the PHP 4.3.x that is crashing BigDump on Win32 if using a dump file with DOS encoded line breaks (contact me for details). As workaround create your dump with only the UNIX line breaks or convert it into UNIX format using some text editor. Then you will be able to run BigDump also at your localhost.

Can I use dump files created by other software than phpMyAdmin?

You can use any text dump file at your own risk. Although it's very important for BigDump to find a semicolon followed by a line break at the end of each query. It won't work else since it has no time to parse the queries to find their proper ends. Further BigDump behaviour can be unpredictable if your dump file is using double quotes to surround strings. BigDump won't work too if your dump file contains any proprietary comment lines (like in some dumps created by other tools). Although you can use the $comment setting in bigdump.php to drop non-standard comment lines by their first characters. Apart from that BigDump doesn't filter the dump file in any way.

Why don't you provide any facility to input the database configuration from the browser form?

Input the database configuration from the form would be very insecure since the settings must be saved in the cookies or returned to your browser everytime the session ends.

(:amazonpl wanderings-20 1890774685:)(:amazonpl wanderings-20 B007I8S1TY:)(:amazonpl wanderings-20 0672327120:)

Related Posts — 24 July 2007, 11:30

This program saved my life, thanks a million

Brent — 24 July 2007, 14:48

Same for me...I was so frustrated with my host in that it would not accept large sql files. What a pain--until I found the BigDump program.

Glad the info helped out. Thanks for taking the time to leave the note.

Yiannis — 17 August 2007, 04:42

omg .. Thanks a million .. I couldnt find any other way to import a ~300 mbs sql file ..

Brent — 19 August 2007, 07:16

So glad this helped. The program saved my butt, that's for sure.

Rene — 29 August 2007, 08:58

Its a great utility however I am getting a problem restoring one large database where after about 30% I get an error stating that the "page is not available or the wrong url" then it stops. When I refresh it then gives me the error message that a table already exists despite the fact that the tables were dropped before the restore. Any ideas?

Brent — 01 September 2007, 06:51

Hmmm....sounds like the refresh is restarting the script, which sees the tables created from the first attempt, and balks.

Also sounds like something is timing out. If you are in an apache environment, you may be able to extend the timeout variable. Or you may have to break up the file up a bit (though I know that defeats the purpose of the script). Perhaps breaking the file into 3 would avoid the timeout. Make sure the table creation and data inserts for the tables in each file are complete. That is, don't break up the file where you create table X in one file and insert data to table X in the next file. The script won't like that.

Author — 20 October 2007, 10:22

I found another solution to this problem here:

Web Musings: How do I upload large SQL files to MySQL?

Brent — 20 October 2007, 21:17

Thanks for sharing that! Looks like a very comprehensive tool.

Jason — 27 October 2007, 07:43

Thanks a million!!!!

My 450MB dump took about an hour and everything went smooth.

My website has been restored.

Brent — 27 October 2007, 09:06

Delighted to hear that! Good luck with your web endeavors.

Adam — 13 November 2007, 08:47

Thanks for the Big Dump link. Worked like a charm and saved me a bunch of time.

Jitendra — 06 April 2008, 06:26

Great script.

Thanks a lot :)

Erik Hansen — 13 May 2008, 02:19

Thank you so much for linking to this resource! I spent 6 hours over the past few days trying to import a large database using tools like PHPMyAdmin, and others, but they kept timing out and having other errors. I'm going to donate $10 to the author of the script because it was so helpful.

Brent — 13 May 2008, 17:28

Yeah, I had similar trouble with a large db, and it took quite a while to find the right solution. Glad Big Dump worked for you. Cheers to making the donation! It's well deserved.