0

I am working on a PHP application using MYSQL as a database which is hosted on my local server. The size of my database is around 10 GB. As a part of deployment process I need to import database into live server.

While importing sql file it throws an exception of max upload size up to 2 MB. Hence I need to create partitions of SQL file so that I can import tables belonging to database in chunks. How to accomplish it?

James Z
  • 12,209
  • 10
  • 24
  • 44
Rubin Porwal
  • 3,736
  • 1
  • 23
  • 26
  • Does https://stackoverflow.com/questions/3958615/import-file-size-limit-in-phpmyadmin help? – Nigel Ren Jun 24 '17 at 11:20
  • @NigelRen Thanks for your answer .But I need a solution to partition sql file logically through grouping queries of multiple tables – Rubin Porwal Jun 24 '17 at 12:05

1 Answers1

0

This is a late answer, but had a similar issue, and found a resolution. Hope this helps someone.

You'll need Git for this. If you have git-bash installed, you can go ahead. If not, find it here [1]: https://git-scm.com/download

Use the split command in Git Bash to split the file:

  1. 100000 lines each: split sqlfile.sql -l 100000 --additional-suffix=.sql

This should do the trick. The other thing that needs to be taken care of is the encoding. You need to change encoding for the files to utf-8 for them to function correctly.

P.S. cd to the correct folder while running git-bash. Otherwise your split files will be formed in the Program Files\Git folder.

Sherwin
  • 49
  • 6