3

I need a database table with "topics" so I downloaded wikipedia's sql file of categories (enwiki-latest-category.sql.gz on http://dumps.wikimedia.org/enwiki/latest/)

The file is 74MB (unzipped) but the mysql limit is 32MB. How can I import this file?

By the way: I tried bigdump (http://www.ozerov.de/bigdump/) but it also seems to have an import limit of 32MB.

Tomi Seus
  • 1,131
  • 2
  • 13
  • 28
  • I would open the file in a text editor and delete a bunch of it. Problem solved in about 15 seconds. – goat May 21 '12 at 19:39
  • Can't you change the limit? I'm sure there is a option for it somewhere in the MySQL configuration. – svick May 21 '12 at 19:49
  • how are you trying to import file? Are you trying to import a local mysql? – Nesim Razon May 21 '12 at 19:59
  • I'm trying to import the SQL-file into my existing database by creating a new table. – Tomi Seus May 22 '12 at 10:10
  • chris, editing the file in a text editor is a nightmare, as it is just too big! – Tomi Seus May 22 '12 at 10:11
  • Does this answer your question? [How do I import an SQL file using the command line in MySQL?](https://stackoverflow.com/questions/17666249/how-do-i-import-an-sql-file-using-the-command-line-in-mysql) – Martin Urbanec Jul 04 '20 at 17:20

2 Answers2

0

You could split it in pieces of 32MB and import them individually. It shouldn't be too time consuming.

Boelensman1
  • 314
  • 2
  • 15
0

If you have mysql installed on your Windows desktop, I have something crazy you may want to try.

Please perform the following steps on your local mysql box

  • STEP 01) Unzip the file enwiki-latest-category.sql.gz to enwiki-latest-category.sql

  • STEP 02) CREATE DATABASE mycat;

  • STEP 03) `mysql -u... -p... -Dmycat < enwiki-latest-category.sql

  • STEP 04) Export the CREATE TABLE for category

    mysqldump -u... -p... --no-data mycat category > category_00.sql

  • STEP 05) Dump the data divided into 10 sections. Since the AUTO_INCREMENT is 134526529, round it to 135000000 (135 million). Dump 20% (27 million) at a time

Just run 5 mysqldumps using the --where option against cat_id

mysqldump -u... -p... --no-create-info mycat category --where="cat_id <=  27000000"                         | gzip > category_01.sql.gz
mysqldump -u... -p... --no-create-info mycat category --where="cat_id  >  27000000 AND cat_id <=  54000000" | gzip > category_02.sql.gz
mysqldump -u... -p... --no-create-info mycat category --where="cat_id  >  54000000 AND cat_id <=  81000000" | gzip > category_03.sql.gz
mysqldump -u... -p... --no-create-info mycat category --where="cat_id  >  81000000 AND cat_id <= 108000000" | gzip > category_04.sql.gz
mysqldump -u... -p... --no-create-info mycat category --where="cat_id  > 108000000"                         | gzip > category_05.sql.gz

Upload these 6 files, unzip them, and load them in order.

Give it a Try !!!

RolandoMySQLDBA
  • 43,883
  • 16
  • 91
  • 132