16

I know that this question has been asked (and answered) many times, but none of them appear to be the same problem that I am seeing...

The table that is giving me problems only has two columns: the first field is an integer, the second field is longtext. Here is a portion of a dump file from MySQL 5.5.30:

1 - MySQL dump 10.13  Distrib 5.5.30, for Linux (x86_64)
2 --
3 -- Host: localhost    Database: mydatabasename
4 -- ------------------------------------------------------
5 -- Server version   5.5.30-log

32 DROP TABLE IF EXISTS `large_file`;
33 /*!40101 SET @saved_cs_client     = @@character_set_client */;
34 /*!40101 SET character_set_client = utf8 */;
35 CREATE TABLE `large_file` (
36   `id` int(11) NOT NULL AUTO_INCREMENT,
37   `data` longtext,
38   PRIMARY KEY (`id`)
39 ) ENGINE=InnoDB AUTO_INCREMENT=59 DEFAULT CHARSET=latin1;
40 /*!40101 SET character_set_client = @saved_cs_client */;

43 -- Dumping data for table `large_file`
44 --
45 
46 LOCK TABLES `large_file` WRITE;
47 /*!40000 ALTER TABLE `large_file` DISABLE KEYS */;
48 INSERT INTO `large_file` VALUES(38,'GyUtMTIzNDVYQ... <large data> ...);
49 /*!40000 ALTER TABLE `large_file` ENABLE KEYS */;
50 UNLOCK TABLES;

As you can see this dump file came from MySQL 5.5.30, and I can import this data into 5.5.30. But, when I try to import into 5.6.x, I get the ERROR 1118 (42000) Row size too large error.

The data going into the large_file table, is (relatively) large, values range in size from 15 MB to about 25 MB. The data is all ASCII (base 64 encoded).

Other posters have had issues with very large number of columns, but I only have two columns in this table.

The longtext type should be capable of storing approx 4 GB, and this has been the case with 5.5.30, but I am finding migration to 5.6.x to be difficult.

Can anyone offer insight into why this is happening? Or, how I can work around it?

Thanks in advance!

Dee
  • 175
  • 1
  • 1
  • 7
  • Try to see here [link - stackoverflow][1] [1]: http://stackoverflow.com/questions/13283381/row-size-too- – Davide Rossi Aug 06 '14 at 15:07
  • Thanks for the pointer @PrincipeDade. I think that issue is more related to large number of columns. I only have two columns, one of which is largetext. It is my understanding that only a small part of a largetext field is stored in the row and the remainder is stored separately [link](http://dev.mysql.com/doc/refman/5.6/en/column-count-limit.html). – Dee Aug 07 '14 at 15:38
  • Here is a test file that demonstrates the issue. It contains one database (name: testdatabase) and one table (name: testtable). There are 2 columns in the table, and there is one row in the table. [test file](http://tiny.cc/5lsfkx). – Dee Aug 11 '14 at 16:46
  • To test with the file linked in the last comment, do: `gunzip -c db.sql.gz | mysql -u root` – Dee Aug 11 '14 at 17:27
  • Related (as yet unanswered) issue: http://stackoverflow.com/questions/25299370/mysql-error-1118-row-size-too-large-when-restoring-django-mailer-database – Greg Humphreys Aug 14 '14 at 03:05

5 Answers5

11

Check that the innodb_log_file_size setting is sufficiently large -- 10 times the largest BLOB data size found in the rows in the table plus the length of other variable length fields.

The following is from MySQL 5.6 Release Notes

InnoDB Notes

  • Important Change: Redo log writes for large, externally stored BLOB fields could overwrite the most recent checkpoint. The 5.6.20 patch limits the size of redo log BLOB writes to 10% of the redo log file size. The 5.7.5 patch addresses the bug without imposing a limitation. For MySQL 5.5, the bug remains a known limitation.

    As a result of the redo log BLOB write limit introduced for MySQL 5.6, innodb_log_file_size should be set to a value greater than 10 times the largest BLOB data size found in the rows of your tables plus the length of other variable length fields (VARCHAR, VARBINARY, and TEXT type fields). Failing to do so could result in “Row size too large” errors. No action is required if your innodb_log_file_size setting is already sufficiently large or your tables contain no BLOB data. (Bug #16963396, Bug #19030353, Bug #69477)

Brian Ridgeway
  • 255
  • 2
  • 5
  • Brian, thank you so much! That did, indeed, resolve the issue. – Dee Sep 06 '14 at 15:29
  • excellent +1 this has solved my issue when dumping from 5.1.60 and loading into 5.6.x – Daryl B Sep 17 '14 at 07:24
  • increased to 20GB, unfortunately it is not working for me. how much maximum value we can give to innodb_log_file_size? – Ram Sep 21 '16 at 12:30
  • the combined size of your logs files (innodb_log_file_size * innodb_log_files_in_group) cannot exceed a maximum of just under 512GB. the high end of the log file size is 1/nth the size of the buffer pool, where N is the number of log files in the group. http://dev.mysql.com/doc/refman/5.6/en/innodb-parameters.html#sysvar_innodb_log_file_size – Brian Ridgeway Sep 22 '16 at 19:34
8

I had this issue with MYSQL 5.7 (OSX 10.11).

The following worked although it may not be ideal.

In my.cfn add:

innodb_strict_mode = 0    
rbento
  • 9,919
  • 3
  • 61
  • 61
Solomous
  • 89
  • 1
  • 1
8
ERROR 1118 (42000) at line 1852: Row size too large (> 8126). Changing some columns to TEXT or BLOB may help. In current row format, BLOB prefix of 0 bytes is stored inline.
[mysqld]
innodb_log_file_size = 512M
innodb_strict_mode = 0

ubuntu 16.04 edit path : nano /etc/mysql/mysql.conf.d/mysqld.cnf

it work!!…

Click To Document

Martin
  • 22,212
  • 11
  • 70
  • 132
1

Ran into an identical problem importing BLOB data from 5.5 to 5.6.22 where the blobs were aprox: 70MB maximum. However, while increasing the innodb_log_file_size did the trick, in my case, I had to increase it to 10GB (I tried in increments of 1GB starting from 1GB) which is way more than 10 times the max BLOB size.

1

If InnoDB strict mode is enabled this error can show.

SET GLOBAL innodb_strict_mode=OFF;

For more Detail Read

Nanhe Kumar
  • 15,498
  • 5
  • 79
  • 71