I've seen many threads about this error, but the solutions I've found don't seem to be applicable in my case.
I've received a rather large (~150Go) dump file from an Oracle database. I converted it to a MySQL one, using OraDump. However, when I try to import it in my MySQL server, I get the infamous error :
ERROR 111 (42000) at line 162936 : Row size too large. The maximum row size for the used table, not counting BLOBs, is 65535.
This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs.
I tried increasing the innodb_log_file_size parameter, removing the strict mode, switching from ImmoDB to myISAM, nothing worked.
In my last attempt, I tried to add the -f parameter to the dump importation, in hope to just squeeze past the error, but now it just seems stuck.
I don't think I can change the table schemas, since they are created within the 150Go dump file, and I don't even know which tables/columns are at fault.
Is there any way around it ?
EDIT : I managed to find the table responsible for that error, and found that it happens when I'm trying to declare it :
#
# Table structure for table 'F_TABLE_EXAMPLE'
#
DROP TABLE IF EXISTS `F_TABLE_EXAMPLE`;
CREATE TABLE `F_TABLE_EXAMPLE` (
`COL_1` BIGINT,
`COL_2` VARCHAR(10) CHARACTER SET utf8,
`COL_3` BIGINT,
`COL_4` BIGINT,
`COL_5` DECIMAL(16,2),
`COL_6` DECIMAL(16,2),
`COL_7` VARCHAR(5) CHARACTER SET utf8,
`COL_8` DATETIME,
`COL_9` VARCHAR(50) CHARACTER SET utf8,
`COL_10` VARCHAR(4000) CHARACTER SET utf8,
`COL_11` VARCHAR(4000) CHARACTER SET utf8,
`COL_12` VARCHAR(4000) CHARACTER SET utf8,
`COL_13` VARCHAR(4000) CHARACTER SET utf8,
`COL_14` VARCHAR(4000) CHARACTER SET utf8,
`COL_15` VARCHAR(4000) CHARACTER SET utf8
) ENGINE=InnoDB;
If I remove COL_15, there's no error, but with it included I get the usual error. (I only included COL_15 since the error begins there, but I have a bunch of other columns in my declaration)