错误1118(42000)行大小太大 [英] ERROR 1118 (42000) Row size too large

查看:281
本文介绍了错误1118(42000)行大小太大的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我知道这个问题已经被问过很多次了,但是似乎没有一个问题与我所见...

I know that this question has been asked (and answered) many times, but none of them appear to be the same problem that I am seeing...

给我带来问题的表只有两列:第一个字段是整数,第二个字段是longtext.这是MySQL 5.5.30中转储文件的一部分:

The table that is giving me problems only has two columns: the first field is an integer, the second field is longtext. Here is a portion of a dump file from MySQL 5.5.30:

1 - MySQL dump 10.13  Distrib 5.5.30, for Linux (x86_64)
2 --
3 -- Host: localhost    Database: mydatabasename
4 -- ------------------------------------------------------
5 -- Server version   5.5.30-log

32 DROP TABLE IF EXISTS `large_file`;
33 /*!40101 SET @saved_cs_client     = @@character_set_client */;
34 /*!40101 SET character_set_client = utf8 */;
35 CREATE TABLE `large_file` (
36   `id` int(11) NOT NULL AUTO_INCREMENT,
37   `data` longtext,
38   PRIMARY KEY (`id`)
39 ) ENGINE=InnoDB AUTO_INCREMENT=59 DEFAULT CHARSET=latin1;
40 /*!40101 SET character_set_client = @saved_cs_client */;

43 -- Dumping data for table `large_file`
44 --
45 
46 LOCK TABLES `large_file` WRITE;
47 /*!40000 ALTER TABLE `large_file` DISABLE KEYS */;
48 INSERT INTO `large_file` VALUES(38,'GyUtMTIzNDVYQ... <large data> ...);
49 /*!40000 ALTER TABLE `large_file` ENABLE KEYS */;
50 UNLOCK TABLES;

如您所见,此转储文件来自MySQL 5.5.30,我可以将此数据导入5.5.30.但是,当我尝试导入5.6.x时,出现 ERROR 1118(42000)行大小太大错误.

As you can see this dump file came from MySQL 5.5.30, and I can import this data into 5.5.30. But, when I try to import into 5.6.x, I get the ERROR 1118 (42000) Row size too large error.

进入large_file表的数据(相对)很大,值的范围从15 MB到大约25 MB.数据全部为ASCII(以64为基数编码).

The data going into the large_file table, is (relatively) large, values range in size from 15 MB to about 25 MB. The data is all ASCII (base 64 encoded).

其他海报的列数非常多,但此表中只有两列.

Other posters have had issues with very large number of columns, but I only have two columns in this table.

longtext类型应该可以存储大约4 GB,而5.5.30就是这种情况,但是我发现很难迁移到5.6.x.

The longtext type should be capable of storing approx 4 GB, and this has been the case with 5.5.30, but I am finding migration to 5.6.x to be difficult.

任何人都可以深入了解为什么会发生这种情况吗?或者,我该如何解决呢?

Can anyone offer insight into why this is happening? Or, how I can work around it?

提前谢谢!

推荐答案

检查innodb_log_file_size设置是否足够大-是表中行中找到的最大BLOB数据大小的10倍以及其他可变长度字段的长度.

Check that the innodb_log_file_size setting is sufficiently large -- 10 times the largest BLOB data size found in the rows in the table plus the length of other variable length fields.

以下内容来自 MySQL 5.6发行说明

InnoDB Notes

  • 重要更改:对于大型的,外部存储的BLOB字段,重做日志写入操作可能会覆盖最新的检查点. 5.6.20补丁程序将BLOB写入的重做日志大小限制为重做日志文件大小的10%. 5.7.5补丁解决了该错误,没有施加任何限制.对于MySQL 5.5,该错误仍然是已知的限制.

  • Important Change: Redo log writes for large, externally stored BLOB fields could overwrite the most recent checkpoint. The 5.6.20 patch limits the size of redo log BLOB writes to 10% of the redo log file size. The 5.7.5 patch addresses the bug without imposing a limitation. For MySQL 5.5, the bug remains a known limitation.

由于针对MySQL 5.6引入了重做日志BLOB写限制,因此应将innodb_log_file_size的值设置为大于表行中找到的最大BLOB数据大小加上其他可变长度字段的长度的10倍( VARCHAR,VARBINARY和TEXT类型字段). 不这样做可能会导致行尺寸太大"错误 .如果您的innodb_log_file_size设置已经足够大或您的表不包含BLOB数据,则无需采取任何措施. (错误#16963396,错误#19030353,错误#69477)

As a result of the redo log BLOB write limit introduced for MySQL 5.6, innodb_log_file_size should be set to a value greater than 10 times the largest BLOB data size found in the rows of your tables plus the length of other variable length fields (VARCHAR, VARBINARY, and TEXT type fields). Failing to do so could result in "Row size too large" errors. No action is required if your innodb_log_file_size setting is already sufficiently large or your tables contain no BLOB data. (Bug #16963396, Bug #19030353, Bug #69477)

这篇关于错误1118(42000)行大小太大的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆