来自C的sqlite3批量插入? [英] sqlite3 bulk insert from C?

查看:93
本文介绍了来自C的sqlite3批量插入?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我遇到了.import命令来执行此操作(批量插入),但是是否有此查询版本可以使用sqlite3_exec()执行。



我只想将一个小的文本文件内容复制到一个表中。



下面的查询版本,


。import demotab.txt mytable



解决方案

Sqlite的性能不能从批量插入中受益。只需单独执行插入操作(但在单个事务中!)就可以提供非常好的性能。这取决于索引的数量和/或数据的插入顺序。如果您没有任何索引,那么对于纯插入而言,缓存大小可能无关紧要。



请务必使用准备好的查询,而不是使用在最内部的循环中重新生成查询计划。将语句包装在事务中非常重要,因为这避免了文件系统将数据库同步到磁盘的需求-毕竟,无论如何原子写入都会部分中止事务,这意味着所有fsync()都会延迟到事务完成为止



最后,由于索引的创建有些昂贵,因此它们会限制您的插入性能。如果您确实要处理大量数据并从一个空表开始,那么在数据后 添加索引可能会有所好处-尽管这不是一个很大的因素。



哦,您可能想要购买那些intel X25-E SSD并确保您有AHCI控制器;-)。



我正在维护一个带有约500000000行(分布在多个表中)的sqlite db的应用程序-其中许多是使用普通的旧的begin-insert-commit批量插入的:它工作正常。


I came across the .import command to do this (bulk insert), but is there a query version of this which I can execute using sqlite3_exec().

I would just like to copy a small text file contents into a table.

A query version of this one below,

".import demotab.txt mytable"

解决方案

Sqlite's performance doesn't benefit from bulk insert. Simply performing the inserts separately (but within a single transaction!) provides very good performance.

You might benefit from increasing sqlite's page cache size; that depends on the number of indexes and/or the order in which the data is inserted. If you don't have any indexes, for a pure insert, the cache size is likely not to matter much.

Be sure to use a prepared query, as opposed to regenerating a query plan in the innermost loop. It's extremely important to wrap the statements in a transaction since this avoids the need for the filesystem to sync the database to disk - afterall, partially a written transaction is atomically aborted anyhow, meaning that all fsync()'s are delayed until the transaction completes.

Finally, indexes will limit your insert performance since their creation is somewhat expensive. If you're really dealing with a lot of data and start off with an empty table, it may be beneficial to add the indexes after the data - though this isn't a huge factor.

Oh, and you might want to get one of those intel X25-E SSD's and ensure you have an AHCI controller ;-).

I'm maintaining an app with sqlite db's with about 500000000 rows (spread over several tables) - much of which was bulk inserted using plain old begin-insert-commit: it works fine.

这篇关于来自C的sqlite3批量插入?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆