使用 Django 将数千条记录插入 SQLite 表的有效方法是什么? [英] What is an efficient way of inserting thousands of records into an SQLite table using Django?

查看:33
本文介绍了使用 Django 将数千条记录插入 SQLite 表的有效方法是什么?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我必须使用 Django 的 ORM 将 8000 多条记录插入 SQLite 数据库.此操作需要作为 cronjob 大约每分钟运行一次.
目前我正在使用 for 循环遍历所有项目,然后将它们一一插入.
示例:

I have to insert 8000+ records into a SQLite database using Django's ORM. This operation needs to be run as a cronjob about once per minute.
At the moment I'm using a for loop to iterate through all the items and then insert them one by one.
Example:

for item in items:
    entry = Entry(a1=item.a1, a2=item.a2)
    entry.save()

这样做的有效方法是什么?

What is an efficient way of doing this?

两种插入方式的小对比.

A little comparison between the two insertion methods.

没有 commit_manually 装饰器(11245 条记录):

Without commit_manually decorator (11245 records):

nox@noxdevel marinetraffic]$ time python manage.py insrec             

real    1m50.288s
user    0m6.710s
sys     0m23.445s

使用 commit_manually 装饰器(11245 条记录):

Using commit_manually decorator (11245 records):

[nox@noxdevel marinetraffic]$ time python manage.py insrec                

real    0m18.464s
user    0m5.433s
sys     0m10.163s

注意:除了插入数据库之外,test 脚本还执行一些其他操作(下载 ZIP 文件,从 ZIP 存档中提取 XML 文件,解析 XML文件),因此执行所需的时间并不一定代表插入记录所需的时间.

Note: The test script also does some other operations besides inserting into the database (downloads a ZIP file, extracts an XML file from the ZIP archive, parses the XML file) so the time needed for execution does not necessarily represent the time needed to insert the records.

推荐答案

你想签出 django.db.transaction.commit_manually.

http://docs.djangoproject.com/en/dev/topics/db/transactions/#django-db-transaction-commit-manually

所以应该是这样的:

from django.db import transaction

@transaction.commit_manually
def viewfunc(request):
    ...
    for item in items:
        entry = Entry(a1=item.a1, a2=item.a2)
        entry.save()
    transaction.commit()

这只会提交一次,而不是在每次 save() 时提交.

Which will only commit once, instead at each save().

在 django 1.3 中引入了上下文管理器.所以现在你可以使用 transaction.commit_on_success() 以类似的方式:

In django 1.3 context managers were introduced. So now you can use transaction.commit_on_success() in a similar way:

from django.db import transaction

def viewfunc(request):
    ...
    with transaction.commit_on_success():
        for item in items:
            entry = Entry(a1=item.a1, a2=item.a2)
            entry.save()

在 django 1.4 中,bulk_create 已添加,允许您创建模型对象的列表,然后一次性提交它们.

In django 1.4, bulk_create was added, allowing you to create lists of your model objects and then commit them all at once.

注意使用批量创建时不会调用保存方法.

NOTE the save method will not be called when using bulk create.

>>> Entry.objects.bulk_create([
...     Entry(headline="Django 1.0 Released"),
...     Entry(headline="Django 1.1 Announced"),
...     Entry(headline="Breaking: Django is awesome")
... ])

在 django 1.6 中,transaction.atomic 被引入,旨在替换现在的旧函数 commit_on_successcommit_manually.

In django 1.6, transaction.atomic was introduced, intended to replace now legacy functions commit_on_success and commit_manually.

来自 django 关于 atomic 的文档:

from the django documentation on atomic:

atomic 既可用作装饰器:

atomic is usable both as a decorator:

from django.db import transaction

@transaction.atomic
def viewfunc(request):
    # This code executes inside a transaction.
    do_stuff()

作为上下文管理器:

from django.db import transaction

def viewfunc(request):
    # This code executes in autocommit mode (Django's default).
    do_stuff()

    with transaction.atomic():
        # This code executes inside a transaction.
        do_more_stuff()

这篇关于使用 Django 将数千条记录插入 SQLite 表的有效方法是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆