使用Django将几千条记录插入到SQLite表中的有效方法是什么? [英] What is an efficent way of inserting thousands of records into an SQLite table using Django?

查看:1260
本文介绍了使用Django将几千条记录插入到SQLite表中的有效方法是什么?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我必须使用Django的ORM将8000+条记录插入SQLite数据库。这个操作需要每分钟一次作为一个cronjob运行。
目前我使用一个for循环遍历所有项目,然后逐个插入。

示例:
entry = Entry(a1 = item.a1,a2 = item.a2)中的项目的

I have to insert 8000+ records into a SQLite database using Django's ORM. This operation needs to be run as a cronjob about once per minute.
At the moment I'm using a for loop to iterate through all the items and then insert them one by one.
Example:

for item in items:
    entry = Entry(a1=item.a1, a2=item.a2)
    entry.save()

这是一种有效的方法? />




编辑:两种插入方法之间的一点比较。

What is an efficent way of doing this?


A little comparison between the two insertion methods.

没有commit_manually装饰器(11245条记录):

Without commit_manually decorator (11245 records):

nox@noxdevel marinetraffic]$ time python manage.py insrec             

real    1m50.288s
user    0m6.710s
sys     0m23.445s

使用commit_manually装饰器(11245条记录):

Using commit_manually decorator (11245 records):

[nox@noxdevel marinetraffic]$ time python manage.py insrec                

real    0m18.464s
user    0m5.433s
sys     0m10.163s

注意 测试脚本还会执行一些其他操作,除了插入数据库(下载ZIP文件,从ZIP存档提取XML文件,解析XML文件),因此执行所需的时间并不一定代表插入记录所需的时间。

Note: The test script also does some other operations besides inserting into the database (downloads a ZIP file, extracts an XML file from the ZIP archive, parses the XML file) so the time needed for execution does not necessarily represent the time needed to insert the records.

推荐答案

你想查看 django.db.transaction.commit_manually

http: //docs.djangoproject.com/en/dev/topics/db/transactions/#django-db-transaction-commit-manually

所以会如下所示:

from django.db import transaction

@transaction.commit_manually
def viewfunc(request):
    ...
    for item in items:
        entry = Entry(a1=item.a1, a2=item.a2)
        entry.save()
    transaction.commit()

哪些只会提交一次,而是在每次保存( )。

Which will only commit once, instead at each save().

在django 1.3介绍了上下文管理器。
所以现在你可以使用 transaction.commit_on_success() 以类似的方式:

In django 1.3 context managers were introduced. So now you can use transaction.commit_on_success() in a similar way:

from django.db import transaction

def viewfunc(request):
    ...
    with transaction.commit_on_success():
        for item in items:
            entry = Entry(a1=item.a1, a2=item.a2)
            entry.save()

在django 1.4中, <添加了code> bulk_create ,允许您创建模型对象的列表,然后一次提交它们。

In django 1.4, bulk_create was added, allowing you to create lists of your model objects and then commit them all at once.

注意使用批量创建时不会调用save方法。

NOTE the save method will not be called when using bulk create.

>>> Entry.objects.bulk_create([
...     Entry(headline="Django 1.0 Released"),
...     Entry(headline="Django 1.1 Announced"),
...     Entry(headline="Breaking: Django is awesome")
... ])

在django 1.6中, transaction.atomic 被引入,旨在替代现有的旧功能 commit_on_success commit_manually

In django 1.6, transaction.atomic was introduced, intended to replace now legacy functions commit_on_success and commit_manually.

从django 关于原子的文档

原子可用作装饰器:

from django.db import transaction

@transaction.atomic
def viewfunc(request):
    # This code executes inside a transaction.
    do_stuff()

作为上下文管理器:

from django.db import transaction

def viewfunc(request):
    # This code executes in autocommit mode (Django's default).
    do_stuff()

    with transaction.atomic():
        # This code executes inside a transaction.
        do_more_stuff()

这篇关于使用Django将几千条记录插入到SQLite表中的有效方法是什么?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆