我可以fetchall()一百万行吗? [英] python postgres can I fetchall() 1 million rows?
问题描述
我正在python中使用 psycopg2 模块从postgres数据库中读取数据,我需要对具有超过一百万行的列中的所有行进行一些操作。
I am using psycopg2 module in python to read from postgres database, I need to some operation on all rows in a column, that has more than 1 million rows.
我想知道 cur.fetchall()
会失败还是导致服务器宕机? (因为我的RAM可能不足以容纳所有数据)
I would like to know would cur.fetchall()
fail or cause my server to go down? (since my RAM might not be that big to hold all that data)
q="SELECT names from myTable;"
cur.execute(q)
rows=cur.fetchall()
for row in rows:
doSomething(row)
更聪明的方法是什么?
推荐答案
fetchall()
最多可获取 arraysize
的限制,因此,为防止对数据库造成重大打击,您可以批量获取可管理的行,也可以单步浏览游标直到其耗尽:
fetchall()
fetches up to the arraysize
limit, so to prevent a massive hit on your database you can either fetch rows in manageable batches, or simply step through the cursor till its exhausted:
row = cur.fetchone()
while row:
# do something with row
row = cur.fetchone()
这篇关于我可以fetchall()一百万行吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!