Zend_DB什么是将大量数据导出到csv的正确策略? - chunks - fetchAll - fetchRow [英] Zend_DB whats the right strategy to export large amounts of data to csv? – chunks – fetchAll – fetchRow

查看:212
本文介绍了Zend_DB什么是将大量数据导出到csv的正确策略? - chunks - fetchAll - fetchRow的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我必须输出大量的数据。另外我必须通过PHP转换每一个记录一点点。什么是出口大量数据的正确策略?




  • 我是否在限制(1000,x)的多个分块查询中分割Zend_Db请求?

  • 我是否使用fetchAll或fetchRow?

  • 考虑到高性能,哪个fetchrow或fetchall表现更好?



我无法使用SQL OUTFILE因为我必须解释来自一列的xml / html。据我所知,mysql无法做到这一点。这对我来说,我可以使用fetchrow或fetchall,但是我无法在mysql服务器上进行处理。因为我获取了大量的数据fetchAll可能会导致内存不足的PHP。所以我不知道我是否可以通过使用fetchrow来避免这种情况,或者我是否必须使用块? fetchrow比fetchall慢?

解决方案

  SELECT field,field1,field2 
INTO OUTFILE'/home/user/out.csv'
FIELDS TERMINATED BY','
OPTIONALLY ENCLOSED BY'''
ESCAPED BY'\\'
LINES TERMINATED BY'\\\
'
FROM your_table WHERE 1 = 1


I have to export a huge amount of data. Also I have to transform every record a little bit through php. Whats the right strategy to export large amounts of data?

  • Do I split Zend_Db requests in multiple chunked queries with limit(1000,x)?
  • Do I use fetchAll or fetchRow?
  • Which fetchrow or fetchall is performing better considering high performance?

I cannot use SQL OUTFILE since I have to interpret the xml/html coming from one column. As far as I know mysql is not able to do this. This means for me, I either can use fetchrow or fetchall, but I cannot process on the mysql server. Since I'm fetching a huge amount of data fetchAll may leed to a out of memory of php. So I'm not sure if I can avoid this by using fetchrow or if I have to use chunks anyways? Is fetchrow slower than fetchall?

解决方案

SELECT field, field1, field2 
INTO OUTFILE '/home/user/out.csv' 
FIELDS TERMINATED BY ',' 
OPTIONALLY ENCLOSED BY '"'
ESCAPED BY '\\'
LINES TERMINATED BY '\n'
FROM your_table WHERE 1=1

这篇关于Zend_DB什么是将大量数据导出到csv的正确策略? - chunks - fetchAll - fetchRow的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆