在java中读取大的CSV [英] Read large CSV in java
本文介绍了在java中读取大的CSV的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我正在使用OpenCSV库。我的代码是这样的
CsvToBean< User> csvConvertor = new CsvToBean< User>();
列表<使用者> list = null;
尝试{
list = csvConvertor.parse(strategy,new BufferedReader(new FileReader(filepath)));
} catch(FileNotFoundException e){
e.printStackTrace();
$ / code>
使用200,000条记录,数据被读入User bean对象列表。但是,对于超过我得到的数据
java.lang.OutOfMemoryError:Java堆空间
$ p
$ >-Xms256m
-Xmx1024m
将大文件拆分成单独的文件并再次读取这些文件,我认为这是一个冗长的解决方案。
是否有任何其他方式可以避免OutOfMemoryError异常。
解决方案
阅读一行一行
类似于此
CSVReader reader = new CSVReader(new FileReader(yourfile.csv));
String [] nextLine;
while((nextLine = reader.readNext())!= null){
// nextLine []是
的数组数组$ b $ System.out.println(nextLine [0 ] + nextLine [1] +etc ...);
}
I want to read huge data from CSV, containing around 500,000 rows. I am using OpenCSV library for it. My code for it is like this
CsvToBean<User> csvConvertor = new CsvToBean<User>();
List<User> list = null;
try {
list =csvConvertor.parse(strategy, new BufferedReader(new FileReader(filepath)));
} catch (FileNotFoundException e) {
e.printStackTrace();
}
Upto 200,000 records,data is read into list of User bean objects. But for data more than that I am getting
java.lang.OutOfMemoryError: Java heap space
I have this memory setting in "eclipse.ini" file
-Xms256m
-Xmx1024m
I am thinking a solution of splitting the huge file in separate files and read those files again, which I think is a lengthy solution.
Is there any other way, by which I can avoid OutOfMemoryError exception.
解决方案
Read line by line
something like this
CSVReader reader = new CSVReader(new FileReader("yourfile.csv"));
String [] nextLine;
while ((nextLine = reader.readNext()) != null) {
// nextLine[] is an array of values from the line
System.out.println(nextLine[0] + nextLine[1] + "etc...");
}
这篇关于在java中读取大的CSV的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!
查看全文