将大量记录从DBF转换为CSV时出现死锁问题 [英] Deadlock issues while converting bulk of records from DBF to CSV

查看:107
本文介绍了将大量记录从DBF转换为CSV时出现死锁问题的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试使用visual studio 2008 c#windows应用程序将DBF文件的视觉框转换为CSV文件。我在visual foxpro连接上取得成功并读取了文件。我使用了oledbdataadapter来填充数据集,我在这里遇到了死锁问题,因为我工作了大约800000条记录。所以我使用oledbdatareader,并读取每一行并写入csv文件。它工作正常,但在写了400000条记录之后,我再次面临死锁问题。我需要将整个800000条记录写入单个CSV文件中。我在这里附上了我的代码。请帮助将800000条记录写入一个CSV。



 OleDbCommand oleDbCommand =  new  OleDbCommand(query,oleDbConnection); 
oleDbCommand.CommandTimeout = 60000 ;
使用(OleDbDataReader oleDbDataReader = oleDbCommand.ExecuteReader())
{
使用(streamWriter1)
{
int fieldCount = oleDbDataReader.FieldCount;
string columns = string .Empty;
StringBuilder fullFile = new StringBuilder();
for int i = 0 ; i < fieldCount; i ++)
{
columns + = oleDbDataReader.GetName(i);
if (i!= fieldCount - 1
{
列+ = ;
}
}
fullFile.Append(columns);
fullFile.Append( \\\\ n);
streamWriter1.Write(fullFile.ToString());
int j = 0 ;
while (oleDbDataReader.Read())
{
j ++;
string rows = string .Empty;
for int i = 0 ; i < fieldCount; i ++)
{
rows + = oleDbDataReader.GetValue(i).ToString()。Trim() ;
rows + = i!= fieldCount - 1 ;
}
fullFile = new StringBuilder();
fullFile.Append(rows);
fullFile.Append( \\\\ n);
streamWriter1.Write(fullFile.ToString());
}
}
oleDbDataReader.Close();
oleDbCommand.Dispose();
oleDbConnection.Close();
}





我的尝试:



我试图将每200000条记录写入不同的文件。但是在400000记录写完后,这个逻辑也失败了死锁问题。

解决方案

我不知道这是不是因为你不应该创建一个新的 StringBuilder 使用已存在对象的名称在while循环内的对象(我不确定以这种方式重用对象名是否会造成内存泄漏。)。只需使用现有的:

  while (oleDbDataReader.Read())
{
// ...
// 避免这种情况:
// fullFile = new StringBuilder();
// 请改用:
fullFile.Clear();
fullFile.Append(rows);
fullFile.Append( \\\\ n);
streamWriter1.Write(fullFile.ToString());
}



如果这不是错误来源,它至少应该提高性能。

出于同样的原因你也可以更改字符串的生成。





来自注释似乎在工作线程中执行此任务可能会解决问题。在GUI线程中执行如此长时间运行的任务时,在执行任务时将无法响应消息。



有关工作线程的一些链接:

BackgroundWorker类(System.ComponentModel) [ ^ ]

初学者的BackgroundWorker类示例 [ ^ ]

[/ EDIT]


Hi, I am trying to convert visual box fro DBF file into CSV file using visual studio 2008 c# windows application. I am succeed in visual foxpro connection and read the file.I have used oledbdataadapter to fill dataset, I am failed here with deadlock issue because i working around 800000+ records. So i go with oledbdatareader, and read each row and wrote into the csv file. It's working fine but after writing 400000+ records am again faced deadlock issue. I need to write whole 800000+ records into a single CSV file. I have attached my code here. Please help to write 800000+ records into a single CSV .

OleDbCommand oleDbCommand = new OleDbCommand(query, oleDbConnection);
oleDbCommand.CommandTimeout = 60000;
using (OleDbDataReader oleDbDataReader = oleDbCommand.ExecuteReader())
{
    using (streamWriter1)
    {
        int fieldCount = oleDbDataReader.FieldCount;
        string columns = string.Empty;
        StringBuilder fullFile = new StringBuilder();
        for (int i = 0; i < fieldCount; i++)
        {
            columns += oleDbDataReader.GetName(i);
            if (i != fieldCount - 1)
            {
                columns += ",";
            }
        }
        fullFile.Append(columns);
        fullFile.Append("\r\n");
        streamWriter1.Write(fullFile.ToString());
        int j = 0;
        while (oleDbDataReader.Read())
        {
            j++;
            string rows = string.Empty;
            for (int i = 0; i < fieldCount; i++)
            {
                rows += oleDbDataReader.GetValue(i).ToString().Trim();
                rows += i != fieldCount - 1 ? "," : "";
            }
            fullFile = new StringBuilder();
            fullFile.Append(rows);
            fullFile.Append("\r\n");
            streamWriter1.Write(fullFile.ToString());
        }
    }
    oleDbDataReader.Close();
    oleDbCommand.Dispose();
    oleDbConnection.Close();  
}



What I have tried:

I have tried to write every 200000 records into different file. But this logic also failed with deadlock issues after 400000 record wrote.

解决方案

I don't know if this is the reason but you should not create a new StringBuilder object inside the while loop using the name of an already existing object (I'm not sure if re-using an object name in this way creates a memory leak.). Just use the existing one:

while (oleDbDataReader.Read())
{
    // ...
    // Avoid this:
    //fullFile = new StringBuilder();
    // Use this instead:
    fullFile.Clear();
    fullFile.Append(rows);
    fullFile.Append("\r\n");
    streamWriter1.Write(fullFile.ToString());
}


If this is not the error source it should at least improve the performance.
For the same reason you may also change the generation of the rows string.

[EDIT]
From the comments it seems that executing this task within a worker thread may solve the problem. When performing such long running tasks within a GUI thread, that will not be able to respond to messages while the task is executed.

Some links about worker threads:
BackgroundWorker Class (System.ComponentModel)[^]
BackgroundWorker Class Sample for Beginners[^]
[/EDIT]


这篇关于将大量记录从DBF转换为CSV时出现死锁问题的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆