如何在C#Web应用程序中将SQL Server表中超过100万行导出到CSV? [英] How to export more than 1 million rows from SQL Server table to CSV in C# web app?

查看:192
本文介绍了如何在C#Web应用程序中将SQL Server表中超过100万行导出到CSV?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将具有100万行和45列的SQL Server表导出到.csv文件,以供用户通过Web界面下载,但是花费的时间太长,最终我不得不手动停止该过程. /p>

我使用SqlDataReader并在阅读器读取时写入文件中,以避免出现内存问题.该代码适用于小型表(少于3k行),但大型表可继续运行,并且目标文件的大小为0 KB.

using (spContentConn) { using (var sdr = sqlcmd.ExecuteReader())
    using (CsvfileWriter)
    { 
        DataTable Tablecolumns = new DataTable();

        for (int i = 0; i < sdr.FieldCount; i++)
        {
            Tablecolumns.Columns.Add(sdr.GetName(i));
        }

        CsvfileWriter.WriteLine(string.Join("~", Tablecolumns.Columns.Cast<DataColumn>().Select(csvfile => csvfile.ColumnName)));

        while (sdr.Read())
            for (int j = Tablecolumns.Columns.Count; j > 0; j--)
            {
                if (j == 1)
                    CsvfileWriter.WriteLine("");
                else
                    CsvfileWriter.Write(sdr[Tablecolumns.Columns.Count - j].ToString() + "~");
            }
    }

我在此线程中使用了相同的建议答案,但仍然无法正常工作.请帮忙. 导出大型数据表数据到c#Windows应用程序中的.csv文件

解决方案

.NET文档中尚不清楚FileWriter是否具有有效的缓冲,因此,我始终使用

I used the same answer recommended in this thread but still doesn't work. Please help. export large datatable data to .csv file in c# windows applications

解决方案

It is not clear from the .NET documentation whether FileWriter has efficient buffering, therefore I always use a BufferedStream instead when I need to read/write large volumes of data. With a stream, you would have to write byte data instead of strings, but that requires only a minor adaptation of your code.

It also looks like you are reading and writing the columns of a DataTable in a loop, which would affect performance. Since the number and order of the columns would not change during an export operation, consider using the positional index to access the column values instead. It would also be better to write one row at a time instead of one column at a time.

Finally, you are using a data-reader, so that should provide the best throughput of data from your SQL Server (limited by your server and bandwidth, obviously). This would also suggest that the performance bottleneck is in the way that your data is being written to file.

For comparison, I just wrote 1,000,000 rows of 45 columns to a text file in under 60 seconds. Granted that my code does not read from a database, but that should still provide a good enough baseline for you.

这篇关于如何在C#Web应用程序中将SQL Server表中超过100万行导出到CSV?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆