C#SQL出口(出现OutofMemory问题) [英] C# SQL export (OutofMemory issue)

查看:298
本文介绍了C#SQL出口(出现OutofMemory问题)的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想批量出口出SQL表,代码工作正常。 Nothign不对的地方可言,问题在于在270万行出口50列。任何人都可以帮我OutOfMemory例外。我知道为什么它的发生,因为它试图将其所有读到内存中,然后写,但我不知道怎么OT打破它读取一行一行然后写出来,而不到永远。

 字符串内容= File.ReadAllText(textBox3.Text); 

串TABLEID = TABLEID = Regex.Match(内容,@CoreDBCaseID =(小于?ID> \d +))。组[ID]值。
字符串服务器=服务器= Regex.Match(内容,@服务器=(LT;服务器> [^;] +))组[服务器]价值。;
串安全=安全= Regex.Match(内容,安全=(小于?安全和GT; [^;] +))组[安全]价值。;
串数据库=数据库= Regex.Match(内容,数据库=(小于?数据库> [^ \r] +))组[数据库]价值。;

的String []数据=新的String [] {
的String.Format(表= {0},TABLEID),
的String.Format(服务器= {0 },服务器),
的String.Format(安全= {0},安全性),
的String.Format(数据库= {0},数据库),
};
串SQLTABLE =(TDS_C+ TABLEID +_table);
字符串COLS =的string.join(,,listBox1.Items.Cast&所述;串方式>()ToArray的());
字符串SQL =选择+ COLS +由+ SQLTABLE +,其中*;
的SqlConnection CON =新的SqlConnection(数据源=+服务器+;初始目录=+数据库+ 集成安全性=+安全性);
con.Open();
SqlDataAdapter的tabadapter =新SqlDataAdapter的(SQL,CON);
的数据集的数据集=新的DataSet();
tabadapter.FillSchema(数据集,SchemaType.Source,SQLTABLE);
tabadapter.Fill(数据集,SQLTABLE);
的DataTable tbltarget = dataset.Tables [0];
串output_text =
tbltarget.Columns.Cast<&DataColumn的GT;()了ToList()
。选择(COL => col.ColumnName)
.Aggregate((电流,旁边)=>当前+| 接下来+)+\r\\\

+
tbltarget.Rows.Cast< DataRow的方式>()了ToList()
。选择(行=> row.ItemArray .Aggregate((电流,旁边)=> current.ToString()+|+ next.ToString()更换(\\\
,)))
.Cast<串GT;( ).Aggregate((电流,旁边)=>当前+\r\\\
+下一个);

File.WriteAllText(@C:\Export.txt);

con.Close();


解决方案

不填的数据集。相反,使用一个数据读取器,那么你可以为你读它,这意味着更小的内存占用流数据出来。



您可以考虑做异步读取自db和有触发一个异步写入文件。这将停止每个读写是一个完全阻塞操作






简单的例子:

 使用(SqlConnection的连接=新的SqlConnection(的connectionString)){
connection.Open();
用(STRM的FileStream =新的FileStream(文件路径)){使用
(TextWriter的WRT =新的TextWriter(STRM)){
的SqlCommand CMD =新的SqlCommand(SQL,连接);
IDataReader的RDR = cmd.ExecuteReader();
,而rdr.Read()
{
wrt.Write(RDR [0]的ToString()+|+ RDR [1]的ToString(); //改变你列的操作
}
}}}


I am trying to bulk export out of a sql table, the code works fine. Nothign wrong with it at all, the problem resides in exporting 50 columns at 2.7million rows. Can anybody help me with the OUtofMemory exception. I know why it's happening because it tried to read it all to memory then write, but i'm not sure how ot break it up to read line by line then write out without taking forever.

string contents = File.ReadAllText(textBox3.Text);

string tableid = tableid = Regex.Match(contents, @"CoreDBCaseID=(?<id>\d+)").Groups["id"].Value;
string server = server = Regex.Match(contents, @"Server=(?<Server>[^;]+)").Groups["Server"].Value;
string security = security = Regex.Match(contents, "Security=(?<Security>[^;]+)").Groups["Security"].Value;
string database = database = Regex.Match(contents, "Database=(?<Database>[^\r]+)").Groups["Database"].Value;

string[] data = new string[] {
    string.Format("Table={0}", tableid),
    string.Format("Server={0}", server),
    string.Format("Security={0}", security),
    string.Format("Database={0}", database),
};
string sqltable = ("TDS_C" + tableid + "_table");
String cols = String.Join(",", listBox1.Items.Cast<String>().ToArray());
string sql = "select " + cols + " from " + sqltable + " where *;
SqlConnection con = new SqlConnection("Data Source=" + server + ";Initial Catalog=" + database + ";Integrated Security=" + security);
con.Open();
SqlDataAdapter tabadapter = new SqlDataAdapter(sql, con);
DataSet dataset = new DataSet();
tabadapter.FillSchema(dataset, SchemaType.Source, sqltable);
tabadapter.Fill(dataset, sqltable);
DataTable tbltarget = dataset.Tables[0];
string output_text =
    tbltarget.Columns.Cast<DataColumn>().ToList()
    .Select(col => col.ColumnName)
    .Aggregate((current, next) => current + "|" + next) + "\r\n"
+
    tbltarget.Rows.Cast<DataRow>().ToList()
    .Select(row => row.ItemArray.Aggregate((current, next) => current.ToString() + "|" + next.ToString().Replace("\n", "")))
    .Cast<string>().Aggregate((current, next) => current + "\r\n" + next);

File.WriteAllText(@"C:\Export.txt);

con.Close();

解决方案

Don't fill a dataset. Instead, use a data reader, then you can stream that data out as you read it in, meaning a much smaller memory footprint.

You could look into doing async reads from the db and have that trigger an async write to the file. This would stop each read-write being a fully blocking operation.


Simple example:

using (SqlConnection connection = new SqlConnection(connectionString)){
connection.Open();
using (FileStream strm = new FileStream(filePath)){
using (TextWriter wrt = new TextWriter(strm)){
SqlCommand cmd = new SqlCommand(sql, connection);
IDataReader rdr = cmd.ExecuteReader();
while rdr.Read()
{
    wrt.Write(rdr[0].ToString() + "|" + rdr[1].ToString(); // change for your manipulation of the columns
}
}}}

这篇关于C#SQL出口(出现OutofMemory问题)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆