批量复制一个DataTable到MySQL(类似于System.Data.SqlClient.SqlBulkCopy) [英] Bulk copy a DataTable into MySQL (similar to System.Data.SqlClient.SqlBulkCopy)
问题描述
我迁移我的程序从Microsoft SQL Server到MySQL。 。一切运作良好,除了与批量复制一个问题
在与MS SQL解决方案的代码看起来是这样的:
connection.Open();
SqlBulkCopy的bulkCopy =新SqlBulkCopy的(连接);
bulkCopy.DestinationTableName =TestTable的;
bulkCopy.WriteToServer(RAWDATA);
现在我尝试做对MySQL类似的事情。因为我认为会有糟糕的表现,我不想写数据表到CSV文件,并做插入从那里与 MySqlBulkLoader
类。
任何帮助将高度赞赏。
由于我认为会有糟糕的表现,我不想写数据表到CSV文件,并做插入从那里与MySqlBulkLoader类。
块引用>
基于毫无根据的假设不排除一个可能的解决方案。我只是用一个标准测试的100,000行插入从
System.Data.DataTable
到MySQL表MySqlDataAdapter的#更新()
A事务中
。它始终如一地花了大约30秒,运行:
<预类=郎-CS prettyprint-覆盖>使用( MySqlTransaction TRAN = conn.BeginTransaction使用(System.Data.IsolationLevel.Serializable))
{
(CMD的MySqlCommand =新的MySqlCommand())
{
cmd.Connection =康恩;
cmd.Transaction = TRAN;
cmd.CommandText =SELECT * FROM TestTable的;
使用(MySqlDataAdapter的DA =新MySqlDataAdapter的(CMD))
{
da.UpdateBatchSize = 1000;
使用(MySqlCommandBuilder将CB =新的MySqlCommandBuilder(DA))
{
da.Update(RAWDATA);
tran.Commit();
}
}
}
}
(我尝试了几个不同值的
UpdateBatchSize
,但他们似乎并没有对过去的时间显著的影响。)
相比之下,使用
MySqlBulkLoader
下面的代码只花了5或6秒运行...
<预类=郎-CS prettyprint-覆盖>字符串tempCsvFileSpec = @C:\Users\Gord\Desktop\dump.csv
使用(StreamWriter的作家=新的StreamWriter(tempCsvFileSpec))
{
Rfc4180Writer.WriteDataTable(RAWDATA,作家,FALSE);
}
变种msbl =新MySqlBulkLoader(康涅狄格州);
msbl.TableName =TestTable的;
msbl.FileName = tempCsvFileSpec;
msbl.FieldTerminator =,;
msbl.FieldQuotationCharacter ='';
msbl.Load();
System.IO.File.Delete(tempCsvFileSpec);
......包括(使用类似的这个),批量加载该文件中,事后删除文件。
I am migrating my program from Microsoft SQL Server to MySQL. Everything works well except one issue with bulk copy.
In the solution with MS SQL the code looks like this:
connection.Open(); SqlBulkCopy bulkCopy = new SqlBulkCopy(connection); bulkCopy.DestinationTableName = "testTable"; bulkCopy.WriteToServer(rawData);
Now I try to do something similar for MySQL. Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the
MySqlBulkLoader
class.Any help would be highly appreciated.
解决方案Because I think there would be bad performance I don't want to write the DataTable to a CSV file and do the insert from there with the MySqlBulkLoader class.
Don't rule out a possible solution based on unfounded assumptions. I just tested the insertion of 100,000 rows from a
System.Data.DataTable
into a MySQL table using a standardMySqlDataAdapter#Update()
inside aTransaction
. It consistently took about 30 seconds to run:using (MySqlTransaction tran = conn.BeginTransaction(System.Data.IsolationLevel.Serializable)) { using (MySqlCommand cmd = new MySqlCommand()) { cmd.Connection = conn; cmd.Transaction = tran; cmd.CommandText = "SELECT * FROM testtable"; using (MySqlDataAdapter da = new MySqlDataAdapter(cmd)) { da.UpdateBatchSize = 1000; using (MySqlCommandBuilder cb = new MySqlCommandBuilder(da)) { da.Update(rawData); tran.Commit(); } } } }
(I tried a couple of different values for
UpdateBatchSize
but they didn't seem to have a significant impact on the elapsed time.)By contrast, the following code using
MySqlBulkLoader
took only 5 or 6 seconds to run ...string tempCsvFileSpec = @"C:\Users\Gord\Desktop\dump.csv"; using (StreamWriter writer = new StreamWriter(tempCsvFileSpec)) { Rfc4180Writer.WriteDataTable(rawData, writer, false); } var msbl = new MySqlBulkLoader(conn); msbl.TableName = "testtable"; msbl.FileName = tempCsvFileSpec; msbl.FieldTerminator = ","; msbl.FieldQuotationCharacter = '"'; msbl.Load(); System.IO.File.Delete(tempCsvFileSpec);
... including the time to dump the 100,000 rows from the DataTable to a temporary CSV file (using code similar to this), bulk-loading from that file, and deleting the file afterwards.
这篇关于批量复制一个DataTable到MySQL(类似于System.Data.SqlClient.SqlBulkCopy)的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!