批量插入在Azure SQL Server中无法正常工作 [英] Bulk insert is not working properly in Azure SQL Server

查看:110
本文介绍了批量插入在Azure SQL Server中无法正常工作的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我无法使用C#webapi将大量数据插入Azure SQL服务器数据库中

考虑

我想在SQL中插入60K>数据. 在我的本地sql服务器上没有问题,但是在Azure SQL中,它的连接超时超时了

我的方法:(所有方法都在本地sql服务器中工作,但在Azure sql服务器中不工作)

1)使用EF对其插入记录进行一次尝试(大约10000,大约10分钟,主要是超时)

2)使用批量插入扩展和EF进行了尝试 3)在SqlBulkCopy中尝试过

4)尝试增加连接字符串中的连接超时时间

5)尝试在Dbcontext中增加命令超时.

异常StackTrace

Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.
System.Data.SqlClient.SqlException (0x80131904): Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding. ---> System.ComponentModel.Win32Exception (0x80004005): The wait operation timed out
   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
   at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)
   at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error)
   at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync()
   at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket()
   at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer()
   at System.Data.SqlClient.TdsParserStateObject.TryReadByte(Byte& value)
   at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)
   at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
   at System.Data.SqlClient.SqlBulkCopy.RunParser(BulkCopySimpleResultSet bulkCopyHandler)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinuedOnSuccess(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinued(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsync(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken)
   at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table, DataRowState rowState)

在Azure中是否有任何解决方案或要更改的配置?

更新

用于批量插入的代码

  using (var dbConnection = new DBModel().Database.Connection as SqlConnection)
                {
                    dbConnection?.Open();
                    using (var sqlBulkCopy = new SqlBulkCopy(dbConnection))
                    {
                        try
                        {
                            /* ColumnMapping
                             * Column is mapped to DB Column to DataTable Column
                             *
                             */
                            sqlBulkCopy.EnableStreaming = true;
                            sqlBulkCopy.BulkCopyTimeout = 500;
                            sqlBulkCopy.DestinationTableName = "LogTable";
                            //dt is object of the Datatable
                            sqlBulkCopy.WriteToServer(dt);
                        }
                        catch (Exception ex)
                        {

                        }
                    }


                }

解决方案

我建议您将sqlBulkCopy.BatchSize设置为合理的值,而不是将所有内容批量插入.根据要插入的数据,尝试从10.000开始,然后上下移动,直到对性能满意为止.

编辑以获得更多说明: 考虑批处理大小时,您需要考虑到SqlBulkCopy不仅需要插入数据,还需要读取并发送数据-最后一部分可能是它在本地SQL Server上起作用而不是在本地SQL Server上起作用的原因. Azure-这还意味着,如果要使用大型数据集,则需要使用较小的批处理大小或较高的BulkCopyTimeout设置,以使每个批处理有机会在达到超时限制之前完成./p>

您可以在这篇文章中阅读有关批量大小的更多信息. 建议的SqlBulkCopy批处理大小是多少?

其他选项:
我正在阅读此书,这可能仅仅是因为您的插入内容达到了关键的DTU(数据库事务单元,基本上是对服务器组合资源的一种度量)使用点.

对性能级别进行了校准和控制,以提供所需的资源来运行数据库工作负载,直至达到所选服务层/性能级别所允许的最大限制.如果您的工作负载达到CPU/数据IO/日志IO限制之一的限制,您将继续以允许的最大级别接收资源,但是查询可能会延迟增加 .这些限制将不会导致任何错误,而只会导致工作负载下降,除非减速变得如此严重,以至于查询开始超时.

来自此链接: https://azure.microsoft.com/da-dk/blog/azure-sql-database-introduces-new-near-real-time-performance-metrics/
在监视DTU的使用情况时,尝试再次启动副本,并查看它在较长的一段时间内是否100%处于打开状态.在这种情况下,您可能希望提高数据库的定价层规模.

I'm not able to insert the bulk amount of data into Azure SQL server DB using C# webapi

Consider

I want to insert 60K> data in SQL. In my local sql server there is no problem but in Azure SQL its getting connection timed-out

My approach:(All are working in local sql server but not in Azure sql server)

1) Tried using EF its inserting record one by one (For 10000 approx. 10 min,mostly timeout)

2) Tried using Bulk insert Extension along with EF 3) Tried in SqlBulkCopy

4) Tried increasing connection time out in connection string

5) Tried increasing command time out in Dbcontext.

Exception StackTrace

Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.
System.Data.SqlClient.SqlException (0x80131904): Execution Timeout Expired.  The timeout period elapsed prior to completion of the operation or the server is not responding. ---> System.ComponentModel.Win32Exception (0x80004005): The wait operation timed out
   at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
   at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)
   at System.Data.SqlClient.TdsParserStateObject.ReadSniError(TdsParserStateObject stateObj, UInt32 error)
   at System.Data.SqlClient.TdsParserStateObject.ReadSniSyncOverAsync()
   at System.Data.SqlClient.TdsParserStateObject.TryReadNetworkPacket()
   at System.Data.SqlClient.TdsParserStateObject.TryPrepareBuffer()
   at System.Data.SqlClient.TdsParserStateObject.TryReadByte(Byte& value)
   at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)
   at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
   at System.Data.SqlClient.SqlBulkCopy.RunParser(BulkCopySimpleResultSet bulkCopyHandler)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinuedOnSuccess(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsyncContinued(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.CopyBatchesAsync(BulkCopySimpleResultSet internalResults, String updateBulkCommandText, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestAsync(CancellationToken cts, TaskCompletionSource`1 source)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalAsync(CancellationToken ctoken)
   at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServerAsync(Int32 columnCount, CancellationToken ctoken)
   at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table, DataRowState rowState)

Is there any solution for it or any configuration to be changed in Azure?

Update

Code used for bulk insert

  using (var dbConnection = new DBModel().Database.Connection as SqlConnection)
                {
                    dbConnection?.Open();
                    using (var sqlBulkCopy = new SqlBulkCopy(dbConnection))
                    {
                        try
                        {
                            /* ColumnMapping
                             * Column is mapped to DB Column to DataTable Column
                             *
                             */
                            sqlBulkCopy.EnableStreaming = true;
                            sqlBulkCopy.BulkCopyTimeout = 500;
                            sqlBulkCopy.DestinationTableName = "LogTable";
                            //dt is object of the Datatable
                            sqlBulkCopy.WriteToServer(dt);
                        }
                        catch (Exception ex)
                        {

                        }
                    }


                }

解决方案

I'd suggest you set sqlBulkCopy.BatchSize to a reasonable amount, instead of inserting everything in one batch. Depending on the data you're inserting, try starting with 10.000 and work your way up or down until you're satisfied with the performance.

Edit for some extra clarification: When you consider your batch size, you need to take into consideration that SqlBulkCopy will need to not only insert the data, but also read AND send it - the last part is probably the reason why it works on your local SQL server, but not on Azure - it also means that, if you're working with a large dataset, that you will need to work with lower batch size, or a considerably higher BulkCopyTimeout setting, to allow each batch the chance to finish before reaching the timeout limit.

You can read more on batch sizes in this post. What is the recommended batch size for SqlBulkCopy?

Other option:
I was reading up on this, and it could simply be because your insert reaches a critical DTU (Database Transaction Unit, basically a measure of the servers combined resources) usage point.

Performance levels are calibrated and governed to provide the needed resources to run your database workload up to the max limits allowed for your selected service tier/performance level. If your workload is hitting the limits in one of CPU/Data IO/Log IO limits, you will continue to receive the resources at the maximum allowed level, but you are likely to see increased latencies for your queries. These limits will not result in any errors, but just a slowdown in your workload, unless the slowdown becomes so severe that queries start timing out.

Taken from this link: https://azure.microsoft.com/da-dk/blog/azure-sql-database-introduces-new-near-real-time-performance-metrics/
Try starting the copy again while monitoring the DTU usage and see if it's on 100% for long(er) periods. If that is the case, you might want to up your pricing tier scale for the database.

这篇关于批量插入在Azure SQL Server中无法正常工作的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆