SQL Server-合并大型表而不锁定数据 [英] SQL Server - Merging large tables without locking the data

查看:133
本文介绍了SQL Server-合并大型表而不锁定数据的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有大量的数据(约300万条记录),需要每天将它们与更新和新记录合并.我有一个存储过程,该过程实际上将记录集分解为1000个记录块,并在临时表中使用MERGE命令,以避免在更新数据时锁定活动表.问题在于它并不能完全帮助您.该表仍然锁定",并且使用该数据的我们的网站在尝试访问数据时收到超时.我什至尝试将其分成100个记录块,甚至尝试使用WAITFOR DELAY '000:00:5'来查看它是否有助于在合并块之间暂停.它仍然相当呆滞.

I have a very large set of data (~3 million records) which needs to be merged with updates and new records on a daily schedule. I have a stored procedure that actually breaks up the record set into 1000 record chunks and uses the MERGE command with temp tables in an attempt to avoid locking the live table while the data is updating. The problem is that it doesn't exactly help. The table still "locks up" and our website that uses the data receives timeouts when attempting to access the data. I even tried splitting it up into 100 record chunks and even tried a WAITFOR DELAY '000:00:5' to see if it would help to pause between merging the chunks. It's still rather sluggish.

我正在寻找有关如何在不锁定表的情况下合并大型数据集的任何建议,最佳实践或示例.

I'm looking for any suggestions, best practices, or examples on how to merge large sets of data without locking the tables.

谢谢

推荐答案

在进行选择时,将前端更改为使用NOLOCK或READ UNCOMMITTED.

Change your front end to use NOLOCK or READ UNCOMMITTED when doing the selects.

您不能NOLOCK MERGE,INSERT或UPDATE,因为必须锁定记录才能执行更新.但是,您可以不锁定选择.

You can't NOLOCK MERGE,INSERT, or UPDATE as the records must be locked in order to perform the update. However, you can NOLOCK the SELECTS.

请注意,应谨慎使用.如果脏读没问题,请继续.但是,如果读取操作需要更新的数据,则需要走另一条路,并弄清楚为什么合并3M记录会引起问题.

Note that you should use this with caution. If dirty reads are okay, then go ahead. However, if the reads require the updated data then you need to go down a different path and figure out exactly why merging 3M records is causing an issue.

我敢打赌,在合并命令期间和/或在内存不足的情况下,大部分时间都花在了从磁盘读取数据上.只需将更多的ram填充到数据库服务器中,您可能会更好.

I'd be willing to bet that most of the time is spent reading data from the disk during the merge command and/or working around low memory situations. You might be better off simply stuffing more ram into your database server.

理想的做法是具有足够的内存,以根据需要将整个数据库拉入内存.例如,如果您有一个4GB的数据库,那么当然要确保在x64服务器中有8GB的RAM.

An ideal amount would be to have enough ram to pull the whole database into memory as needed. For example, if you have a 4GB database, then make sure you have 8GB of RAM.. in an x64 server of course.

这篇关于SQL Server-合并大型表而不锁定数据的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆