哪一个在性能方面可以很好...... [英] Which one can be good in terms of performance...

查看:57
本文介绍了哪一个在性能方面可以很好......的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

大家好,



我必须使用包含超过11行(百万)行的数据处理文件。



问题,每一行都有我需要分割的数据,每行将组成5行进行插入或更新。看一下性能点,哪一个是值得建议的:



1.写CTE将每一行拆分为5然后CTE结果使用合并声明。

2.创建表并转储整个数据,运行逻辑/存储过程,这将从每行生成5行到另一个表...然后使用它与合并..



3.创建临时表和拆分数据。在合并声明中使用该临时表。



比这更好的主意。



请注意我是使用SSIS读取文件,因为我们有多个文件....



谢谢,

Hello everyone,

I have to process file with data which contains more than 11 lacs (Million) rows.

Problem, Each row have data which I need to split and each row will make up 5 rows to Insert or update. Looking at performance point, Which one is suggestable:

1. Writing CTE to split each row into 5 and then CTE Result use merge statement.
2. Create table and dump entire data, Run logic/stored procedure which will generate 5 rows from each row into another table... and then use it with merge..

3. Create temp table and split data. Use that temp table in merge statement.

Any better idea than this.

Please note I am using SSIS to read file as we have multiple files....

Thanks,

推荐答案

每个一个人有它的优点和缺点。

这完全取决于你在哪种情况下使用什么,

参考以下链接来清除你所有的困惑



http://www.dotnet-tricks.com/Tutorial/sqlserver/X517150913-Difference-between-CTE-and-Temp-Table-and-Table-Variable.html [ ^ ]



表格变量V / S临时表[ ^ ]
Each one has it's advantages and dis-advantages.
It totally depends upon in which situation you are using what ,
refer following link to clear all your confusion

http://www.dotnet-tricks.com/Tutorial/sqlserver/X517150913-Difference-between-CTE-and-Temp-Table-and-Table-Variable.html[^]

Table Variable V/S Temporary Table[^]


当第一种方法似乎没问题时,第二种和第三种方法似乎不合逻辑。为什么?想象一下,您已经在表中获取了数据并且想要将其复制到临时表中,然后您想要提供拆分/合并操作。为什么?请注意,对大部分数据的每次操作都需要一些时间。因此,由于许多操作正在等待,因此需要很多时间。得到它了?最有效的方法是提供直接的复制操作。
When the first approach seems to be OK, that second and third seems to be not-logical. Why? Imagine, you have already got data in a table and you want to "copy" it into temporary table, then you want to provide split/merge operation. Why? Note, each operation on huge portion of data must take some time. So, as many operation is being pending, as many time is needed. Got it? The most efficient way is to provide direct "copy" operation .


这篇关于哪一个在性能方面可以很好......的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆