如何处理2M记录并保存到数据表中 [英] How to process 2M records and saved into datatable

查看:100
本文介绍了如何处理2M记录并保存到数据表中的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

您好,



我在表中有2M记录,我通过store proc获取它。然后我想将其转换为JSON格式。当我取出它并将其存储到数据表中时,我遇到了错误。我如何获取2M记录并将其保存为JSON格式。



错误: - 无法写入内存。



我尝试过:



Hello,

I have 2M records in the table which I am fetching it through store proc. Then I want to convert it into JSON format. When I am fetching it and storing it into datatable I am getting below error. How I can fetch 2M records and saved it into JSON format.

Error :- The memory could not be written.

What I have tried:

DataTable dt = new DataTable();

           using (var connection = new SqlConnection(ConfigurationManager.ConnectionStrings["CRM"].ConnectionString))
           {


               using (var cmd = new SqlCommand("SPGetList", connection))
               {
                   cmd.CommandType = System.Data.CommandType.StoredProcedure;
                   SqlDataAdapter da = new SqlDataAdapter();
                   da.SelectCommand = cmd;

                   da.Fill(dt);// This line causing an error and throwing the memory exception error.

               }
           }
           return DataTableToJSONWithStringBuilder(dt);

推荐答案

我们应该怎么知道?

我们不知道你的 DataTableToJSONWithStringBuilder 方法正在进行 - 除了方法名称提供的大纲 - 也不知道究竟是哪一行导致问题。



所以,它取决于你。

在该代码的第一行放置一个断点,并通过调试器运行你的应用程序。然后查看您的代码,并查看您的数据并找出手动应该发生的事情。然后单步执行每一行检查您预期发生的情况正是如此。如果不是,那就是当你遇到问题时,你可以回溯(或者再次运行并仔细观察)以找出原因。


对不起,但我们不能为你做到这一点 - 时间让你学习一门新的(非常非常有用的)技能:调试!





How should we know?
We have no idea what your DataTableToJSONWithStringBuilder method is doing - other than the broad outline the method name provides - nor any idea exactly which line is causing the problem.

So, its going to be up to you.
Put a breakpoint on the first line in that code, and run your app through the debugger. Then look at your code, and at your data and work out what should happen manually. Then single step each line checking that what you expected to happen is exactly what did. When it isn't, that's when you have a problem, and you can back-track (or run it again and look more closely) to find out why.

Sorry, but we can't do that for you - time for you to learn a new (and very, very useful) skill: debugging!


Quote:



我也在想。但我不是一个SQL专家。我怎样才能分批获取它,比如前50k然后再这样......





取决于你的SQL服务器版本。

旧的方法是使用ORDER BY和ROW_NUMBER:



Depends on your SQL server version.
The old way is to use ORDER BY with a ROW_NUMBER:

SELECT MyColumns 
FROM (SELECT MyClolumns, ROW_NUMBER() OVER (ORDER BY UniqueKeyColumn) AS RowNum FROM MyTable) s
WHERE s.RowNum BETWEEN @StartRow AND @EndRow

这可以从SQL 2005开始工作。

SQL 2012引入了OFFSET-FETCH:

This works from SQL 2005 onwards.
SQL 2012 introduced OFFSET-FETCH:

SELECT MyColumns FROM MyTable ORDER BY UniqueKeyColumn OFFSET @STARTROW ROWS FETCH NEXT @ROWSCOUNT ROWS ONLY


您是否可以尝试使用'数据读取器'代替适配器,看看数据的获取是否成功?



如果'数据读取器'用于获取数据,那么您可以将返回的结果集转换/存储为数据表或复杂类型,以便进一步处理(即转换为JSON)。



您还可以使用TimeSpan检查使用适配器和阅读器获取记录所需的时间差异。例如:对50K,150K和500K记录进行此练习。我确信数据阅读器会更快。
Can you try to use 'data reader' instead of adapter and see if the fetching of data is successful?

if the 'data reader' works for fetching data then you can convert/store the resultset returned to datatable or a complex type as required for further processing (i.e. converting to JSON).

You can also use TimeSpan to check the difference in time required to fetch records using adapter and reader. E.g.: do this excercise for 50K, 150K, and 500K records. I am sure data reader is going to be faster.


这篇关于如何处理2M记录并保存到数据表中的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆