如何使用Map/reduce脚本删除海量记录? [英] How to delete mass records using Map/reduce script?

查看:132
本文介绍了如何使用Map/reduce脚本删除海量记录?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我创建了一个Map/Reduce脚本,该脚本将获取客户发票并将其删除.如果我根据以下条件在UI中创建保存的搜索,则会显示400万条记录.现在,如果我运行脚本,则在完成"getInputData"阶段之前,执行将停止,因为此阶段的最大存储限制为200Mb.因此,我想从400万条记录中提取4000条记录,并执行它,并每15分钟安排一次脚本.这是第一阶段的代码(getInputData)-

I have created a Map/Reduce script which will fetch customer invoices and delete it. If I am creating saved search in UI based on the below criteria, it shows 4 million records. Now, if I run the script, execution stops before completing the "getInputData" stage as maximum storage limit of this stage is 200Mb. So, I want to fetch first 4000 records out of 4 million and execute it and schedule the script for every 15 mins. Here is the code of first stage (getInputData) -

            var count=0;
                var counter=0;
                var result=[];
                var testSearch = search.create({
                    type: 'customrecord1',
                    filters: [ 'custrecord_date_created', 'notonorafter', 'sta​rtO​fLa​stM​ont​h' ],
                    columns: [ 'internalid' ]
                }); 
                do{
                    var resultSearch = testSearch.run().getRange({
                        start : count,
                        end   : count+1000
                    });
                    for(var arr=0;arr<resultSearch.length;arr++){
                        result.push(resultSearch[arr]);
                    }
                    counter = count+counter;                        
                }while(resultSearch.length >= 1000 && counter != 4000);
                return result;

在创建保存的搜索过程中,这需要花费很长时间,是否有任何方法可以解决在保存的搜索创建过程中可以过滤掉前4000条记录的问题?

During creating the saved search, it is taking long time, is there any work around where we can filter first 4000 records during saved search creation?

推荐答案

为什么不进行自定义批量更新?

Why not a custom mass update?

这将是一个5到10行的脚本,它将按照批量更新的标准获取当前记录的内部ID和记录类型,然后删除该记录.

It would be a 5-10 line script that grabs the internal id and record type of the current record in the criteria of the mass update then deletes the record.

这篇关于如何使用Map/reduce脚本删除海量记录?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆