这是在mongodb中插入和更新对象数组的安全方法吗? [英] Is this a safe way for inserting and updating an array of objects in mongodb?

查看:71
本文介绍了这是在mongodb中插入和更新对象数组的安全方法吗?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

以下代码更新指定对象的数组,如果该对象不在数据库中,则将其插入.它工作正常,但是我是mongodb的新手,我不确定这是安全还是快速的方法.

The following code updates an array of specified objects or insert if the object is not in the database. It works fine but I'm new in mongodb and I'm not sure if this is a safe or fast way to do it.

也许我应该使用 updateMany ?我尝试使用它,但无法获得与以下代码相同的行为.

Maybe I should use updateMany? I tried to use it but I couldn't get the same behaviour as the following code.

mongodb.connect(mongo_url, function(err, db) {
    if(err) console.log(err)
    else {
        var mongo_products_collection = db.collection("products")

        mongoUpsert(mongo_products_collection, data_products, function() {
            db.close()
        })
    }
})

function mongoUpsert(collection, data_array, cb) {
    var data_length = data_array.length

    for (var i=0; i < data_length; i++) {
        collection.update(
            {product_id: data_array[i].product_id},
            data_array[i],
            {upsert: true}
        )
    }

    return cb(false)
}

推荐答案

使用

Using the bulkWrite API to carry out the updates handles this better

mongodb.connect(mongo_url, function(err, db) {
    if(err) console.log(err)
    else {
        var mongo_products_collection = db.collection("products")

        mongoUpsert(mongo_products_collection, data_products, function() {
            db.close()
        })
    }
})

function mongoUpsert(collection, data_array, cb) {

    var bulkUpdateOps = data_array.map(function(data) {
        return {
            "updateOne": {
                "filter": { 
                    "product_id": data.product_id,
                    "post_modified": { "$ne": data.post_modified }
                },
                "update": { "$set": data },
                "upsert": true
            }
        };
    });

    collection.bulkWrite(bulkUpdateOps, function(err, r) {
        // do something with result
    });

    return cb(false);
}


如果要处理更大的数组(即> 1000),则考虑以500为批次将写操作发送到服务器,因为您没有将每个请求发送到服务器,而是每500个请求中只有一次,因此可以提供更好的性能.


If you're dealing with larger arrays i.e. > 1000 then consider sending the writes to the server in batches of 500 which gives you a better performance as you are not sending every request to the server, just once in every 500 requests.

对于批量操作,MongoDB施加默认内部限制每批处理1000个操作,因此选择500个文档是一个好主意,因为您可以控制批量大小,而不是让MongoDB施加默认值,即,对于大于1000个文档的较大操作.因此,对于上述情况,在第一种方法中,由于该数组很小,因此可以一次写入所有数组,但是对于较大的数组,可以选择500.

For bulk operations MongoDB imposes a default internal limit of 1000 operations per batch and so the choice of 500 documents is good in the sense that you have some control over the batch size rather than let MongoDB impose the default, i.e. for larger operations in the magnitude of > 1000 documents. So for the above case in the first approach one could just write all the array at once as this is small but the 500 choice is for larger arrays.

var ops = [],
    counter = 0;

data_array.forEach(function(data) {
    ops.push({
        "updateOne": {
            "filter": { 
                "product_id": data.product_id, 
                "post_modified": { "$ne": data.post_modified } 
            },
            "update": { "$set": data },
            "upsert": true
        }
    });
    counter++;

    if (counter % 500 == 0) {
        collection.bulkWrite(ops, function(err, r) {
            // do something with result
        });
        ops = [];
    }
})

if (counter % 500 != 0) {
    collection.bulkWrite(ops, function(err, r) {
        // do something with result
    }
}

这篇关于这是在mongodb中插入和更新对象数组的安全方法吗?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆