用于大量数据记录,数据库还是文件的体系结构? [英] Architecture for a lot of data logging, DB or file?

查看:105
本文介绍了用于大量数据记录,数据库还是文件的体系结构?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在开发一个Python应用程序,我希望该应用程序具有可扩展性,以每秒容纳约150次写入.分布在大约50个不同的来源中.

I'm working on a Python app I want to be scalable to accommodate about 150 writes per second. That's spread out among about 50 different sources.

Mongodb是否是一个不错的选择?我在写数据库时是分开的,或者只是为每个源制作一个日志文件并分别解析它们.

Is Mongodb a good candidate for this? I'm split on writing to a database, or just making a log file for each source and parsing them separately.

还有其他有关记录大量数据的建议吗?

Any other suggestions for logging a lot of data?

推荐答案

我会说mongodb非常适合日志收集,原因是:

I would say that mongodb very good fit for the logs collection, because of:

  1. Mongodb具有惊人的快速写入
  2. 日志不是很重要,因此可以在服务器发生故障时放松其中的一些日志.因此,您可以在没有 journaling 选项的情况下运行mongodb 选项,以避免写入开销.
  3. 此外,您可以使用分片来提高写入速度,与此同时,您也可以将最旧的日志移至单独收集或文件系统中.
  4. 您可以轻松 导出数据数据库到json/csv.
  5. 一旦您将所有内容都存储在数据库中,便可以查询数据以查找所需的日志.
  1. Mongodb has amazing fast writes
  2. Logs not so important, so it's okay to loose some of them in case of server failure. So you can run mongodb without journaling option to avoid writes overhead.
  3. In additional you can use sharding to increase writes speed, in same time you can just move oldest logs to separate collection or into file system.
  4. You can easy export data from database to the json/csv.
  5. Once you will have everything in a database you will able to query data in order to find log that you need.

所以,我认为mongodb非常适合日志之类的东西.您无需在文件系统中管理大量日志文件. Mongodb为您做到这一点.

So, my opinion is that mongodb perfectly fit for such things as logs. You no need manage a lot of logs files in the file system. Mongodb does this for you.

这篇关于用于大量数据记录,数据库还是文件的体系结构?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆