Python - 如何流式传输要分解的大型(11 GB)JSON 文件 [英] Python - How to stream large (11 gb) JSON file to be broken up

查看:41
本文介绍了Python - 如何流式传输要分解的大型(11 GB)JSON 文件的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一个非常大的 JSON (11 gb) 文件,它太大而无法读入我的内存.我想把它分解成更小的文件来分析数据.我目前正在使用 Python 和 Pandas 进行分析,我想知道是否有某种方法可以访问文件的块,以便可以在不使程序崩溃的情况下将其读入内存.理想情况下,我想将多年的数据分解为跨度约为一周的较小的可管理文件,但是没有固定的数据大小,尽管如果它们是设定的间隔并没有那么重要.

I have a very large JSON (11 gb) file that is too large to read into my memory. I would like to break it up into smaller files to analyze the data. I am currently using Python and Pandas for the analysis and I am wondering if there is some way to access chunks of the file so that it can be read into memory without crashing the program. Ideally, I would like to break the years worth of data into smaller manageable files that span about a week, however there isn't a constant data size, although it doesn't matter as much if they are a set interval.

这里是数据格式

{
"actor" : 
{
    "classification" : [ "suggested" ],
    "displayName" : "myself",
    "followersCount" : 0,
    "followingCount" : 0,
    "followingStocksCount" : 0,
    "id" : "person:stocktwits:183087",
    "image" : "http://avatars.stocktwits.com/production/183087/thumb-1350332393.png",
    "link" : "http://stocktwits.com/myselfbtc",
    "links" : 
    [

        {
            "href" : null,
            "rel" : "me"
        }
    ],
    "objectType" : "person",
    "preferredUsername" : "myselfbtc",
    "statusesCount" : 2,
    "summary" : null,
    "tradingStrategy" : 
    {
        "approach" : "Technical",
        "assetsFrequentlyTraded" : [ "Forex" ],
        "experience" : "Novice",
        "holdingPeriod" : "Day Trader"
    }
},
"body" : "$BCOIN and macd is going down ..... http://stks.co/iDEB",
"entities" : 
{
    "chart" : 
    {
        "fullImage" : 
        {
            "link" : "http://charts.stocktwits.com/production/original_10047145.png"
        },
        "image" : 
        {
            "link" : "http://charts.stocktwits.com/production/small_10047145.png"
        },
        "link" : "http://stks.co/iDEB",
        "objectType" : "image"
    },
    "sentiment" : 
    {
        "basic" : "Bearish"
    },
    "stocks" : 
    [

        {
            "displayName" : "Bitcoin",
            "exchange" : "PRIVATE",
            "industry" : null,
            "sector" : null,
            "stocktwits_id" : 9659,
            "symbol" : "BCOIN"
        }
    ],
    "video" : null
},
"gnip" : 
{
    "language" : 
    {
        "value" : "en"
    }
},
"id" : "tag:gnip.stocktwits.com:2012:note/10047145",
"inReplyTo" : 
{
    "id" : "tag:gnip.stocktwits.com:2012:note/10046953",
    "objectType" : "comment"
},
"link" : "http://stocktwits.com/myselfbtc/message/10047145",
"object" : 
{
    "id" : "note:stocktwits:10047145",
    "link" : "http://stocktwits.com/myselfbtc/message/10047145",
    "objectType" : "note",
    "postedTime" : "2012-10-17T19:13:50Z",
    "summary" : "$BCOIN and macd is going down ..... http://stks.co/iDEB",
    "updatedTime" : "2012-10-17T19:13:50Z"
},
"provider" : 
{
    "displayName" : "StockTwits",
    "link" : "http://stocktwits.com"
},
"verb" : "post"
}

推荐答案

jq 1.5 有一个流解析器(记录在 http://stedolan.github.io/jq/manual/#Streaming).从某种意义上说,它很容易使用,例如如果您的 1G 文件名为 1G.json,则以下命令将生成一个行流,每个叶子"值包括一行:

jq 1.5 has a streaming parser (documented at http://stedolan.github.io/jq/manual/#Streaming). In one sense it's easy to use, e.g. if your 1G file is named 1G.json, then the following command will produce a stream of lines, including one line per "leaf" value:

jq -c --stream .1G.json

(输出如下所示.注意每一行本身都是有效的 JSON.)

(The output is shown below. Notice that each line is itself valid JSON.)

但是,使用流式输出可能并不那么容易,但这取决于您想要做什么:-)

However, using the streamed output may not be so easy, but that depends on what you want to do :-)

理解流式输出的关键是大多数行都具有以下形式:

The key to understanding the streamed output is that most lines have the form:

[ PATH, VALUE ]

其中PATH"是路径的数组表示.(使用jq时,这个数组其实可以作为路径使用.)

where "PATH" is an array representation of the path. (When using jq, this array can in fact be used as a path.)

[["actor","classification",0],"suggested"]
[["actor","classification",0]]
[["actor","displayName"],"myself"]
[["actor","followersCount"],0]
[["actor","followingCount"],0]
[["actor","followingStocksCount"],0]
[["actor","id"],"person:stocktwits:183087"]
[["actor","image"],"http://avatars.stocktwits.com/production/183087/thumb-1350332393.png"]
[["actor","link"],"http://stocktwits.com/myselfbtc"]
[["actor","links",0,"href"],null]
[["actor","links",0,"rel"],"me"]
[["actor","links",0,"rel"]]
[["actor","links",0]]
[["actor","objectType"],"person"]
[["actor","preferredUsername"],"myselfbtc"]
[["actor","statusesCount"],2]
[["actor","summary"],null]
[["actor","tradingStrategy","approach"],"Technical"]
[["actor","tradingStrategy","assetsFrequentlyTraded",0],"Forex"]
[["actor","tradingStrategy","assetsFrequentlyTraded",0]]
[["actor","tradingStrategy","experience"],"Novice"]
[["actor","tradingStrategy","holdingPeriod"],"Day Trader"]
[["actor","tradingStrategy","holdingPeriod"]]
[["actor","tradingStrategy"]]
[["body"],"$BCOIN and macd is going down ..... http://stks.co/iDEB"]
[["entities","chart","fullImage","link"],"http://charts.stocktwits.com/production/original_10047145.png"]
[["entities","chart","fullImage","link"]]
[["entities","chart","image","link"],"http://charts.stocktwits.com/production/small_10047145.png"]
[["entities","chart","image","link"]]
[["entities","chart","link"],"http://stks.co/iDEB"]
[["entities","chart","objectType"],"image"]
[["entities","chart","objectType"]]
[["entities","sentiment","basic"],"Bearish"]
[["entities","sentiment","basic"]]
[["entities","stocks",0,"displayName"],"Bitcoin"]
[["entities","stocks",0,"exchange"],"PRIVATE"]
[["entities","stocks",0,"industry"],null]
[["entities","stocks",0,"sector"],null]
[["entities","stocks",0,"stocktwits_id"],9659]
[["entities","stocks",0,"symbol"],"BCOIN"]
[["entities","stocks",0,"symbol"]]
[["entities","stocks",0]]
[["entities","video"],null]
[["entities","video"]]
[["gnip","language","value"],"en"]
[["gnip","language","value"]]
[["gnip","language"]]
[["id"],"tag:gnip.stocktwits.com:2012:note/10047145"]
[["inReplyTo","id"],"tag:gnip.stocktwits.com:2012:note/10046953"]
[["inReplyTo","objectType"],"comment"]
[["inReplyTo","objectType"]]
[["link"],"http://stocktwits.com/myselfbtc/message/10047145"]
[["object","id"],"note:stocktwits:10047145"]
[["object","link"],"http://stocktwits.com/myselfbtc/message/10047145"]
[["object","objectType"],"note"]
[["object","postedTime"],"2012-10-17T19:13:50Z"]
[["object","summary"],"$BCOIN and macd is going down ..... http://stks.co/iDEB"]
[["object","updatedTime"],"2012-10-17T19:13:50Z"]
[["object","updatedTime"]]
[["provider","displayName"],"StockTwits"]
[["provider","link"],"http://stocktwits.com"]
[["provider","link"]]
[["verb"],"post"]
[["verb"]]

这篇关于Python - 如何流式传输要分解的大型(11 GB)JSON 文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆