程序突然结束时如何维护json的完整性 [英] How to maintain the integrity of json when a program end abruptly python

查看:87
本文介绍了程序突然结束时如何维护json的完整性的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在一个打开10个孩子的脚本中运行多线程".每个子脚本都会查询一个API,并将一些数据记录到json中.我记录了保存在另一个json中的每个json的日志,但是代码有时突然结束并破坏了json的完整性,因此无法重新打开它,例如:

I'm running 'multiple threads' in one script that open 10 children. Each child script queries an API and logs some data into a json. I make a log of each json saved in another json, however the code sometimes ends abruptly and destroys the integrity of the json so it can't be re-opened eg:

["2016_02_21_18_46_41", 1], ["2016_02_21_18_46_42", 1], ["2016_02_21_18_46_4

如何做到这一点,使其仅在完成后才将数据写入json?或者我该如何解决此问题?

How can make it so it only writes the data to the json if it's complete? or how can I alternatively fix this problem ?

为了连续性,如果我的父脚本在这里

For continuity, here if my parent script

from threading import Thread
import sys

sys.path.append('/python/loanrates/master')

names =  ['BTS', 'ETH', 'CLAM', 'DOGE', 'FCT', 'MAID', 'STR', 'XMR', 'XRP', 'BTC' ]
threads = []
for name in names: 
    sys.path.append('/python/loanrates/'+name)

import Master

for name in names:
    T = Thread(target=Master.main, args=(name,))
    print T
    threads.append(T)

for thread_ in threads:
    thread_.start()

for thread_ in threads:
    thread_.join()

这是我的孩子脚本

import arrow
import json
import pickle  
import time
import urllib

def main(name):

    date = arrow.utcnow().format('YYYY_MM_DD_HH_mm_ss')
    previous_date = "2016_02_18_09_02_52"
    previous_second = date[-2:]
    count = 0

    print name,'has started'

    while True:
            date = arrow.utcnow().format('YYYY_MM_DD_HH_mm_ss')
            second = int(date[-2:])
            url = 'https://poloniex.com/public?command=returnLoanOrders&currency='+name
            try:
                response = urllib.urlopen(url)
                data = json.loads(response.read())
            except:
                data = 'error'
                print 'error wth name has occured, probably been blocked by polo'
                time.sleep(10)

            #open previous data
            with open( 'D:/python/loanrates/'+name+'/'+previous_date+'.json', 'r') as f:
                previous_data = json.load(f)

            #open date log
            with open( 'D:/python/loanrates/'+name+'/date_store.json', 'r') as f:
                date_store = json.load(f)

            #compare new to old data
            # if new != old, new data is saved and that date recives a 1 in the 'date_store' dict,
            # signifying theres is new data for that date
            if previous_data != data and previous_second != second and second%10 == 0:

                date_store.append((date,1))

                with open( 'D:/python/loanrates/'+name+'/'+date+'.json', 'w') as f:
                    json.dump(data, f)

                with open( 'D:/python/loanrates/'+name+'/date_store.json', 'w') as f:
                    json.dump(date_store, f)

                previous_date = date
                previous_second = second

                count += 1
                if count == 1000: print 'name has logged 1000'
            # if new = old the new data hasn't changed and isn't saved, that date 'date_store' recieves a 0
            # signifying that the previous date of value 1 can be subtituted for this date
            elif previous_second != second and second%10 == 0:
                date_store.append((date,0))

                with open( 'D:/python/loanrates/'+name+'/date_store.json', 'w') as f:
                    json.dump(date_store, f)

                previous_second = second

if __name__ == '__main__':
    try:
        main(name)
    except KeyboardInterrupt:
        print 'Interrupted'
        try:
            sys.exit(0)
        except SystemExit:
            os._exit(0)
    except:
        pass

(如您所见,只有一个子脚本,但是它处于for循环中)

(as you can see there is only one child script but it's in a for loop)

推荐答案

如果在数据未损坏的情况下成功加载备份文件时,try/except应该可以工作.然后,当尝试加载json时,在出现异常时加载备份文件

try/except should work if you keep a backup file on successful loading when the data is not corrupted. Then when trying to load json load the backup file when it goes to exception

这篇关于程序突然结束时如何维护json的完整性的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆