Python 3.6酸洗自定义过程 [英] Python 3.6 pickling custom procedure

查看:121
本文介绍了Python 3.6酸洗自定义过程的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有一些类A的对象,这些对象有其自己的方法可以被腌制,将其称为custom_module.customPickle(A),该方法接受A的实例并返回序列化字符串. 我也有每个包含A的类B的对象列表.

我需要腌制列表,但是腌制A会带来一些难以解决的错误.但是,A有其自己的方法可以被腌制.

我可以在类B中实现__reduce__()方法,以便它调用custom_module.customPickle(A).但是我该怎么做才能使pickle能够有效地序列化B?


对象Amusic21.stream,对象B是自定义对象.自定义序列化功能为 music21.converter.freezeStr(streamObj, fmt=None) 并且解串函数应为 music21.converter.thawStr(strData)

解决方案

您可以使用 copyreg模块来注册用于酸洗和解酸的自定义功能;您注册的函数的行为类似于 __reduce__方法班级.

如果返回一个元组(unpickle_function, state),则将调用已注册的unpickle_function可调用对象,以状态为参数再次释放它,因此您可以在其中使用music21.converter.thawStr()函数:

import copyreg
import music21.converter
import music21.stream


def pickle_music21_stream(stream_obj):
    return music21.converter.thawStr, (music21.converter.freezeStr(stream_obj),)

copyreg.pickle(music21.stream.Stream, pickle_music21_stream)

(copyregconstructor参数在最新的Python版本中被忽略)

这将为那些对象注册一个 global 处理程序.您还可以为每个选取器使用一个调度表,请参阅[*调度表,了解如何注册一个.

现在,当进行腌制时,遇到Stream的任何实例时,handle_stream()函数将用于产生序列化,而thawStr()函数将用于再次为这些数据打孔.

但是music21.converter函数本身使用了pickle.它们有效地打包和清理流,然后对所得的Stream实例进行腌制.然后将调用自定义处理程序,您将遇到无限循环.

解决方法是使用自定义分发表处理酸洗和酸洗.在这种情况下,请避免使用copyreg,因为它设置了一个全局钩子,该钩子将在每次腌制Stream对象时被递归调用.

您自己的泡菜基础设施需要使用自定义的泡菜器:

import copyreg
import io
import pickle
import music21.converter
import music21.stream


def pickle_music21_stream(stream_obj):
    return music21.converter.thawStr, (music21.converter.freezeStr(stream_obj),)


def dumps(obj):
    f = io.BytesIO()
    p = pickle.Pickler(f)
    p.dispatch_table = copyreg.dispatch_table.copy()
    p.dispatch_table[music21.stream.Stream] = pickle_music21_stream
    p.dump(obj)
    return f.getvalue()


def loads(data):
    return pickle.loads(data)  # hook is registered in the pickle data

此处,仅当在您自己的数据结构中找到Stream实例时才调用自定义函数. music21例程使用全局pickle.dumps()pickle.loads()函数,并且不会使用相同的钩子.

I have some objects of class A which has its own method to be pickled, call it custom_module.customPickle(A) which takes an instance of A and returns a serialization string. I also have list of objects each of class B that contains A.

I need to pickle the list, but pickling A gives some error difficult to solve. However, A has its own method to be pickled.

I can implement the __reduce__() method in class B so that it calls custom_module.customPickle(A). But how can I do this so that pickle is able to serialize B efficiently?


Object A is a music21.stream and object B is a custom object. The custom serialization function is music21.converter.freezeStr(streamObj, fmt=None) and the unpickle function should be music21.converter.thawStr(strData)

解决方案

You can use the copyreg module to register custom functions for pickling and unpickling; the function you register acts like a __reduce__ method on the class.

If you return a tuple of (unpickle_function, state), then the registered unpickle_function callable will be called to unpickle it again, with state as the argument, so you can use your music21.converter.thawStr() function there:

import copyreg
import music21.converter
import music21.stream


def pickle_music21_stream(stream_obj):
    return music21.converter.thawStr, (music21.converter.freezeStr(stream_obj),)

copyreg.pickle(music21.stream.Stream, pickle_music21_stream)

(the constructor argument to copyreg is ignored in recent Python versions)

This registers a global handler for those objects. You can also use a dispatch table per pickler, see [*Dispatch Tables on how you'd register one.

Now, when pickling, when encountering any instances of Stream the handle_stream() function is used to produce a serialisation, and the thawStr() function will be used to unpickle that data again.

However, the music21.converter functions use pickle themselves. They effectively pack and clean up the stream, and then pickle the resulting Stream instance. This will then call the custom handler, and you have an infinite loop.

The work-around is to use a custom dispatch table to handle pickling and unpickling. Avoid using copyreg in this case, as it sets a global hook that'll be called recursively each time a Stream object is being pickled.

Your own pickle infrastructure needs to use a custom pickler:

import copyreg
import io
import pickle
import music21.converter
import music21.stream


def pickle_music21_stream(stream_obj):
    return music21.converter.thawStr, (music21.converter.freezeStr(stream_obj),)


def dumps(obj):
    f = io.BytesIO()
    p = pickle.Pickler(f)
    p.dispatch_table = copyreg.dispatch_table.copy()
    p.dispatch_table[music21.stream.Stream] = pickle_music21_stream
    p.dump(obj)
    return f.getvalue()


def loads(data):
    return pickle.loads(data)  # hook is registered in the pickle data

Here the custom function is only called when a Stream instance is found in your own data structure. The music21 routines use the global pickle.dumps() and pickle.loads() functions and won't use the same hook.

这篇关于Python 3.6酸洗自定义过程的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆