具有多处理的超时装饰器类会产生酸洗错误 [英] A timeout decorator class with multiprocessing gives a pickling error

查看:85
本文介绍了具有多处理的超时装饰器类会产生酸洗错误的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

因此,在Windows上,signalthread方法通常是不好的主意/对于超时功能不起作用.

So on windows the signal and the thread approahc in general are bad ideas / don't work for timeout of functions.

我制作了以下超时代码,当代码花费很长时间时,该代码会从multiprocessing抛出timeout exception.这正是我想要的.

I've made the following timeout code which throws a timeout exception from multiprocessing when the code took to long. This is exactly what I want.

 def timeout(timeout, func, *arg):
    with Pool(processes=1) as pool:
        result = pool.apply_async(func, (*arg,))
        return result.get(timeout=timeout)

我现在正在尝试将其变为装饰器样式,以便可以将其添加到广泛的功能中,尤其是那些调用了外部服务且无法控制代码或持续时间的功能.我当前的尝试如下:

I'm now trying to get this into a decorator style so that I can add it to a wide range of functions, especially those where external services are called and I have no control over the code or duration. My current attempt is below:

class TimeWrapper(object):

    def __init__(self, timeout=10):
        """Timing decorator"""
        self.timeout = timeout

    def __call__(self, f):
        def wrapped_f(*args):
            with Pool(processes=1) as pool:
                result = pool.apply_async(f, (*args,))
                return result.get(timeout=self.timeout)

        return wrapped_f

它给出了一个酸洗错误:

It gives a pickling error:

@TimeWrapper(7)
def func2(x, y):
    time.sleep(5)
    return x*y

File "C:\Users\rmenk\AppData\Local\Continuum\anaconda3\lib\multiprocessing\reduction.py", line 51, in dumps cls(buf, protocol).dump(obj) _pickle.PicklingError: Can't pickle <function func2 at 0x000000770C8E4730>: it's not the same object as __main__.func2

File "C:\Users\rmenk\AppData\Local\Continuum\anaconda3\lib\multiprocessing\reduction.py", line 51, in dumps cls(buf, protocol).dump(obj) _pickle.PicklingError: Can't pickle <function func2 at 0x000000770C8E4730>: it's not the same object as __main__.func2

我怀疑这是由于多重处理而导致的装饰器效果不佳,但实际上我不知道如何使它们发挥出色.有关如何解决此问题的任何想法?

I'm suspecting this is due to the multiprocessing and the decorator not playing nice but I don't actually know how to make them play nice. Any ideas on how to fix this?

PS:我已经在该站点和其他地方进行了广泛的研究,但是没有找到任何可行的答案,无论是用pebble,线程,作为函数装饰器还是其他方法.如果您知道在Windows和python 3.5上都可以使用的解决方案,那么我会很高兴只使用它.

PS: I've done some extensive research on this site and other places but haven't found any answers that work, be it with pebble, threading, as a function decorator or otherwise. If you have a solution that you know works on windows and python 3.5 I'd be very happy to just use that.

推荐答案

在Windows中,您要实现的目标特别麻烦.核心问题是装饰功能时,会对其进行阴影处理.由于它使用

What you are trying to achieve is particularly cumbersome in Windows. The core issue is that when you decorate a function, you shadow it. This happens to work just fine in UNIX due to the fact it uses the fork strategy to create a new process.

但是,在Windows中,新过程将是空白过程,在该过程中将启动全新的Python解释器并加载您的模块.加载模块后,装饰器会隐藏真实函数,从而很难找到pickle协议.

In Windows though, the new process will be a blank one where a brand new Python interpreter is started and loads your module. When the module gets loaded, the decorator hides the real function making it hard to find for the pickle protocol.

唯一正确的方法是在装饰过程中设置蹦床功能.您可以在 ,但是,只要您不进行练习,我建议直接使用pebble,因为它已经提供了您想要的东西.

The only way to get it right is to rely on a trampoline function to be set during the decoration. You can take a look on how is done on pebble but, as long as you're not doing it for an exercise, I'd recommend to use pebble directly as it already offers what you are looking for.

from pebble import concurrent

@concurrent.process(timeout=60)
def my_function(var, keyvar=0):
    return var + keyvar

future = my_function(1, keyvar=2)
future.result()

这篇关于具有多处理的超时装饰器类会产生酸洗错误的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆