多个文件中的Python全局变量 [英] Python Global Variables in multiple files

查看:389
本文介绍了多个文件中的Python全局变量的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我有2个守护进程,它们应该访问同一个变量。
我为全局变量创建了第三个文件,每个守护进程都可以访问该变量。但是当一个人改变这个变量时,另一个人仍然会看到默认值。



例子:

glob.py

  time = 0 

守护进程a:

 导入日期时间
导入时间
导入glob

while(True):
glob .time = datetime.datetime.now()
time.sleep(30)

守护进程b:

 导入glob 

while(True):
print(glob.time )

每次打印0
我希望我已经明确了我的问题,有人可以帮助我。
如果您需要更多信息,请随时询问。

解决方案

看起来像(尽管您不明确地告诉你)你正在以一种完全独立的方式运行你的程序:两种不同的Python解释器调用。

没有你希望存在的那种魔法:就好像你有两个运行相同程序的实例一样,每个实例都有它的变量实例(全局的或者以其他方式)。

如果您正在执行一些简单的任务,则更简单的方法是为每个进程输出一个文本文件作为输出,另一个进程尝试从它想要知道的每个进程生成的文件中读取信息 - (甚至可以在Unix中使用命名管道)。

另一种方法是创建一个Python脚本使用 multiprocessing stdlib模块来协调守护进程的启动,然后创建一个multiprocessing.Manager对象以在进程之间直接共享变量。
首先设置起来可能会更复杂,但这是干净的事情。请在此查看Manager类的文档:
https://docs.python。 org / 3 / library / multiprocessing.html


I have 2 daemons, which should access the same Variable. I've created a 3rd file for global variables and each daemon can access the variable. But when one changes the variable the other one still sees the default value.

example:

glob.py

time = 0

daemon a:

import datetime
import time
import glob

while(True):
    glob.time = datetime.datetime.now()
    time.sleep(30)

daemon b:

import glob

while(True):
    print(glob.time)

it would print 0 everytime I hope I've made my problem clear, and someone can help me. If you need some more information please feel free to ask.

解决方案

It looks like (although you don't tell explicitly that) you are running your programs in a completely independent way: two different invocations of the Python interpreter.

There is no such magic as you are hoping would exist: just as if you have two instances of the same program running, each one will have its instance of variables (global or otherwise).

If you are performing some simple task, the easier way to go is to have one text file as output for each process, and the other process trying to read information from the file generated by each process it wants to know about - (you could even use named pipes in Unixes).

The other way is to have a Python script to coordinate the starting of your daemons using the multiprocessing stdlib module, and then create a multiprocessing.Manager object to share variables directly between process. This can be more complicated to set up at first, but it is the clean thing to do. Check the docs on the Manager class here: https://docs.python.org/3/library/multiprocessing.html

这篇关于多个文件中的Python全局变量的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆