单个python应用程序的开销 [英] Overhead of individual python apps

查看:53
本文介绍了单个python应用程序的开销的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在建立一个由几个小的python

应用程序组成的系统,这些应用程序在同一台PC上相互通信。


在Windows中运行,启动每个应用程序会生成一个

进程,并且每个进程最终都会占用> 4MB的系统

内存。这个内存使用情况是由Windows任务管理器报告的,因为python.exe映像名称是



我的问题:有没有办法减少这个进程开销?例如:

你可以设置它以便一个python.exe实例处理多个

进程吗?


考虑的一种可能性是将它们作为单个

进程的线程而不是多个进程运行,但这有其他缺点

我的应用程序而且我宁愿不这样做,


我考虑的另一种可能性就是在每个应用程序中删除除了最基本的
基本导入之外的所有内容,但我测试了这个并且它有

边际效益。我向自己展示了一个简单的单行内容由''x = raw_input()'组成的
应用程序仍然会吃掉> 2.7MB。


我也试过 - 但是,毫不奇怪,它没有为

单线做任何事情。


我只是运行.py文件而且我还在v2.3


所有帮助都很感激!


谢谢,

Russ

I''m setting up a system that consists of several small python
applications that all communicate amongst each other on the same pc.

When running in Windows, launching each application generates a
process, and each of those processes ends up taking up > 4MB of system
memory. This memory usage is as reported by the Windows Task manager
for the python.exe image name.

My Question: Is there any way to reduce this per-process overhead? eg:
can you set it somehow so that one python.exe instance handles multiple
processes?

One possibility considered is to run them as threads of a single
process rather than multiple processes, but this has other drawbacks
for my application and I''d rather not,

Another possibility I considered is to strip out all but the most
essential imports in each app, but I tested this out and it has
marginal benefits. I demonstrated to myself that a simple one liner
app consisting of ''x = raw_input()'' still eats up > 2.7MB .

I also tried -O but it, not surprisingly, did nothing for the
one-liner.

I''m simply running the .py files and I am still on v2.3

All help appreciated!

Thanks,
Russ

推荐答案

" Qopit" < RU ************ @ gmail.com>写道:
"Qopit" <ru************@gmail.com> writes:
在Windows中运行时,启动每个应用程序会生成一个
进程,并且每个进程最终都会占用> 4MB的系统内存。此内存使用情况与Windows任务管理器报告的python.exe映像名称相同。


第一步是澄清所报告的内容。如果WTM是报告每个进程的总内存使用量,那么它将超过

估计总使用量相当大。特别是,所有的Python可执行代码都应由所有的

流程共享。除非您加载已编译的扩展名,否则这些只会由使用它们的人共享。你需要一些东西,

将分别给你堆栈,堆和代码段大小

计算出内存使用的真正含义。

我的问题:有没有办法减少每个进程的开销?例如:
你可以设置它,以便一个python.exe实例处理多个
进程?


默认情况下操作系统应该为你做这件事。

考虑的一种可能性是将它们作为单个
进程的线程运行而不是多个进程,但这对我的应用程序还有其他缺点,我宁愿不这样做,
When running in Windows, launching each application generates a
process, and each of those processes ends up taking up > 4MB of system
memory. This memory usage is as reported by the Windows Task manager
for the python.exe image name.
The first step is to clarify what''s being reported. If WTM is
reporting the total memory usage for each process, then it''s over
estimating the total usage by a considerable amount. In particular,
all the Python executable code should be shared by all the
processes. Unless you load compiled extensions, anyway - those will
only be shared by the ones that use them. You''ll need something that
will give you the stack, heap and code segment sizes separately to
work out what the memory usage really is.
My Question: Is there any way to reduce this per-process overhead? eg:
can you set it somehow so that one python.exe instance handles multiple
processes?
The OS should do that for you by default.
One possibility considered is to run them as threads of a single
process rather than multiple processes, but this has other drawbacks
for my application and I''d rather not,




这不应该有助于内存使用 - 数据不会跨越

流程在任何情况下都需要线程私有。


不确定的原因是我不是积极的Windows

在这个领域表现得非常好。可能是Windows没有

共享可执行文件。在这种情况下,一个解决方案是转向现代的操作系统 - 比如v6 Unix :-)。


< mike

-

Mike Meyer< mw*@mired.org> http://www.mired.org/home/mwm/

独立的WWW / Perforce / FreeBSD / Unix顾问,电子邮件以获取更多信息。



That shouldn''t help memory usage - the data that isn''t across
processes would need to be thread-private in any case.

The reason for the uncertainty is that I''m not positive that Windows
behaves sanely in this area. It may be that Windows doesn''t have
shared executables. In this case, one solution is to move to a modern
OS - like v6 Unix :-).

<mike
--
Mike Meyer <mw*@mired.org> http://www.mired.org/home/mwm/
Independent WWW/Perforce/FreeBSD/Unix consultant, email for more information.


几个使用4Mb的应用程序不应该非常

内存(最多可能是20Mb)。你没有说你的机器内存有多少内存,但256Mb内存

将花费你不超过
Several apps using 4Mb each shouldn''t be very much
memory (maybe 20Mb at most). You didn''t say how
much memory was in your machine, but 256Mb of memory
will cost you no more than


50。不值得花很多钱。


-Larry


Qopit写道:
50. Not really worth a
lot of effort.

-Larry

Qopit wrote:
我正在建立一个由几个小型python应用程序组成的系统,这些应用程序在同一台PC上相互通信。

在Windows中运行时,启动每个应用程序会生成一个
过程,每个过程最终占用> 4MB的系统内存。这个内存使用情况是由Windows任务管理器报告的,用于python.exe映像名称。

我的问题:有没有办法减少每个进程的开销?例如:
你可以设置它以便一个python.exe实例处理多个
进程吗?

考虑的一种可能性是将它们作为单个线程运行
过程而不是多个过程,但这对我的应用程序还有其他缺点,我宁愿不这样做,

我考虑的另一种可能性就是去除除了最多的所有
每个应用程序中必不可少的导入,但我测试了它,它有边际效益。我向自己展示了一个由''x = raw_input()'组成的简单的一个内衬应用程序仍然会吃掉> 2.7MB。

我也试过 - 但是,毫不奇怪,它没有为
单行做任何事。

我只是在运行。 py文件,我仍然在v2.3

所有帮助表示赞赏!

谢谢,
Russ
I''m setting up a system that consists of several small python
applications that all communicate amongst each other on the same pc.

When running in Windows, launching each application generates a
process, and each of those processes ends up taking up > 4MB of system
memory. This memory usage is as reported by the Windows Task manager
for the python.exe image name.

My Question: Is there any way to reduce this per-process overhead? eg:
can you set it somehow so that one python.exe instance handles multiple
processes?

One possibility considered is to run them as threads of a single
process rather than multiple processes, but this has other drawbacks
for my application and I''d rather not,

Another possibility I considered is to strip out all but the most
essential imports in each app, but I tested this out and it has
marginal benefits. I demonstrated to myself that a simple one liner
app consisting of ''x = raw_input()'' still eats up > 2.7MB .

I also tried -O but it, not surprisingly, did nothing for the
one-liner.

I''m simply running the .py files and I am still on v2.3

All help appreciated!

Thanks,
Russ



这篇关于单个python应用程序的开销的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆