通过Python可以达到的最高时间精度范围是多少? [英] What is the highest scale of time precision that can be reached via Python?

查看:840
本文介绍了通过Python可以达到的最高时间精度范围是多少?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

考虑一个非常简单的计时器;

start = time.time()
end = time.time() - start 
while(end<5):
    end = time.time() - start
    print end

此计时器的精度如何?我的意思是,与实时时钟相比,该时钟的同步性和实时性如何?

现在是真正的问题;

使用Python可以精确测量的最小时间量是多少?

解决方案

这完全取决于平台.使用 timeit.default_timer()函数,它将返回最适用于您平台的精确计时器 .

从文档中:

以特定于平台的方式定义默认计时器.在Windows上,time.clock()具有微秒级的粒度,但是time.time()的粒度为1/60秒.在Unix上,time.clock()具有1/100秒的粒度,而time.time()更精确.

因此,在Windows上,您获得的是微秒,在Unix上,您将获得平台可以提供的任何精度,通常 (远远超过)1/100秒.

Consider a very simple timer;

start = time.time()
end = time.time() - start 
while(end<5):
    end = time.time() - start
    print end

how precise is this timer ? I mean compared to real-time clock, how synchronized and real-time is this one ?

Now for the real question ;

What is the smallest scale of time that can be measured precisely with Python ?

解决方案

This is entirely platform dependent. Use the timeit.default_timer() function, it'll return the most precise timer for your platform.

From the documentation:

Define a default timer, in a platform-specific manner. On Windows, time.clock() has microsecond granularity, but time.time()‘s granularity is 1/60th of a second. On Unix, time.clock() has 1/100th of a second granularity, and time.time() is much more precise.

So, on Windows, you get microseconds, on Unix, you'll get whatever precision the platform can provide, which is usually (much) better than 1/100th of a second.

这篇关于通过Python可以达到的最高时间精度范围是多少?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆