如何在 Linux 中创建高分辨率计时器来衡量程序性能? [英] How to create a high resolution timer in Linux to measure program performance?

查看:27
本文介绍了如何在 Linux 中创建高分辨率计时器来衡量程序性能?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在尝试将 GPU 与 CPU 性能进行比较.对于 NVIDIA GPU,我一直使用 cudaEvent_t 类型来获得非常精确的计时.

I'm trying to compare GPU to CPU performance. For the NVIDIA GPU I've been using the cudaEvent_t types to get a very precise timing.

对于 CPU,我一直在使用以下代码:

For the CPU I've been using the following code:

// Timers
clock_t start, stop;
float elapsedTime = 0;

// Capture the start time

start = clock();

// Do something here
.......

// Capture the stop time
stop = clock();
// Retrieve time elapsed in milliseconds
elapsedTime = (float)(stop - start) / (float)CLOCKS_PER_SEC * 1000.0f;

显然,只有在以秒为单位计算时,这段代码才有效.此外,结果有时会很奇怪.

Apparently, that piece of code is only good if you're counting in seconds. Also, the results sometime come out quite strange.

有人知道在 Linux 中创建高分辨率计时器的方法吗?

Does anyone know of some way to create a high resolution timer in Linux?

推荐答案

查看 clock_gettime,这是高分辨率计时器的 POSIX 接口.

Check out clock_gettime, which is a POSIX interface to high-resolution timers.

如果在阅读手册页后,您想知道 CLOCK_REALTIMECLOCK_MONOTONIC 之间的区别,请参阅 CLOCK_REALTIME 和 CLOCK_MONOTONIC 的区别?

If, having read the manpage, you're left wondering about the difference between CLOCK_REALTIME and CLOCK_MONOTONIC, see Difference between CLOCK_REALTIME and CLOCK_MONOTONIC?

完整示例请参见以下页面:http://www.guyrutenberg.com/2007/09/22/profiling-code-using-clock_gettime/

See the following page for a complete example: http://www.guyrutenberg.com/2007/09/22/profiling-code-using-clock_gettime/

#include <iostream>
#include <time.h>
using namespace std;

timespec diff(timespec start, timespec end);

int main()
{
    timespec time1, time2;
    int temp;
    clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &time1);
    for (int i = 0; i< 242000000; i++)
        temp+=temp;
    clock_gettime(CLOCK_PROCESS_CPUTIME_ID, &time2);
    cout<<diff(time1,time2).tv_sec<<":"<<diff(time1,time2).tv_nsec<<endl;
    return 0;
}

timespec diff(timespec start, timespec end)
{
    timespec temp;
    if ((end.tv_nsec-start.tv_nsec)<0) {
        temp.tv_sec = end.tv_sec-start.tv_sec-1;
        temp.tv_nsec = 1000000000+end.tv_nsec-start.tv_nsec;
    } else {
        temp.tv_sec = end.tv_sec-start.tv_sec;
        temp.tv_nsec = end.tv_nsec-start.tv_nsec;
    }
    return temp;
}

这篇关于如何在 Linux 中创建高分辨率计时器来衡量程序性能?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆