OpenGL:Linux上的渲染时间限制 [英] OpenGL: render time limit on linux

查看:196
本文介绍了OpenGL:Linux上的渲染时间限制的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我正在通过OpenGL和Qt实现某种计算算法.所有计算都在片段着色器中执行.

I'm implementing some computation algorithm via OpenGL and Qt. All computations are executed in fragment shader.

有时,当我尝试执行一些困难的计算时(在GPU上花费5秒钟以上),OpenGL会在计算结束前中断计算.我想这是 TDR 从Windows.

Sometimes when i trying to execute some hard computations (that takes more than 5 seconds on GPU) OpenGL breaks computation before it ends. I suppose this is system like TDR from Windows.

我认为我应该将输入数据分成几部分,但是我需要知道允许多长时间的计算.

I think that i should split input data by several parts but i need to know how long computation allowed.

我如何在linux上获得渲染时间限制(如果有跨平台解决方案,那将很酷)?

How i can obtain render time limit on linux (it will be cool if there is crossplatform solution)?

推荐答案

恐怕这是不可能的.在遍历X和Wayland的文档后,我找不到任何提及GPU看门狗定时器设置的内容,所以我相信这是特定于驱动程序的,并且用户可能无法访问(太可怕了.

I'm afraid this is not possible. After a lot of scouring through the documentation of both X and Wayland, I could not find anything mentioning GPU watchdog timer settings, so I believe this is driver-specific and likely inaccessible to the user (that or I am terrible at searching).

不过,可以通过在xorg.conf上添加一行,然后将其传递到图形驱动程序上,在NVIDIA硬件上的X 上禁用此看门狗.

It is however possible to disable this watchdog under X on NVIDIA hardware by adding a line to your xorg.conf, which is then passed on to the graphics driver.

选项交互式"布尔"

此选项控制驱动程序看门狗的行为,该行为将尝试检测并终止卡住的GPU程序,以确保GPU仍可用于其他进程.但是,GPU计算应用程序通常具有长期运行的GPU程序,因此将其杀死是不可取的.如果您使用的是GPU计算应用程序,并且这些应用程序将提前终止,请尝试关闭此选项.

This option controls the behavior of the driver's watchdog, which attempts to detect and terminate GPU programs that get stuck, in order to ensure that the GPU remains available for other processes. GPU compute applications, however, often have long-running GPU programs, and killing them would be undesirable. If you are using GPU compute applications and they are getting prematurely terminated, try turning this option off.

请注意,即使是NVIDIA文档也没有提及超时的数字量.

Note that even the NVIDIA docs don't mention a numeric quantity for the timeout.

这篇关于OpenGL:Linux上的渲染时间限制的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持IT屋!

查看全文
登录 关闭
扫码关注1秒登录
发送“验证码”获取 | 15天全站免登陆