[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/jp/ - Otaku Culture

Search:


View post   

>> No.8227854 [View]
File: 51 KB, 863x681, render queue.png [View same] [iqdb] [saucenao] [google]
8227854

Regarding latency...

Taken from the vsync path .ini:
[quote][Option]
Vsync = 0
SleepType = 1
BltPrepareTime = 0
AutoBltPrepareTime = 1
AllowShortDelay = 0
GameFPS = 60
ReplaySkipFPS = 240
ReplaySlowFPS = 30
LockBackBuffer = 0
D3DMultiThread = 0
HookDirectInput = 0
[/quote]
Can someone explain the theory behind this patch. Why is it needed? Assuming that input is read at the same speed as frames are written to the sceen, If your fps is 60 I don't see how there could be more lag than 1 frame (16.6ms).

I think that people ignore a certain other setting, that may affect your latency much more.
If my logic doesn't fail, render queue (nvidia: maximum pre-rendered frames, ati:flip queue) should generate much more input latency. If you don't change your render queue setting, the default is 3 and therefore it should cause input latency of 3 frames (50ms).
http://www.tweakguides.com/Oblivion_13.html
Render queue is sometimes (somewhat erroneously) called triple buffering
http://en.wikipedia.org/wiki/Triple_buffering#Triple_buffering (touhous are directx games),
but proper triple buffering doesn't have this kind of problem.
Have I misunderstood something? Is render queue exactly what the vsync patch tries to correct? I atleast don't think so.

Also, there may or may not be a way for a game programmer to manually disable or change the value of render queue. However, if I had to make a gues I'd say that there isn't, since for example in nvidia control panel "Maximum pre-rendered frames" doesn't have an option called "Application-controlled". Is someone here experienced enough with directx programming to answer this last question?

Navigation
View posts[+24][+48][+96]