Input delay experiment
I was under the assumption that su_GetSDLInput was called at a regular time interval. So, since the timer didn't work, I just put a counter into the function and figured that would be good enough.
My code works fine, as my inputs are in fact delayed, but the delay seems to be a bit inconsistent. I can detect a large variance in how much the inputs are delayed. This makes me think the function is not called at a regular time interval; so using Time(), if enabled, wouldn't work reliably either.
Is inside su_GetSDLInput the wrong place to pull my delayed turns out of the Q?
My code works fine, as my inputs are in fact delayed, but the delay seems to be a bit inconsistent. I can detect a large variance in how much the inputs are delayed. This makes me think the function is not called at a regular time interval; so using Time(), if enabled, wouldn't work reliably either.
Is inside su_GetSDLInput the wrong place to pull my delayed turns out of the Q?
It's probably a bad idea after all to use that Timer() function there. Use tSysTimeFloat() instead (warning, it returns a double.... ). Time() is the current game time and is reset to zero at the beginning of each round.
Edit: the function is the perfect place to delay all input. It's indeed called more often at times; specifically, it's called in loops as long as the return value is true to empty the event queue completely (don't want to process just one event per frame).
Edit: the function is the perfect place to delay all input. It's indeed called more often at times; specifically, it's called in loops as long as the return value is true to empty the event queue completely (don't want to process just one event per frame).
- Jonathan
- A Brave Victim
- Posts: 3391
- Joined: Thu Feb 03, 2005 12:50 am
- Location: Not really lurking anymore
It really depends on what you're doing, but if you're pushing it with fast games 100 ms is a pretty long delay. I found that even an LCD response time difference where both LCDs have a significant response in well under 100 ms makes a noticeable difference in my play. When I switched to the faster LCD after a while on the slower one things suddenly 'just worked', which explains why I felt the opposite when playing with the slower one. This is reaction time; I think timing isn't affected as much.gss wrote:That's pretty much what I want to drill down on. I'm guessing 100ms is reasonable and 200 ms is unreasonable. The big question is, does everybody feel the same way, or does the line where reasonable and unreasonable meet move around for different observers... and if so, by how much.
ˌɑrməˈɡɛˌtrɑn
My initial tests agree with you. My cycle starts to feel sticky at around 30-35 ms forced input delay.Jonathan wrote:... if you're pushing it with fast games 100 ms is a pretty long delay ...
Questions I need to find answers to:
1) What is the max latency between hitting a key and registering that as an event in the SDL functions. (I'm using linux)
2) How long does it take between deque'ing the turn from my delay Q to getting a rendered frame out of my graphics card.
Adding those numbers to the frame latency through my LCD should give me my worst case input latency.
- Jonathan
- A Brave Victim
- Posts: 3391
- Joined: Thu Feb 03, 2005 12:50 am
- Location: Not really lurking anymore
I have noticed that that part can really mess up on Mac OS X. Not just SDL, but any input between the hardware and final software. More load, lower framerates and more simultaneously 'active' inputs all contribute to it. Turning keyboard repeat off helps a bit when keys are pressed. Both latency and time between updates increase. I suspect this only happens on Mac OS X, and it's one of the main reasons I don't see it as a gaming platform. I'd be interested to get a confirmation that other platforms don't suffer from this, or that I'm not the only one to notice this problem.gss wrote:1) What is the max latency between hitting a key and registering that as an event in the SDL functions. (I'm using linux)
Setting swap mode to finish should prevent the GPU from queueing multiple frames for faster framerates but slightly larger latency (older versions without the setting always act like finish). In that case the delay shouldn't exceed one to two frames.gss wrote:2) How long does it take between deque'ing the turn from my delay Q to getting a rendered frame out of my graphics card.
ˌɑrməˈɡɛˌtrɑn
- Jonathan
- A Brave Victim
- Posts: 3391
- Joined: Thu Feb 03, 2005 12:50 am
- Location: Not really lurking anymore
I found a good example of what some TFT displays apparently do (called input lag??): http://www.youtube.com/watch?v=pi2OE6hSh00. You might want to take that into account if you try to get absolute numbers.
ˌɑrməˈɡɛˌtrɑn
Yeah, it's an obscure marketing trick. What counts in marketing is the time it takes for pixels to switch color. For a while, the "overdrive" technology has been around where, to achieve faster switches, they apply more voltage to the display than would be required to switch to the new color to give the crystals an extra "kick" so they get moving. Apparently, to get even shorter switch times, some manufacturers now delay the displayed image one or two frames so they can shape the voltage signal in an optimal way. Of course, you'll get one or two frames of extra delay, which isn't bad for video playback, but very bad for games. Buying a new monitor has gotten more complicated again
The display brightness has also a surprising effect on the felt latency; our eyes process brigher signals faster, so a dim display will add "eye lag". I experienced this firsthand just yesterday when I dug out my old CRT monitor and hooked it up, hoping for better game experiences. Unfortunately, it has gotten rather dim over the years, not to mention other defects and quirks of 12+ years old monitors.
The display brightness has also a surprising effect on the felt latency; our eyes process brigher signals faster, so a dim display will add "eye lag". I experienced this firsthand just yesterday when I dug out my old CRT monitor and hooked it up, hoping for better game experiences. Unfortunately, it has gotten rather dim over the years, not to mention other defects and quirks of 12+ years old monitors.
I am taking this into account. My LCD monitors (Dell 2001FP) seem to be on the slow end, if random research pages on the net can be trusted. Supposedly the processing lag is around 55-76ms. Other LCD's are much faster, some might be a bit slower. CRT's are much faster.
To z-man's point, the advertised input latency of the Dell 200FP is 16ms. I guess the term 'input latency' is the term with the loose definition.
To z-man's point, the advertised input latency of the Dell 200FP is 16ms. I guess the term 'input latency' is the term with the loose definition.
- Jonathan
- A Brave Victim
- Posts: 3391
- Joined: Thu Feb 03, 2005 12:50 am
- Location: Not really lurking anymore
Until you add uncorrected audio to the video. Leading audio supposedly increases tension, and I can imagine some people would notice the timing difference itself.z-man wrote:Of course, you'll get one or two frames of extra delay, which isn't bad for video playback, but very bad for games.
I notice that when I dim my backlight a lot in the dark, with some bright things like status LEDs and street lights present (pupil dilation should be enough to counter it if it gets a chance).z-man wrote:The display brightness has also a surprising effect on the felt latency; our eyes process brigher signals faster, so a dim display will add "eye lag".
ˌɑrməˈɡɛˌtrɑn
- Jonathan
- A Brave Victim
- Posts: 3391
- Joined: Thu Feb 03, 2005 12:50 am
- Location: Not really lurking anymore
To add some input: I noticed that games felt rather sluggish in Mac PCSX but could adjust to it without too much trouble. When I measured the keyboard input → video output delay it turned out to be on the order of 1/5 second!
ˌɑrməˈɡɛˌtrɑn
- Code hunter
- Average Program
- Posts: 51
- Joined: Sat Dec 23, 2006 3:15 am