Input delay experiment
Input delay experiment
I have a project for which I need to research the effect of input delay. As an experiment, I'd like to add a mechanism into armagetronad whereby I can force a programmable delay (in milliseconds) between when a turn key is pressed and when the screen shows the bike moving in the new direction.
I've pulled and compiled the 0.2.8 branch and will experiment with that. Being a total noob to this code, I was hoping somebody could suggest a way to do this.
- gss
I've pulled and compiled the 0.2.8 branch and will experiment with that. Being a total noob to this code, I was hoping somebody could suggest a way to do this.
- gss
Interesting. A simple, but imperfect, way is to abuse the mechanics of CYCLE_DELAY; where user input is processed, you can make the cycle pretend it has done a turn a short time ago, so the next turn will be delayed. The downside of this is that consecutive turns can't be delayed further, so if the desired input delay shall be larger than CYCLE_DELAY, some turns are executed too early. Anyway, here's the code for that:
It goes into gCycle.cpp, you need to modify the start of gCycle::Act to look like that. You'll get a new configuration variable called CYCLE_INPUT_DELAY that controls the delay.
Code: Select all
REAL sg_inputDelay = 0;
static tSettingItem<REAL> c_st("CYCLE_INPUT_DELAY",
sg_inputDelay);
bool gCycle::Act(uActionPlayer *Act, REAL x){
// don't accept premature input
if (se_mainGameTimer && ( se_mainGameTimer->speed <= 0 || se_mainGameTimer->Time() < -1 ) )
return false;
// crude input delay
if ( pendingTurns.size() == 0 )
{
REAL lastTurnTime = se_GameTime() - sg_delayCycle + sg_inputDelay;
if ( lastTurnTimeRight_ < lastTurnTime )
lastTurnTimeRight_ = lastTurnTime;
if ( lastTurnTimeLeft_ < lastTurnTime )
lastTurnTimeLeft_ = lastTurnTime;
}
- Tank Program
- Forum & Project Admin, PhD
- Posts: 6711
- Joined: Thu Dec 18, 2003 7:03 pm
This might be of some help maybe:
http://www.lfsforum.net/showthread.php? ... post319451
It's about input lag in racing sims
http://www.lfsforum.net/showthread.php? ... post319451
It's about input lag in racing sims
winner of: Spoon, 3rd, 6th, 9th, 11th, 18th, 19th, 33rd, 34th and 48th Ladle.
Retired since 07/2012
Retired since 07/2012
That's interesting, too, but a different subject Gss wants to purposefully introduce additional input lag, and that other guy measures one form of natural input lag.
There were a couple of games I didn't play because they had such big amounts of input lag. Most notably here: Tron 2.0. With my current PC, the lag is gone, but back when it was new and fresh and you actually had a chance to meet other players online, it was unbearable unless I turned down the resolution. The worst one was Aquanox. Upgrading my PC didn't help there. Bummer. In both cases, I'm talking about 0.1 to 0.2 seconds of lag. I tend to blame it on Direct3D's lack of a glFinish like function.
There were a couple of games I didn't play because they had such big amounts of input lag. Most notably here: Tron 2.0. With my current PC, the lag is gone, but back when it was new and fresh and you actually had a chance to meet other players online, it was unbearable unless I turned down the resolution. The worst one was Aquanox. Upgrading my PC didn't help there. Bummer. In both cases, I'm talking about 0.1 to 0.2 seconds of lag. I tend to blame it on Direct3D's lack of a glFinish like function.
That's pretty much what I want to drill down on. I'm guessing 100ms is reasonable and 200 ms is unreasonable. The big question is, does everybody feel the same way, or does the line where reasonable and unreasonable meet move around for different observers... and if so, by how much.z-man wrote:In both cases, I'm talking about 0.1 to 0.2 seconds of lag. I tend to blame it on Direct3D's lack of a glFinish like function.
Tank wrote:I wonder if some sort of temporary turn input queue would be practical. That way when you press a key it's put in the queue, with a marker for the time it was, and it's only read off the queue after your x amount of time has passed.
This is exactly the methodology I want to use. Now that you've mentioned the place to insert the queue, what's alluding me is the best way to service the queue. Ideally I implement some sort of timer that I program to pop every x milliseconds, but this seems a bit complicated. It looks like there is already a timer used to get the SDL inputs. Can you give me some hints on how I can use it to dequeue?z-man wrote:Yeah, the perfect way would be to modify su_GetSDLInput() to keep a queue of input events and delay them. The function already conveniently records the time an event came in, you only need to put them into a std::deque< std::pair< SDL_Event, REAL > > and pull them out as desired..
- gss
- Tank Program
- Forum & Project Admin, PhD
- Posts: 6711
- Joined: Thu Dec 18, 2003 7:03 pm
OK, I think I understand how to do this now. For times when su_GetSDLInput would have returned an event, now it should just Q that event. If ever the function is entered and the Q is non-empty, it should only return the next event from the Q if the event happened more than xmilliseconds previously, where x is the number of milliseconds worth of input lag to model.
Correct?
Correct?
- Tank Program
- Forum & Project Admin, PhD
- Posts: 6711
- Joined: Thu Dec 18, 2003 7:03 pm