just wondering why the fps isnt above 100 or 200 in this game..
an exerpt from some chat at #armagetron on efnet/freenode/quakenet
<DRaTRePvS> im gonna ask stupid questions now
<DRaTRePvS> does atron use direct-x ?
<DRaTRePvS> or is it like all openGL
<armrelay> <guru3-efnet> uses sdl
<armrelay> <guru3-efnet> so whatever sdl uses
<DRaTRePvS> sdl? i thot that was an acryonym for software deveolpers licence
<armrelay> <guru3-efnet> no
<armrelay> <ElronMacB-efnet> lol
<armrelay> <guru3-efnet> http://www.libsdl.org
<DRaTRePvS> somethign dada & lighting?
<armrelay> <ElronMacB-efnet> simple directmedia layer
<DRaTRePvS> o
<armrelay> <guru3-efnet> lol
<DRaTRePvS> directmedia = directx
<armrelay> <ElronMacB-efnet> nopes
* DRaTRePvS wonders why he doesnt get 100+ fps
<armrelay> <ElronMacB-efnet> directmedia != directx
<armrelay> <guru3-efnet> win/lin?
<DRaTRePvS> is v-sync untogglable in atron & turned on by default?
<armrelay> <guru3-efnet> i never figured out why it was different on windows
<armrelay> <guru3-efnet> i suspect sdl is behind it tho
<armrelay> <ElronMacB-efnet> hmmm
<DRaTRePvS> might have something to do with vsync
<armrelay> <ElronMacB-efnet> no
<DRaTRePvS> if vsync is on, on any 3d game, you wont get higher fps than your refresh rate
<armrelay> <ElronMacB-efnet> must be opengl
<armrelay> <ElronMacB-efnet> that makes the differenxe
<armrelay> <ElronMacB-efnet> difference
<DRaTRePvS> i play some games that display ALOT more polygons, in alot higher FPS
<DRaTRePvS> does the atron engine self-cull polygons that arent in camera view?
<DRaTRePvS> locally
<armrelay> <ElronMacB-efnet> guru3: sdl doesn't make the difference between win and lin, q3 for exapmle runs way better in linux too and it's not using sdl
<armrelay> <guru3-efnet> mm hmm
<armrelay> <guru3-efnet> well i dunno
<DRaTRePvS> like, its not trying to display stuff behind the bike, that is away from current camera view, is it?
<armrelay> <guru3-efnet> and supertard, i have no clue
<DRaTRePvS> it shoudlnt even calculate polygons that are off-camera
<DRaTRePvS> mebbe thats why its slow
<DRaTRePvS> something seems like holding it back
<DRaTRePvS> keepign it slow
<armrelay> <ElronMacB-efnet> it probably does render everything at once...
<armrelay> <guru3-efnet> who knows
<DRaTRePvS> yeh i get the feeling it uses no culling tricks
<armrelay> <guru3-efnet> well, z-man probably does
<armrelay> <ElronMacB-efnet> why isn't he in here?
<DRaTRePvS> and linux can just ... dunno calculate all that sh!t faster
<DRaTRePvS> yes, he really shoudl be in here..
<armrelay> <ElronMacB-efnet> yup
<DRaTRePvS>
<DRaTRePvS> all the dev should
<armrelay> <guru3-efnet> he has been at least once
<DRaTRePvS> thigns would move faster & with better communication
<DRaTRePvS> IMO
<armrelay> <ElronMacB-efnet> yeah, seen 'im
<DRaTRePvS> im gonna ask in guru3 about polygon culling behidn the camera
<armrelay> <guru3-efnet> u try that
<armrelay> <guru3-efnet>
<DRaTRePvS> & mebbe suggest more devs hang out on #armagetron
Do polygons behind a local camera cull themselves?
Yeah, like, we haven't got anything better to do than hang out in a chat room. If we were all there all the time, nothing would get done. And have you heard of that thing called ... sleep? Or work? Some people have strange concepts of reality.
On the topic at hand: It is paranoid to think OGL has any different performance than DirectX, and SDL is used basically only for input handling, sound and initialization stuff. AA runs crappy because the rendering code is crappy: it does not use vertex buffers ( it uses display lists which were recommended way back when I started this, but only on request since several cards don't even support those ). It does not sort the things it renders by the required render state ( the texture used ). It just blows out every polygon in the scene and lets OGL handle the culling. The worst sin however is probably that it uses LINES to draw the sparks and explosions. Since nobody else uses them, they are SLOW on current cards. And I think there may be a bug that causes massive multiple renderings of the same wall in some situations.
Nevertheless, with sparks disabled, rendering speed is fillrate limited for me in every configuration. All the sins, except for the bug, lower the geometry throughput and not the fillrate, so I figure it's not that bad at all. It's just not high tech. Hey, it's five years old
AA uses the VSync setting from the operating system since there's no crossplatform way to set it.
If someone here is a rendering guru, I'd be glad to hand over the rendering code. It should be abstracted away first, of course, since ATM, rendering and game code are quite spaghetti-like...
On the topic at hand: It is paranoid to think OGL has any different performance than DirectX, and SDL is used basically only for input handling, sound and initialization stuff. AA runs crappy because the rendering code is crappy: it does not use vertex buffers ( it uses display lists which were recommended way back when I started this, but only on request since several cards don't even support those ). It does not sort the things it renders by the required render state ( the texture used ). It just blows out every polygon in the scene and lets OGL handle the culling. The worst sin however is probably that it uses LINES to draw the sparks and explosions. Since nobody else uses them, they are SLOW on current cards. And I think there may be a bug that causes massive multiple renderings of the same wall in some situations.
Nevertheless, with sparks disabled, rendering speed is fillrate limited for me in every configuration. All the sins, except for the bug, lower the geometry throughput and not the fillrate, so I figure it's not that bad at all. It's just not high tech. Hey, it's five years old
AA uses the VSync setting from the operating system since there's no crossplatform way to set it.
If someone here is a rendering guru, I'd be glad to hand over the rendering code. It should be abstracted away first, of course, since ATM, rendering and game code are quite spaghetti-like...
- Jonathan
- A Brave Victim
- Posts: 3391
- Joined: Thu Feb 03, 2005 12:50 am
- Location: Not really lurking anymore
That chat is total chaos.
Should be either this or a config option. The user should be able to choose between better grinding (or change that?) and much less annoying/confusing rendering...z-man wrote:AA uses the VSync setting from the operating system since there's no crossplatform way to set it.
Too much spaghetti for me at the moment so I don't know how everything fits together, and I'm not a rendering guru, but I'll think about possible speedups.If someone here is a rendering guru, I'd be glad to hand over the rendering code. It should be abstracted away first, of course, since ATM, rendering and game code are quite spaghetti-like...
thx for the reply, that explains alot..
BTW, anyone noticed, that if a player leaves the game right as a round starts, that thier explosion causes a graphics "hiccup" a nice long pause & everyones camera jumps momentarily towards the person who exploded, ..
i'm pretty sure the FPS takes a mighty hit when this happens too.. but, then it fixes itself quickly & the round starts...
blah
BTW, anyone noticed, that if a player leaves the game right as a round starts, that thier explosion causes a graphics "hiccup" a nice long pause & everyones camera jumps momentarily towards the person who exploded, ..
i'm pretty sure the FPS takes a mighty hit when this happens too.. but, then it fixes itself quickly & the round starts...
blah