[Gambas-user] sdl Draw event overhead is killing frame rate

Kevin Fishburne kevinfishburne at ...1887...
Thu Jan 16 06:03:42 CET 2014


On 01/15/2014 10:07 AM, Benoît Minisini wrote:
> Le 15/01/2014 06:17, Kevin Fishburne a écrit :
>> On 01/14/2014 10:19 PM, Benoît Minisini wrote:
>>> Le 15/01/2014 03:33, Kevin Fishburne a écrit :
>>>> I noticed a month or two ago that the frame rate on the server side of
>>>> my game went from around 300 to 50. I assumed it was an OpenGL driver
>>>> issue or that I'd added some really inefficient code. It appears this is
>>>> not the case, but rather it seems the Draw event raised by opening a
>>>> screen using SDL is executing some heavy (and hopefully unnecessary)
>>>> lifting that it wasn't previously.
>>>>
>>>> I've attached a project which is essentially an empty SDL screen draw
>>>> event loop that calculates a frame rate once per second and outputs the
>>>> results to the console. Any idea what's going on here? Any empty loop on
>>>> my reasonably modern PC and brand new video card running the binary
>>>> NVIDIA drivers should be producing frame rates in the thousands, at least.
>>>>
>>> You must know that the event loop of the SDL library cannot be shared,
>>> so you get instead a busy waiting loop that repeatedly checks SDL events
>>> and calls one loop of the Gambas event loop with a 10 ms timeout.
>>>
>>> Each time that loop is run, the "Draw" event is raised. Unless you call
>>> "Stop Event" during the Draw event handler, gb.sdl assumes that
>>> something has be drawn, and so refreshes the window.
>>>
>>> Refreshing a window takes a lot of CPU, depending on its size and what
>>> your window manager is.
>>>
>>> If the window is refreshed, your framerate may be limited by the GPU
>>> (through the monitor vsync, which is emultated now on LCD monitors),
>>> check that.
>>>
>>> If you call Stop Event, the Draw event will be called again and again
>>> without refreshing, and so the CPU will be spent most of the time there.
>>> Note that a small CPU time is given back to the OS then, so that not
>>> 100% of the CPU is used.
>>>
>>> Just check the CPU usage and the framerate in the different cases (small
>>> window, big window, with a 3D window manager, without...), and report!
>>>
>> That is interesting. I'm trying to wrap my head around it. My question
>> at this point is if something changed on my system or in Gambas. I've
>> tried this on two different PCs with two different video cards (with and
>> without the binary NVIDIA driver and with vsync and desktop compositing
>> disabled), and even ran the server code from several months ago with the
>> same effect.
>>
>> I placed a "Stop Event" after the "Inc FPS_Frames" statement in the
>> Screen_Draw procedure in the test project and it actually reduced the
>> frame rate from 236 to 157. That's the opposite of what I would have
>> expected based on your description of what is happening.
>>
>> I think I understand the logic. Basically, if stuff doesn't need to be
>> drawn on screen, call "Stop Event" at the beginning of the Screen_Draw
>> procedure (the SDL draw event) to avoid its unneeded overhead. It
>> doesn't work as expected unfortunately (unless I'm too thick to
>> understand what you're saying of course!).
>>
>> I just installed the mesa-utils and ran glxgears, which is reporting
>> about 677 FPS at approximately 1920x1080. Running GambasGears at
>> resolutions between 128x128 and 1920x1080 all produce a frame rate of
>> 233. Running it with everything but the frame rate calculation removed
>> from the Screen_Draw procedure produces a frame rate of 237 (similar to
>> my test project), which I tested through resolutions between 128x128 and
>> 1920x1080. BeastScroll's FPS maxes out at 227 FPS at any resolution.
>>
>> In order to see if it's something in Gambas or on my system I decided to
>> try some native games using SDL and OpenGL. I installed OpenArena from
>> the repositories and it runs around between 800 and 1000 FPS at 640x480
>> and around 200-300 FPS at 1920x1080 with all settings on high. I
>> installed Nexuiz and on default settings at 1920x1080 get between 150
>> and 200 FPS, which is pretty damn good for Nexuiz.
>>
>> I think something funky is going on with gb.sdl at this point, as it
>> seems to affect any SDL/OpenGL application including the example
>> programs, while non-Gambas SDL/OpenGL applications achieve remarkable
>> frame rates even under heavy load.
>>
>> I don't need Gambas to change, but some sort of documented workaround
>> may be in order if I'm correct in my suspicions. All these tests were
>> performed using NVIDIA driver 304.108 with vsync disabled on a GeForce
>> GTX 650 with 1024 MB of VRAM in Kubuntu 13.04 ia64, latest daily Gambas
>> build.
>>
> My problem is that nothing has changed for a long time in the way gb.sdl
> manages things...
>
> Which version of Gambas do you use?
>
> Can you make me a little project with something actually drawn and give
> me your stats if you call Stop Event, if you don't call it?
>
> Can you try on your machine if you replace the NVIDIA driver by Nouveau?
>
> Thanks!

The results are in, and they don't tell me much. :( I attached the 
updated test project, which includes the test results in comments at the 
top. They are:

' No Stop Event | No Geometry | Binary NVIDIA driver | 236 FPS
' Stop Event    | No Geometry | Binary NVIDIA driver | 158 FPS
' No Stop Event | No Geometry | Nouveau driver       | 238 FPS
' Stop Event    | No Geometry | Nouveau driver       | 159 FPS
' No Stop Event | Geometry    | Binary NVIDIA driver | 236 FPS
' Stop Event    | Geometry    | Binary NVIDIA driver | 155 FPS
' No Stop Event | Geometry    | Nouveau driver       | 218 FPS
' Stop Event    | Geometry    | Nouveau driver       | 161 FPS

Adding "Stop Event" reduces thread CPU usage from 100% to 84%. The frame 
rate is consistently lower when using Stop Event. That 16% difference in 
CPU also doesn't seem to correlate with the difference in frame rates, 
as the frame rate difference has a significantly larger discrepancy. 
Factors unknown to me could be contributing to that of course...

Perhaps this has nothing to do with SDL, but events in general? I just 
modified the project to do a Do...Loop with just the FPS calculation and 
it's executing over 6.6 million FPS. What's a quick way to create an 
event that executes as fast as possible? I'd like to calculate its frame 
rate to see how it compares to SDL's Draw event and the Do...Loop. Then 
we'd know if it's all events or if it's SDL's Draw event in particular.

Also (and this needs a new thread), I just discovered that if you 
comment out the ".Show()" line signal 11 is raised upon executing the 
"Glu.Build2DMipmaps(TextureImage)" line.

-- 
Kevin Fishburne
Eight Virtues
www: http://sales.eightvirtues.com
e-mail: sales at ...1887...
phone: (770) 853-6271

-------------- next part --------------
A non-text attachment was scrubbed...
Name: SDL_Draw_Event_Test.tar.gz
Type: application/gzip
Size: 41211 bytes
Desc: not available
URL: <http://lists.gambas-basic.org/pipermail/user/attachments/20140116/f1e7498a/attachment.gz>


More information about the User mailing list