[Gambas-user] sdl Draw event overhead is killing frame rate
Kevin Fishburne
kevinfishburne at ...1887...
Wed Jan 15 06:17:09 CET 2014
On 01/14/2014 10:19 PM, Benoît Minisini wrote:
> Le 15/01/2014 03:33, Kevin Fishburne a écrit :
>> I noticed a month or two ago that the frame rate on the server side of
>> my game went from around 300 to 50. I assumed it was an OpenGL driver
>> issue or that I'd added some really inefficient code. It appears this is
>> not the case, but rather it seems the Draw event raised by opening a
>> screen using SDL is executing some heavy (and hopefully unnecessary)
>> lifting that it wasn't previously.
>>
>> I've attached a project which is essentially an empty SDL screen draw
>> event loop that calculates a frame rate once per second and outputs the
>> results to the console. Any idea what's going on here? Any empty loop on
>> my reasonably modern PC and brand new video card running the binary
>> NVIDIA drivers should be producing frame rates in the thousands, at least.
>>
> You must know that the event loop of the SDL library cannot be shared,
> so you get instead a busy waiting loop that repeatedly checks SDL events
> and calls one loop of the Gambas event loop with a 10 ms timeout.
>
> Each time that loop is run, the "Draw" event is raised. Unless you call
> "Stop Event" during the Draw event handler, gb.sdl assumes that
> something has be drawn, and so refreshes the window.
>
> Refreshing a window takes a lot of CPU, depending on its size and what
> your window manager is.
>
> If the window is refreshed, your framerate may be limited by the GPU
> (through the monitor vsync, which is emultated now on LCD monitors),
> check that.
>
> If you call Stop Event, the Draw event will be called again and again
> without refreshing, and so the CPU will be spent most of the time there.
> Note that a small CPU time is given back to the OS then, so that not
> 100% of the CPU is used.
>
> Just check the CPU usage and the framerate in the different cases (small
> window, big window, with a 3D window manager, without...), and report!
>
That is interesting. I'm trying to wrap my head around it. My question
at this point is if something changed on my system or in Gambas. I've
tried this on two different PCs with two different video cards (with and
without the binary NVIDIA driver and with vsync and desktop compositing
disabled), and even ran the server code from several months ago with the
same effect.
I placed a "Stop Event" after the "Inc FPS_Frames" statement in the
Screen_Draw procedure in the test project and it actually reduced the
frame rate from 236 to 157. That's the opposite of what I would have
expected based on your description of what is happening.
I think I understand the logic. Basically, if stuff doesn't need to be
drawn on screen, call "Stop Event" at the beginning of the Screen_Draw
procedure (the SDL draw event) to avoid its unneeded overhead. It
doesn't work as expected unfortunately (unless I'm too thick to
understand what you're saying of course!).
I just installed the mesa-utils and ran glxgears, which is reporting
about 677 FPS at approximately 1920x1080. Running GambasGears at
resolutions between 128x128 and 1920x1080 all produce a frame rate of
233. Running it with everything but the frame rate calculation removed
from the Screen_Draw procedure produces a frame rate of 237 (similar to
my test project), which I tested through resolutions between 128x128 and
1920x1080. BeastScroll's FPS maxes out at 227 FPS at any resolution.
In order to see if it's something in Gambas or on my system I decided to
try some native games using SDL and OpenGL. I installed OpenArena from
the repositories and it runs around between 800 and 1000 FPS at 640x480
and around 200-300 FPS at 1920x1080 with all settings on high. I
installed Nexuiz and on default settings at 1920x1080 get between 150
and 200 FPS, which is pretty damn good for Nexuiz.
I think something funky is going on with gb.sdl at this point, as it
seems to affect any SDL/OpenGL application including the example
programs, while non-Gambas SDL/OpenGL applications achieve remarkable
frame rates even under heavy load.
I don't need Gambas to change, but some sort of documented workaround
may be in order if I'm correct in my suspicions. All these tests were
performed using NVIDIA driver 304.108 with vsync disabled on a GeForce
GTX 650 with 1024 MB of VRAM in Kubuntu 13.04 ia64, latest daily Gambas
build.
--
Kevin Fishburne
Eight Virtues
www:http://sales.eightvirtues.com
e-mail:sales at ...1887...
phone: (770) 853-6271
More information about the User
mailing list