Timer Comparison

Started by bobjob, October 03, 2010, 12:32:02

Previous topic - Next topic


Indeed, notice the DEBUG thing where I use Thread.sleep(1) instead of yield(). That's to stop Chaz's laptop overheating in Spain and shutting down. But in the released game it gives me an absolutely flat 60fps; sleep(1) occasionally drops 5 frames randomly for no explained reason.

Cas :)


aye, same here. tho' my other laptop is not in spain but downstairs.

my solution over thread.yield() is .. v-sync on, so far.


Have you tried mixing the "sleep" method
so if the need to wait is large use sleep(1) if its small, use yeild()


yep, it's probably the best trade-off.
as here http://andy-malakov.blogspot.com/2010/06/alternative-to-threadsleep.html its about finding the "sleep precision".


I've been trying to get smooth timing for a little while, and finally got it working very smoothly (on Windows XP).

If the game loop is something like

<advance logic, render>
<sleep for 1/desired_fps minus however long the above took>

it will be fairly imprecise - I was experiencing very frequent jumps where a frame would take 2-3x longer than it's supposed to, just because the sleep() call took that much longer than expected.  So Thread.sleep(5) would return after 15+ ms.

The interesting part is, if you call Thread.sleep(1) often, it greatly improves timer precision.  Literally changing nothing else, and adding a new daemon thread that calls Thread.sleep(1) in an infinite loop, with no other interaction with the rest of the code, completely eliminated any jerkiness in the main game loop.

Weird, I know.  What's even more interesting is that this will help even if it's done in a separate process, not just a separate thread.


the daemon Thread.sleep (Integer.MAX_VALUE) or similar does the trick too.  ???


Quote from: basil on November 12, 2010, 21:31:07
the daemon Thread.sleep (Integer.MAX_VALUE) or similar does the trick too.  ???

Hmm.  I've tried setting it to Thread.sleep(2) and it was noticeably worse than sleep(1).  Haven't tried Integer.MAX_VALUE though, perhaps it behaves differently - I'll give it a shot tonight.

There's some native method you call in windows to increase timer precision.  So you usually call that method, call getTime, and then call another method indicating you no longer want increased precision.  The timer is system-wide, though, not just for your process.

So perhaps the timer precision isn't increased immediately, and takes a bit to kick in?  That would sort of explain this situation.  Sort of.