setNativeCursor's delay, what is the expected delay behavior? [solved]

Started by MadJack, June 08, 2012, 11:38:59

Previous topic - Next topic

MadJack

I'm implementing custom cursors in jME3, but it seems there's something wrong in the animated cursors' speed.

What I have done is an importer for animated cursors as defined with the RIFF standard. In this format the DisplayRate, what is called delay in LWJGL, is defined as:

Quote
DisplayRate is the default frame rate in jiffies (one jiffy equals 1/60th of a second). The rate is actually how long an animation frame remains on the screen before the next frame is displayed. If no Rate subchunk is present in an ANI file, this value is used as the duration that all frames will appear on the display. A rate of 0 indicates that there is no delay between frames.

The subchunk's rate is defined as:
Quote
Rates is an array of JIF rate values. The number of values is equal to the value of the NumSteps field in the Header subchunk. Each value corresponds, by position, to a step in the animation (value 0 corresponds to the first step, value 1 to the second step, and so on).

Essentially those two values act the same way; one is global for all frames (DisplayRate) or used for every frame (rate subchunk).

Following the logic above, after import and using the same values used in either DisplayRate or Rate, animation speed is faster on LWJGL.

My questions are, what is the delay behavior when it comes to animation? Does it uses jiffies? Is it an arbitrary time-span separated from Microsoft's implementation?

Hopefully this is clear.

Thanks.

MadJack

Took the devil by the horns and wrangled it down to the ground. ;)

So after getting the sources I found out the delay is in milliseconds.

It would be very much appreciated if the Javadoc reflected that.

Thanks.