Adaptive Sync

classic Classic list List threaded Threaded
6 messages Options
adi
Reply | Threaded
Open this post in threaded view
|

Adaptive Sync

adi
Hello

Does something know with what function i can set set adaptice sync
in jogl?
In the Internet i have found sometinh how that wglSwapIntervalEXT,
but there is nothing in jogl.
Thanks


Reply | Threaded
Open this post in threaded view
|

Re: Adaptive Sync

Sven Gothel
Administrator
On 09/20/2014 10:59 AM, adi [via jogamp] wrote:
> Hello
>
> Does something know with what function i can set set adaptice sync
> in jogl?
> In the Internet i have found sometinh how that *wglSwapIntervalEXT*,
> but there is nothing in jogl.
> Thanks

xxxSwapInterval*(..) has been aliased via
  GL gl;
  gl.setSwapInterval(int swapInterval);
see GearsES2 for example.

However, 'adaptive sync' IMHO is something different,
i.e. more like an 'adaptive framerate/rendering'
where one likes to guarantee a certain framerate
and hence drops elements from rendering or reducing details
according to continuously measures frame duration ..

~Sven



signature.asc (828 bytes) Download Attachment
adi
Reply | Threaded
Open this post in threaded view
|

Re: Adaptive Sync

adi
Hi

In my understandig, apative vsync eliminating frame rate stuttering and screen tearing.
The graphic card sends the ok to show the next frame not the monitor at a fixed rate.
For Nvidea i have that found for this feature:
http://www.geforce.com/hardware/technology/adaptive-vsync

   
Reply | Threaded
Open this post in threaded view
|

Re: Adaptive Sync

Sven Gothel
Administrator
On 09/20/2014 12:02 PM, adi [via jogamp] wrote:
> Hi
>
> In my understandig,* apative vsync *eliminating frame rate stuttering and
> screen tearing.
> The graphic card sends the ok to show the next frame not the monitor at a
> fixed rate.
> For Nvidea i have that found for this feature:
> http://www.geforce.com/hardware/technology/adaptive-vsync

Thx for the link.
Glancing over it .. yes, purpose is the same as I described earlier.

Methodology may be different, i.e. also involves toggling vsync etc,
where the method I described is platform agnostic.

If you like to hint to the coding details, i.e. an OpenGL vendor extension
with an example - we may add it.

~Sven



signature.asc (828 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Adaptive Sync

Quizzical
In reply to this post by adi
There are some different things that it is important to keep straight.  Adaptive Sync is a technology to basically tell monitors to refresh whenever it makes sense, rather than at fixed intervals.  For example, if you have a new frame finish every 25 ms, the monitor would refresh every 25 ms instead of refreshing every 17 ms and sometimes doubling frames.  It's very new and requires monitors that don't exist yet, but should come out around the end of the year.  It looks to me like the sort of thing that is handled entirely in video driver and monitor firmware magic, so I'd expect it to work with JOGL.

What you seem to be talking about is Nvidia's Adaptive VSync, which is Nvidia's proprietary implementation of vertical sync that they brought with their Kepler and later video cards.  One common way that vertical sync is done is that when you finish rendering a frame, the GPU completely stops until the monitor grabs a frame, and then starts rendering the next frame.  This intrinsically caps your rendering speed at the monitor refresh rate.  A simple frame time counter demonstrates that JOGL does not do this.

Another implementation is what used to be called triple buffering.  The idea here is that when you finish rendering a frame, you always immediately start rendering another frame.  But whenever the monitor is ready to display a new frame, it switches to the most recent complete frame available to it.  This means that you don't display half of one frame and half of another, which is what causes "tearing".  I've tried very hard to get tearing behavior in JOGL and couldn't, so I suspect that JOGL does something along these lines.

One advantage of traditional vertical sync over triple buffering is that you reduce the GPU load (and hence temperatures and power consumption) if you're rendering at much faster than the monitor refresh rate.  If you have a refresh rate of 60 Hz and are rendering 180 frames per second, only every third frame shows up anyway and the other two get discarded.  It also requires fewer framebuffers, which saves on video memory.  That used to be significant, but with recent video cards having in the ballpark of 1 GB of video memory, saving 8 MB or so doesn't matter anymore.

The advantage of triple buffering is what happens if you're rendering at somewhat below the monitor refresh rate.  If you have a 60 Hz monitor and take 20 ms to render each frame, then vertical sync will slow your frame rate to 30 frames per second, not 50, as you'll be idle 2/5 of the time.  Triple buffering lets you keep the full 50 frames per second by doubling every fifth frame rather than every other one and going as fast as the video card can go.

The goal of Nvidia's Adaptive VSync is to have the best of both worlds.  The idea is to say, when you're done rendering a frame that took longer than the monitor refresh rate, don't stop rendering, but start the next frame immediately, like with triple buffering.  But when you're done rendering a frame that took less time than the monitor refresh rate, wait for the monitor to grab that frame before starting another.  That way, you get the power savings of traditional vertical sync at very high frame rates, but the maximal frames per second of triple buffering at low frame rates.

I'm not sure if Nvidia's offer of Adaptive VSync can be implemented purely through video drivers or if it requires some help from the game engine.  I do know that Champions Online had this behavior more than two years before Nvidia announced that they were offering Adaptive VSync with the launch of their Kepler cards two years ago, so it can also be done through the game engine.

Regardless, the whole "what do we do about slow frame rates" question will be mostly resolved by adaptive sync once people buy new hardware that supports it.  (Nvidia also offers G-Sync, which does basically the same thing but requires proprietary, expensive hardware.)  After that, the only thing to be gained from Nvidia's Adaptive VSync would be power savings from drawing fewer frames at very high frame rates--and even this would come at the expense of some milliseconds of display latency.
adi
Reply | Threaded
Open this post in threaded view
|

Re: Adaptive Sync

adi
Hi

Thank for your hints.
The main effort is eliminating frame rate stuttering and
screen tearing. AMD names it freesync.
Better Krohnos Group makes an open Standard from that.
Here a Link from the first Monitor:
http://www.kitguru.net/peripherals/monitors/anton-shilov/amd-demonstrates-the-first-freesync-display-prototype/