Full page flash animations of cube     (borrowed from https://boallen.com/fps-compare.html)

15 FPS   

30 FPS                

60 FPS


60hz mouse vs 120hz mouse (cursor pictures borrowed from user ToastyX on various forums)
Note that the desktop is not demanding so supplies abundance of "fps", so each refresh is filled
with a new movement state. In a game you'd have to supply a min 60fps@60hz and 120fps@120hz
in order to show a new position at each hz screen update like the mouse cursor does.
Otherwise you will still show the last scene update/mouse cursor position through the next hz update(s)
for a longer "freeze-frame" gap. I.e. if the desktop were updated 60fps at 120hz, they would both look like the
first mouse picture at 60screen updates.


120fps+120hz vs 60, 30, 15.. out of 1000hz ~ 1 second

120hz vs 60hz representation attempting to show the greater motion and animation definiton.
It would actually be even more defined at 120hz+120fps (3 frames vs 6 frames not 3 vs 5).

Note that to fill each refresh with a more recent frame of action/animation as shown in
this particular 5:3 example you would have to supply 100fps at 100hz-120hz and 60fps at 60hz.

More Frame Rate Comparisons

(Of course there could be a much more imperfect timing/rendering of frames than explained below, incl glitches/judders, etc - this is just the raw number comparisons).

30fps would have action slices "freeze-framed" 32.2ms each while the 120hz+ 120fps user would have seen 4 game world action state updates in the same period,
had 4 times smoother motion definition displayed, and would be allowed more chances and sooner chances to react to and/or input different actions
(adjust trajectories/paths of travel, duck/jump/shoot/initiate defenses, intiate healing, etc).

60fps would have action slices "freeze-framed" 16.6ms each while the 120hz user would have seen 2 game world action state updates in the same period,
had twice the smoothness of motion definition displayed, and would be allowed more chances and sooner chances to input different actions.

misc FPS below 120 would have varying rate comparisons but still would be freeze-framing often compared to high fps at 120hz.

80 fps would attempt to send a new scene update every 12.5ms.
On a g-sync monitor, the monitor's refresh rate would drop to 80hz and would actually update the screen every 12.5ms 1:1.
The 120fps+ at 120hz user would be shown 40 more frames in the same period obviously.
The 120fps+ at 120hz user would see frames 1/3 sooner, have 1/3 more motion definition, etc. than the 80fps g-sync user.

Comparing 80fps on a 120hz monitor(non-gsync) to a 120fps+120hz user:
120hz monitors update every 8.3ms. 80 frames sent would leave 40 updates "empty", requiring the same frame to be frequently shown more than once ("freeze framed").
1/2 (40) of the 80 frames would be shown 1:1 at 8.3ms each.
1/2 (40) of the 80 frames would have to be "freeze-framed" to 16.6ms each.
40 "frozen" through 2 updates (80 updates) + 40 at 1 frame per update (40) = 120 screen updates (120hz).
2/3 of the time, the 80fps user at 120hz is seeing 16.6ms "freeze frames" continue through two 120hz+120hz 8.3ms screen updates by comparison.
1/3 of the time, the 80fps user at 120hz is seeing the other 40 frames at 8.3ms.

40 fps would attempt to transmit a new scene update every 25ms on a g-sync monitor which dynamically switched to 40hz.
That would be the same frame shown through 3 screen updates of the 120fps+ 120hz user by comparion.

40 fps on a 60hz monitor without variable hz (gsync) results in a 40:60 ratio which would mean that:
1/3 (20 to be exact) of the 60hz screen updates would be "un-filled", so would have to be filled with "frozen-frames" through more than one screen update.
These "doubles" or "frame freezes" would be intermixed somehow throughout the 60 screen updates.
20 of the 40 frames could be shown at 16.6ms (1fps per 1hz of 60hz)
the remaining 20 of the 40 frames would have to be shown for 33.2ms (16.6ms x2) to equal 60 screen updates filled (20 through 2hz + 20 though 1hz).
2/3 of the time, the 40fps user at 60hz is seeing 33.2 ms "freeze-frames" continue through four 120hz+120fps 8.3ms screen updates by comparison
1/3 of the time, the 40fps user at 60hz is seeing the other 20 frames at 16.6 ms "freeze" frame through two 120hz+120fps 8.3ms screen updates.

100fps would attempt to transmit a new scene update every 10ms.
this would be 1/5 less motion definition than 120hz+120fps.
With g-sync at 100hz it would be just 1/5th less defined with the duration spread across 1000ms/second,
without g-sync 20 frames of the 100fps would be "freeze-framed" through two hz/120hz (=16.6ms) instead of 8.3ms updates.

120fps would attempt to transmit a new scene update every 8.3ms

144fps+144hz would attempt to transmit a new scene update every 6.94ms

Blur amount comparisons

Blur amount using a pursuit camera (which is a very close approximation of what our eyes see with sample-and-hold blur at different hz).
Images borrowed from www.blurbusters.com
Note that in a modern game you would not just be seeing a single simple cell-shaded object bluring, you would be seeing the entire viewport
full of high rez geometry, high rez textures, etc.. the whole game world bluring during your continual/repeated movement keying, mouse-look
flow pathing, and viewport affecting ability usages or other triggered events.



Lightboost (hack)

The images below's blur amounts would look even worse full screen. Shrunken images look more defined.
The game in the example is also not using very high detail textures, but it still provides some insight.
120hz without strobing still blurs out all high detail textures(incl. shaders, depth via bumpmapping, etc)
and text, just more within the "shadow mask" of onscreen objects. The "240hz" example is actually 120hz + backlight strobing.

ULMB mode (Ultra Low Motion Blur) / Low persistence mode:
"It actually gets it’s own hardware button on the kit’s monitor, as well, which makes it very convenient to enable/disable – it’s just to the left of the power button. I generally preferred G-Sync on, which meant ULMB wasn’t available, but on older games where I hold a solid huge FPS (or all of my 2D usage) it was quick and easy to enable. Hopefully a future update or revision allows for both G-Sync and ULMB, as that would truly be the best of both worlds.

ULMB is noticeably better than the lightboost hack on the same monitor, it’s still somewhat dimmer but it doesn’t wash out the colors or give any kind of tint to it. I don’t have a high-speed camera handy, or I’d take some sample pictures that could be compared against lightboost, however I’m sure blurbusters will jump in with these once the media ban is lifted. I’m quite impressed the improvements they made over straight lightboost, especially considering the kit has no settings for color tone or contrast. "

G-SYNC Mode:
Nvidia G-sync technology which, for the first time, enables perfect synchronization between the GPU and the display. NVIDIA G-SYNC technology eliminates tears, stutters and lag in gaming monitors. The result is consistently smooth frame rates and ultrafast response not possible with previous display technologies. Several years in the making, G-SYNC™ technology synchronizes the monitor's refresh rate to the GPU's render rate, so images display the moment they are rendered. -- Basically matches the monitor's HZ to the FPS dynamically, eliminating screen abberations like judder and tearing.

ULMB mode and G-Sync Mode are mutually exclusive

Even if they were ever able to combine ULMB mode with g-sync mode sometime in the future, the ULMB mode would have to cut out on the fly whenever you hit sub 100hz-fps because the strobing would be too slow then. For now, very high settings on demanding games that cause a fps roller coaster down to sub 60fps frame rates would benefit from g-sync dynamic hz more, and games/configs/and gpu power setups that can handle ~120fps consitently would benefit more from ULMB mode for zero blur FoV movement/zero-blur viewport motion pathing.

G-sync, Free-sync, and non-proprietary monitor tech
*note: There is an eizo fg2421 VA monitor that uses a backlight strobing tech for zero blur that is not tied to nvidia gpus like g-sync is. It doesn't have the dynamic hz function though. The current prototype of the oculus rift vr headset also utilizes high refresh rate oled screens with some sort of backlight strobing/screen blanking for blur elimination. Amd also has a dynamic hz technology called free-sync but nvidia seems to be ahead in the game and release schedule for this year.

"It’s entirely possible that both companies are telling the truth on this one. AMD may be able to implement a G-Sync-like technology on supported panels, and it could work with the manufacturers of scalar ASICs if G-Sync starts catching on for Nvidia. Nvidia, meanwhile, is probably telling the truth when it says it had to build its own hardware solution because existing chips for desktop displays weren’t doing the job."

"In the long run, if panel makers start building variable refresh rates into their own displays, than the need for Nvidia-specific G-Sync technology may fade out — but that doesn’t mean the company can’t make a pretty penny off the concept while it lasts. And since it’ll take time for panel manufacturers to adopt the capability if they choose to do so, it means Nvidia has a definite window of opportunity on the technology."