HDR10+ Advanced joins Dolby Vision 2 in trying to make you like motion smoothing

Fluppeteer

Ars Tribunus Militum
1,708
Subscriptor++
Don't count on it. HDMI doesn't even do 48Hz, at minimum it'd have to be sped up to 50Hz.

Could you cite your source for that? There were early FHD screens that could only do 30Hz (and manufacturers claiming that 60Hz was impossible - while WUXGA monitors using the same underlying signalling were fine). The original single link HDMI connector was effectively single-link DVI-D with an audio stream added, and that 13Hz (WQUXGA) panel I referred to was running over DVI; it could hit 48Hz with dual-link and two cables. I don't think the custom EDID entries are limited (even if there are no low frame rate default entries - I've not looked, and DisplayID is after my time), and you could always send a random signal to the display and see whether it gets accepted even if it's not in the EDID (often fine with CRTs, and there were CRT monitors that had DVI-D inputs).

I'm less familiar with HDMI than DVI, so maybe I'm about to learn something, but I wasn't aware of any such limit (other than Apple deciding some display modes aren't worthy of being presented).
 
Upvote
2 (2 / 0)

excelleron

Smack-Fu Master, in training
6
Long-term, one thing that could pre-empt everything, including this Ars article:

A framerateless video file format!

(Early research is already ongoing! ETA 2030s)

...All frame rates are native in a framerateless file. No discrete frames.
...You can still output perfect Hollywood FIlmmaker Mode 24fps out of it.
...Yet, you can also output native 23.976fps or 25fps out of it too, with no audio slowdown/speedup!
...You can still prioritize cinematography at a specific framerate (lighting, pan tables, etc)
...You can have director preferred framerate metadata (e.g. 24fps).
...From the same master video file, you can still export multiple print files at multiple different frame rates (e.g. .MP4 or whatever at 23.976, 24, 25, 48, 50, 59.94, 60, 120, 240, etc) for compatible distribution purposes.

A framerateless file format, is metaphorically like a temporal RAW format (the temporal equivalent to a spatial RAW format). There are many ways to record frameratelessly that is being researched by sensors, via codecs, and by other methods. This is superior to black box interpolators, in that all frame rates are natively built into the same file.

(More than 6 different possible approaches to this, are being evaluated to pull this off, ranging from the rough equivalent of timecoded photons to other sensor-level tweaks, to delay the framerate requirement to a post-process task and/or real-time playback task and/or distribution prints. Sufficient optimizations have been found that file sizes are predicted to be smaller than a discrete 500fps-1000fps file, making framerateless files a manageable format for mastering stage)

(And... Wrong exposure length? Too much blur? Too little blur? Long after you recorded, camera shutter speed can even be edited in post-process in a framerateless file format, since camera shutter is metadata instead during continuous photon capture into a file. As long as you captured enough photons into the file -- you can fix underexposed/overexposed scenes, even without artificial processing, if you want to avoid it. Or improve things with algorithms that improves with the extra firehose of data. Many kinds of processing is optional! You can use newly available algorithms that likes the extra data, to successfully combine brightness of long exposures with the temporal accuracy of short exposures, to get bright high speed video out of a framerateless file, if you wanted. But obviously, this is optional)


And do other framerates like 59.94fps or 60fps, as well as 120fps, 240fps, if the user wished (or the director wished, via "preferred framerate" metadata that can still be overriden by users for accessibility sake). High and low framerates in one file, and all odd-divisors accomodated, because there's no divisors in a framerateless file.

This would be superior to fake frame interpolation, since all frame rates are native frame rates (in an analog continuum) in a framerateless video file. This is very early talks (for 2030s), and would solve quite many problems, including no need for integer divisors, fix audio speedup/slowdown, and solve more fake-frame problems. A way-lesser-evil than the current status quo.

There will be some new workflows needed, but you can still prioritize to a 24fps workflow with a framerateless file format. This may not affect your generation of cinematographers, but the next-generation. This is in early lab stuff and researcher stuff, so it's not really easily googleable yet.
I mean, this all sounds cool in a way that I'd love to eat cake all day and not have to go to the gym afterwards. But at that point, just render it all in UnReal, because we aren't making movies anymore. And I promise, I'm not a "celluloid or die person", I worked with the Viper Filmstream and have embraced new tech the whole way. But at some point, y'all, this is just what a film looks like and how it works to be made. But I'll keep a weather eye out for this.
 
Upvote
4 (5 / -1)
Don't count on it. HDMI doesn't even do 48Hz, at minimum it'd have to be sped up to 50Hz.

This is what I don't understand. Video games consoles can run games at lower FPS than the 60Hz refresh rate. In fact, the FPS can fluctuate. Not to mention TVs can display 24 FPS from a 60Hz refresh rate. So why can't BluRays display at 48 FPS?
 
Upvote
2 (2 / 0)

Barleyman

Ars Tribunus Militum
2,266
Subscriptor++
Modern HDMI supports VRR, so it can practically do 48 or 96 Hz.
But that requires your TV to support VRR.. Likely many/most panels will sync to both 50Hz and 60Hz as a bare minimum, though. No idea about 48Hz, difficult to say, it's close to 50Hz so maybe the TCON will sync to it, maybe not.
 
Upvote
0 (1 / -1)
This is what I don't understand. Video games consoles can run games at lower FPS than the 60Hz refresh rate. In fact, the FPS can fluctuate. Not to mention TVs can display 24 FPS from a 60Hz refresh rate. So why can't BluRays display at 48 FPS?
I'm nearly 100% sure both HDMI and Blu-rays support native 48 fps. Worst case, HDMI and most modern panels support VRR, which certainly goes down to 48 fps.

The bigger problem is streaming boxes, actually. Apple TV as usual is the front runner and supports full mode switching, but only if you dig into Settings and enable it. AFAIK, Apple TV even supports instant framerate changes if the TV does. Though color mode changes still require a full renegotiation which is why they're disabled by default. (And annoyingly the screen saver no longer runs in the native video mode, because I guess people thought that looked worse than blanking the screen every time it starts or stops.)
 
Upvote
4 (4 / 0)

Barleyman

Ars Tribunus Militum
2,266
Subscriptor++
Could you cite your source for that? There were early FHD screens that could only do 30Hz (and manufacturers claiming that 60Hz was impossible - while WUXGA monitors using the same underlying signalling were fine). The original single link HDMI connector was effectively single-link DVI-D with an audio stream added, and that 13Hz (WQUXGA) panel I referred to was running over DVI; it could hit 48Hz with dual-link and two cables. I don't think the custom EDID entries are limited (even if there are no low frame rate default entries - I've not looked, and DisplayID is after my time), and you could always send a random signal to the display and see whether it gets accepted even if it's not in the EDID (often fine with CRTs, and there were CRT monitors that had DVI-D inputs).

I'm less familiar with HDMI than DVI, so maybe I'm about to learn something, but I wasn't aware of any such limit (other than Apple deciding some display modes aren't worthy of being presented).
30Hz is a standard refresh rate. From the actual HDMI org spec, FHD@48Hz isn't a thing, but 4k@48Hz is.

It bears to be mentioned that those are standard rates, your equipment may grok different framerates but no guarantee.

https://s3-us-west-2.amazonaws.com/...Speed-HDMI-Cable-Data-Rate-Table-20250102.png

1762315619955.png
 
Upvote
1 (1 / 0)

Barleyman

Ars Tribunus Militum
2,266
Subscriptor++
This is what I don't understand. Video games consoles can run games at lower FPS than the 60Hz refresh rate. In fact, the FPS can fluctuate. Not to mention TVs can display 24 FPS from a 60Hz refresh rate. So why can't BluRays display at 48 FPS?
Because nobody made that a standard, there's nothing to view at that resolution basically so no-one is asking for it. I'm not cognizant enough with BD players to guess or know what they'd do encountering h264 file with 48fps encoding, probably "undefined" covers it.

I'm nearly 100% sure both HDMI and Blu-rays support native 48 fps. Worst case, HDMI and most modern panels support VRR, which certainly goes down to 48 fps.
4k48-60 is a thing on HDMI spec, so theoretically that should work. However, since there literally isn't a single BD with that framerate AFAIK, I wouldn't really count on it.

You certainly can have resolutions over HDMI which are not standard, e.g. the list doesn't cover 1440p at all.
 
Upvote
3 (3 / 0)

mdrejhon

Ars Praefectus
3,122
Subscriptor
Don't count on it. HDMI doesn't even do 48Hz, at minimum it'd have to be sped up to 50Hz.
You are thinking of HDTV standards, including ATSC or official VESA modes, etc. as well as HDMI recommendations, not the HDMI protocol.

Some HDMI paper sheets (PDF) may rubber stamp certain modes but, fundamentally, HDMI logical protocol layer is a bitbucket that can do any refresh rate, if both ends supports it. HDMI 1.0 is based on DVI which supported any fractional refresh rate. Newer specs extended the protocol.

Almost all computers and HDMI computer monitors are able to do it, e.g. 1080p at 75Hz and such. Some, not all TVs, supports these additionals.

Even old fashioned HDMI 1.0 was able to unofficially do any Hz within its bandwidth specifications.

I did custom Hz from a PC over HDMI in year 2012! It is up to the display to support it.

Today, you can easily do custom refresh rates from a computer over HDMI via NVIDA Custom Resoluton feature ToastyX CRU EDID override utility.

In another tech spin for displays that only supports custom refresh rates during VRR, there is an unconventional EDID tweak. You can also use a fixed Hz encapsulated inside a VRR transport, some devices supports that. I have successfully output 48Hz and 96Hz from a Retrotink 4K to an ordinary LG 4K OLED HDTV.

FWIW, I got odd refresh rates such as 225Hz and 73.5Hz (non-VRR) working over HDMI from my computer - just tested now to an OLED computer monitor. It's as easy as creating it via your favourite Custom Resolution Utility (CRU) app that is compatible with your GPU.

But that requires your TV to support VRR..
Not necessarily.

There are pre-VRR multisync fixed Hz flat panels that don't support VRR, that supports an analog continuum of vertical refresh rates.

These situations can happen:

1. Display firmware or scaler sticks to the minimum HDMI support despite almost all HDMI chips supporting custom Hz. Out of sync.
2. Display supports custom fixed-Hz non-VRR
3. Display supports VRR (which can be used for varying and non-varying Hz)
4. Display supports only (2) or (3)
5. Display supports both (2) AND (3)
 
Last edited:
Upvote
4 (4 / 0)

TheBaconson

Ars Scholae Palatinae
920
NTSC speed-down, anyone? Pal speed-up is when they run 24fps slightly faster for 25fps transmissions in case you're not familiar. I've seen people claim you can't do that, but dude, you've been watching that content your whole life. Life is just 4% faster in Europe.
I remember waaaay back in the days of 3d studio 3.0 we used to repeat the 24th frame to get the smooth 25fps.
I forget the reasoning behind it, it was 30ish years ago, and not something I was terribly into to.
 
Upvote
0 (0 / 0)

mdrejhon

Ars Praefectus
3,122
Subscriptor
It's part of why I posted the ergonomic consideration.
@Aurich I noticed direct-to-post permalinks are currently broken.

It semi-works but when there is a AJAX page load latency of over 50ms (I'm in Canada), there is a race condition in the code.

Any plans to fix?

Perhaps add midpage jumps it to a setTimeout() inside an .onload event so the jump happens after AJAX render + one pass of main loop. Or hacky way, you could poll the Ajax-complete if you are forced to rely on external plugins to render the comments AJAX.

Either way it doesn't work 90%+ of the time for users in a different city than the web server. It's a comments-render AJAX latency-triggered race condition it seems.

One of three things happens, suggestive of a race condition:

Upon clicking the post permalink;
1. It does nothing (goes to top only)
2. It works (jumps to post). Mainly if AJAX is fast.
3. It works then jumps back to top. Mainly if AJAX is slightly lagged.

Usually (3) happens.

To reproduce:
A. Test permalinks INSIDE Ars posts (on the main website, not the forums).
B. Then try intentionally delaying comments load slower, or using some kind of network throttling utility (e.g. Network Link Conditioner) to force (3) or (1) to happen more often than (2).
 
Last edited:
Upvote
3 (3 / 0)

DanNeely

Ars Legatus Legionis
16,119
Subscriptor
Long-term, one thing that could pre-empt everything, including this Ars article:

A framerateless video file format!

(Early research is already ongoing! ETA 2030s)

...All frame rates are native in a framerateless file. No discrete frames.
...You can still output perfect Hollywood FIlmmaker Mode 24fps out of it.
...Yet, you can also output native 23.976fps or 25fps out of it too, with no audio slowdown/speedup!
...You can still prioritize cinematography at a specific framerate (lighting, pan tables, etc)
...You can have director preferred framerate metadata (e.g. 24fps).
...From the same master video file, you can still export multiple print files at multiple different frame rates (e.g. .MP4 or whatever at 23.976, 24, 25, 48, 50, 59.94, 60, 120, 240, etc) for compatible distribution purposes.

A framerateless file format, is metaphorically like a temporal RAW format (the temporal equivalent to a spatial RAW format). There are many ways to record frameratelessly that is being researched by sensors, via codecs, and by other methods. This is superior to black box interpolators, in that all frame rates are natively built into the same file.

(More than 6 different possible approaches to this, are being evaluated to pull this off, ranging from the rough equivalent of timecoded photons to other sensor-level tweaks, to delay the framerate requirement to a post-process task and/or real-time playback task and/or distribution prints. Sufficient optimizations have been found that file sizes are predicted to be smaller than a discrete 500fps-1000fps file, making framerateless files a manageable format for mastering stage)

(And... Wrong exposure length? Too much blur? Too little blur? Long after you recorded, camera shutter speed can even be edited in post-process in a framerateless file format, since camera shutter is metadata instead during continuous photon capture into a file. As long as you captured enough photons into the file -- you can fix underexposed/overexposed scenes, even without artificial processing, if you want to avoid it. Or improve things with algorithms that improves with the extra firehose of data. Many kinds of processing is optional! You can use newly available algorithms that likes the extra data, to successfully combine brightness of long exposures with the temporal accuracy of short exposures, to get bright high speed video out of a framerateless file, if you wanted. But obviously, this is optional)


And do other framerates like 59.94fps or 60fps, as well as 120fps, 240fps, if the user wished (or the director wished, via "preferred framerate" metadata that can still be overriden by users for accessibility sake). High and low framerates in one file, and all odd-divisors accomodated, because there's no divisors in a framerateless file.

This would be superior to fake frame interpolation, since all frame rates are native frame rates (in an analog continuum) in a framerateless video file. This is very early talks (for 2030s), and would solve quite many problems, including no need for integer divisors, fix audio speedup/slowdown, and solve more fake-frame problems. A way-lesser-evil than the current status quo.

There will be some new workflows needed, but you can still prioritize to a 24fps workflow with a framerateless file format. This may not affect your generation of cinematographers, but the next-generation. This is in early lab stuff and researcher stuff, so it's not really easily googleable yet.

Do you have a high level writeup on how this would work? The only thing that's coming to mind is storing it as 3d rendering instructions; but while that might work in theory for CGI the gap between even a top end consumer GPU and a professional render farm that doesn't need to output in real time has me questioning the feasibility even for deep pocketed videophiles, never mind people trying to watch on their phone.
 
Upvote
2 (2 / 0)

DanNeely

Ars Legatus Legionis
16,119
Subscriptor
Now there's something that could actually use some attention... many European broadcasts/shows are so janky when viewed on platforms delivering 60Hz (or 30p).

At least in theory 600 FPS displays should be able to natively display both 24, 25, and 30 FPS content (along with a mess of faster modes) natively.

I know I've seen esport displays approaching that level, so we're probably not that far off from being able to do it on displays that maintain other forms of visual quality while still pushing refresh rates.
 
Upvote
0 (0 / 0)

DanNeely

Ars Legatus Legionis
16,119
Subscriptor
@Aurich I noticed direct-to-post permalinks are currently broken.

It semi-works but when there is a AJAX page load latency of over 50ms (I'm in Canada), there is a race condition in the code.

Any plans to fix?

Perhaps add midpage jumps it to a setTimeout() inside an .onload event so the jump happens after AJAX render + one pass of main loop. Or hacky way, you could poll the Ajax-complete if you are forced to rely on external plugins to render the comments AJAX.

Either way it doesn't work 90%+ of the time for users in a different city than the web server. It's a comments-render AJAX latency-triggered race condition it seems.

One of three things happens, suggestive of a race condition:

Upon clicking the post permalink;
1. It does nothing (goes to top only)
2. It works (jumps to post). Mainly if AJAX is fast.
3. It works then jumps back to top. Mainly if AJAX is slightly lagged.

Usually (2) happens.

This has been an issue since the most recent major site redesign. At the time neither Aurich or the web developer were able to reproduce it while I had (and still have) the first behavior almost universally. AFAIK they weren't sure what was causing it at the time.
 
Upvote
1 (1 / 0)

Marlor_AU

Ars Tribunus Angusticlavius
7,734
Subscriptor
NTSC speed-down, anyone? Pal speed-up is when they run 24fps slightly faster for 25fps transmissions in case you're not familiar. I've seen people claim you can't do that, but dude, you've been watching that content your whole life. Life is just 4% faster in Europe.
Well, our movies were faster, but our games were slower.

NTSC games targeting 60Hz became 50Hz in PAL territories. 30Hz became 25Hz.

While the 4% speedup on films was unnoticeable (apart from slightly shorter runtime), the 17% slowdown in games was actually something that had an impact. The first time I fired up an NTSC ROM in a PC emulator, I thought something was wrong. The pitch of the audio was different, and the music seemed more hectic.
 
Upvote
1 (1 / 0)

DanNeely

Ars Legatus Legionis
16,119
Subscriptor
This is what I don't understand. Video games consoles can run games at lower FPS than the 60Hz refresh rate. In fact, the FPS can fluctuate. Not to mention TVs can display 24 FPS from a 60Hz refresh rate. So why can't BluRays display at 48 FPS?
I think you may be slightly confused here.

The baseline behavior of modern games just doesn't generate frames at the same rate that they're output to the display. They either repeat frames if they're running slower than output or discard some if running faster; with the main difference being if they switch immediately when a new frame is generated (aka vsync off, which can generate visible tearing in games with rapid changes in what's being shown) or wait until the start of the next frame on the display (vsync on, which can result in noticable stuttering if the game falls below the displays operational framerate). These fixed frame rate modes were the only ones available for LCD/OLED displays when DVD and Bluerate standards were created.

Newer systems support frame by frame variable refresh rates, but require support at both the output and display ends. On PC monitors Variable Refresh Rate is still being treated as a premium feature for gaming displays (I don't know how available it is on TVs); in theory a video player could use VRR mode to do a native 48hz output but for disc based media would probably require a new generation of hardware and supported modes. With discs in what appears to be terminal decline the odds of that happening are unlikely IMO. Maybe some streaming platforms may eventually add support for 48hz content; although if it's a small enough niche may all decide it's not worth bothering with even at a future point when VRR is ubiquitous.
 
Upvote
1 (1 / 0)

Barleyman

Ars Tribunus Militum
2,266
Subscriptor++
You are thinking of HDTV standards, including ATSC or official VESA modes, etc. as well as HDMI recommendations, not the HDMI protocol.

Some HDMI paper sheets (PDF) may rubber stamp certain modes but, fundamentally, HDMI logical protocol layer is a bitbucket that can do any refresh rate, if both ends supports it. HDMI 1.0 is based on DVI which supported any fractional refresh rate. Newer specs extended the protocol.

Almost all computers and HDMI computer monitors are able to do it, e.g. 1080p at 75Hz and such. Some, not all TVs, supports these additionals.

Even old fashioned HDMI 1.0 was able to unofficially do any Hz within its bandwidth specifications.

I did custom Hz from a PC over HDMI in year 2012! It is up to the display to support it.

Today, you can easily do custom refresh rates from a computer over HDMI via NVIDA Custom Resoluton feature ToastyX CRU EDID override utility.

In another tech spin for displays that only supports custom refresh rates during VRR, there is an unconventional EDID tweak. You can also use a fixed Hz encapsulated inside a VRR transport, some devices supports that. I have successfully output 48Hz and 96Hz from a Retrotink 4K to an ordinary LG 4K OLED HDTV.

FWIW, I got odd refresh rates such as 225Hz and 73.5Hz (non-VRR) working over HDMI from my computer - just tested now to an OLED computer monitor. It's as easy as creating it via your favourite Custom Resolution Utility (CRU) app that is compatible with your GPU.


Not necessarily.

There are pre-VRR multisync fixed Hz flat panels that don't support VRR, that supports an analog continuum of vertical refresh rates.

These situations can happen:

1. Display firmware or scaler sticks to the minimum HDMI support despite almost all HDMI chips supporting custom Hz. Out of sync.
2. Display supports custom fixed-Hz non-VRR
3. Display supports VRR (which can be used for varying and non-varying Hz)
4. Display supports only (2) or (3)
5. Display supports both (2) AND (3)
Yup, you can do all kinds of things, if the TCON plays ball, computer monitors are usually not that picky, I've got freesync-capable 3440x1440p75 monitor in my bedroom that does go down to 30Hz if my memory serves, enabling full freesync experience although it's not on Nvidia's list (so you have to force it).

I'm familiar with EDID tweaks, I poinked the useless 4096 resolution off the TV list as it was confusing games that thought it's the preferred resolution. It can also be handy to remove resolutions you don't care about so games don't show bazillion different non-native resolutions.

WRT HDMI chip supporting variable refresh, that's one thing, but the TCON has to do it as well. There's also supporting and "supporting", I've got Denon AVR that says it supports all kinds of things, but in reality it's glitching all the time if you try to run VRR sources through it. Thankfully new TVs give you several fast HDMI ports.. Also even if the HDMI chip supports various refresh rates, the TCON has to do that as well, there's also the lower level panel hardware that needs to play ball, I can't remember anymore what the actual panel LCD driver is called, been a while since I worked on displays.

..

Anyways, BD spec does not do 48Hz, so if you want your hobbit on BD, you either need to speed it up to 50Hz or convert it to 60Hz, which is not going to look good. I spent a little time digging on what "soap operas" are actually available for home viewing, I'm happy to report you have three Cameron movies to pick from on Disney+, Titanic (obviously "remastered"), Avatar (also remastered) and Avatar WoW (native!). There's a gotcha, though, you gotta watch them on Apple Vision Pro, or no HFR for you.

The buzzword these days seems to be "TrueCut Motion", it allows the filmmaker a fine control over the framerate, so you can do camera pans on not-eye-bleeding 48Hz and drop down to 24Hz for static shots. I'm sure some neckbeards would complain and in fact I think most of the titles are not in fact shot in 48Hz but they do motion interpolation on camera pans and screen it at 48Hz. Apparently Kung Fu Panda 4 was first feature animation to be presented in HFR, but mostly in China as Chinese like their films judder free. You'd have to be a sleuth to find one in Europe, though, apparently dolby cinemas are able to do it but that hardly means they actually do.

Here's flatpanels article about 48Hz, it seems it only appeared in HDMI 2.1 and it's optional, not mandated.
https://www.flatpanelshd.com/focus.php?subaction=showfull&id=1734507498
 
Upvote
2 (2 / 0)

The Kaleideion

Wise, Aged Ars Veteran
115
Why can't they just shoot and distribute in 60fps. 24fps isn't enough.
Peter Jackson made some people sick with the Hobbit and now the rest of us can’t have nice things.

Absolutely can’t stand how bad 24FPS looks either at the cinema or in the living room. If I wanted to watch a slideshow, I’d cue up the family photo album.
 
Upvote
6 (10 / -4)

mdrejhon

Ars Praefectus
3,122
Subscriptor
Do you have a high level writeup on how this would work? The only thing that's coming to mind is storing it as 3d rendering instructions; but while that might work in theory for CGI the gap between even a top end consumer GPU and a professional render farm that doesn't need to output in real time has me questioning the feasibility even for deep pocketed videophiles, never mind people trying to watch on their phone.
It's still way too early to have high-level writeups yet, since it's still researcher-grade caliber stuff that still is incubating papers.

Some of the ongoing research works haven't reached preprint yet, but elements of technologies towards framerateless video is already being actively developed.

If you want high-level writeups, they are in scattered posts such as this Google Researcher @LaurieWired thread on social media and others.

1762323704531.png


1762323590809.png


But I do confirm that the GPU geometry method is one of the multiple possible methods.

However, I also refer to custom camera sensors:

(More than 6 different possible approaches to this, are being evaluated to pull this off, ranging from the rough equivalent of timecoded photons to other sensor-level tweaks, to delay the framerate requirement to a post-process task and/or real-time playback task and/or distribution prints. Sufficient optimizations have been found that file sizes are predicted to be smaller than a discrete 500fps-1000fps file, making framerateless files a manageable format for mastering stage)
There is already an experimental framerateless camera sensor (0.92 megapixels), the IMX 636 sensor developed by Sony x Prophesee. It simply timestamps (per-pixel) changes in brightness, rather than a traditional global framerate. You can use three of these sensors for R/G/B channel, to get full color framerateless into a timestamped firehose. Relevant spec data:
  • Pixel latency @ 1000 lux (μs): <100
  • Pixel latency @ 5 lux (μs): <1000
  • Maximum clock frequency (MHz): 100
So for normal brightness scenery (outdoors), that's good enough timestamp precision to produce perfect 23.976fps 24fps 25fps, from a sensor, already available today in sampling. See PDF Datasheet. The per-pixel event based nature of the sensor unlocks ginormous dynamic range far bigger than the dynamic range of typical camera sensors:
  • Dynamic Range* (dB): >86 (5 lux – 100 klux) / >120 (80 mlux – 100klux)
Right now, framerateless camera sensor technology (experimental sampling today) is already ahead of the framerateless video file format (not yet extant). Currently, the main difficulty is distilling the firehose into an efficient framerateless video file format.

There are multiple framerateless approaches being researched simultaneously in research circles.
  • Event based sensors (neuromorphic);
    This is more visible to human vision processing. There are already sensors that do this. See Dynamic Vision Sensor (DVS) on Google Search already, which timestamps changes to brightness per pixel, instead of doing a discrete frame rate.
  • Photon-timestamped sensors;
    Scattered related lineitem research (Google Scholar) keywords include "subframe timestamping" and "single-photon imaging", like single-pixel or few-pixel cameras capable of extreme frame rates (billion frames per second), sufficient for literal photon timestamping. Not usable for framerateless video currently, the niche subcategories of related research, that, someday, may combine (with multiple related research) into a full-frame framerateless sensor with sufficient bandwidch.
  • High sample rate captures into continuous integration
    Example: use brute CMOS readout rates, like 10,000fps sensor that simply converts it into defacto photon timestamping, via ASICs layered directly underneath the camera sensor to prevent the firehose. Converting classic massive-oversampled framerates to a framerateless format realtime in a reversible way (reliably output high quality low framerates)
  • Temporal neural encoding approaches (camera sensors mimicking retinas + some addons to add fine timestamping) - might be combined with above approach.
  • Waveform encoded light fields;
  • Geometry approaches (yes, your GPU idea)
    Example: Encode the whole scene as GPU geometry so detailed that it's perceptually lossless when downconverted to flat 4K. Requiring a GPU to render in order to playback a video/movie. This can have some benefits to creating better 3D, as well as easier imports to games/interactive content, by using the same camera to produce content for a movie & content for a game (etc).
Etc.

Some of these are way "out there". Like how the MEMS fairytale in 1980s Scientific American magazine, actually someday eventually became commotidized (motion sensors in smartphones). In the 1980s I grew up with reading articles about MEMS innovation, which sounded very science fiction. But those 1980s fairy tales, actually is in every single smartphone in the market, as gyroscopes, accelerometers, and magnetometers built using silicon lithography fabrication methods, built into chips. I simply mention this as "an outlandish concept that eventually became reality" where researchers are talking about how to pull off framerateless in the laboratory.

Multiple approaches (2 or 3 of the above) are actually looking to become viable within a decade, while others will be dismissed as too compute-heavy or bandwidth-heavy. Early research is currently showing massive promise.

While Laurie entioned ~2038 for framerateless video standardization. I think file format experiments will arrive ~2030, but standardizing (H.268, H.269) would definitely probably be way closer to 2040.

AFAIK, there's over a dozen researchers I noticed talking about framerateless technology -- that's only the ones I've noticed so far.

Even as I grew up and enjoy 24fps (and prefer it for Hollywood, but not for games), it's noteworthy that real life effectively has no discrete frame rate.

The concept of a framerate, while technologically convenient, is an artificial humankind invention. It creates various kinds of side effects (stroboscopics, motion blurs, wagonwheel effect, "fake frame" requirement if you want more frames, conversion inflexibility, loss of quality when converting to different frame rates, audio sync problems, speedups/slowdowns (23.976 vs 24 vs 25). You have various artifacts that necessarily make it look different from real life, whether intentional or unintentional (even if you best-effort try to make it match real life), like Stroboscopic Effect of Finite Frame Rates, which is still visible even at 500fps.

As time passes, more creators want more and more flexibility and control over in the source material towards booming content consumption (movies, television, streaming, AR, VR, gaming, of all kinds of frame rates, etc). The ability to generate perfect Hollywood Filmmaker Mode 24fps (or 25fps or 23.976fps with no audio reprocessing needed) as well as output perfect triple-digit framerates without needing classic black-box interpolation. All sourced from the same framerateless file format.

NOTE: For those new thread joiners not following the concept of framerateless video file format, refer to my earlier big post on Page 1 of the thread; you can still output a Hollywood Filmmaker Mode 24fps out of a framerateless video file format. Or 23.976fps (with no audio slowdown) or 25fps (with no audio speedup), or 120fps without needing to use your TV's fake frame feature. Or change camera shutter speed in post-process. There's many superpowers that a framerateless video file format can do.

I guess this reply is a defacto writeup, maybe even one of the first-ish Internet posts that actually lists multiple framerateless approaches to a more mainstream forum outside researcher circles. Although I've posted about framerateless formats in other venues (e.g. "Cinematography of the 2030s" in RED Camera forums) as early as 2018-2019, but I did not deep dive into the technological components as I did in this comment.

I should convert my (this) post into some article on my website (Blur Busters) at some point in 2026.
 
Last edited:
Upvote
9 (9 / 0)
Why would a 120hz screen need to interpolate a 24p source given that 120 is evenly divisible by 24? Am I misunderstanding something?
On 120hz oled, panning at 24fps is more juddery than lcd and projector.

Google oled movie judder, or oled 24fps judder for complains for years
 
Upvote
0 (0 / 0)
Upvote
1 (1 / 0)

Marlor_AU

Ars Tribunus Angusticlavius
7,734
Subscriptor
Absolutely can’t stand how bad 24FPS looks either at the cinema or in the living room. If I wanted to watch a slideshow, I’d cue up the family photo album.
Unfortunately, I don't think there's an answer for cinema projection.

If you're used to 24Hz projection, high frame rates can look "stagey", fake and unnatural... or at worst, cause motion sickness.

If you're used to motion interpolation and high frame rates, 24Hz projection can look jittery... or at worst, cause motion sickness.

Frankly, I'm used to both and can bear with either, but I generally find films look more natural at 24Hz, and that's what I choose when I'm in control.

For home use, allowing users to choose what suits them is fine. But cinemas are in a lose/lose situation. Their existing audience is mostly used to 24Hz projection, but many younger viewers, used to motion interpolation, want faster frame rates. They can't satisfy both.
 
Upvote
3 (4 / -1)

Marlor_AU

Ars Tribunus Angusticlavius
7,734
Subscriptor
Why cant we just start shooting films and stuff in 120hz and be done with it.
24fps is what ... sixties-ear tech.
The 24fps standard was established when films moved from silent to synchronized sound.

Most silent films were filmed at 16-20fps. The exact speed was determined by how fast the cameraman cranked the handle of the camera. In fact, cameramen would often undercrank to make the action more intense. It was then up to the cinema projectionist to determine the best speed to play the film back at.

However, when synchronized sound was introduced, a precise frame rate was needed to ensure the audio played back at the correct pitch and with correct synchronization. Western Electric - which developed the Vitaphone system used in the first talkies - landed on 24fps as a good frame rate to use, so this became the standard.

The Jazz Singer launched with 24fps synchronized sound in 1927, and it has been the standard ever since.
 
Upvote
15 (15 / 0)

Larry-Burns

Seniorius Lurkius
35
Subscriptor++
Samsung S90D, to be exact. When it detects that it is connected to a PC laptop via HDMI, it announces that it is "optimizing" the display for best picture quality.

It then locks me into just two viewing modes: Gaming and Enhanced.

When hooked to the PS5, the TV has a much wider range of modes, including Standard, Gaming, and Filmmaker.

If I switch the HDMI port I can sometimes make the TV forget that it is connected to a PC and restore those modes while using it, but it was simpler to just rename the port "Game Console." As soon as the name didn't have "PC" in the title, the full range of options reappeared.

I have a 2020 LG CX and a 2025 S90D. The S90D has the edge on image quality, but the LG CX actually gets out of my way more effectively.
When the TV is connected to your computer, do those settings become available on your PC? Like, can you change them from Control Panel?
 
Upvote
0 (0 / 0)

Polykin

Wise, Aged Ars Veteran
112
Subscriptor
As mentioned by many posters, motion smoothing is an important option for accessibility for medical reasons.

I have epilepsy, and even though it is only slightly flicker-sensitive it still causes discomfort triggered by low frame rates. The UFO-test becomes epilepsy-uncomfortable within seconds at 30 fps or lower and would send me spiraling within a minute

EDIT: Removed less relevant paragraphs
 
Last edited:
Upvote
4 (4 / 0)

Demento

Ars Legatus Legionis
15,477
Subscriptor
Is it going to detect what sort of television you have?

For point of reference, I have motion smoothing off on the old LCD television. LCD refresh smear/blurring does an adequate job all on its own of making quick pans not too awful to look at. I also have an OLED screen. OLED pixels have as close to zero response time as is possible. Quick pans look awful without smoothing set to 3/10 or so. There's a "cinema mode" that does basically the same thing, though I'm sure some would be horrified to see "cinema" and motion smoothing next to each other.

Clearly you wouldn't want the same smoothing settings for the two different screen techs.
 
Upvote
3 (3 / 0)

highly-erratic

Wise, Aged Ars Veteran
142
I'm sorry that you have experienced headaches while watching films in a theater. When we break them, there are usually other factors that make the calculations moot. Paul Greengrass's frantic editing on the Bourne films he directed are an example of throwing those rules out the window, but there it works for persistence of motion.

And hey, this is all anecdotal on a thread post, but in all the years I have seen films and visual media (spots, shows) projected in a public theater or theater mastering room I have never once experienced this outside of a server-side issue in a mastering suite. Where either the projector and server were not talking to each other correctly or it was rendered improperly for the DCI. Incidentally, I happened to watch the 4k BluRay extended addition of RotK this past week and those pans are buttery smooth to my eye.

To add my anecdote, I'm not sure I have ever seen a 24fps film with smooth pans. The Star Wars ones seem particularly egregious as examples of stuttery messes I have to look away from.

With some 4k content even small camera movements following a character through a scene can have detail vanish from the scenery, temporarily lost in the motion blur, only to snap back once the camera stops moving. Wish I could unsee that after noticing it.

Everything just looks artificial. Ditching this archaic 100 year old 24fps nonsense can't come soon enough!
 
Upvote
8 (11 / -3)
I've been trying to keep up with the two new formats and have come away with one question. These all seem to be TV led from what I've read. Planning on buying an Apple TV whenever they launch the new model. Does it also have to support these and if so, can they be added in updates if not there at launch of the device? Or is the device agnostic and it's simply metadata on the video that any device can send to the new tv's that support it?
 
Upvote
0 (0 / 0)

Achilles

Ars Scholae Palatinae
941
Subscriptor
Sadly. I hate that.

I hate pulldown judder too. I've been hired to work on the firmware of many home theater video processors (Runco, Faroudja, Key Digital etc), and most recently Retrotink 4K.

Recently, I assisted with the 3:2 pulldown deinterlacer + dejudder in Retrotink 4K (retrogaming video processor). It can even convert 60fps to 48/72/96/120 removing judder from 24fps material.

Silky smooth if you're playing VHS or DVD material through a Retrotink 4K -- that's my contribution. It even supports "pseudo VRR" for LG OLEDs, so you can do 72Hz and 96Hz fixed-Hz via "fixed VRR".

We even have an easter egg enhancement to the Retrotink 4K video processor. Those TVs don't support custom refresh rates without VRR, so we added a modification to piggyback a fixed-Hz inside a VRR transport, to allow VRR TVs to work at custom fixed-Hz refresh rates. You can even, optionally, add BFI to simulate 35mm projector double strobe or triple strobe, if desired (48Hz flicker for 24fps material simulating 180-degree shutter, inside a 96Hz transport using software-BFI + fixed-Hz-in-VRR-transport)

Be noted Retrotink 4K was designed for retro boxes (e.g. 8bit/16bit consoles) but users are also using it to enhance VHS/DVD material too, as well as early 1080i HDTV material (e.g. D-VHS).


Low-framerate motion on OLEDs look harsher than on LCDs, because there's no GtG pixel response to do a micro-fade between refresh cycles. See high speed video of refresh cycles.

Instead of interpolation -- one better approach would have been to use an optional slow-GtG emulator filter for those who dislike the harsh motion of "low-framerates-on-fast-response", but don't want fake frames.

Imagine an equivalent of ~2/1000sec-equivalent scene fade (by "accident" due to slow 2ms GtG pixel response) between 24fps frames apparently help lot to soften the abruptness. It's sort of like a 1/500sec camera shutter, except added display side via a micro cross-fade between refresh cycles.

The open source display shader initiative currently attempts to address this -- One of my works in progress is an LCD GtG simulator shader for OLEDs.

However, even without interpolation, even 60fps sometimes looks smoother on LCDs than OLEDs. This is because at GtG=0, stutter detection (regular stutter; zero-judder/erratics) equalizes with flicker fusion threshold. So motion doesn't look smooth on OLEDs until around ~85fps (depends on your individual flicker fusion threshold). There's a great stutter-to-blur animation demo that demonstrates how your eyes/display blends stutter to blur and back, at different frame rates.

I think more display-processing-algorithm control should be put into user's hands, for additional options that are more organic.
I don't have anything to contribute other than a thank you. I cherish subject matter experts coming out to play at Ars. We appreciate you!
 
Upvote
7 (7 / 0)
On oled, the issue is that 24 fps panning is simply way too juddery than lcd and movie theater
Then why do you choose to break them so much? Basically every movie I've ever seen in a theater has constant visible jumps during panning. Half of LOTR is sweeping vistas and they look terrible and give me a headache.

Only video games have actual smooth panning shots. Movies are uniformly terrible. (48Hz films were fine but everyone else hated them.)
The old films shown in a cinema at 24fps were fine, it is the new digital stuff in the theaters that is bad. Really bad in the beginning of the digital transition. Almost unwatchable back then.
 
Upvote
1 (1 / 0)