Don't count on it. HDMI doesn't even do 48Hz, at minimum it'd have to be sped up to 50Hz.
I mean, this all sounds cool in a way that I'd love to eat cake all day and not have to go to the gym afterwards. But at that point, just render it all in UnReal, because we aren't making movies anymore. And I promise, I'm not a "celluloid or die person", I worked with the Viper Filmstream and have embraced new tech the whole way. But at some point, y'all, this is just what a film looks like and how it works to be made. But I'll keep a weather eye out for this.Long-term, one thing that could pre-empt everything, including this Ars article:
A framerateless video file format!
(Early research is already ongoing! ETA 2030s)
...All frame rates are native in a framerateless file. No discrete frames.
...You can still output perfect Hollywood FIlmmaker Mode 24fps out of it.
...Yet, you can also output native 23.976fps or 25fps out of it too, with no audio slowdown/speedup!
...You can still prioritize cinematography at a specific framerate (lighting, pan tables, etc)
...You can have director preferred framerate metadata (e.g. 24fps).
...From the same master video file, you can still export multiple print files at multiple different frame rates (e.g. .MP4 or whatever at 23.976, 24, 25, 48, 50, 59.94, 60, 120, 240, etc) for compatible distribution purposes.
A framerateless file format, is metaphorically like a temporal RAW format (the temporal equivalent to a spatial RAW format). There are many ways to record frameratelessly that is being researched by sensors, via codecs, and by other methods. This is superior to black box interpolators, in that all frame rates are natively built into the same file.
(More than 6 different possible approaches to this, are being evaluated to pull this off, ranging from the rough equivalent of timecoded photons to other sensor-level tweaks, to delay the framerate requirement to a post-process task and/or real-time playback task and/or distribution prints. Sufficient optimizations have been found that file sizes are predicted to be smaller than a discrete 500fps-1000fps file, making framerateless files a manageable format for mastering stage)
(And... Wrong exposure length? Too much blur? Too little blur? Long after you recorded, camera shutter speed can even be edited in post-process in a framerateless file format, since camera shutter is metadata instead during continuous photon capture into a file. As long as you captured enough photons into the file -- you can fix underexposed/overexposed scenes, even without artificial processing, if you want to avoid it. Or improve things with algorithms that improves with the extra firehose of data. Many kinds of processing is optional! You can use newly available algorithms that likes the extra data, to successfully combine brightness of long exposures with the temporal accuracy of short exposures, to get bright high speed video out of a framerateless file, if you wanted. But obviously, this is optional)
And do other framerates like 59.94fps or 60fps, as well as 120fps, 240fps, if the user wished (or the director wished, via "preferred framerate" metadata that can still be overriden by users for accessibility sake). High and low framerates in one file, and all odd-divisors accomodated, because there's no divisors in a framerateless file.
This would be superior to fake frame interpolation, since all frame rates are native frame rates (in an analog continuum) in a framerateless video file. This is very early talks (for 2030s), and would solve quite many problems, including no need for integer divisors, fix audio speedup/slowdown, and solve more fake-frame problems. A way-lesser-evil than the current status quo.
There will be some new workflows needed, but you can still prioritize to a 24fps workflow with a framerateless file format. This may not affect your generation of cinematographers, but the next-generation. This is in early lab stuff and researcher stuff, so it's not really easily googleable yet.
Don't count on it. HDMI doesn't even do 48Hz, at minimum it'd have to be sped up to 50Hz.
Huh. I thought it was just me - my SO doesn't notice anything to do with visual fidelity, so anything I notice I have to run past me, myself, and I to validate.On oled, the issue is that 24 fps panning is simply way too juddery than lcd and movie theater
But that requires your TV to support VRR.. Likely many/most panels will sync to both 50Hz and 60Hz as a bare minimum, though. No idea about 48Hz, difficult to say, it's close to 50Hz so maybe the TCON will sync to it, maybe not.Modern HDMI supports VRR, so it can practically do 48 or 96 Hz.
I'm nearly 100% sure both HDMI and Blu-rays support native 48 fps. Worst case, HDMI and most modern panels support VRR, which certainly goes down to 48 fps.This is what I don't understand. Video games consoles can run games at lower FPS than the 60Hz refresh rate. In fact, the FPS can fluctuate. Not to mention TVs can display 24 FPS from a 60Hz refresh rate. So why can't BluRays display at 48 FPS?
30Hz is a standard refresh rate. From the actual HDMI org spec, FHD@48Hz isn't a thing, but 4k@48Hz is.Could you cite your source for that? There were early FHD screens that could only do 30Hz (and manufacturers claiming that 60Hz was impossible - while WUXGA monitors using the same underlying signalling were fine). The original single link HDMI connector was effectively single-link DVI-D with an audio stream added, and that 13Hz (WQUXGA) panel I referred to was running over DVI; it could hit 48Hz with dual-link and two cables. I don't think the custom EDID entries are limited (even if there are no low frame rate default entries - I've not looked, and DisplayID is after my time), and you could always send a random signal to the display and see whether it gets accepted even if it's not in the EDID (often fine with CRTs, and there were CRT monitors that had DVI-D inputs).
I'm less familiar with HDMI than DVI, so maybe I'm about to learn something, but I wasn't aware of any such limit (other than Apple deciding some display modes aren't worthy of being presented).
Because nobody made that a standard, there's nothing to view at that resolution basically so no-one is asking for it. I'm not cognizant enough with BD players to guess or know what they'd do encountering h264 file with 48fps encoding, probably "undefined" covers it.This is what I don't understand. Video games consoles can run games at lower FPS than the 60Hz refresh rate. In fact, the FPS can fluctuate. Not to mention TVs can display 24 FPS from a 60Hz refresh rate. So why can't BluRays display at 48 FPS?
4k48-60 is a thing on HDMI spec, so theoretically that should work. However, since there literally isn't a single BD with that framerate AFAIK, I wouldn't really count on it.I'm nearly 100% sure both HDMI and Blu-rays support native 48 fps. Worst case, HDMI and most modern panels support VRR, which certainly goes down to 48 fps.
You are thinking of HDTV standards, including ATSC or official VESA modes, etc. as well as HDMI recommendations, not the HDMI protocol.Don't count on it. HDMI doesn't even do 48Hz, at minimum it'd have to be sped up to 50Hz.
Not necessarily.But that requires your TV to support VRR..
I remember waaaay back in the days of 3d studio 3.0 we used to repeat the 24th frame to get the smooth 25fps.NTSC speed-down, anyone? Pal speed-up is when they run 24fps slightly faster for 25fps transmissions in case you're not familiar. I've seen people claim you can't do that, but dude, you've been watching that content your whole life. Life is just 4% faster in Europe.
@Aurich I noticed direct-to-post permalinks are currently broken.It's part of why I posted the ergonomic consideration.
Long-term, one thing that could pre-empt everything, including this Ars article:
A framerateless video file format!
(Early research is already ongoing! ETA 2030s)
...All frame rates are native in a framerateless file. No discrete frames.
...You can still output perfect Hollywood FIlmmaker Mode 24fps out of it.
...Yet, you can also output native 23.976fps or 25fps out of it too, with no audio slowdown/speedup!
...You can still prioritize cinematography at a specific framerate (lighting, pan tables, etc)
...You can have director preferred framerate metadata (e.g. 24fps).
...From the same master video file, you can still export multiple print files at multiple different frame rates (e.g. .MP4 or whatever at 23.976, 24, 25, 48, 50, 59.94, 60, 120, 240, etc) for compatible distribution purposes.
A framerateless file format, is metaphorically like a temporal RAW format (the temporal equivalent to a spatial RAW format). There are many ways to record frameratelessly that is being researched by sensors, via codecs, and by other methods. This is superior to black box interpolators, in that all frame rates are natively built into the same file.
(More than 6 different possible approaches to this, are being evaluated to pull this off, ranging from the rough equivalent of timecoded photons to other sensor-level tweaks, to delay the framerate requirement to a post-process task and/or real-time playback task and/or distribution prints. Sufficient optimizations have been found that file sizes are predicted to be smaller than a discrete 500fps-1000fps file, making framerateless files a manageable format for mastering stage)
(And... Wrong exposure length? Too much blur? Too little blur? Long after you recorded, camera shutter speed can even be edited in post-process in a framerateless file format, since camera shutter is metadata instead during continuous photon capture into a file. As long as you captured enough photons into the file -- you can fix underexposed/overexposed scenes, even without artificial processing, if you want to avoid it. Or improve things with algorithms that improves with the extra firehose of data. Many kinds of processing is optional! You can use newly available algorithms that likes the extra data, to successfully combine brightness of long exposures with the temporal accuracy of short exposures, to get bright high speed video out of a framerateless file, if you wanted. But obviously, this is optional)
And do other framerates like 59.94fps or 60fps, as well as 120fps, 240fps, if the user wished (or the director wished, via "preferred framerate" metadata that can still be overriden by users for accessibility sake). High and low framerates in one file, and all odd-divisors accomodated, because there's no divisors in a framerateless file.
This would be superior to fake frame interpolation, since all frame rates are native frame rates (in an analog continuum) in a framerateless video file. This is very early talks (for 2030s), and would solve quite many problems, including no need for integer divisors, fix audio speedup/slowdown, and solve more fake-frame problems. A way-lesser-evil than the current status quo.
There will be some new workflows needed, but you can still prioritize to a 24fps workflow with a framerateless file format. This may not affect your generation of cinematographers, but the next-generation. This is in early lab stuff and researcher stuff, so it's not really easily googleable yet.
Now there's something that could actually use some attention... many European broadcasts/shows are so janky when viewed on platforms delivering 60Hz (or 30p).
@Aurich I noticed direct-to-post permalinks are currently broken.
It semi-works but when there is a AJAX page load latency of over 50ms (I'm in Canada), there is a race condition in the code.
Any plans to fix?
Perhaps add midpage jumps it to a setTimeout() inside an .onload event so the jump happens after AJAX render + one pass of main loop. Or hacky way, you could poll the Ajax-complete if you are forced to rely on external plugins to render the comments AJAX.
Either way it doesn't work 90%+ of the time for users in a different city than the web server. It's a comments-render AJAX latency-triggered race condition it seems.
One of three things happens, suggestive of a race condition:
Upon clicking the post permalink;
1. It does nothing (goes to top only)
2. It works (jumps to post). Mainly if AJAX is fast.
3. It works then jumps back to top. Mainly if AJAX is slightly lagged.
Usually (2) happens.
Well, our movies were faster, but our games were slower.NTSC speed-down, anyone? Pal speed-up is when they run 24fps slightly faster for 25fps transmissions in case you're not familiar. I've seen people claim you can't do that, but dude, you've been watching that content your whole life. Life is just 4% faster in Europe.
I think you may be slightly confused here.This is what I don't understand. Video games consoles can run games at lower FPS than the 60Hz refresh rate. In fact, the FPS can fluctuate. Not to mention TVs can display 24 FPS from a 60Hz refresh rate. So why can't BluRays display at 48 FPS?
Yup, you can do all kinds of things, if the TCON plays ball, computer monitors are usually not that picky, I've got freesync-capable 3440x1440p75 monitor in my bedroom that does go down to 30Hz if my memory serves, enabling full freesync experience although it's not on Nvidia's list (so you have to force it).You are thinking of HDTV standards, including ATSC or official VESA modes, etc. as well as HDMI recommendations, not the HDMI protocol.
Some HDMI paper sheets (PDF) may rubber stamp certain modes but, fundamentally, HDMI logical protocol layer is a bitbucket that can do any refresh rate, if both ends supports it. HDMI 1.0 is based on DVI which supported any fractional refresh rate. Newer specs extended the protocol.
Almost all computers and HDMI computer monitors are able to do it, e.g. 1080p at 75Hz and such. Some, not all TVs, supports these additionals.
Even old fashioned HDMI 1.0 was able to unofficially do any Hz within its bandwidth specifications.
I did custom Hz from a PC over HDMI in year 2012! It is up to the display to support it.
Today, you can easily do custom refresh rates from a computer over HDMI via NVIDA Custom Resoluton feature ToastyX CRU EDID override utility.
In another tech spin for displays that only supports custom refresh rates during VRR, there is an unconventional EDID tweak. You can also use a fixed Hz encapsulated inside a VRR transport, some devices supports that. I have successfully output 48Hz and 96Hz from a Retrotink 4K to an ordinary LG 4K OLED HDTV.
FWIW, I got odd refresh rates such as 225Hz and 73.5Hz (non-VRR) working over HDMI from my computer - just tested now to an OLED computer monitor. It's as easy as creating it via your favourite Custom Resolution Utility (CRU) app that is compatible with your GPU.
Not necessarily.
There are pre-VRR multisync fixed Hz flat panels that don't support VRR, that supports an analog continuum of vertical refresh rates.
These situations can happen:
1. Display firmware or scaler sticks to the minimum HDMI support despite almost all HDMI chips supporting custom Hz. Out of sync.
2. Display supports custom fixed-Hz non-VRR
3. Display supports VRR (which can be used for varying and non-varying Hz)
4. Display supports only (2) or (3)
5. Display supports both (2) AND (3)
Peter Jackson made some people sick with the Hobbit and now the rest of us can’t have nice things.Why can't they just shoot and distribute in 60fps. 24fps isn't enough.
It's still way too early to have high-level writeups yet, since it's still researcher-grade caliber stuff that still is incubating papers.Do you have a high level writeup on how this would work? The only thing that's coming to mind is storing it as 3d rendering instructions; but while that might work in theory for CGI the gap between even a top end consumer GPU and a professional render farm that doesn't need to output in real time has me questioning the feasibility even for deep pocketed videophiles, never mind people trying to watch on their phone.
There is already an experimental framerateless camera sensor (0.92 megapixels), the IMX 636 sensor developed by Sony x Prophesee. It simply timestamps (per-pixel) changes in brightness, rather than a traditional global framerate. You can use three of these sensors for R/G/B channel, to get full color framerateless into a timestamped firehose. Relevant spec data:(More than 6 different possible approaches to this, are being evaluated to pull this off, ranging from the rough equivalent of timecoded photons to other sensor-level tweaks, to delay the framerate requirement to a post-process task and/or real-time playback task and/or distribution prints. Sufficient optimizations have been found that file sizes are predicted to be smaller than a discrete 500fps-1000fps file, making framerateless files a manageable format for mastering stage)
On 120hz oled, panning at 24fps is more juddery than lcd and projector.Why would a 120hz screen need to interpolate a 24p source given that 120 is evenly divisible by 24? Am I misunderstanding something?
You can, when upping the fps for certain scenes. Like only for panning scenes.The soap opera effect is the appearance of the movie having been filmed with a high frame rate (like soap operas often are). You can't have frame interpolation and avoid it.
Unfortunately, I don't think there's an answer for cinema projection.Absolutely can’t stand how bad 24FPS looks either at the cinema or in the living room. If I wanted to watch a slideshow, I’d cue up the family photo album.
Twenties.Why cant we just start shooting films and stuff in 120hz and be done with it.
24fps is what ... sixties-ear tech.
The 24fps standard was established when films moved from silent to synchronized sound.Why cant we just start shooting films and stuff in 120hz and be done with it.
24fps is what ... sixties-ear tech.
When the TV is connected to your computer, do those settings become available on your PC? Like, can you change them from Control Panel?Samsung S90D, to be exact. When it detects that it is connected to a PC laptop via HDMI, it announces that it is "optimizing" the display for best picture quality.
It then locks me into just two viewing modes: Gaming and Enhanced.
When hooked to the PS5, the TV has a much wider range of modes, including Standard, Gaming, and Filmmaker.
If I switch the HDMI port I can sometimes make the TV forget that it is connected to a PC and restore those modes while using it, but it was simpler to just rename the port "Game Console." As soon as the name didn't have "PC" in the title, the full range of options reappeared.
I have a 2020 LG CX and a 2025 S90D. The S90D has the edge on image quality, but the LG CX actually gets out of my way more effectively.
Can I ask what movies you prefer it on?You say that, but I certainly prefer it on certain genres.
Point being - it's my TV and my content being displayed. If I want this stuff enabled, it shouldn't be disabled by some stupid half-baked embedded tech.
It's 4%, just like it's not 0.02 cents ;-)For 25p in 60Hz, this is usually done by just slowing and conforming the whole master down by .04%
I'm sorry that you have experienced headaches while watching films in a theater. When we break them, there are usually other factors that make the calculations moot. Paul Greengrass's frantic editing on the Bourne films he directed are an example of throwing those rules out the window, but there it works for persistence of motion.
And hey, this is all anecdotal on a thread post, but in all the years I have seen films and visual media (spots, shows) projected in a public theater or theater mastering room I have never once experienced this outside of a server-side issue in a mastering suite. Where either the projector and server were not talking to each other correctly or it was rendered improperly for the DCI. Incidentally, I happened to watch the 4k BluRay extended addition of RotK this past week and those pans are buttery smooth to my eye.
Depends on your definition of "decent", not everything supports variable framerates. Or even nearly everything.Any decent TV can match the frame rate of the source. We don't need a new standard! This is the most ridiculous thing I've heard.
or just do 120hzThe true fix would be variable refresh-rate TV. It should not be hard,
I don't have anything to contribute other than a thank you. I cherish subject matter experts coming out to play at Ars. We appreciate you!Sadly. I hate that.
I hate pulldown judder too. I've been hired to work on the firmware of many home theater video processors (Runco, Faroudja, Key Digital etc), and most recently Retrotink 4K.
Recently, I assisted with the 3:2 pulldown deinterlacer + dejudder in Retrotink 4K (retrogaming video processor). It can even convert 60fps to 48/72/96/120 removing judder from 24fps material.
Silky smooth if you're playing VHS or DVD material through a Retrotink 4K -- that's my contribution. It even supports "pseudo VRR" for LG OLEDs, so you can do 72Hz and 96Hz fixed-Hz via "fixed VRR".
We even have an easter egg enhancement to the Retrotink 4K video processor. Those TVs don't support custom refresh rates without VRR, so we added a modification to piggyback a fixed-Hz inside a VRR transport, to allow VRR TVs to work at custom fixed-Hz refresh rates. You can even, optionally, add BFI to simulate 35mm projector double strobe or triple strobe, if desired (48Hz flicker for 24fps material simulating 180-degree shutter, inside a 96Hz transport using software-BFI + fixed-Hz-in-VRR-transport)
Be noted Retrotink 4K was designed for retro boxes (e.g. 8bit/16bit consoles) but users are also using it to enhance VHS/DVD material too, as well as early 1080i HDTV material (e.g. D-VHS).
Low-framerate motion on OLEDs look harsher than on LCDs, because there's no GtG pixel response to do a micro-fade between refresh cycles. See high speed video of refresh cycles.
Instead of interpolation -- one better approach would have been to use an optional slow-GtG emulator filter for those who dislike the harsh motion of "low-framerates-on-fast-response", but don't want fake frames.
Imagine an equivalent of ~2/1000sec-equivalent scene fade (by "accident" due to slow 2ms GtG pixel response) between 24fps frames apparently help lot to soften the abruptness. It's sort of like a 1/500sec camera shutter, except added display side via a micro cross-fade between refresh cycles.
The open source display shader initiative currently attempts to address this -- One of my works in progress is an LCD GtG simulator shader for OLEDs.
However, even without interpolation, even 60fps sometimes looks smoother on LCDs than OLEDs. This is because at GtG=0, stutter detection (regular stutter; zero-judder/erratics) equalizes with flicker fusion threshold. So motion doesn't look smooth on OLEDs until around ~85fps (depends on your individual flicker fusion threshold). There's a great stutter-to-blur animation demo that demonstrates how your eyes/display blends stutter to blur and back, at different frame rates.
I think more display-processing-algorithm control should be put into user's hands, for additional options that are more organic.
On oled, the issue is that 24 fps panning is simply way too juddery than lcd and movie theater
The old films shown in a cinema at 24fps were fine, it is the new digital stuff in the theaters that is bad. Really bad in the beginning of the digital transition. Almost unwatchable back then.Then why do you choose to break them so much? Basically every movie I've ever seen in a theater has constant visible jumps during panning. Half of LOTR is sweeping vistas and they look terrible and give me a headache.
Only video games have actual smooth panning shots. Movies are uniformly terrible. (48Hz films were fine but everyone else hated them.)