Motion smoothing has a bad reputation among most cinephiles, as well as many home theater enthusiasts and content creators. Also known as motion or video interpolation, motion smoothing is available in virtually every modern TV today. It’s supposed to remove judder from films and TV shows that are shot with 24p (24 frames per second) or 25p film and displayed on 60Hz or 120Hz TVs. But motion smoothing often results in the dreaded soap opera effect and unwanted visual artifacts.
Two upcoming HDR standards, HDR10+ Advanced and Dolby Vision 2, are looking to change how we perceive motion smoothing and more closely align motion interpolation with a creator’s vision. However, it’s unclear if these standards can pull that off.
HDR10+ Advanced’s Intelligent FRC
Today, Samsung provided details about the next version of the HDR10 format, which introduces six new features. Among HDR10+ Advanced’s most interesting features is HDR10+ Intelligent FRC (frame rate conversion), which is supposed to improve motion smoothing.
A TV using motion smoothing analyzes each video frame and tries to determine what additional frames would look like if the video were playing at a frame rate that matched the TV’s refresh rate. The TV then inserts those frames into the video. A 60Hz TV with motion smoothing on, for example, would attempt to remove judder from a 24p film by inserting frames so that the video plays as if it were shot at 60p. For some, this appears normal and can make motion, especially camera panning or zooming, look smoother. However, others will report movies and shows that look more like soap operas, or as if they were shot on higher-speed video cameras instead of film cameras. Critics, including some big names in Hollywood, argue that motion smoothing looks unnatural and deviates from the creator’s intended vision.

Yes, it is apparently happening more with younger than older. Here's why:
There are is a minority but bigger-% for a younger age than for an older age, among people who can't stand 24fps because their brains are no longer being conditioned to tolerate low frame rates on big screens.
TL;DR Reason: Fewer movie theater visits, more TVs with default interpolation, extreme exposure to high frame rate gaming, continual exposure to higher refresh rate screens, and other things like that, conspire to reduce 24fps acclimation factor.
While it's a still a minority of population (e.g. 1% vs 5%; exact numbers still unknown but is a big gap)...
...It is still definitely observed of a very clear pattern emerging. It is noticeably bigger with the younger generation now than, say, the Gen-X or boomer generation.
In my experience, more 24fps discomfort seem to be occuring with the new generation who did not grow up with low frame rates or interpolation-free TVs. While 24fps on a phone is usually no problem, large-FOV 24fps (big screen TVs, theaters) seem to trigger motion sickness in more of the younger generation than the older generation.
Several also haven't gone to movie theaters often enough, to acclimate to Hollywood Filmmaker Mode. While I am not a film maker, I'm considered an industry expert in this topic (see earlier Google Scholar link). The pandemic lockdowns did not help, acclimating many to interpolation-riddled TVs.
I make it a point to manually configure my TV to Hollywood Filmmaker Mode, as I usually prefer original 24fps for my movies, even if I prefer extreme frame rates for my videogames/VR. But I have noticed that the accessibility considerations now happen increasingly more and more, due to the reduced exposure/acclimating to large-FOV 24fps as of late.
It's really annoying -- but this is a data point in my business.
Most famous filmmakers are older than 40, so there's a slight ergonomics generation gap.
It's a pretty muddy rabbit hole. Ugh.