Whatever happened to 4K? The rise of “Ultra HD” TV

Status
Not open for further replies.

xoa

Ars Legatus Legionis
12,392
Subscriptor
Topevoli":2ydfr7m8 said:
When do we get uncompressed HD? Most "HD" Stations looks like utter crap. I shouldn't be seeing artifacts at the price they charge for cable/fios.
Never, because that would be dumb. Visual transparency can be achieved for most sources of 1080p at somewhere between 10-20 Mbps with Hi10p H.264 done with a good encoder (x264) by someone who knows what they're doing. Even MPEG2 becomes pretty lossless towards 40-50 Mbps.

As BananaBonanza said, the issue is that most HD Stations are morons/greedy and bit starve their streams in order to pump out as much as possible while simultaneously using pathetic encoding. Compression is good.
BananaBonanza":2ydfr7m8 said:
Compression is fine, we'd just need slightly higher bitrates... :(
It'd help if they had competent encoders too, or for that matter even used up-to-date stuff. I've seen plenty of "HD" broadcasts that still use MPEG2, let alone H.264 high profile. The stations don't want to spend the money though to change that, and to an extent it could be worse. People who really care will probably just get the Blu-rays later anyway (or in a dream future might be able to buy full quality normal MKVs online).
 
Upvote
7 (7 / 0)

inpher

Well-known member
3,697
cervier":aeg1w2qq said:
VashTheStampede":aeg1w2qq said:
We need more than just higher resolutions for TVs. The Hobbit is being shown at 48FPS, while Avatar sequels will be 60FPS. People *can* see the differences and many will prefer the higher frame rates on their programming.

Whatever the updated delivery mechanism (physical or streaming) to these higher resolutions should also include higher frame rate possibilities as well.

I thought that human eyes were not able to see the difference with framerate over 25-30 fps? In the case of the Hobbit I think the 48 fps is for 3D.

There are three things (possibly more, but three that I can remember) to consider with regards to the classic choice of frame rate.

Cost and weight of film. Human perception and persistence of vision. And movement on film.

They are all related in this context. The choice of 24 frames per second was chosen because it was the lowest frame rate at which enough people did not experience any negative effects of the frame/flicker/strobing film projector. It was optimised this way because of cost and logistics. 30 frames per second would be 25% more expensive and weigh 25% more than 24 fps, something which made both accountants, and film handlers happy. You can go as low as 12-15 fps while still retaining persistence of vision, but it is not going to look good.

The third thing is that cinematography has been sort of limited in what it can do with these 24 fps, for example panning and action scenes have never really looked good unless the panning is done very slowly or extremely fast, the middle ground has been avoided because it has had too many obvious artefacts. A higher frame rate will allow for more types of camera work.

…eh, I think the RED guys can explain better than I do, I recommend this link.
 
Upvote
11 (11 / 0)

ScifiGeek

Ars Legatus Legionis
19,001
lepoete73":2ekc8tsm said:
I think the term 4K is confusing. Until very recently, I thought that 4K meant around 4000 line resolution as we always refer to current resolutions by mentioning the number of lines (720p, 1080p) I simply assumed they also referred to the number of lines and not to 4 times the area in pixel as it is actually the case.


Actually, you were closer to correct the first time. It is NOT about 4 times the area.

It is about being 4K pixels (aprox) across:

The actually digital cinema standards are:

2K: 2048x1080 (1920x1080 at home)
4K: 4096x2160 (3840x2160 at home)
8K: 8192x4320 (7680x4320 at home)

Perhaps they are worried about getting some nitwit suing them for false advertising because the 4K screens for the home are not actually 4K pixels across?
 
Upvote
2 (3 / -1)

alxx

Ars Praefectus
4,997
Subscriptor++
aardman":3plxcagx said:
I predict 4K or ultra HD or whatever else it is called will never catch on. Very few people want their living rooms dominated by a monstrous screen. And even fewer people watch a movie and focus on the weave of the leading lady's knickers or the grain on the protagonist's leather sofa.

Its coming whether we want it or not.

I just wish they'd hurry up and get all this screen tech into pc monitors.
I want a full aperture 4k capable monitor (4096 × 3112) though I'd happily settle for 4096 × 2160.
 
Upvote
3 (3 / 0)
aardman":10p7e0ua said:
I predict 4K or ultra HD or whatever else it is called will never catch on. Very few people want their living rooms dominated by a monstrous screen. And even fewer people watch a movie and focus on the weave of the leading lady's knickers or the grain on the protagonist's leather sofa.

I pretty much focus exclusively on the leading lady's knickers.
 
Upvote
7 (7 / 0)
xoa":2jffyi1g said:
Topevoli":2jffyi1g said:
When do we get uncompressed HD? Most "HD" Stations looks like utter crap. I shouldn't be seeing artifacts at the price they charge for cable/fios.
Never, because that would be dumb. Visual transparency can be achieved for most sources of 1080p at somewhere between 10-20 Mbps with Hi10p H.264 done with a good encoder (x264) by someone who knows what they're doing. Even MPEG2 becomes pretty lossless towards 40-50 Mbps.

As BananaBonanza said, the issue is that most HD Stations are morons/greedy and bit starve their streams in order to pump out as much as possible while simultaneously using pathetic encoding. Compression is good.
BananaBonanza":2jffyi1g said:
Compression is fine, we'd just need slightly higher bitrates... :(
It'd help if they had competent encoders too, or for that matter even used up-to-date stuff. I've seen plenty of "HD" broadcasts that still use MPEG2, let alone H.264 high profile. The stations don't want to spend the money though to change that, and to an extent it could be worse. People who really care will probably just get the Blu-rays later anyway (or in a dream future might be able to buy full quality normal MKVs online).


MKV Is pretty much the only acceptable format for me these days. I had a couple with bad audio and I was able to trivially extract it and put a new track in without conversion.
 
Upvote
1 (2 / -1)

alxx

Ars Praefectus
4,997
Subscriptor++
xoa":3k5y0mz7 said:
Topevoli":3k5y0mz7 said:
When do we get uncompressed HD? Most "HD" Stations looks like utter crap. I shouldn't be seeing artifacts at the price they charge for cable/fios.
Never, because that would be dumb. Visual transparency can be achieved for most sources of 1080p at somewhere between 10-20 Mbps with Hi10p H.264 done with a good encoder (x264) by someone who knows what they're doing. Even MPEG2 becomes pretty lossless towards 40-50 Mbps.

As BananaBonanza said, the issue is that most HD Stations are morons/greedy and bit starve their streams in order to pump out as much as possible while simultaneously using pathetic encoding. Compression is good.
BananaBonanza":3k5y0mz7 said:
Compression is fine, we'd just need slightly higher bitrates... :(
It'd help if they had competent encoders too, or for that matter even used up-to-date stuff. I've seen plenty of "HD" broadcasts that still use MPEG2, let alone H.264 high profile. The stations don't want to spend the money though to change that, and to an extent it could be worse. People who really care will probably just get the Blu-rays later anyway (or in a dream future might be able to buy full quality normal MKVs online).

.red is an option , supposedly 2.5MB a second for streaming 4k (currently up to 4096 by 2160)

posting the links again so don't have to go back page of comments

http://www.reduser.net/forum/showthread ... RAY-ODEMAX
http://www.red.com/news/red-announces-p ... ith-odemax
http://odemax.com/information.html
 
Upvote
0 (1 / -1)

DelvenDarcaine

Wise, Aged Ars Veteran
141
For me, one thing HD brought that made upgrading from SD more exciting was the widescreen format (plus the space saving flatness). Since I already have a widescreen flat TV, just bumping up the resolution on it isn't nearly as exciting as the jump from my old CRT TV.

I have no doubt that I will eventually have an Ultra HDTV, but I won't be upgrading because of it like I did HDTV.
 
Upvote
2 (2 / 0)

crhilton

Ars Tribunus Militum
2,304
inpher":1cvoymzq said:
cervier":1cvoymzq said:
VashTheStampede":1cvoymzq said:
We need more than just higher resolutions for TVs. The Hobbit is being shown at 48FPS, while Avatar sequels will be 60FPS. People *can* see the differences and many will prefer the higher frame rates on their programming.

Whatever the updated delivery mechanism (physical or streaming) to these higher resolutions should also include higher frame rate possibilities as well.

I thought that human eyes were not able to see the difference with framerate over 25-30 fps? In the case of the Hobbit I think the 48 fps is for 3D.

There are three things (possibly more, but three that I can remember) to consider with regards to the classic choice of frame rate.

Cost and weight of film. Human perception and persistence of vision. And movement on film.

They are all related in this context. The choice of 24 frames per second was chosen because it was the lowest frame rate at which enough people did not experience any negative effects of the frame/flicker/strobing film projector. It was optimised this way because of cost and logistics. 30 frames per second would be 25% more expensive and weigh 25% more than 24 fps, something which made both accountants, and film handlers happy. You can go as low as 12-15 fps while still retaining persistence of vision, but it is not going to look good.

The third thing is that cinematography has been sort of limited in what it can do with these 24 fps, for example panning and action scenes have never really looked good unless the panning is done very slowly or extremely fast, the middle ground has been avoided because it has had too many obvious artefacts. A higher frame rate will allow for more types of camera work.

…eh, I think the RED guys can explain better than I do, I recommend this link.


I know quite a few people who get motion sickness watching video games, ostensibly because of all the camera motion. They can watch action movies. If film makers made more fast camera pans for action flicks I wonder if there would be a lot more people who couldn't watch those movies.
 
Upvote
0 (3 / -3)
aardman":1bn9n4mp said:
I predict 4K or ultra HD or whatever else it is called will never catch on. Very few people want their living rooms dominated by a monstrous screen. And even fewer people watch a movie and focus on the weave of the leading lady's knickers or the grain on the protagonist's leather sofa.
Wow. This is what all my elderly relatives deep in the Appalachians said about HD in 2005 (without the bit about the knickers).
Now even they have HD, and most are even mainlining HD straight into their smart 60"+ 1080P's via netflix/hulu/vudu/amazon/wtfbbq.
I am currently under contract to Sony, Samsung and Vizio to do in-home warranty repairs on flat screens and computers as a major part of my business, and I assure you that almost everybody does, in fact, want the most "monstrously" huge screen they can afford and/or transport home, and they are all amazed at the detail of "full HD". "Ultra HD" will be an easy sell, especially with screen prices plummeting. Walmart sold tons of 60" 1080P Vizio smart TVs on black Friday for less than $700, and there was even an uptick in corresponding router sales for wifi.
 
Upvote
2 (2 / 0)

grimmethod

Wise, Aged Ars Veteran
153
inpher":yh9jr6b7 said:
cervier":yh9jr6b7 said:
VashTheStampede":yh9jr6b7 said:
We need more than just higher resolutions for TVs. The Hobbit is being shown at 48FPS, while Avatar sequels will be 60FPS. People *can* see the differences and many will prefer the higher frame rates on their programming.

Whatever the updated delivery mechanism (physical or streaming) to these higher resolutions should also include higher frame rate possibilities as well.

I thought that human eyes were not able to see the difference with framerate over 25-30 fps? In the case of the Hobbit I think the 48 fps is for 3D.

There are three things (possibly more, but three that I can remember) to consider with regards to the classic choice of frame rate.

Cost and weight of film. Human perception and persistence of vision. And movement on film.

They are all related in this context. The choice of 24 frames per second was chosen because it was the lowest frame rate at which enough people did not experience any negative effects of the frame/flicker/strobing film projector. It was optimised this way because of cost and logistics. 30 frames per second would be 25% more expensive and weigh 25% more than 24 fps, something which made both accountants, and film handlers happy. You can go as low as 12-15 fps while still retaining persistence of vision, but it is not going to look good.

The third thing is that cinematography has been sort of limited in what it can do with these 24 fps, for example panning and action scenes have never really looked good unless the panning is done very slowly or extremely fast, the middle ground has been avoided because it has had too many obvious artefacts. A higher frame rate will allow for more types of camera work.

…eh, I think the RED guys can explain better than I do, I recommend this link.

I believe one of the reasons was also because old film reel would melt if they tried to roll it at the high frame rate that the new Hobbit and Avatar 2 are being shown; maybe not Hobbit speeds but Avatar 2 speeds.
 
Upvote
1 (1 / 0)

frankie1969

Ars Scholae Palatinae
895
cervier":3h9016h7 said:
I thought that human eyes were not able to see the difference with framerate over 25-30 fps? In the case of the Hobbit I think the 48 fps is for 3D.
Some human eyes can perceive over 60 fps. Perhaps you aren't among them. Do fluorescents flicker for you?

I used to be at risk of migraines from sitting in a room with CRTs at 60 Hz refresh (ugh, the pulsating horror). So happy when LCDs became the norm.
 
Upvote
7 (7 / 0)

inpher

Well-known member
3,697
crhilton":40xbhy4k said:
inpher":40xbhy4k said:
[…]The third thing is that cinematography has been sort of limited in what it can do with these 24 fps, for example panning and action scenes have never really looked good unless the panning is done very slowly or extremely fast, the middle ground has been avoided because it has had too many obvious artefacts. A higher frame rate will allow for more types of camera work.

…eh, I think the RED guys can explain better than I do, I recommend this link.


I know quite a few people who get motion sickness watching video games, ostensibly because of all the camera motion. They can watch action movies. If film makers made more fast camera pans for action flicks I wonder if there would be a lot more people who couldn't watch those movies.

I was actually suggesting the possibility of fewer scenes with very fast camera panning, because with 24 fps the choice is between very slow and very fast camera panning. With higher frame rates some cinematographers/directors might opt to slow down some of the very fast movements because those "medium speed" movements will no longer feel like the movie is stuttering.
 
Upvote
0 (0 / 0)
Ultra HD on TV seems impractical but it's a great idea for projectors. I can really see the pixelation in long shot in movies on blu-ray/1080p on our set up at home.

It is curious that just going up to 48fps or 60fps gives you a perception of higher resolution however. And 3D becomes amazingly clear and non-taxing for me anyway. I'd rather that upgrade happen before Ultra HD ... I think.

A big problem is that MANY movies had all their final effects work finished at 2K to make deadlines. Yep. Hopefully, that isn't happening too much anymore.
 
Upvote
1 (1 / 0)

tigas

Ars Tribunus Angusticlavius
7,397
Subscriptor
Pubert":1qinqdm2 said:
so-ooo....
is there a Ultra HDi and an Ultra HDp? ;)

Frankly, I liked 4K. Simple. Frank. To the point.

(And half the syllables.)

i (interlaced) is dead. Good riddance.

(edit) - it should be, but since OTA broadcast can't hack 1080p, some stations are doing 1080i. :facepalm:
 
Upvote
1 (1 / 0)

beebee

Ars Tribunus Angusticlavius
8,865
Spungy":1eomjmxp said:
I still can't tell the difference between 720p and 1080p. My eyes are crap.

I was at CES when the first 1080 line plasmas were displayed. You could hardly tell the difference over 720 lines if the scene was moving. On a still it was discernable. These were small displays by todays standards. Maybe 45 inches.

The content producers are generally aware of the "set" and try to avoid anything that will moire due to insufficent resolution. More so in TV than in "film". (What fool in wardrobe brought a slanted striped tie!)

I just don't see 4K and 8k being a big deal unless you have a projector and large screen. The consumer thinks DVDs are quite good, and they are just very good SD that gets interpolated.
 
Upvote
0 (1 / -1)
xryancat":2ej6xusu said:
Ars wrote an excellent piece on 4k resolution TVs and the perceivable difference it has on movies and television last summer.
I'm surprised more people don't challenge the chart and the conclusions being drawn from it. The basic assumptions about human visual acuity are pretty well known and accepted, but there's a lot more to it than that.
 
Upvote
4 (4 / 0)

ScifiGeek

Ars Legatus Legionis
19,001
Kalessin":25lhjyfk said:
xryancat":25lhjyfk said:
Ars wrote an excellent piece on 4k resolution TVs and the perceivable difference it has on movies and television last summer.
I'm surprised more people don't challenge the chart and the conclusions being drawn from it. The basic assumptions about human visual acuity are pretty well known and accepted, but there's a lot more to it than that.

IMO the chart works rather well. My TV is at the 720p distance and I can't tell the difference between 720/1080 video and even decent 540p looks very nice.

The only thing I have seen that contradicts the normal visual acuity distance numbers, is some very dubious report out of NHK, from a group pushing 8K.
 
Upvote
1 (2 / -1)

Pubert

Ars Tribunus Militum
2,266
I saw Douglas Trumbull's Showscan when I lived in Dallas about 25 years ago.
He had a demo theater theater there in the back of a Chuck E. Cheese, no less.

It absolutely blew me away. As in, I watched the entire movie slack-jawed.

70mm @ 60 fps.

It was so sharp that the audience was totally fooled that a person being projected (supposedly changing reels behind the screen) wasn't actually real and in the theater.

There was a collective gasp when we all realized that we had been fooled as part of the demo -and what we had been seeing wasn't real. I'll never forget it.
http://en.wikipedia.org/wiki/Showscan

It's a shame the tech was only ever used for some rides in Las Vegas.
 
Upvote
3 (3 / 0)

xoa

Ars Legatus Legionis
12,392
Subscriptor
Duncan MacLeod":3ch6vxnq said:
MKV Is pretty much the only acceptable format for me these days. I had a couple with bad audio and I was able to trivially extract it and put a new track in without conversion.
Yeah, in general I'm unwilling to pay money for unreasonably inferior tech nowadays.
alxx":3ch6vxnq said:
.red is an option , supposedly 2.5MB a second for streaming 4k (currently up to 4096 by 2160)
You mean for end distribution or production? RED is a candidate for the latter, although ProRes also seems somewhat common. No idea how that'll all boil out in the end.

As far as the former though, I expect distribution will in the end almost 100% boil down to HEVC (H.265). Broadcom's just announced 4Kp60 SoC (the BCM7445) is supposed to support both H.264 and H.265, and just as with the HD transition we'll probably see a lot of initial content in the old codec and then transition to the new.
 
Upvote
0 (0 / 0)

inpher

Well-known member
3,697
Kalessin":nasdsgic said:
xryancat":nasdsgic said:
Ars wrote an excellent piece on 4k resolution TVs and the perceivable difference it has on movies and television last summer.
I'm surprised more people don't challenge the chart and the conclusions being drawn from it. The basic assumptions about human visual acuity are pretty well known and accepted, but there's a lot more to it than that.
I partly agree that it is more to it that, but from my point of view it has more to do with things that are not primarily about resolution.

I have trouble distinguishing 720p from 1080p at a distance that is consistent with the chart. So, my subjective experience tells me the calculations that produced that chart are decent solid enough. The threshold where returns are increasingly small have been reached at 1080p, I'd much rather see improvements in colour accuracy, frame rate, and compression (roughly in that order).
 
Upvote
0 (1 / -1)

CenterLess

Ars Tribunus Militum
2,587
Subscriptor++
VashTheStampede":1onjy2qa said:
We need more than just higher resolutions for TVs. The Hobbit is being shown at 48FPS, while Avatar sequels will be 60FPS. People *can* see the differences and many will prefer the higher frame rates on their programming.

Whatever the updated delivery mechanism (physical or streaming) to these higher resolutions should also include higher frame rate possibilities as well.

I'm in agreement with you there. There's a point of diminishing return on uHD, but I'd actually like it if they up the frame rate from the current 30 to 60 and make it standard.
 
Upvote
0 (1 / -1)

azazel1024

Ars Legatus Legionis
15,090
Subscriptor
Mmm, I tend to disagree with that chart that gets thrown around a lot.

Is there going to be a big noticable difference between 720p and 1080p at 20ft on a 40 inch television? No. Is it maybe possible to preceive a slight difference if you stuck both next to each other. Possibly and possibly not something you'd necessarily be able to say with full confidence.

As a photographer I've made a lot of prints. Both higher and lower DPI prints as well as more initial pixels to the same final image resolution.

I can tell you looking at a pair of 8x10s next to each other held at arms length and one printed at 240dpi and one printed at 300dpi, a lot of people probably will notice that the 300dpi print looks just the slightest bit better, or a better case, one printed at 240dpi and the other at, say, 600dpi. Our eyes shouldn't be able to preceive a difference, but they generally can, to some degree.

Now I'll fully agree, unless you are going HUGE for your TV, or sitting really close, the difference between full HD (1080p) and ultra HD (2160p) is probably going to be trivial. However, I'd also still bet if you put identical TVs next to each other, even at say 40 inches in size, and put them at a resonable distance, say, 10ft, I'd bet most people could preceive at least a very small difference between the two.

According to the chart, you probably shouldn't even really notice a difference between 720p and 1080p with that.

However, I do think for the most part it is simply status seeking. If/when the cost differences are VERY modest and there is also a good content delivery method for 2160p, you won't find me buying one.

If the price increment is maybe only 20% or so more for a 2160p display over a 1080p display, I'd really consider it. Especially if it was going to be my "Bigum TV for the basement".

If Blu Ray, or some other physical media spec comes along that can deliver 2160p no more highly compressed than Blu Ray manages with 1080p and 24/30FPS, than I'd consider that a win and a good option for moving to an Ultra HD display as well (though the price increment is still a requirement).

A lot of stuff the last few years has been shot in 2160p or higher res, so it isn't like there are NO movies or shows out there that could be shown in ultra HD. However, a lot of old media shot on 35mm film, or god forbid a smaller format, there is really no point in ever converting to anything higher than 1080p. Rescanning the original or a transfer isn't really going to get you much of anything. Some really high quality cinema 35mm film MIGHT just be capable of showing an improvement scanned at 2160p rather than 1080p, but the difference is probably going to be VERY small. 35mm photographic film is lucky to see much more than maybe the equivelent to 2160p resolution for a really good film, and 35mm cinema film is CROP frame of 35mm photographic film. Photographic 35mm is 36x24mm, cinema 35mm is basically 22x11.8mm for wide screen (it orients the image perpendicular to 35mm photographic film and is also a narrower ratio).

So if you have a hard time managing something like 2160p with a good 35mm photographic film...what do you think is going to happen with something that has roughly half the height? Yeah, that is what I thought, 1080p is pushing it for decent 35mm cinema film transfers. 2160p is basically wishful thinking.

Now with digital cinema cameras, which most people have been using for the last few years, most of those handle 4k with aplomb that last year or two (and plenty did it stretching back several years, but were not industry standard, per say). A number will do 6k and a very, very small handful will do 8k now.

Frankly I am a lot more interested in seeing a spec that can deliver 1080p at 48/60fps (and not for 3D) than I am 4k at 24/30FPS. Though, why not 4k at 48/60fps. I wouldn't mind that one bit.
 
Upvote
5 (5 / 0)

CenterLess

Ars Tribunus Militum
2,587
Subscriptor++
alxx":bv90d4dl said:
JTD121":bv90d4dl said:
Awesome. No one really cares, because there is no content, and the ridiculous re-tooling of all our infrastructure is just not going to happen nearly as fast as the ITU (or whomever) is bumping TV set resolutions.....

There is content but not from the major studios and usual providers yet.

The studios are complaining now about loss due to piracy and DRMing the hell out of their content on blu ray. Do you think they're going to be more enthused when a near perfect reproduction of theatre quality experience is going to spur them on to releasing them on these new uHD formats? I doubt it. They're happiest selling us the worst quality at the most obscene price they can gouge out of us.
 
Upvote
0 (1 / -1)
CenterLess":1f43o5hf said:
alxx":1f43o5hf said:
JTD121":1f43o5hf said:
Awesome. No one really cares, because there is no content, and the ridiculous re-tooling of all our infrastructure is just not going to happen nearly as fast as the ITU (or whomever) is bumping TV set resolutions.....

There is content but not from the major studios and usual providers yet.

The studios are complaining now about loss due to piracy and DRMing the hell out of their content on blu ray. Do you think they're going to be more enthused when a near perfect reproduction of theatre quality experience is going to spur them on to releasing them on these new uHD formats? I doubt it. They're happiest selling us the worst quality at the most obscene price they can gouge out of us.

That explains why new movies are available in BluRay, and BluRay 3D....

It takes a mighty living room to provide an experience on par with a movie theater.
 
Upvote
1 (1 / 0)

CenterLess

Ars Tribunus Militum
2,587
Subscriptor++
MalnarThe":16jo5a2v said:
CenterLess":16jo5a2v said:
alxx":16jo5a2v said:
JTD121":16jo5a2v said:
Awesome. No one really cares, because there is no content, and the ridiculous re-tooling of all our infrastructure is just not going to happen nearly as fast as the ITU (or whomever) is bumping TV set resolutions.....

There is content but not from the major studios and usual providers yet.

The studios are complaining now about loss due to piracy and DRMing the hell out of their content on blu ray. Do you think they're going to be more enthused when a near perfect reproduction of theatre quality experience is going to spur them on to releasing them on these new uHD formats? I doubt it. They're happiest selling us the worst quality at the most obscene price they can gouge out of us.

That explains why new movies are available in BluRay, and BluRay 3D....

It takes a mighty living room to provide an experience on par with a movie theater.

I'm talking about the new uHD format. Not the 1080p stuff we have now.
 
Upvote
-1 (0 / -1)
I hope these manufacturers are acutely aware of the fact that, in order to sell these things, they need UHD content, and in order for consumers to access UHD content, internet speeds in the country need to go up by a lot and quickly (along with no caps). If these guys can get more $$$ and muscle behind efforts to nationalize or at least make-competitive our internet infrastructure, it could go a long way in this almost impossible battle against incumbent ISPs.
 
Upvote
0 (0 / 0)

alxx

Ars Praefectus
4,997
Subscriptor++
lvlln":2yqu23ie said:
I hope these manufacturers are acutely aware of the fact that, in order to sell these things, they need UHD content, and in order for consumers to access UHD content, internet speeds in the country need to go up by a lot and quickly (along with no caps). If these guys can get more $$$ and muscle behind efforts to nationalize or at least make-competitive our internet infrastructure, it could go a long way in this almost impossible battle against incumbent ISPs.

Here's hoping odimax gives the indie producers a chance to shine and distribute their content and not get shut-out again by the big studios.
 
Upvote
0 (0 / 0)

ScifiGeek

Ars Legatus Legionis
19,001
azazel1024":1upopbg7 said:
I can tell you looking at a pair of 8x10s next to each other held at arms length and one printed at 240dpi and one printed at 300dpi, a lot of people probably will notice that the 300dpi print looks just the slightest bit better, or a better case, one printed at 240dpi and the other at, say, 600dpi. Our eyes shouldn't be able to preceive a difference, but they generally can, to some degree.

If you are using a Bayer Sensor camera, your effective DPI is less than that, which can account for you seeing difference at further distances than you expect.

My own testing indicates the numbers work quite well when starting with sharp 1:1 sources. If you are starting with soft bayer images, you really aren't providing the full DPI to test.
 
Upvote
-1 (0 / -1)

Biggiesized

Ars Tribunus Militum
1,755
alxx":16jx0hlc said:
Is ars going to review the Odemax 4k distribution service when it launches at sundance ?
and the redray player ?

http://www.red.com/news/red-announces-p ... ith-odemax
http://odemax.com/information.html

http://www.reduser.net/forum/showthread ... RAY-ODEMAX

They are going to be using the red ray codec, supposedly have had a break through and got it way down on bandwidth 2.5MB a second for streaming 4k video
http://www.redgrabs.com/up/1354328970.jpg


Also in Europe there is a dedicated 4k demo satelite channel that started on tuesday
http://www.prnewswire.com/news-releases ... 93172.html

I'll wait to see it in person. They've made numerous claims which have not panned out, one of which is that they describe REDCODE as visually lossless (it's not even close).
 
Upvote
1 (1 / 0)

imaca

Wise, Aged Ars Veteran
155
mkuch90":2zhjhb80 said:
The problem I'm going to have with the transition to UltraHD / 4K is the required viewing distance is uncomfortable. I have a 55" TV and I need to be prohibitively close to be able to see the pixels.
http://www.marseilleinc.com/recommended ... resolution
I guess it's all down to user preference and eyesight, I bought a 40" rather than 55" because with the 55" I had to be too far away before the picture didn't look like pixelated crap.
 
Upvote
-1 (1 / -2)
More pixels the better i think. 1080p looks great on 24" screens.

the real issue is i want content, if any lesson should be learned for 720p/1080p screens is content. the biggest issue IF anf it a big IF we get content is delivering it to homes.
the amount of data is massive, we will need to scrap optical drive for hot swap SSD type players. which will be loaded at point of sale. as no one anytime soon will have the internet speed to stream/download the movies.
Its TB's of data. or we need a great new lossless encoding method for optical disks.
 
Upvote
1 (1 / 0)
Status
Not open for further replies.