Why won’t Steam Machine support HDMI 2.1? Digging in on the display standard drama.

Status
You're currently viewing only OptimusP83's posts. Click here to go back to viewing the entire thread.
This got me wondering if Valve could get away with "one little trick" and just make it internally a DP port (as far as the driver is concerned) and wire it to an external port that is just an HDMI 2.1 dongle in disguise?

To the user, it just looks like an HDMI port on the back of the case. . .

Although that probably means that in OS management tools, you have 2 DisplayPort ports show up, and zero HDMI, which might be a technical support issue as users get confused.

Although it's probably also too late in the product development cycle to make such a change. If it's even theoretically possible.
While this could maybe work, good luck trying to explain to a random consumer that "that HDMI port? Yea its not technically an HDMI port..." when it doesn't play nicely with their random model of TV. Shit like that is why so many people are frustrated with USB-C, throw in an HDMI port that isn't an HDMI port, and you're asking for all sorts of support headache and likely legal trouble from the HDMI Forum. I bet they even regulate that an exposed HDMI port on a backplane MUST be actual HDMI not Display Port Alt-mode or something. Which, honestly, is a good idea. Nothing destroys trust and goodwill when things don't work as advertised because a manufacturer tries to get cute with standards...

That all being said, HDMI Forum are being the assholes here by requiring certification for 2.1 without providing tools for proper OS supprt. Seems like if they can't/won't support a platform due to their own lack of proper SDK that there should be some route for 3rd parties to bring that support to customers. Hell, at least work with Valve to build that SDK. Sounds like they're just being protectionist jackholes...
 
Upvote
11 (11 / 0)
Well, no. The DP signal can carry audio and video which for "output" is all you really need. They can probably even hack in the "wake on activity" signal to turn the TV on automatically. The HDCP signal might be more tricky from a Linux box to be allowed to be "trusted" for that and stuff like ARC are really "optional" for coming from a PC.
I doubt its possible to implement broadly functional CEC via some DP->HDMI hack, at lest without creating a support nightmare for themselves. I'd consider reliably functional CEC to be an absolute must-have for the audience Valve is likely targeting for the Steam Machine.

And thats not even touching on things like ARC or e-ARC, VRR (which, you may recall, is the topic of this article), and others. Negotiating resolution, color depth, and refresh rate is doable for most modern HDMI/DP devices, but its not even the bare minimum of functionality for a device like the Steam Machine.

Aren't we always harping on and on around here about companies who workaround, hack or eschew standards to bring some buggy functionality to market without needing certification? How is this any different? And why should valve take it on themselves to create said buggy workaround when they really should be pressuring HDMI Forum to built a fully functional toolchain that actually works on non-windows OS's?
 
Last edited:
Upvote
8 (9 / -1)
The board traces for 2.0 and 2.1 are identical, so no, Valve did nothing.

Your concept of "cobbled together" is alarmingly strict. Motherboards regularly have supporting chips that add or change functionality. Is an ASMedia SATA chipset for extra ports "cobbled together" too? Is a Lenovo laptop cobbled together because it needs a tweaked keyboard driver?

Sometimes the base platform needs augmented. If the manufacturer open-sources and mainlines the device tweaks and it is as maintainable as any desktop, and more than most laptops.
There is a LARGE gulf between "theoretically working" and "working in practice", PARTICULARLY when it comes to anything touching HDMI.

You mention CEC...Do you have any practical experience using CEC? I do. It still doesn't always work properly for me, which is a HUGE improvement over not working AT ALL with previous devices only a few years ago. And let me refresh your memory, CEC was defined with HDMI ONE POINT FUCKING ZERO. Which was released in 2002. That was TWENTY THREE YEARS AGO...

So what you're telling me is that Intel was able to implement a workaround for higher framerates at 4k on hardware where they control the FULL STACK, both software and hardware and indeed have a specialization in building highly complex chips and interconnects in order to facilitate new standards and whatnot. Notably absent from that link you sent was any mention of VRR in any form, its just about negotiating a higher out-of-spec interconnect bandwidth. Do you have any hands on experience with Intel's workaround? Does it reliably function with both consumer TVs as well as PC monitors? Does it even try to support VRR? Does this out-of-spec negotiation all work flawlessly under Linux?

Meanwhile Valve, who has VERY LITTLE experience in the hardware world until fairly recently with the Steam Deck (and a few earlier projects I'm sure they'd like us all to forget), is going to just crank out a custom GPU? Or rather, pay AMD to crank out some custom GPU silicon to support an out-of-spec bandwidth increase with no guarantee (or documentation) that would show that it delivers on all the HDMI 2.1 features and with AMD doing much of the heavy lifting software wise (since their drivers would need to support this under windows, linux, etc)? Are you nuts? Valve certainly isn't, which is why they're pressuring HDMI to figure their shit out on Linux rather than shipping a hack that they'll be on the hook to support for a long time...

Nothing in what you've said or linked to shows that this issue is "solved" at all, even by Intel's hack.
 
Upvote
7 (9 / -2)
Point 1: yes, VRR works great. ALLM doesn't though but that could be software stack.
Point 2: if this is a hack then computers are a hack and we should ship none of them. It's not custom silicon. It's adding a converter chip to an already-custom board where there are already many similar-scope chips, and validating the output. There are hundreds of these hacks going into every level of every motherboard already and the idea that this consumer-visible one would be the one that breaks it all is hilarious. No. No it would not. It would work great.

Your CEC comment suggests a lack of reading comprehension or an unwillingness to consume source material provided in an exchange. CEC does not even transfer on the same pins as frame and audio data. It is literally its own protocol on its own pin on the connector and as long as it gets from point A (source) to point B (kernel module in Linux) and back again, there is quite literally no distinction between whether it was natively supported in hardware or if it was tunneled over DisplayPort data pins.
VRR over Intel's custom DP-> HDMI converter works great? On what exactly? Does it work on hundreds (thousands?) of consumer TVs? Color me very skeptical. Even if it does work well, its a non-trivial burden for a company like Valve who doesn't have vast in-house GPU expertise and a full in-house silicon design stack to work with.

And it IS a hack, lets be clear. Its doing stuff outside of the established specs of both protocols. That it is done by a company as large as Intel and it evidently works is completely irrelevant, its still a hack and will require a non-trivial amount of effort for Intel to continue to support via testing of new devices and code revisions into the future.

And my comment on CEC? What does my alleged lack of reading comprehension have to do with your demonstrated lack of reading comprehension? I wasn't saying VRR is a part of CEC, that is obvious from my post. I brought up CEC as an example of the HDMI Forum's inability to deliver on their own features; CEC being a data-channel-over-HDMI implementation to send commands to attached devices and synchronize states. Any my point wasn't whether it was supported over Displayport, it was that HDMI was unable to figure their shit out for a literal decade to get their own feature to work on their own connector over their own protocol. So you pretending its simple because its "data going over its own separate pins" is hilariously reductive and glossing over complexities that were clearly difficult for the creators of the spec to overcome.

TL;DR: don't come at me about reading comprehension when you're misreading my post and switching context to something I wasn't actually saying. I brought up HDMI's challenges with CEC as an example of a) their relative incompetence implementing functional software, or b) their specs being far more complicated than they seem on the surface. I stand by my assertion that what you're suggesting is far more complex than it seems and Valve has nowhere near the resources and subject matter expertise as Intel does. Intel has building GPUs and chipsets for decades, Valve has ZERO experience with either aside from working with drivers on the software side with Proton or the engine side with Source. Even if they could duplicate Intel's accomplishment, why would they?

ETA: And just because I was feeling generous and its been a while since I thought of this topic, I reread your link (which was in a response to someone else, not me) and it has ZERO mention of VRR. There is a LOT of talk about EDIDs, VESA modes, Fixed rate link negotiation, and refresh rate and BPP support but nothing about VRR. And while there are further reference materials in that very technical response in that thread, nothing seems to directly reference variable refresh or more advanced HMDI and I'm not about to fall down a rabbit hole of VESA specs and EDIDs just to refute an already very downvoted post on a forum.

ETA2: Ok I read farther down that thread:

"ALLM is an HDMI specification that is not available due to the lack of a native FRL PHY. VRR is also an HDMI specification that cannot be provided via (this) PCON. For gaming, you should use a DisplayPort interface and turn on Adaptive Sync to adjust the refresh rate to the game's frame rates. This is of course not possible on a TV (no DisplayPort connector)."

So VRR is explicitly NOT supported by Intel's DP->HDMI2.1 solution. Which is the opposite of what you're stating. Not only is VRR NOT supported by Intel's solution, neither is HDMI2.1. Its literally just a workaround to support higher resolutions at higher BPP and fixed refresh rates using DSC. You really need to check your sources, friend...
 
Last edited:
Upvote
-2 (0 / -2)
Status
You're currently viewing only OptimusP83's posts. Click here to go back to viewing the entire thread.