> Last time I plugged in an HDMI source and the darn "smart" television showed the image for 0.5 seconds before displaying a menu that asks me to press a button on the remote to show the image.
That's entirely the fault of your crappy smart display with some crappy OS and has entirely nothing to do with HDMI as a standard.
I would think as a plug and play standard for A/V stuff, HDMI is one of the farthest along the "just works" spectrum for the vast majority of people. Occasionally I see a device where there's something stupid like switching to a different HDMI source doesn't switch the audio source and you have to use some dumb OSD menu with many nested levels to get to the audio sources, but again, that's not HDMI's fault.
I have had quite a few broken HDMI cables in lecture halls at uni and in meeting rooms at various work places, but I think that's the reality of any connector that gets plugged and unplugged tens of times per day (especially by people who don't care and don't have to pay for them when they break). They just need to replace the cables more often.
> That's entirely the fault of your crappy smart display with some crappy OS and has entirely nothing to do with HDMI as a standard.
Sure yeah, but I don't buy it. If you create a standard that is too complicated or too feature-creeped to be implemented fully and that lack of full implementation means the fundamental role of the standard breaks down, that standard might be part of the problem.
I too could envision a solution that theoretically works perfectly, and all people are doing it wrong if it doesn't. But such standards have to be made with reality in mind. USB-C is another one of those. Cool – now I have a ton of USB-C cables none of which tell me on the cable what capabilities they have. One can't support USB-power delivery, the other doesn't work with video up to certain resolutions, etc.
I get that more data means higher frequency and that this directly translates to more problems, but nobody (at least no consumer) asked for the complexity of the HDMI spec. We want to connect a cable and see the picture in 99.99% of the cases. If that doesn't work 100% of the times the standard is at fault. The base functionality of the thing needs to be so dumb and so clear that it just works, even if the other side doesn't even know what an EDID is. That was the task and the result is catastrophic failure.
I think an awful lot of this could be solved by requiring the ports to export the information they get to the device, and requiring that if the devices can reasonably be able to display the information that they do so. PCs, phones, tablets would all tell you about the cable and the connection. Things without screens and interfaces would not be required to add them, though.
It's not that the cables support varying specs (which I actually have no problem with--you shouldn't have to pay for features you don't need, and some features trade off vs cable length), but that we have no easy way to find them out or test them.
That's entirely the fault of your crappy smart display with some crappy OS and has entirely nothing to do with HDMI as a standard.
I would think as a plug and play standard for A/V stuff, HDMI is one of the farthest along the "just works" spectrum for the vast majority of people. Occasionally I see a device where there's something stupid like switching to a different HDMI source doesn't switch the audio source and you have to use some dumb OSD menu with many nested levels to get to the audio sources, but again, that's not HDMI's fault.
I have had quite a few broken HDMI cables in lecture halls at uni and in meeting rooms at various work places, but I think that's the reality of any connector that gets plugged and unplugged tens of times per day (especially by people who don't care and don't have to pay for them when they break). They just need to replace the cables more often.