Next-Gen HDMI specs to be announced in January Could the GeForce RTX 50 and Radeon RX 8000 series not only support DisplayPort 2.1 but also a new HDMI standard? It seems possible, given that the HDMI Forum has just announced it will reveal new specifications for the standard at CES 2025, coinciding with AMD and […]
I wish manufacturers would bother to mark the capabilities of their otherwise identical looking ports and cables, so we could figure out what the hell we were looking at when holding a Device That Does Not Work in one hand and a cable in the other.
It’s bully for the device if it knows, but that doesn’t help the user who has just pulled one identical looking cable out of many from the drawer and will have no idea until they plug it in whether or not they will get a picture, nothing, near-undiagnosable partial functionality, or smoke.
I can definitely see the difference between a 1440p 27 inch display vs a 5k 27 inch display, add in high refresh rate and HDR and you already are close to exceeding the DP 2.0 maximum bandwidth (without display stream compression). I wish we could finally get decent high DPI monitors on desktops that aren’t made by or for apple Macs
Though that’s not where you would use HDMI. I would argue for TV:s, 4k is generally enough, and HDMI 2.1 already has enough bandwidth for 4k 120 Hz 12 bit-per-color uncompressed.
The point is definition doesn’t matter, it’s the viewing distance + pixel density that matters. This is what apple calls retina: when we stop seeing the individual pixels (jagged edges) at a normal viewing distance. This means that a phone will need a much higher pixel density than a desktop or tv. But the low-dpi displays we still have are unacceptable in 2024 the icons and text look so ugly…
The thing is, I prefer actually owning my media, I don’t use steaming services for the most part. But even with my 40 TB of media storage, I just don’t have the space for 5k content. If it’s worthwhile, it gets 1080, if it matters less (kid shows or anything that came from a dvd), it gets 720 at best.
Higher resolution costs more money and requires better hardware to drive it (more money). Overpowering a lower resolution only means your hardware is relevant for longer.
It’s just tech creep.
I wish my eyes were constantly upgraded to keep up with these resolutions
I wish manufacturers would bother to mark the capabilities of their otherwise identical looking ports and cables, so we could figure out what the hell we were looking at when holding a Device That Does Not Work in one hand and a cable in the other.
I think this is the reason a number of standards are going to active cables, so the device will know the cable isn’t up to standard.
Or in the case of USB-C, so it doesn’t catch fire after having five amps cranked through it.
It’s bully for the device if it knows, but that doesn’t help the user who has just pulled one identical looking cable out of many from the drawer and will have no idea until they plug it in whether or not they will get a picture, nothing, near-undiagnosable partial functionality, or smoke.
I’m thinking that the user will get a notification that the cable they’re using isn’t the correct one.
On the screen that doesn’t work?
HDMI is reverse compatible a long way back, almost every device will fall back to a standard that doesn’t require such an expensive cable.
Come on man, this is simple stuff.
I can definitely see the difference between a 1440p 27 inch display vs a 5k 27 inch display, add in high refresh rate and HDR and you already are close to exceeding the DP 2.0 maximum bandwidth (without display stream compression). I wish we could finally get decent high DPI monitors on desktops that aren’t made by or for apple Macs
Though that’s not where you would use HDMI. I would argue for TV:s, 4k is generally enough, and HDMI 2.1 already has enough bandwidth for 4k 120 Hz 12 bit-per-color uncompressed.
But DisplayPort, yeah, that could use a bit more.
I don’t believe in anything over 1080p. Waste of bandwidth.
The point is definition doesn’t matter, it’s the viewing distance + pixel density that matters. This is what apple calls retina: when we stop seeing the individual pixels (jagged edges) at a normal viewing distance. This means that a phone will need a much higher pixel density than a desktop or tv. But the low-dpi displays we still have are unacceptable in 2024 the icons and text look so ugly…
The thing is, I prefer actually owning my media, I don’t use steaming services for the most part. But even with my 40 TB of media storage, I just don’t have the space for 5k content. If it’s worthwhile, it gets 1080, if it matters less (kid shows or anything that came from a dvd), it gets 720 at best.
1080p gang 🤙
Higher resolution costs more money and requires better hardware to drive it (more money). Overpowering a lower resolution only means your hardware is relevant for longer.
It’s just tech creep.
My eyes have been on an update binge lately, unfortunately it’s just been downgrades.