Remind me in like 6+ years when the standard is actually widely adopted. Many high-end OLED monitors today in the year 2025 still ship with fucking hdmi 2.0 and displayport 1.4.
I need 4k to be happy, with 1080, you have giant windows in you OS (like most apps are only usable in fullscreen) even on 100% and still see single pixels so well…
Straight unusable for me, maybe on a phone with max 5” there 1080 is like a good middle ground (battery vs resolution vs not seeing single pixels)
Yeah, 1080p is fine on a small laptop screen, or a small TV on the other side of the room, but it’s unusable for desktop applications. Even 1440 is noticeably low res. I disagree about phones, though. I think 1080p is overkill and 720p is fine.
I mostly want displays to not be something I worry about. Even if I just have a single port, being able to connect 3 4K monitors without worrying about their refresh rate is convenient.
It doesn’t top out below 144Hz. There are benefits with diminishing returns up to at least 1000Hz especially for sample-and-hold displays (like all modern LCD/OLED monitors). 240Hz looks noticeably smoother than 144Hz, and 360Hz looks noticeably smoother than 240Hz. Past that it’s probably pretty hard to tell unless you know what to look for, but there are a few specific effects that continue to be reduced.
https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/
Yea, I think the limits are going to top out around that 300Hz mark, it’s going to be really hard to convince people they can see or feel a difference between 300Hz and 480Hz. I have no preference between 240Hz and 300Hz already.
For computer monitors, I also wouldn’t be surprised if we top out at 4k for regular consumers, with a few niche 8k products available.
Well one very good reason would be that their specification is closed source and as such not even HDMI Forum partner AMD can implement them in their open source driver.
For a long time we had VGA for video cables. There was no VGA version 2.1.9, now supporting 1024x768 mode with 16-bit colour. Cables did not cost $29. There were no rent-seeking patent holders charging license fees, or at least they weren’t obnoxious enough that we knew about them. It didn’t have five different types of connectors. There was no VGA consortium constantly keeping itself in the news with periodic press releases. Companies didn’t need to sign away their soul to write drivers for it. There was no VGA copy protection trying to keep us from decoding our own video streams. Cables didn’t include enough microelectronics to power a space shuttle.
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: [email protected]
No game suggestions, friend requests, surveys, or begging.
No Let’s Plays, streams, highlight reels/montages, random videos or shorts.
No off-topic posts/comments, within reason.
Use the original source, no clickbait titles, no duplicates.
(Submissions should be from the original source if possible, unless from paywalled or non-english sources.
If the title is clickbait or lacks context you may lightly edit the title.)
Of course you need a new cable. You’re not getting massive upgrades in fidelity on your old crappy cable.
But… but… it has gold-plated connectors 😟
Remind me in like 6+ years when the standard is actually widely adopted. Many high-end OLED monitors today in the year 2025 still ship with fucking hdmi 2.0 and displayport 1.4.
DisplayPort over HDMI!!
After 1080p60 I kind of still notice a difference, but I’m not willing to pay much more for increasing that further.
I need 4k to be happy, with 1080, you have giant windows in you OS (like most apps are only usable in fullscreen) even on 100% and still see single pixels so well…
Straight unusable for me, maybe on a phone with max 5” there 1080 is like a good middle ground (battery vs resolution vs not seeing single pixels)
Yeah, 1080p is fine on a small laptop screen, or a small TV on the other side of the room, but it’s unusable for desktop applications. Even 1440 is noticeably low res. I disagree about phones, though. I think 1080p is overkill and 720p is fine.
I think, with phones, it is very important to factor the size of screen in
720p is fine, but with 7”+ phones, I think, one is happy about having similar DPI compared to the smaller 720p phones.
You sound like you’ve never gamed at 240p
Jokes on you 😁 my first game I ever played was on a 240 x 160 screen
(2.9”)
I mostly want displays to not be something I worry about. Even if I just have a single port, being able to connect 3 4K monitors without worrying about their refresh rate is convenient.
Is the 480Hz support “just because”, or is there any kind of use case for it?
I think It’s more like the bandwidth needed to support 12k at 120hz also allows for 4k at 480hz, soo… por que no los dos?
deleted by creator
It doesn’t top out below 144Hz. There are benefits with diminishing returns up to at least 1000Hz especially for sample-and-hold displays (like all modern LCD/OLED monitors). 240Hz looks noticeably smoother than 144Hz, and 360Hz looks noticeably smoother than 240Hz. Past that it’s probably pretty hard to tell unless you know what to look for, but there are a few specific effects that continue to be reduced. https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/
Yea, I think the limits are going to top out around that 300Hz mark, it’s going to be really hard to convince people they can see or feel a difference between 300Hz and 480Hz. I have no preference between 240Hz and 300Hz already.
For computer monitors, I also wouldn’t be surprised if we top out at 4k for regular consumers, with a few niche 8k products available.
12K. brought to you by Hollywood Face Makeup and CGI alliance.
I hate HDMI with a passion that can not be explained.
Why?
Well one very good reason would be that their specification is closed source and as such not even HDMI Forum partner AMD can implement them in their open source driver.
https://www.phoronix.com/news/HDMI-Closed-Spec-Hurts-Open
DisplayPort spec is fully open btw.
Thanks for a great answer.
I don’t know, there’s just something about it.
For a long time we had VGA for video cables. There was no VGA version 2.1.9, now supporting 1024x768 mode with 16-bit colour. Cables did not cost $29. There were no rent-seeking patent holders charging license fees, or at least they weren’t obnoxious enough that we knew about them. It didn’t have five different types of connectors. There was no VGA consortium constantly keeping itself in the news with periodic press releases. Companies didn’t need to sign away their soul to write drivers for it. There was no VGA copy protection trying to keep us from decoding our own video streams. Cables didn’t include enough microelectronics to power a space shuttle.
Somehow I think we could do better.
I get it now. Thanks.
Great. What’s the max length? 6”?