It’s nitpicking, whether it runs at 3840x2160 or 4096x2160 does not matter. Same goes for calling it 4K or UHD, even when one is technically incorrect.
If even Sony calls their 3840x2160 blu-rays “4K UHD” I’m fine with the average person using them interchangeable.
I had to go digging but 3840x2160 is both 2160p AND 4k UHD. 4096x2160 is something called 4K DCI which is more of a camera or film industry thing and is rarely used for things like TVs or video games.
1080p 1080i 720p (IE the i/p suffix) denotes a SMPTE resolution and timing.
HD/FHD/UHD (720,1080,2160 respectively) also denote SMPTE resolutions and timings.
These are SMPTE ST2036-1 standards, which are 16:9 and have defined (but not arbitrary) frame rates up to 120fps.
4k DCI is still a SMPTE timing, but used for cinema and is generally 24fps (tho can be 48fps for 2k DCI).
It’s SMPTE 428-1.
There are other “4k” standards, but not nearly as common.
If you have arbitrary resolutions or timings outside of the SMPTE standards, and generally fall into VESA standard resolution/timings or custom EDID resolution/timings.
Chances are your computer is actually running 1920x1080@60 CVT-RB rather than 1080p60.
Whilst 1080p60 and 1920x1080@60 seem like they should be the same, some displays (and devices) might only support SMPTE timings or VESA timings.
So, although a display is 1920x1080 it might expect SMPTE, but the device can only output VESA.
No problem.
Displays, resolutions, framrates, edids are all very complex. And marketing muddies the water!
I’ve encountered this issue before when using BlackMagic equipment.
What I was plugging into was described to me as “1080p”.
Laptop directly into it would work, and it looked like 1080p in windows display management.
Going through BlackMagic SDI converters (SDI is a SMPTE standard protocol, so these boxes went hdmi->sdi, sdi cable, sdi->hdmi, and would only support SMPTE resolutions/timings), the display wouldn’t work.
Because the display was VESA only.
Correct, but both can be called 2160p just because of their vertical resolution. Overall both terms don’t matter in gaming because aspect ratio can be changed on the fly (on PC) depending on the output device. Haven’t touched a console in years but I assume they are stuck with a 16:9 aspect ratio no matter what they are playing on?
Calling 3840x2160 “4k” makes sense since 3840 is so close.
On a different note sometimes I’ve heard people call 2560x1440 for “2k” but neither 2560 nor 1440p are close to 2k so that makes little sense to me
The logic of some people goes that anything under 4000 horizontal pixels is not “real” 4k.
But as mentioned, I don’t care and also call 3840x2160 “4k” simply because it’s shorter than “2160p”.
And I don’t want a definition for “4k DCI” or “4k UHD” … just a formally accepted definition of “4k” (in the context of a display resolution). We can all agree that it colloquially means the number 4000, I hope.
There is not one definition, if you hear “4K” you can use the context of the conversation to determine if they’re talking about the consumer 4K UHD format or cinematic 4K, neither of which have a vertical resolution of exactly 4000px.
UHD standards are maintained by ITU
DCI standards were developed by the DCI group and are now maintained by SMPTE
There is not one definition, if you hear “4K” you can use the context of the conversation to determine if they’re talking about the consumer 4K UHD format or cinematic 4K
4K UHD (along with 8K UHD and 16K UHD) are the consumer format standards for 3840x2160 image formats which includes Blu-ray. Full 4K or True 4K or DCI 4K is the cinematic 4K standard shown at 4096x2160, which many TVs supper via slight letterboxing
It’s nitpicking, whether it runs at 3840x2160 or 4096x2160 does not matter. Same goes for calling it 4K or UHD, even when one is technically incorrect.
If even Sony calls their 3840x2160 blu-rays “4K UHD” I’m fine with the average person using them interchangeable.
I had to go digging but 3840x2160 is both 2160p AND 4k UHD. 4096x2160 is something called 4K DCI which is more of a camera or film industry thing and is rarely used for things like TVs or video games.
1080p 1080i 720p (IE the i/p suffix) denotes a SMPTE resolution and timing.
HD/FHD/UHD (720,1080,2160 respectively) also denote SMPTE resolutions and timings.
These are SMPTE ST2036-1 standards, which are 16:9 and have defined (but not arbitrary) frame rates up to 120fps.
4k DCI is still a SMPTE timing, but used for cinema and is generally 24fps (tho can be 48fps for 2k DCI).
It’s SMPTE 428-1.
There are other “4k” standards, but not nearly as common.
If you have arbitrary resolutions or timings outside of the SMPTE standards, and generally fall into VESA standard resolution/timings or custom EDID resolution/timings.
Chances are your computer is actually running 1920x1080@60 CVT-RB rather than 1080p60.
Whilst 1080p60 and 1920x1080@60 seem like they should be the same, some displays (and devices) might only support SMPTE timings or VESA timings.
So, although a display is 1920x1080 it might expect SMPTE, but the device can only output VESA.
Wow. This was very informative. Thanks!
No problem.
Displays, resolutions, framrates, edids are all very complex. And marketing muddies the water!
I’ve encountered this issue before when using BlackMagic equipment.
What I was plugging into was described to me as “1080p”.
Laptop directly into it would work, and it looked like 1080p in windows display management.
Going through BlackMagic SDI converters (SDI is a SMPTE standard protocol, so these boxes went hdmi->sdi, sdi cable, sdi->hdmi, and would only support SMPTE resolutions/timings), the display wouldn’t work.
Because the display was VESA only.
I then read a lot about SMPTE, VESA, and EDIDs!
Correct, but both can be called 2160p just because of their vertical resolution. Overall both terms don’t matter in gaming because aspect ratio can be changed on the fly (on PC) depending on the output device. Haven’t touched a console in years but I assume they are stuck with a 16:9 aspect ratio no matter what they are playing on?
Why is it incorrect? 4k isn’t a formal standard. It just means you have approximatly 4k horizontal pixels.
Calling 3840x2160 “4k” makes sense since 3840 is so close.
On a different note sometimes I’ve heard people call 2560x1440 for “2k” but neither 2560 nor 1440p are close to 2k so that makes little sense to me
1920x1080 is closer to 2K if anything.
yep Full HD is a 2K resolution https://en.wikipedia.org/wiki/2K_resolution
Heh, I never knew this and I am tech savy. TIL.
I think people call 1440p 2k because they know 4k exists and associate 1080p with 1k.
The logic of some people goes that anything under 4000 horizontal pixels is not “real” 4k. But as mentioned, I don’t care and also call 3840x2160 “4k” simply because it’s shorter than “2160p”.
Huh TIL
I thought it was because 4k has 4x the pixel count of 1080p
4K is definitely a formal standard
Ok, can you formally define it or link me to it?
And I don’t want a definition for “4k DCI” or “4k UHD” … just a formally accepted definition of “4k” (in the context of a display resolution). We can all agree that it colloquially means the number 4000, I hope.
There is not one definition, if you hear “4K” you can use the context of the conversation to determine if they’re talking about the consumer 4K UHD format or cinematic 4K, neither of which have a vertical resolution of exactly 4000px. UHD standards are maintained by ITU DCI standards were developed by the DCI group and are now maintained by SMPTE
I agree. But then it’s not a formal standard.
4K UHD (along with 8K UHD and 16K UHD) are the consumer format standards for 3840x2160 image formats which includes Blu-ray. Full 4K or True 4K or DCI 4K is the cinematic 4K standard shown at 4096x2160, which many TVs supper via slight letterboxing