I like this announcement, because it means that there's a manufacturing line for proper HiDPI [1] displays running in some LG factory somewhere that third party manufactures like LG/Dell/Iiyama can hopefully use to give us some fresh good-looking 27" 5K desktop monitors. It boggles my mind how little attention very high pixel density displays have been getting from PC display manufacturers. I would also be first in line for a PC monitor that uses the M1 iMac display, but I suppose nobody sees a market for higher end 24" monitors anymore.
[1]: HiDPI displays that work correctly with macOS' and Linux desktop's naive HiDPI implementation, that requires 2x scaling for good results.
Nobody in 2022 will sell you a monitor that does that, except for Apple's expensive stuff that is hard to use with regular PCs and one over the top Dell display. I wish everyone did what ChromeOS or modern Windows apps do. I need that extremely crisp font rendering in my life.
Apple have always been the first to push higher resolution devices for as long as I've been alive.
Laptops in the early 2010's were stuck on 1336x768 until Apple kicked up a fuss about having "retina", same with phones which had comically low resolutions until Apple made a fuss about it with the iPhone 4.
Sadly my eyes aren't as good as they used to be so I can't make a lot of use of the extra real-estate, but it always seems as if they're ahead when it comes to resolutions on consumer devices.
The ThinkPad R50p had a QXGA screen (2048x1536) option back in 2003. Granted Windows had zilch support for it which is probably why it died. Even the base model was a 1600x1200 screen. And there was plenty of phones with 200+ ppi before the iPhone 4. I think it has more to do with higher PPI screens getting cheaper and Apple could just get more supply first...
Of course macOS supports these higher resolutions but there’s also the color calibration and support for wider color gamuts other than sRGB and having that all integrated together.
For example, the iMac had Display P3 (a color space 50% bigger than RGB) support starting in 2015.
And Apple still hasn't shipped a laptop with a higher refresh rate than 120hz when competitors have been shipping 144 or even 240hz for years (A 100% bigger value than 120hz!).
Apple doesn't ship stuff just because it's available; there has to be some appreciable value.
Faster refresh doesn't necessarily mean a better display. I suspect Apple's displays are higher quality than most competitors even if they max out at 120Hz.
Here are the specs for the latest MacBook Pros:
16.2-inch (diagonal) Liquid Retina XDR display;1 3456-by-2234 native
resolution at 254 pixels per inch
XDR (Extreme Dynamic Range)
1,000,000:1 contrast ratio
XDR brightness: 1000 nits sustained full-screen, 1600 nits peak2 (HDR content only)
SDR brightness: 500 nits
1 billion colors
Wide color (P3)
True Tone technology
Refresh rates
ProMotion technology for adaptive refresh rates up to 120Hz
Fixed refresh rates: 47.95Hz, 48.00Hz, 50.00Hz, 59.94Hz, 60.00Hz
In games with fast motion, 120hz vs 240hz is easily perceptible (outside first person shooters it’s not good to provide much value). I would be fine with 120hz for desktop work, but 60hz displays has been a gripe of mine about Macs for a long time. 120+hz was available on CRTs 25 years ago…
I can see benefits even while scrolling a webpage or moving windows around. It's very confortable.
This been said, I can remember CRTs at 60hz with side scrollers and demos at 60fps with perfect image stability while scrolling. You could read tiny texts scrolling up and down perfectly. This is not the case anymore with LCD panels...
Try to read a webpage while scrolling, impossible.
60hz or 120hz I would pay for a stable image in motion. It would bring tremendous confort.
IMAX did tests back in the day with footage of a baseball being pitched directly at the camera. They found that the intensity of emotional response tailed off after 60hz. It's not that the higher frame rate isn't perceptible, but that raising the frame rate doesn't really have any appreciable impact unless people expect higher frame rates.
There's a notable difference between passively consumed content and interactive content.
We're fine with 24FPS for films and tv shows, but try that for an FPS and it's considered nigh unplayable. I remember when the line was "the human eye can't see past 30FPS" now it's at 60. The treadmill keeps going.
Random cheapo laptop: 1366x..
One better: apple
Even better: higher priced laptop had 1920x.... Oh, and those had IPS displays unlike apple with (halfway decent) TN panels.
So everybody else pushed, then apple not only caught up but jumped ahead. And now they are again behind with 3k vs 4k and 5k vs 8k and miniled vs oled with the same order: cheap < apple < expensive.
Same for the macbook air/ultrabook form factor: Invented by Sony but only got popular once apple jumped on the train years later.
Apple is really good at "sherlocking" and getting out a polished/well-integrated version of something that has been around for years and making it popular.
> Sadly my eyes aren't as good as they used to be so I can't make a lot of use of the extra real-estate, but it always seems as if they're ahead when it comes to resolutions on consumer devices.
With proper 2x scaling as intended by Apple there is no extra screen real-estate. 5k at 2x scaling gives you the real-estate of a 2560 x 1440 display, just with a doubled pixel density and thus much sharper. This is the actual value of HiDPI display.
I bought a mac during this period and I have a distinct memory of having to choose between a 1080p laptop or a much lower resolution mac. MBPr has a leap forwards but I mean laptops that WEREN'T Apple's at Apple's price range had already moved on from 1336x768.
352 × 416 pixels at 2.1" is about 259 pixels per inch.
When introducing the iPhone 4, Steve Jobs said the number of pixels needed for a Retina display is about 300 PPI for a device held 10 to 12 inches from the eye.
They've had the iMac line using these displays forever and it hasn't filtered down to other display makers. Only LG via the ultrafine line has used these densities (also Windows and Linux support is lacking or janky)
Iiyama [1], Dell [2] and LG used a 27" 5K iMac display for a little while, but as production at Apple wound down you can no longer really buy those in most places.
A portion of that could be related to lack of protocol support. 5k at 60Hz is more bandwidth that HDMI or DisplayPort could provide until very recently. Apple got around it by essentially making it two separate displays bound together in software using a single Thunderbolt cable, but that is only really feasible when you control the entire ecosystem.
Graphics cards that can output 5K60 over a single DisplayPort cable out of the box have been available since mid 2016. If anything, availability of HiDPI PC monitors has gone down since then.
Looking at the uproar over at Reddit, it seems like people care far more about refresh rate than resolution or color depth. This also seems to be borne out by the current monitor market - most monitors seem to be 1440p high-refresh rate monitors.
The vast majority of people with 120hz+ displays do not play video games. They're standard in mid-range TVs. They're standard in mid to high-end phones. They're standard in high end tablets. Gamers are a minority of the 120hz panel market. They're certainly also a minority of the people discussing this panel on reddit, 5k is too high resolution for gaming and 120hz is too slow, people weren't wanting a 5k120hz panel to play games, they wanted it for productivity and at most to play games on the side.
Low refresh rate is bad for pros for the same reason it's bad for gamers, it makes it harder to quickly and accurately perceive things in motion and it slows down your ability to accurately execute inputs. I feel the effects of the reality distortion field has caused everybody to believe that quite simply no real professionals could dislike working on 60hz LCDs and the disappointment is all caused by all those silly gamers.
The PC market seems to be driven by the gamers and the word on the gamers street is that it's all about latency.
I actually like extra wide displays. There are few interesting options like that but the rest seems to be dominated by low color, low resolution, low latency stuff.
I wanted to build a gaming PC that would double as a Windows dev machine, so I wanted more pixels than 1080p.
Even with my 3090, I can only reasonably do 1440p @ 240Hz, and even then I lose some frames on Fortnite with graphics settings turned down. 4k was out. Thankfully Alienware makes a very nice 1440p 240Hz monitor.
Do you mean the AW2721D from 2020? If so, I was looking to purchase one myself.
How color accurate is the screen?
How well does it handle HDR?
Does it ever get blurry?
If you've ever tried run Linux, either bare or VM, did it handle dpi correctly?
I haven't tested for color accuracy or HDR. No Linux. Does not get "blurry" while I'm gaming.
My only recent comparisons are my new 16" M1 MBP and my 1440p Dell U2719 monitor (provided by work). Colors are wildly better than the Dell (of course) and it seems close to the MBP, although the MBP looks better out of the box of course.
At time of purchase, the only viable 240Hz 1440p options were this Alienware or the Samsung Odyssey G7. The Alienware offered better latency, and it was 40% off on the Dell website for some reason, so that's the one I went with.
Once you get used to low-latency or high-refresh-rate displays, you can't not notice the subtle mouse cursor drag of a "early days" 4K display (had one at work), or the teeny delay of (most) Dell monitors. Honestly considered at asome point just asking work to get me a high-Hz display.
I have a 24"(61cm) 4k Dell monitor with Ubuntu... it is a bit unique these days, don't think there are many others around. Mostly happy with it, but...
I'd rather have higher density like the laptop it is connected to, with 4k. Perhaps 200dpi 3:2 or 16:10 around ~22"(56cm) diagonal that can do portrait would be my preferred monitor. Haven't seen that around unfortunately.
I made this comment elsewhere, but all I really want is a 96x2 PPI monitor because I'm mainly on Windows and that's what widgets look their best at. 24", 3840x2560 (for 96 x 2 PPI, 3:2 aspect ratio), 120Hz, 10 bits per color, topping out bandwidth at exactly two lanes of UHBR 20 (or four lanes of UHBR10) of DisplayPort 2.0 would be my holy grail.
I could deal it being around 22" for higher density for Mac users.
It seems there's a few people in our same camp, but we haven't been heard by manufacturers yet.
There was an LG Ultrafine 21.5" 4k display which was the same DPI as the MacBook's screen, but it's been long discontinued (along with that model of the iMac, which was what the display was originally destined for)
I've been searching for those, but they're unobtanium even second hand. I suppose no-one wants to get rid of these monitors once they have them, because there's no replacements you can buy.
Yes. My desktop is using a janky 5k display with dozens of dead subpixels, and still it was the best option available at the time (and it now seems to be discontinued). It's impressive how the supposedly diverse PC ecosystem completely fails to deliver in certain areas; see also reasonably sized Android phones.
This is likely a mini-LED screen that Apple has been putting into the iPad Pro and MBPs, which is not a technology LG possesses. This is likely manufactured in Taiwan or China or Germany, using the licensed technology from Taiwan's Epistar.
There is no evidence for that. It is more than likely an LG display, as Apple has been rumored to work be working with LG on Apple branded displays, and they are the only producer (so far) of 27 inch 5K displays.
> I like this announcement, because it means that there's a manufacturing line for proper HiDPI [1] displays running in some LG factory somewhere that third party manufactures like LG/Dell/Iiyama can hopefully use to give us some fresh good-looking 27" 5K desktop monitors.
LG, Dell, and Iiyama all made such monitors; the only survivor is the LG one. They didn't sell well, apparently.
Personally it's still too much of a hassle dealing with HiDPI on Windows, especially if you mix with regular DPI displays. They also seem like overkill. I don't know about you but 1440p at 27" is the perfect DPI for me
Awesome to see them finally putting out almost consumer friendly pricing displays. Few things I'm disappointed about though:
- 60hz. For this price point I'd expect higher.
- Thunderbolt 3. Interesting that they didn't bump to 4, given the Mac Studio is Thunderbolt 4. This means you wont be able to daisy chain the displays.
- Lack of size options. Would love to see more variety here. After moving to an ultrawide format, I can't see myself moving back to standard format monitors from a productivity standpoint.
Overall though excited for this and keen to see how it'll evolve. It'll be a miss for me this cycle but keen to see their future releases of their monitor line.
DisplayPort 1.2 cannot even do 5k at 60Hz at 10 bits per channel. You might be thinking about DisplayPort 2.0, which is not yet widely supported in video cards or monitors, and even then it needs the (essentially non-existent) UHBR 20 data rate to get to 120Hz.
HDMI 2.1 does support 5K@120Hz@10bpc but requires DSC as you say, which is not very commonly found either.
(Note that unlike what a sibling comment states, DSC is lossy. Some argue that the less is not noticeable to the human eye, but not everyone agrees)
It's not commonly found on monitors. DSC is included starting with DisplayPort 1.4, and a good portion of monitors even today still ship with DP 1.2
Nvidia and AMD have not been shipping DSC with HDMI simply because HDMI included DSC starting with HDMI 2.1, which only the current generation of graphic cards and very few monitors support.
Display Stream Compression, a lossy compression algorithm optimized for the specific use case of device-display transfers (i.e. relatively low compression ratio == very high quality that's claimed to imperceptible even if it can't run lossless, not needing to work on full frames for speed/sub-frame latency, ...)
I didn't know either. DSC is Display Stream Compression, and has been part of the spec since 1.4. According to Wikipedia [0],
> DSC is a compression algorithm that reduces the size of the data stream by up to a 3:1 ratio.... Using DSC with HBR3 transmission rates, DisplayPort 1.4 can support 8K UHD (7680 × 4320) at 60 Hz or 4K UHD (3840 × 2160) at 120 Hz
DisplayPort 2.0 supports up to 77.37 Gbit/s of bandwidth. Wikipedia's DisplayPort article has excellent tables about bandwidths of display formats: HDR 5K@120hz (57 Gbit/s) is totally possible with DP 2.0.
One could use DisplayPort 2.0 alt-mode to carry that signal over a USB-C cable. With the main disadvantage that the USB ports on the monitor would drop to USB 2.0 speeds [1]. And alt-mode 2.0 is only available on the most recent hosts with USB4 ports.
[1] Native USB4 tunneling only supports DisplayPort 1.4a (for now). That's a huge flaw imo, cause falling back to alt-mode takes over the cable and blocks USB3/PCIe tunneling from working. In fact, that flaw is even present on this monitor:
> When connected to iPad Pro 12.9-inch (3rd and 4th generation), iPad Pro 11-inch (1st and 2nd generation), or iPad Air (5th generation), Studio Display USB-C ports deliver USB 2 data transfer speeds.
Those models don't support TB3 or USB4 on their USB-C port, so they have to use alt-mode with
either compression or without HDR.
There's also the fact that nearly all content has compression as part of its path to display anyway. Video & images are obvious examples, but also GPUs almost always render to a lossy compressed framebuffer as well. It just saves way too much memory bandwidth not to do.
So as long as 'transcoding' from GPU framebuffer compression to DSC is itself a mostly lossless process, there's realistically not even a quality hit from it anyway
Samsung has 5120x1440 @ 240hz today. That is literally half of a 5K panel at double the speed. It amazes me how many people I've seen say this exact same thing about how 5k120hz is impossible.
Displays using MST to achieve higher resolutions than a single link allowed worked out more or less fine. The problem there is MacOS doesn't support MST.
That's a common misunderstanding. DisplayPort MST allows multiple display streams to be multiplexed onto a single DP link. So you are still bound by the bandwidth of a DP connection (25.92 Gbit/s for HBR3).
The 5K Ultrafine (which I presume you are talking about) is connected via two separate DP links which are tunneled over a single cable using TB3.
That sounds like nitpicking, but is a crucial difference. MST provides daisy chaining of displays, but can not help you exceed bandwidth limits. Whereas the two DP links are completely independent and theoretically allow you to double the max bandwidth. (well 2xHBR3 exceeds the max TB3 bandwidth, so the 5K uses 2xHBR2)
TB4 has the same bandwidth as TB3 so you wouldn't be able to daisy chain 5k monitors anyway. Also as the other commentator mentioned, TB doesn't have enough bandwidth for 120hz (53Gbps vs 40Gbps)
Try doing your same math with the Pro Display XDR or the Samsung Odyssey G9 and your calculations will come up with the monitors being impossible especially accounting for overhead. 5k120hz is very possible with today's tech, you just need to use DSC or 2 links.
Yep. The LG 5Ks are the same way too. Moreover, if you have an Intel MacBook Pro you have to use ports on opposite sides of the laptop if you have 2 displays. You can’t plug them in on the same side.
Yeah I don’t know what it is with everyone prioritizing pixels over refresh rate.. once you start using 100+hz 60hz starts to feel like a slide show. You can see mouse trails and scrolling is jerky and uncoordinated, it’s painful and insanely distracting.
I mainly just look at text all day, so the "slow" transitions between different bits of text just don't matter all that much to me. Resolution helps that text look nice and crisp though, which I really do appreciate.
I don't think I'm reading the text while it's scrolling... But maybe I am more than I realize? I actually think I'm more looking at the shape of the text when it's scrolling rather than the actual text itself.
To this day, 27" 1440p, 144hz IPS gaming monitors are still my favorite daily driver for any desktop productivity. And when I say favorite, I mean by an extremely large margin.
The amount of real estate you get with 2560x1440 is fantastic, and you can actually game on it at native resolution on older gen graphics. The pixel density of 27" @ 1440p is the epitome of goldilocks. Every single pixel is just the right amount of useful.
I switched from 27in 1440p to 27in 4K recently, and I am enjoying it. I previously used 100% scaling, but now I'm using 200%. I have less screen real estate, but everything is more crisp, and I find myself leaning forward at my monitor less and with less eye strain.
Although I agree that 100+ Hz matters and the difference is huge, pixels density is very important also, and for some (myself included), it matters more.
For a while I went down the rabbit hole of getting high end CRTs for retro games. I’m in a PAL region though, and the first time I booted up a 50Hz game I wondered how the hell I didn’t spend my entire childhood with a migraine. 60 I’m fine with but 50 I can literally see flickering.
I don't know what it is with everyone claiming a 60 HZ panel is unusable. Movies are in 24Hz and people panic if you make them 60Hz because it's too smooth for them, but somehow 60Hz isn't enough to browse the web?
Being able to see my screen's pixels annoys me a lot more than moving my mouse at 60 Hz.
Meaningless comparison. You also can't run most VR games at anything less than 90-120fps without introducing motion sickness in the user. So what does that say about "usability" of 60fps on a monitor? Nothing. You can't compare two visual mediums directly by the number of "frames per second".
Camera lenses introduce blur through the use of the shudder, which acts as a form of "interpolation" and smooths out the image. Displays do no such thing; image strobing is extremely noticeable at different framerates. The difference in Destiny 2 between 60FPS and 100FPS is quite literally shocking to the point I can't go back.
TVs have techniques to deal with this, either motion smoothing[1] or black frame insertion. PC displays can't do the first for latency reasons but I think BFI is just down to expense, or because we don't have OLED/mini-LED displays yet.
There's two kinds on LG OLEDs, 24->60 and 60->120. The second one doesn't have the objectionable artifacts.
The first one people say you shouldn't turn on to respect the artist's vision, but I sure hope they don't mean it, because that'd mean they're respecting Harvey Weinstein. Clearly leaving it on is the moral option.
I've been running an LG non-curved 3440x1440 ultrawide monitor as my main work monitor for several years now. It's a great form factor, but the resolution is really sad. It's essentially a wider version of the Dell 1440p monitor that I had ten years ago. My ideal monitor now would be a pixel-doubled version of this ultrawide monitor, but I'm still tempted to lose the extra width and upgrade to this 5k display.
The Dell U4021QW is probably the best monitor I've ever used. I, too, got sad with 1440 vertical pixels on previous ultrawides, so I was pretty excited when they announced this one. 5120 x 2160 is really a dream. It's limited to 60hz, which bothers some folks, but I haven't noticed. I don't game or anything on it, though.
I actually forgot that that monitor configuration existed. It looks nice, although I assume you would still be running at the same UI scaling as a 1440p monitor, and the higher pixel density compared to a 34" ultrawide 1440p display (around 25% higher) means you put the monitor closer to your eyes. So unless you're doing non-integer scaling (which I want to avoid if at all possible), you get a lot more real estate than the 1440p display, but not 2x density. Or you could do 2x scaling, but then you've just got the real estate of an ultrawide 1080p display. Personally I have plenty of real estate with 3440x1440, and what I'm really interested in is jumping to 2x scaling.
Same. I have a 32" 1440p display and am considering grabbing one of these when I get back to the states in a few months. I could live without the speakers, but honestly if they're "good enough" they might kick my powered studio monitors off of my desk. I like my powered speakers but I don't use them anywhere near the volume they thrive at so it might be worth it to re-examine my workflow as a hobbyist.
I would really love an ultrawide for the extra space and immersive gaming option. But Retina display has spoiled me a lot. I regret getting used to a 4K configuration scaled to 150% which is effectively 2560x1440 display with higher PPI.
Yeap. Same dilema here, splitting in two or three when coding doesn't look good in my LG 27" 4k monitor. Is either full screen or nothing. For the case, instead of buying 2 32" apple pro monitors buying 3 of the 27 would make it better.
I think I'll buy one and maybe buy two more later if it proves to be good, almost $5k though, ouch... Wish they made a simpler version with no ports, no camera, no speakers, etc - why would I need 3 cameras and 3 sets of speakers?..
It's kind of ridiculous how slow the progress in screen resolutions has been. It's disappointing that Apple is only going for 5k, not 8k; they're well positioned to fix the ecosystem and bring high-resolution large displays to everyone, if only they tried.
I'm finally sitting in front of an 8k 65" screen. This gives me a nice combination of decent picel density in the center and lots of peripheral vision in which to put secondary windows, plus I can sit across the room and watch a movie on it. But every component of the ecosystem introduced problems and friction.
I have an M1 Macbook Pro on the same desk. The Macbook can't drive the 8k TV. I have a separate desktop running Windows with an nVidia GPU for that. Every component of the video ecosystem has given me friction in getting to 8k. I had to swap my $1k nVidia GPU for a different $1k nVidia GPU that wasn't any faster, to get HDMI 2.1 support. Had to use special HDMI cables, because cables that aren't specially marked as HDMI2.1 compatible don't have enough bandwidth. And then the display itself has a ridiculous postprocessing bug (I wrote about it at https://www.rtings.com/tv/discussions/IyO2wLLsNnJCMT-_/firmw...) which makes me think the firmware engineers didn't have a working 8k source to test with.
maybe i'm just old and blind but i can BARELY even notice the difference between 1440p and 4k if i very explicitly and intentionally look for it let alone the difference between 5k and 8k. this seems like so beyond unnecessary that it's absurd. i might be able to be convinced that you could notice on your 8k 65 inch tv if you were standing right in front of it, but on a 27 inch display that seems incredibly unlikely to me. also what world do we live in where a resolution that is literally above 4k isn't considered high resolution.
I sit 2ft from the 8k screen; it fills much more of my peripheral vision than most people's monitors do, and I position most things in the middle and never full-screen anything. Think of it like a multimonitor setup without the seams between monitors.
Are you talking about for watching video? In that case I'm with you, but for text (using it as a computer monitor as the OP is) the difference is night and day.
I have a 1440p 31 inch screen from which I seat about 1m away, and while it's finer than my previous 1080p one, In text or other fine features I can still notice the pixel boundaries. I would expect to notice them up to about 5k at this distance, but anything more would probably just be feel-good-ness rather than actual measurable difference, for me at least.
Depends on the screen size. On a 22" screen, the difference between 4k and 1440p equates to 200dpi and 133dpi respectively - if you can't see the difference between 200dpi and 133dpi, I'm sorry, but you have either poor eyesight or glasses aren't doing their job.
This 5k 27" screens blows your 8k 65" screen out if the water at any reasonable viewing distance for a computer monitor...
Nearly no one wants TVs for monitors, the evidence is LG's OLEDs which gamers will take 1440p ultrawides with infinitely worse picture quality over, just for the more reasonable form factor.
I use a 48" 4k 120hz HDR LG OLED. It's by far the best electronic purchase I've ever made in terms of productivity and entertainment.
It has a bulky monitor stand, but an Ergotron will allow you to mount it on an arm secured to the desk with enough range to move it right to the edge of the desk. You could also mount it on the wall.
So as long as you have a wider desk, like 27" which are available at Ikea, you're good to go. With a good window manager you can do magic with this monitor.
Setting the 8K screen farther away so it fills the same field of view will make it strictly better than the smaller 5K if you can manage it. Same view, more angular resolution and your eyes are focused farther away which reduces eye strain.
No it's not, which is why people take $2,000 1440p monitors still.
The "if you can manage it" part is an awkward setup usually with terrible ergonomics (a monitor too far away is worse for posture because you subconsciously tilt forward), it takes a ton of space, it looks hideous
-
Forgive my for my bluntness, but I am soooo tired of every conversation about improving monitors being sidetracked by people act like using a TV-sized TV as a computer monitor is better than strictly anything.
It's like saying using your hand saw as a flat head screw driver is strictly better because it had a larger handle... as long as you can line it up in the screw.
You may not like it yourself but other people have the opposite experience. Setting the screen farther away and having your eyes focus at that distance is more comfortable for them. That other people choose other solutions doesn't invalidate that this solution exists and is valued by people. And I used the term strictly precisely. If you can manage that setup physically you get the same angle of view, more resolution and less eye strain. The 8K large screen dominates the 5K small screen solution for that set of criteria. If you add other criteria you care about instead the evaluation changes.
You're using strictly in a way that's the opposite of precise when you need to preface it with a barn sized caveat.
In fact your suggestion is worse than standard, most people accept you'll never sit far enough away from a 65" TV to have the same apparent size as a 27" monitor and settle for "small enough" apparent size
-
I can't be bothered to do the exact math here but even at normal TV viewing distanced meant to fill your entire view the recommended distance is 5.5ft to 8ft.
Getting it down to the same apparent size of a 27" monitor would mean sitting across a very large room...
> You're using strictly in a way that's the opposite of precise when you need to preface it with a barn sized caveat.
It's not a caveat, it's the whole point. To not have your eyes straining to focus closely the screen needs to be far away, can't have one without the other. The people that value that start at that point of figuring out how to be able to have the screen far away enough. For them it's not a caveat, it's a requirement. So for them when they go shopping for screens the 8K 65" option totally dominates the 5K 27" and is a strictly better option.
> most people accept you'll never sit far enough away from a 65" TV to have the same apparent size as a 27" monitor and settle for "small enough" apparent size
The cases I'm referring do put it far enough away, that's the point, it's how you get a 65" image down to 27" on a desk equivalent. That suggestion shows up once in a while on HN when people discuss eye strain. You've now gotten another answer from someone that does something completely different and uses the 8K TV as a multi-screen setup on a single screen. That's the problem with these discussions. A TV or projector used 5+ meters away, an external screen on your desk, and a 14" laptop at keyboard distance are completely different use cases. Yet often these discussions end up on ">X DPI is essential" without considering this factor. It seems no one can imagine that other people use their screens very differently from them and it turns into a religious war.
That's the wrong approach. When I want a centered 27" rectangle, I use software to put a 27" rectangle in the spot where I want it. I never fullscreen anything, and put secondary things around the edges, the way one would use extra monitors. There are some minor software issues to work through, it definitely requires some customization in the window manager department, but it's worth it imo.
8k/65in has the same pixel density as a 4k/32.5in, which is mid-range as computer monitors go. 16k would be nice if that was possible to buy, but this pixel density is good enough for my purposes.
You talking to someone who upgraded from a 4k/32" equivalent ("5k" 40" ultrawide) to the XDR just for the pixel density increase, and it's no contest at all: even the most casual PC user would immediately appreciate the bump.
I'm more annoyed by slow progress in refresh rates. Resolutions have progressed pretty nicely IMO, but the top end monitors from Dell, Apple etc still all running at 60hz.
Also feel like 27 inches is pretty small these days, for high productivity type of work. Wish Apple went for a 34 inch
Why do they keep releasing these incremental specs/generations with minor improvements? I mean they should just define a standard that is actually somewhat futureproof.
But I'm sure there are economical reasons why that's not done
I've seen this sentiment a lot but it doesn't track with my experience. 4K@60Hz is now common and very cheap (<300€ for an LG IPS 27'' screen). It definitely wasn't just a few years ago, people even bought weird 4k@30Hz screens as a compromise. 5K is an intermediate step most manufacturers didn't bother with and we're getting 8K now at which point we've pretty much maxed out human vision for almost all applications. As far as I can tell we live in the future and nobody is happy. Maybe it is because very high end screens did 4K and 5K early and stopped there for a while because at that price it was a niche. Meanwhile all the innovation has been on making the cheaper ones reach that same level.
The key feature is "HDMI 2.1 display stream compression", which is only present in very recent GPUs, and without which a computer can't output 8k at all. If it's new enough to have that feature, it's fast enough to handle 8k no problem. I don't typically run videogames at 8k, but I'm sitting close enough that I want them centered in a ~4k window anyways rather than filling my peripheral vision.
For me, its frustrating to not see an option without all the webcam, microphone, octuple speaker array, A13, 100w charging, TB hub, madness. I have to imagine that adds significant cost, and at $1600 we're not in the territory where this stuff can be included just 'cause.
I like it, but I imagine some professionals would like to do what Apple's marketing images all show; buy two, maybe even three. I'd love to have one with all that stuff, but not all of them need the bells and whistles; and there's value in having all your displays be identical, especially in work that needs color calibration (not to mention, it looks nice).
So, maybe I grab one if the reviews look solid. And hopefully in the future they release a version without all that extra stuff for closer to the $1200-$1300 an LG 5K ultrafine display can be had for.
Except if you compare it to the LG 5K, it's worth the extra cost in build quality alone. The LG monitors are notorious pieces of junk with flaky connectors that come loose like clockwork.
The new Studio display is expensive, but I think the features somewhat justify the price compared to the "competition".
It's not hard to beat the notoriously bad build quality and reliability of the LG UltraFine and Apple has a track record of building solid displays (I have an LED Cinema Display that's going on 13 years old with nary a single issue).
I think your experience has been a constant of most Apple products for the last 22 years or so. Some people have no issues ever, some people have constant issues all the time. That said, I loved the Cinema displays and they did have a good track record, but if memory serves, the Thunderbolt Display variant was notoriously flaky [0]. Since they've put out so few external displays, it does stain their reputation a bit.
Everything I've heard about the Pro display XDR's has been that all the things like USB/Thunderbolt connectivity are rock solid.
This suggests Apple's display group has their goods together, and the Studio Display is likely to be solid too.
My personal take: my two 27" 4K LG monitors run GREAT here. I like the extremely small vertical bezels on both of them (one is a 144Hz and the other is a 60Hz). Connectors are fine. The monitors don't move at all, so build quality issues are imperceptible here. I think I paid something like $600 for each one at different points in time.
Yeah I know its just anecdata but I've been using mine daily for over 4 years now, all day every day, and never had an issue. I love the single cable + built in peripherals and am puzzled at the lack of competition around it. I had really questioned whether I was reaching when I purchased it for ~$1200 but 4+ years later, there's still very little to compare it to. (Tons of great options if you don't want built-in peripherals / single cable of course)
I’ve had 5 ultra fines since they were released in 2016/15ish, both 3x4k 21.5”, and now 2x5k 27”. Never experienced any issues, build quality is quite high in my opinion and the stand is surprisingly solid, much better then my Dell UltraSharps and LG UltraWide.
Of course not at the level of the Apple Displays but they are really outliers in the industry.
Apple is not interested in releasing a cheap dumb commodity PC-compatible monitor and never will be. It just doesn't make sense for them. For a lot of reasons.
I don't feel that's a reasonable take, in this case.
Apple's own marketing images show tons of these displays side-by-side with one-another. And that makes total sense; the M1 Ultra can run four of them IIRC. Apple customers, especially big businesses, will buy multiple of these just for the sake of them looking good next to one-another. Not unreasonable.
But what's the point of having three webcams & microphones? 18 speakers? Its just needless cost overhead; not to mention, I'm calling it here, when these things start shipping we're going to get a Verge article or something of people with dual+ displays complaining about how the webcam selection dialog in Zoom/etc is confusing because they're all ambiguously named like "Studio Display Webcam (1) (2) (3) etc". I can see it now.
I think it behooves Apple to offer one of these without all the bells & whistles. Knock $200 off the price or not, I'm not sure that's relevant; but as a secondary display, the bells & whistles actually get in the way. Its not about being a cheap dumb commodity PC compatible monitor; its just about serving what their customers are overwhelming going to want.
It doesn't matter whether you think it's reasonable. It makes no sense for Apple to try to do this.
You're entirely right that it makes no sense to have multiple sets of Dolby Atmos sound systems and cams and mics. But that doesn't mean it makes sense for Apple to do commodity displays. They're clearly betting that one 5K display will be enough in many cases.
Stocking multiple SKUs of large displays just doesn't make sense in their retail stores. I suspect the profit margins are iffy as well compared to, say, iPhones obviously, where the margin is circa 40% and the size of the box is tiny.
This entire comment just screams "it is how it is, until it isn't". Whether or not they decide to do it has nothing to do with anything you've said.
They're already stocking six Studio Display SKUs in-store (if they decide to stock them all in-store, that is). Two variations of glass finish times three variations of stands. Yes, that's right: "Each stand or mount adapter is built in. They are not interchangeable, so it’s important to consider your workspace needs at the time of purchase." They're not a separate component, and not even easily swappable outside the factory.
Their profit margins are fine. They can charge whatever they want for it, its not relevant because, clearly, if they were concerned about being price competitive with other displays they would not be charging $1600 for this one. Even the LG 5K Ultrafine is $400+ cheaper. In one comment you assert that Apple isn't interested in being a commodity monitor manufacturer, and in another you assert that the profit margins are low; so, which is it?
You could be right if they end up actually stocking those in all stores. I'm not sure whether that will be the case.
One small counterpoint to that part of what you said is that the 27" iMac has apparently now been pulled from the lineup, so, that should clear up some space in the stockroom.
As for the "which is it?" question, I don't feel those two points were mutually exclusive, so therefore I don't think your question is really valid. I think both of my arguments were true and valid; that's why I made them to begin with. You might disagree, of course. :)
Would be nice to have >5K resolutions to have more pixel density.
Alas, macOS doesn't support any native scaling other than 100% and 200%. So if they did release e.g. a 27" 8K monitor, the text would either be too small, or they'd have to use bitmap scaling to make it bigger in which case there'd be no advantage of having an 8K monitor.
(EDIT: To clarify, all other scaling factors are done by rendering at either 100% or 200% and doing bitmap scaling up or down. By bitmap scaling 200% up up to e.g. 250%, things are bigger so that's good, but there's no extra detail being displayed, so you're wasting the resolution of your monitor. You might as well buy a cheaper monitor with fewer but larger pixels.)
I really don't understand why they don't either (a) adopt Windows' approach of allowing rendering directly to any arbitrary scale, or (b) at least introduce a 300% mode with bitmap scaling analogous to their 100% and 200% modes.
Your suggestion was tried for years and failed every time because it doesn’t work. HiDPI shipped because it was 2x or nothing. (Later there was 3x.)
The reason Windows developers still think it might work is they have no taste and don’t care about localized UI, pixel cracks, or blurry bitmap controls.
Android has been shipping arbitrary DPI support for longer than iOS or MacOS has supported the limited 2x HiDPI, and it doesn't have the issues you're talking about.
It's definitely possible to do, although retrofitting an existing UI toolkit & ecosystem is a massive challenge.
Isn't the Android SDK older than iOS? Either way, it can't escape blurriness. There's bitmaps on this very page and if I zoom in the reply button misrenders.
Actually it can't escape pixel cracks either, as I seem to remember OpenGL ES 1.0 including GL_LINES where the default width is 1px.
Androids pixels are display pixels, there's no post-render scaling. So no pixel cracks, no blurriness, etc... Density is instead handled mostly at layout, as it should be.
This is also best practice on many other platforms, too (eg, 'pt' and 'em' units). This is far less commonly followed on most toolkits, though, but it's the norm on Android.
nah, hidpi on Windows is fine if you know what you're doing as an application developer. it's those who haven't bothered to make the changes recommended or required that make windows hidpi support look worse than it is in single-monitor setups.
multiple monitors on Windows with varying dpi scales is not great though, and probably can't be until some backwards compatibility promises are broken or expired.
> nah, hidpi on Windows is fine if you know what you're doing as an application developer.
Many (most?) don't or can't be bothered so the end result in practice is that it doesn't work. Microsoft has been chasing the HiDPI fairy for decades and the situation hasn't really gotten better. They still ship software (both in and outside the OS) that isn't HiDPI aware.
Retina prioritizes making it easy for developers to adopt: Double the size of artwork, points are 2 pixels, Done. Are there quibbles? Sure. But millions of people are looking at HiDPI screens where every single thing in the OS and all third party apps they use fully support 2x.
No one lives in the magical world where arbitrary scaling factors are supported. Doing that as a developer is just too damn complicated when @1x + @2x (and maybe @3x) makes 98% of people happy and is vastly less work.
> Microsoft has been chasing the HiDPI fairy for decades and the situation hasn't really gotten better.
Well, I want to say it has gotten better. Their Metro design is what a UI has to look like to work with arbitrary scales and still solve those problems I mentioned. The downsides are it uses simple geometric shapes and has tons of whitespace everywhere so text can reflow in longer languages.
This is probably another big reason behind “flat design” in modern websites, the old skuomorphic (sp) stuff would be hard to do with vectors.
(As for vectors in UI, they have other performance issues which are important if you like live resizing windows.)
The vast majority of apps on modern Windows (10+) just works. I've been running all kinds of weird scaling factors like 175% for the past several years with no issues.
>nah, hidpi on Windows is fine…
Exactly, it’s fine.
Apple has never settled for “fine” either it’s great or it gets cuts at some point.
>if you know what you’re doing as an application developer.
Apple also tends to avoid wading into footgun waters to keep quality (for 3rd party devs) as high as possible across the board.
> I really don't understand why they don't either (a) adopt Windows' approach of allowing rendering directly to any arbitrary scale
Windows is badly broken in this regard. I bought and returned a Microsoft Surface laptop because its rendering is broken: If you display a web page containing horizontal lines (like grid lines of an HTML table) then the lines will appear to have varying thickness even though they are all set to 1px. That's crap; I couldn't believe Microsoft is shipping this. If Windows scaling is set to anything other than 100% or 200% you will have this issue. Both 150% and 300% have this issue. I have never seen such issues on a Mac.
Macs does support non-integer scaling. In fact, macOS currently ships non-integer scaling by default in certain MBP models. It gets criticized from time to tome, though the newer 14/16” MBPs ship 200% as default again.
Unless things changed since last I checked ~2 years ago, macOS’s 150% scaling actually renders at 300% and downscales. It looks pretty bad (visible aliasing on any kind of text) and is wasteful performance-wise.
As far as I know, it's worse, it renders at 200% and downscales.
macOS draws only at 1x or 2x. 3x is iOS only.
Apples decision to support only integer scaling was what allowed them to adopt Retina displays very quickly. Unfortunately it led to a subotimal solution in the long term.
I upgraded from 2015 13" Macbook Pro to 2021 14" Macbook Pro, and it's remarkable how similar the devices are. The physical dimensions are almost identical. Neither have a touch bar. And they both have a similar selection of ports (HDMI, SD card, aux, mag safe), and in very similar locations. The difference is basically improvements in quality of pretty much everything: better screen resolution and size, better webcam, better keyboard, better speakers, USB-C instead of thunderbolt ports and USB-A, better battery life, and of course a much much faster processor and thermals.
However this decision does mean Apple can't produce monitors with more than about 200 PPI without the text being too small (at 200%). i.e. they can't go beyond 5K at 27".
Which is a shame, you can use 8K monitors with Windows at e.g. 300% just fine.
They can easily add support for rendering at 3x. iOS has supported that since iPhone 6 Plus. The only thing that's missing from macOS is a switch to turn it on, and produce all the UI assets at 3x.
For textual work, 4k is crisp enough, for gaming you can't run 8k with 120Hz as it's way too demanding and there are no content that dense, for movies or TV there are barely any content for 8k and maybe it's just for photo editors.
Having 120Hz for all the display's baseline is far more important than going beyond 4k.
> (a) adopt Windows' approach of allowing any arbitrary scale
Apple is all about controlling and curating the user experience. They would much rather force you into some lane than allow you to go wild configuration wise.
Personally, no, but the GP did say "allowing any arbitrary scale" and that's what I was referring to. Why Apple does not allow 150% is beyond me because it is quite reasonable. But I can understand why they don't allow users to enter a value like 168%.
> Why Apple does not allow 150% is beyond me because it is quite reasonable.
It is absolutely not reasonable. HTML tables with horizontal grid lines will appear to have varying line thicknesses when in reality they are all set to the same width.
I wish they wouldn't make me scroll through two pages of animated backgrounds just to see the actual monitor. That said if I would spend this much on a monitor I would prefer the Samsung Odyssey Neo, which has a 5K resolution on a curved 49' display with a 240 Hz refresh rate, HDR 2.000 and G-Sync/Freesync. I guess the color space coverage is not as good though, and it's not as bright at 420 nits, though that's more than bright enough for me.
The 300 nits brightness kills this model for me. With my desk sitting next to a window, even with indirect sunlight anything below 400-500 nits starts having usability issues. I could lower the shades but I'd much rather let the sunlight in.
I have several Dell monitors (writing this on a 27' 4k S2721QS), they're absolutely awesome for office work. Sometimes I crave for a higher refresh rate though, I have another ultrawide monitor with 144 Hz refresh rate and it really makes a difference, I find.
I learned a long time ago to hit "Tech Specs" on the top right instead of sitting through the ridiculous marketing pages (especially since they have awful performance on Firefox for some reason).
I've worked off Apple Cinema displays since the '90s and Apple Studio displays before that. I am a monitor snob. I care deeply how things look.
I currently have an Apple XDR Pro monitor. It's not a great monitor - it is indeed big, the outside is as cool as anything ever made, and the USB-C hub is nice. But in EVERY other way it is badly inferior to the stock 5K monitor that comes with even the cheapest 27" iMac from a few years ago. I've been wishing for just an 27" iMac monitor that I can plug into my laptop, since that's the best non-laptop monitor I've found - even contemplated building a Frankenstein one. I'm excited I can now just buy one.
Just to be clear, the XDR is not a _terrible_ monitor. It's certainly better than a random $200 monitor. Most people in the world wouldn't notice the following issues.
- There is a noticeable (to me) brightness shift with viewing angle changes. When the monitor is on the back of my desk and displaying a flat color across the monitor, I see a noticeable gradient instead of a flat color because of different viewing angles between looking head at the center pixels vs being off axis to the side pixels.
- The adaptive brightness backlighting doesn't do a very good job. Full bright white pixels in a dark areas will be noticeable dark compared to full bright white pixels surrounded by other brighter pixels.
- If you have some bright pixels in an otherwise dark area, the backlight cell built into the monitor there noticeable brightens, putting an approximately once inch by one in halo around the bright pixels. For example, this halo shows up around the mouse cursor when moving across a black area.
- I tend to work with light text on a dark background. The monitor makes the darks too bright (from the backlight bleeding through) and the brights too dark (because the backlight around there isn't on all the way). Everything is muted. It's noticeably less contrasty, colorful and alive when used with light on light than two iMac 27"s I have had.
- Although the PPI specs are the same, there's something about the pixels on the XDR that makes them more visible and less smooth to me than on the 27.
Ah, I haven't noticed much difference between my 27" iMac and my LG 5k UF's (there's a little), and been tempted to get the XDR, mainly because I was hoping for even better and bigger panel than the 5k ones. With your comment in mind I'll decommission that idea. I appreciate your comment, it's exactly the kind of stuff I'm picky about while others think I'm mad.
I'm not the OP but I've had an XDR for about a month now, which replaced a 27" LG UltraFine 5K (same panel as the one used on 27" iMac for years). My issue with the XDR is the lack of built-in camera, mic, and speakers that are serviceable for zoom calls.
I don't need audiophile stuff, but it's remarkable how bad the built-in AV was on the LG UltraFine. Mic quality is bad enough that I won't inflict it on my co-workers. The camera is angled too low, resulting in the top of my head usually being cut off. And the speakers go from quiet to really loud, with no in-between.
Can you actually connect HDMI input to a thunderbolt only display? I have the LG 5K thunderbolt-only display and cannot use it with a desktop PC due to this issue.
You'd still use a 5k display in a non-native 4k resolution (surely the xbox doesn't support higher resolutions?), which might be ok for video and most games, but it's not a good solution. I'd just get a half decent, affrodable 27" / 4k display for gaming and keep the Studio display for actual work.
Plus, you can use the additional 4k display as an extended macOS display when not gaming. I mean there's never enough screen real estate.
Hard to say. Those are actually designed for the other way around: Plugging a USB-C device into a displayport monitor, by extracting the alternative mode out of the USB-C connection.
Benq's is pretty good but whoever stuck those checkbox-filling speakers into the design is making a cruel joke at everyone else's expense. I get better sound out of my phone. Way, way better sound.
True, there are QA issues reported with it. (FWIW I've had mine since 2017 and only starting to hit random connectivity issues now, but 5 years isn't a bad run for a heavily-used monitor)
For these types of displays - you shouldn't really ever be using the stand that it comes with anyway. Monitor arms are ubiquitous and known. They do much better job of holding a monitor than the stands that come with most monitors.
That's a "you are holding it wrong" kinda argument. If you pay 1.5k for a monitor you should be able to expect a stand that doesn't shake if you bump into your table a bit.
Yeah there was no mention of local dimming or refresh rate so it seems more like a refresh of that display with the new Apple design. Though for refresh rate I kind of understand since I don't think there are any 5k 120hz displays on the market. I also noticed that it only has a single thunderbolt port, so I'm not sure if you can daisy chain the displays. Even just adding local dimming would have been a huge difference maker but it likely would cannibalize the Pro Display XDR sales.
No webcam or microphone in the LG displays. Well the 4k at least. I assume the 5k is the same. It does have speakers. They're ok for the price point but not notable. It's also basically unsupported as far as software is concerned. You can't connect an iPad to it for example. Or at least I can't anymore. Sound would play but not screen mirroring.
Interesting. Thanks for the info. I guess I should have been less lazy and just looked that up before posting. When I was looking at both the 4k and 5k it was completely lost on me that it had those features. Dang! I don't need 5k (or 4k really but w/e) but would have liked to have had the webcam and mic.
I wish Apple would use their clout to move up in size. I've been waiting for a nice 32" monitor or larger for a while at a reasonable price. Currently using a 4K 39" TV as a monitor (had to contort the settings to get the text readable, etc.) Probably has something to do with panel yields and profit, sadly (but understandably). That said (and acknowledged, I'm not generally an Apple person), I see very little reason to buy Apply monitors, from a price perspective I just don't see them as a value and see many extremely comparable monitors that work great with Apple computers.
...although fair warning it does have occasional "flickering maroon blankouts" (1-2x...3-4x per day?). AFAIK they're basically two panels side-by-side so sometimes my right-side-one will "blip" for half a second. (search flicker in the reviews/questions).
Find one you're happy with, and check the reviews! The difference between "monitor" and "TV" is massive w.r.t. latency.
For all the work Apple is doing to improve their environmental impact (which I applaud) I'm completely flabbergasted as to why this is a single device monitor. It's quite the impact on the environment to require users who have a Mac mini/Studio and a MacBook Pro to buy two monitors (no matter how great the production chain is).
Sure, you can plug/unplug the cable, but that works so-so on a Studio with the ports on the back.
Everything else I could've lived with, this is a major omission :(
I use an IOGEAR Thunderbolt 2 KVM[0] that lets you switch thunderbolt between two computers. I use this in conjunction with three TB2 -> TB3 adapters[1] and a CalDigit TS3+[2] (which drives my DisplayPort monitor). It's janky, but it works, and I don't really notice the speed difference dropping down to TB2 speeds since the only TB3 peripheral I hang off the CalDigit is an audio interface.
At some point I actually had a TB2 only computer I wanted to KVM switch with a TB3 laptop so this weird setup made more sense then.
Do they actually work well (I actually want to know)? I once started looking into something like that and it seemed to be all pain and suffering to get it to work or it having weird behaviours.
If you have a product that you know work I'm all ears, it would make my life a lot easier.
I currently do this with HDMIs (I somehow make do with a measley 4K monitors).
I assumed such a thing existed for thunderbolt but hadn't looked into it; it looks like you're right that this market is underdeveloped at present. Hopefully somebody does it! Or, the wisdom of HN will yield us some options.
Last I looked, Thunderbolt switches were prohibitively expensive and only made by oddball companies. My solution is a Thunderbolt dock (CalDigit TS3+/TS4+, ThinkPad Thunderbolt 4 Dock, etc) along with a TB cable for each machine that I swap the connection to the dock with as needed. Since it's a single cable swap, it's not much of an inconvenience.
Use the remote control technology in the latest MacOS. It will allow the keyboard and mouse from the Studio to control the laptop. Even more environmentally friendly than having an extra cable.
I would need to share the screen too between my laptop and my desktop mac, so, no. Homeoffice is a thing these days and that means, you need to connect a screen to your work computer. I would rather not have the additional USB ports on the screen (though they are nice) than not having a second input. And giving that the screen has an A13 on its own, it would have the ability for nice handling multiple inputs, e.g. PIP or nice switching of keyboard/mouse attached.
As already mentioned, people working from home would benefit from this a lot, and KVM monitors for two machines have been a thing for at least a decade (Dell, Eizo, BenQ amongst others have been offering them so there’s clearly a market). And even if it wasn’t very used, the environmental cost of just one extra monitor would be many many unused ports.
Honestly - I think it's pretty niche. I get that on HN - everyone will chime in and say it's frequent but I've seen thousands of monitors in my day... and all of them were connected to just one device and never more than one device.
Yes - sometimes things get connected to multiple devices but again - it's a niche.
No mention of refresh rate, so I'm assuming a 60hz panel, only offering usb-c and thunderbolt, no hdmi or display port, and no mention of hdr rating, or mini/micro led backlight zones. Apple's gonna apple i guess. People will still buy it.
I've still got my old 27" Apple Thunderbolt Display (a puny 2560 by 1440 pixels), which i use daily. (The convenience of built-in video/mic/USB hub, with one cable, is indispensible and more important to me than resolution, and it's been really unclear to me what else could do that with a macbook). But might be ready to upgrade to this guy.
I also have a Thunderbolt Display, which I got last year. It's been great and, as you say, having webcam/mic/speakers/usb hub _and an Ethernet port_ is absolutely fantastic. And the quality of the thing overall is just superb.
Also for me those "add-ons" are more important than pixel density, and I'm saying this as a graphic designer. If there were a version of this with a bigger size (but with less pixel density), and maybe a more 'squared' factor, I'd jump to it without even thinking, but alas for Apple pixel density is first above everything else.
I am interested in comparison shopping, I can't find very many other vendors that offer displays with built-in camera, mic, speakers, as well as ethernet port and USB hub, all connected with the USB-C/thunderbolt cable.
Do they exist but I'm just bad at shopping? I may be! Please help out?
Great news. I use an LG ultrawide but it took three LGs to find one that MacOS could reliably detect with a supported resolution. It was a ridiculously complicated and painful process.
I had a 2014 RMBP with the Thunderbolt monitor and Apple wireless keyboard and mouse. Everything “just worked”. I never had any issues. I just spent my time working instead of fighting with my workstation. It was glorious.
Glad to see Apple back to form here. I’m willing to pay the premium for a complete solution.
There's a good reason for this: Thunderbolt 4 doesn't have enough port bandwidth. TB4 can carry a max of 30Gbps but 5K @ 120hz is 32Gbps. There's no way Apple would do two cables for this, and the next gen of Thunderbolt is a while off, so the choice makes sense.
Doesn't display stream compression support higher bitrates? Also, I think thunberbolt 4 supports DisplayPort Alt Mode 2.0, which can handle up to like 80Gbps. But I guess that comes at the cost of attaching any other devices other than the monitor.
DisplayPort 1.4 (which TB3/TB4 support as an alternate mode) has enough bandwidth for 5k at up to 144Hz using display stream compression (which Apple already uses for the Pro Display XDR): https://en.wikipedia.org/wiki/DisplayPort#Refresh_frequency_...
Whatever Apple is doing for the XDR (which I own) is not DSC, or not standard.
Because DSC has been broken ever since the Big Sur betas. Catalina supported 1.4 and DSC and my cheese grater Mac Pro happily drove 2x27" 4K monitors in HDR @ 144Hz.
Big Sur? Nope, you get 95Hz SDR or 60Hz HDR. If you set the monitors to DP1.2 you get 120Hz SDR.
Running a program with fixed-rate 60hz (a lot of games) would cause judder. It's the same sort of reason they don't do 150% scaling; you want integer multiples of the "low-fi" version so that you can upscale without artifacting.
I hardly notice when switching between my M1Pro monitor and larger 27" monitor. Sure my monitor do support 144Hz, but im running it at 60Hz... because docking station
I disagree completely. My main monitor is at 144hz, and it's very jarring to switch to my (slightly older) MBP running at 60hz. Not having high refresh rate means this new monitor won't ever be on my short list.
I feel that while changing from 30hz to 60hz and the latter feels smooth (enough). Never used a higher refresh rate device but wonder how much more can this keep going and humans noticing.
Having used displays from 30Hz to 300Hz, I believe 120Hz vs 60Hz is easily noticeable for anyone who is aware of the differences. Higher than that is a steep point of diminishing returns in my experience.
I wish they would just make a normal display. I’m currently using a 27” 1440p display which has some nice features, USB power delivery for single-cable video/charging/data, and really excellent colour accuracy, but the pixel density is just garbage compared to my MacBook. Using them both at the same time is jarring (especially since macOS dropped subpixel antialiasing. Text looks noticeably less blocky when hooked up to a Windows machine).
I don’t need a monitor with an embedded iPhone CPU and six speakers, I just need the panel out of an old 27” iMac that doesn’t cost more than the damned computer driving it. I can’t even get an LG UltraFine anymore, they were discontinued outside of the US ages ago.
You're in company with all the people who've said "I wish they would just make a netbook" "I wish they would just make a budget desktop" "I wish they would just make a <insert favorite product missing from Apple's lineup here>" over the past 30 years.
wow, 600 nits -- that's bright. If you have a mac book pro you already have a screen that does 500 nits which easily lets you see the screen in the sun. If you're inside without glare you can get away with just 250 nits. For smart phones -- the nits are typically much higher because people use them in the sun and need good visibility outside.
A 600 nit screen inside would almost be too bright for some people. I know people who already turn down their laptops because 500 is too bright. but i think its good to have the option to turn it down rather than a screen that's too dim.
I'd disagree, I have a 16" 2021 MacBook that has the XDR Display with 1000avg/1600peak brightness and I constantly have it maxed.
It's very annoying have it next to my BenQ 4K monitor that feels so much flatter in comparison. It just makes any other monitor so boring in comparison.
No, it only sustains 1000 nits for HDR content. For SDR content (including normal desktop usage), it provides 500 nits.[0] You may need to reevaluate your perception of what a 600 nit display would look like.
The purchase page seems really confusing compared to how they typically do things. Typically, when purchasing something like a Macbook, you select the base model and then some of the upgrades are listed at + $XXX.00. That combined with the fact that the XDR display stand cost so much, I thought they were charging and additional $1599 for the VESA adapter. I was momentarily absolutely furious until I realized that the VESA adapter is a no additional cost option.
I hope it has better Windows support than the Pro Display XDR (it won't).
I managed to get the XDR hooked up to Windows at 6K (with a very unusual cable) but you can't adjust it below max brightness so you get a sustained 1000 nits in your face at all times. Also HDR doesn't work. But gaming at 6K is cool af!
Also, this having speakers, camera, and mic is a big step up over the XDR (unfortunately for me).
Speaking as someone who's in the market for a new display, I haven't been able to find a quality 27" 5K screen. LG has one, but it gets poor reviews. Everybody else is selling 4Ks, which is lower effective resolution than what I'm using now (a 23" 1920x1200 Apple Cinema display). So this is definitely appealing to me.
The effective resolution of 4K is 2x in both directions because to match the size you just set the 27" screen a bit farther away so it fills the same field of view. The jump in resolution from going 4K will be very noticeable and a 27" 4K IPS screen is less than 300€. Well worth the upgrade.
Because I’m on a Mac, everything will be rendered at 2x resolution, giving me the same usable area as a 1920x1080 display, even though the physical area is larger. That’s a step backwards—I’m not willing compromise on usable area or 2x resolution, so a 4K display isn’t good enough.
(Edit: my Cinema Display is something like 15 years old, so I’m confident I’ll get my money’s worth out of a new purchase.)
OSX does have intermediate steps so you can use those instead. It's not true fractional scaling but as the resolution increases the artifacts are less noticeable. Don't know if there are any 2400px tall 4K screens if you want to go for exactly the same dimensions.
I am in the same bucket. LG Ultrafine is the only option which seems available. But it's an old model and I am surprised the lack of availability from the other manufacturers.
I have the LG ultra fine 5k and it's a very good monitor. It has a small defect: occasionally there is a small flickering on the central vertical line. It goes away if you put the screen to sleep and wake it again (it takes a few seconds, and I had to do it a handful of times in a year)
That really is a shame. Mac screens have been amazing for a long time now, but as soon as the computer inside them is outdated and slow, the whole device including the still awesome display is obsolete. Thats the main reason I'd probably never buy an iMac.
Most of the issues people have with the LG are resolved by this new model; namely, build quality, speaker quality and webcam quality. For an extra $300 it seems like a no-brainer to pick this over the much-complained-about LG.
I'm so tempted, but I've been on 32" and 34" for years now, I just don't think I could go to 27" again. I am dying for a great high end monitor with charging for my laptop, integrated camera and speakers. I loved my old cinema display. I really want this, just a bit bigger.
Pity there isn't a version with a 24" panel as on the M1 iMac. 27" is great but it actually might be a liittle too big, but I'll live. I guess no product is ever ideal.
Can anyone help me understand how I'd connect this to a PC? Would I need a GPU with Thunderbolt 4 or does the motherboard need Thunderbolt 4 and the PC somehow magically utilises the GPU via that motherboard display anyway?
If the prodisplay xdr is anything to go by, you wouldn't. That could only connect to PC's with thunderbolt, but because the monitor controls were built into macOS, you couldn't change any settings or use it at its native resolution.
I was waiting a long time for the release of an apple "Pro Motion" display and again they disappointed.
There is no reason why they are not offering an ultra wide with 120+hz and retina.
They are apple. They don't need standards to be ready because they control the whole chain and create their own standards.
Steve Jobs never would have let them release that display with only 60hz. It's the same panel I'm looking at for almost 10 years (in different iMacs).
I think about giving up on retina and going ultrawide with high refresh.
Finally an AR coating on a monitor without the grainy sparkly look! I bought 3 highly rated monitors last year and returned all but one because the anti reflective coatings were visually distracting and made the screen look dirty and unsharp. The one I kept still has the effect, but to an “acceptable” degree relative to the others and I was tired of returning monitors.
Hopefully other manufacturers will follow suit - or at least start releasing glossy monitors as an option, for people who don’t want to view things through an ugly grainy sparkly coating.
The thing is, for most people a glossy monitor with an AR coating is a good compromise, since reflections are a lot more distracting than a slightly matte finish. For a studio monitor this isn't a consideration, but in general the finish is mentioned in the specifications.
The problem is, some monitors aren’t “slightly matte” - many are very matte, and you can’t tell just how matte they are into you receive them, since the finish is always just called “matte”.
Glossy isn’t ideal but I’d definitely take it over spending $500 on a 4K ultra sharp display only to view everything through a fine layer of grainy rainbow colored dust.
Apple’s nano-matte technology seems like it could be a wonderful albeit expensive compromise, but I know I and many others are willing to pay the premium.
Unfortunately I bought an LG 4k display from Apple last year (or was it the year before?) for home office work so it'll be a tough sell to buy something new, but this solves a couple of my problems with my home office setup, namely having to put in headphones with a mic for every video call since I use my Mac in clamshell mode, and having an external webcam mounted on my monitor. This is very close to "one cable to rule them all" setup. Maybe I'll just have to save some of my allowance for a bit :P
awkwardly small / awkwardly positioned second display when compared to a regular second display. main display has its own cam. and the external magic trackpad can go front and center on my desk.
the wasted parts get to come out to play when undocked :)
Oh sure, they put the anti-reflective coating on the device that lives inside on a desk, but the portable stuff all has super reflective surfaces making them hell to use outside.
As a person who works on a Mac but uses a Windows PC for gaming: a pity this only has USB-C inputs, so you won't be able to connect a PC with an RTX 30xx card to it.
Wow! Didn't know those cables can go the other direction as well. That's huge!
Thank you! Another example that writing something wrong on the internet is the best way to find the truth :)
That said, it looks like there's just one input port and three output ports. Would need to do some hack there as well, to avoid constant cable changing.
Several Intel and AMD motherboards have both Thunderbolt ports along with a DisplayPort in port that lets you pipe the output of a discrete GPU (like an RTX 3080) through the Thunderbolt connection. These will probably work just fine with the Studio display. Some PC laptops with Thunderbolt and more capable GPUs probably work with it too.
There are bidirectional DisplayPort <-> USB-C cables, as well as DisplayPort+USB-A -> USB-C cables/adapters that work with the XDR display at 6k, or there are PCIe cards that add a USB-C port with DisplayPort alt-mode support
Only problem is that you'll probably need to extract a BootCamp driver to control brightness/volume, assuming one will exist for this monitor.
(well, if it supports HDR input, you'd use Window's SDR brightness to control it that way instead)
Amazingly, NVidia cards had this port on previous generation models, but it got removed for some reason. (Though it's actually not clear to me it would work; they did some non-standard stuff to get USB 3.1 instead of USB 2.0 alongside DisplayPort lanes.)
It most likely wouldn't work with a Windows PC anyways due to the tight macOS integration anyways; my iMac in Boot Camp can't use any of the USB-C ports on a connected LG Ultrafine 5k at all.
I’m happy that they finally released this monitor, it felt weird that you could get an imac which was sleeker than any existing screen, but does anyone know of more affordable alternatives?
This may be a bit weird/niche but I care a lot about the thickness of the monitor itself, the depth of the stand and the bezel around the screen. I’d like if the object I spend 8 hours a day staring at isn’t aesthetically horrendous..
I'm disappointed with the ergonomics of it. First, you have to pay extra for height adjustability, and then the height adjustment is via an arm that changes the viewing distance when you adjust the height. Why can't it just be like Dell monitors (and many others) where the screen can be moved up or down on a fixed axis, and it's included in the product?
Any mentions of the refresh rate? I've been waiting for a 4K/5K display that supports 120 Hz to pair nicely with my 2021 MBP, but it looks like the only available options so far are funky gaming displays: https://tonsky.me/blog/monitors-mac/
Assuming this is the same panel as in the 5k iMac, it's a good panel. It doesn't matter that it's old. What matters is that it works, and that you can actually buy it. There are very few high res displays on the market.
It would be nice if it cost less than 1000€, but I guess you can't have everything.
So happy they finally made this. Ever since I got my first retina MacBook Pro a decade ago, I’ve wanted a good companion display for it. The LG UltraFine worked, but was never good. Iiyama and the other knock-offs were worse.
Basically instapurchased this. Now just have to wait two weeks for it to show up…
Most of the features announced are quite nice but strangely (for an Apple product), the primary feature being the display is simply not. As far as I can tell, it doesn't even have 120 hz let alone a good local dimming implementation.
Different priorities. People buying these are looking for good specs in color reproduction, contrast (as far as is possible without FALD backlighting), and perhaps most importantly consistent QC.
If you look at reviews for just about any model of monitor released in the past 5-6 years QC has been atrocious, with dead pixels, backlight bleed, and other odd issues being commonplace, making it a challenge to get a unit that's good all around. This has been especially true for the display that this is most directly replacing (LG Ultrafine 5k).
Nobody is buying this monitor for contrast, it has about the worst contrast available at the price point. It's failure to actually serve this priority at all genuinely makes me wonder what the target audience is.
Show me a higher/same resolution display with a higher refresh rate. I'd genuinely curious. I've been searching for a new monitor for a few weeks and can't find anything with 4K and more than 60Hz refresh rate.
As the other comment pointed out, that is not an apples to apples comparison. The monitor you linked is a UHD one (3840 x 2160 pixels, around 8.3 megapixels) with a pixel density of ~200 ppi. The display showcased by Apple has a resolution of 5K (5120 x 2880 pixels, around 14.75 megapixels) with a pixel density of 217 ppi. Also, based on my experiences with LG gaming monitors I would assume that the Apple display also has a significantly better color accuracy.
If you would have taken a second to look at the specs and search for actually comparable products you would find that there are, at least to my knowledge, no displays with the same resolution and higher refresh rates. This makes sense because 5k@60Hz already has incredibly high bandwidth requirements.
The best actually comparable product is an LG UltraFine 5K which comes in at 1,499€ msrp which is 246€ cheaper than Apple's Studio Display at 1,745€ msrp. Oh, and to no ones surprise, it also has a 60 Hz refresh rate.
So to answer your initial question: Yes, people do spend that kind of money on displays with "only" 60Hz.
The Studio Display runs the full version of iOS 15.4, with the exact same build used by the iPhone and iPad , meaning that updates to the display's functionality will come as part of iOS update.
Probably something like that. Apple has a lot of products that use their custom silicon that aren't Macs or iOS devices — think Homepods, AirPods, etc. I'm sure they have a whole team whose job it is just to create respins of their CoreOS stack + whatever happens to be relevant to the hardware device.
A few years ago I would have been very excited about this, and immediately upgraded the Thunderbolt Display I had at the time. These days I prefer 32" display, and I don't even need 4k (I have 2 32" QHD LG displays at both home and work)
Wow that's a painful website... 5 seconds in and I've scrolled to the bottom as fast as possible to get past the fluffy marketing bullshit, in the hope that there were some tech specs of some sort there, then left...
How is this news? Apple has had the exact specs (27 in, 5K res) available since something like 5-7 years ago, both as a stand-alone monitor and as an iMac.
sure, it may be a middle ground between the existing Pro XDR and 5K LG Ultrafine displays, but this is something that was just announced today, hence "news".
I realize that as a staunch Linux user I'm not in their target demographic, but... wow that website is slightly motion sickness inducing. Is it as bad to scroll through this on a mac as it is on linux? I use a mouse with an actual wheel, maybe they don't optimize for that kind of legacy device...?
Edit: it's also just so hard to scroll exactly to the points where the information is presented. I just mostly end up at points that are meaningless transitions... So confusing...
Mac pointing devices have step-less and inertial scrolling, so the scrolling itself is effortless, but the framerate of the "video" is too low, so it's either not fluid enough unless you scroll like crazy.
I don't think (for me) the issue is how smooth the scrolling is. It's the fact that these animations are there at all. Like... who wants to see these? They add nothing and add huge visual confusion...
I feel like everyone immediately realized what a terrible UX scrolljacking provides when the trend started, but Apple has pressed on despite their (in my opinion) otherwise thoughtful UX design.
If you middle-click and scroll down smoothly, it's a better experience (provided you nail the right speed). However, there's a bunch of text you miss doing it that way.
> Is it as bad to scroll through this on a mac as it is on linux
It is. Despite their investment in desktop hardware (finally!) they completely forgot how to do desktop software. Ten years ago all those animations where buttery smooth even if they were a weird custom "video" code that assembled them out of separate PNGs.
FYI the Studio Display website was outsourced to an agency, and not developed in-house by Apple engineers. Outsourcing rarely improves the quality of software.
I hate this trend in ‘ultra modern’ websites with the infinite scroll pulling you through some sort of animation. The website is near unusable as a result.
Hey, it's marginally more usable than unskippable Flash intros, which is what we used to get.
Although, unlike other similarly effect-heavy apple landing pages I've seen in the recent past, this one doesn't appear to offer a decent alternative version when javascript is disabled, which is a disappointment.
I might be ignorant here, but what trend? Scrolljacking was popular for about 5 minutes back around 2010 during the HTML5 craze before people realized how awful a UX it is. The only place I still see it being used regularly is Apple sites. Have you seen this pattern used elsewhere?
[1]: HiDPI displays that work correctly with macOS' and Linux desktop's naive HiDPI implementation, that requires 2x scaling for good results.
Nobody in 2022 will sell you a monitor that does that, except for Apple's expensive stuff that is hard to use with regular PCs and one over the top Dell display. I wish everyone did what ChromeOS or modern Windows apps do. I need that extremely crisp font rendering in my life.