Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
14-inch MacBook Pro review: A Mac Pro in your backpack (sixcolors.com)
215 points by jdminhbg on Oct 25, 2021 | hide | past | favorite | 239 comments


Much has already been said about this, but this achievement of shipping a MBP that pro users seem to love is a triumph of user feedback over design purity. I suspect that Jony Ives’ departure may have facilitated this. I suspect it will be a breakout success and I aspire to contribute to their sales figures some day. Of all of Apple’s revenue streams I know their pro products are now merely an afterthought, but they remain an important part of the brand’s ethos and I’m delighted to see them shipping a truly superior product again.


My guess is that Apple bet on something that did not materialise, it wasn't about Jony Ive not caring about user feedback.

What was the bet you say? I think they bet that the shift to USB-C/Thunderbolt/Wireless will happen quickly and they can help with the push and increase their margins.

IMHO It wasn't a design decision but a business one and I think the removal of mag-safe is a giveaway. Every little thing that they add makes the thing more complicated to manufacture, thus more expensive. Every non-off-the-shelf piece puts multiple teams in to work on multiple continents and this costs money. I think they aimed for the simplicity at management level and not at design level. They did some calculation based on how well people are locked in and how many will buy a new mac and went with the most profitable option up until the time for the great transition comes. Until very recently the PC industry was racing to the bottom and Apple did not have a reason to compete on features. Nice Windows laptops are a very new phenomenon, up until very recently the best trackpad on a PC laptop wasn't even on the same league with Apple's.


USB-C adoption mostly went fine, though. Or as fine as could be expected.

I think the past five years can be more easily explained with — ironically — a Steve Jobs quote:

> When we were an agrarian nation, all cars were trucks, because that's what you needed on the farm. But as vehicles started to be used in the urban centers, cars got more popular … PCs are going to be like trucks. They're still going to be around, they're still going to have a lot of value, but they're going to be used by one out of X people.

And that's what the Mac team lost sight of; they tried to make Pro Macs that were sports cars. Maybe the Macbook Air is an affordable sports car of sorts (do they still make Mazda Miatas?), but the Macbook Pro is and should be a truck.


>(do they still make Mazda Miatas?)

They're called MX-5s here in the UK and I see them everywhere, unsurprisingly because it's in the spirit of an old British two seater but unlike the Triumph Spitfire or MG Midget it doesn't break down all the time and isn't essentially hygroscopic.


Original model was a Brit design by Lotus even, I believe.


Not quite. Mazda were inspired by the design of the Lotus Elan, but the MX-5 was never designed in any part by Lotus.


> do they still make Mazda Miatas

It's gone through some name changes in the US (sometimes MX-5 Miata, sometimes just MX-5), but they still sell it.


Really? USB-C adoption went fine? I have never seen a usb thing that I found superior to what it replaced. I see a lot of dongles though.


I don't get the dongle people. You can just buy USB-C to USB-B cords. They cost about the same as dongles. I replaced the cords on all my devices years ago and never looked back.

https://www.amazon.com/Cable-Matters-Micro-Braided-Jacket/dp...

https://www.amazon.com/JSAUX-Charger-Braided-Compatible-Exte...

https://www.amazon.com/Cable-Matters-Printer-USB-C-Black/dp/...


It would be really nice if that was the case. However when you pick up a USB C cable, you can't tell if it is suitable for a particular purpose because there are so many optional features in the spec. Data transfer speed, PD, Display Port, HDMI, Thunderbolt 3/4, and probably others that I'm not remembering.

I can understand not requiring a heavy gauge wire for a cable that is just transferring data and a bit of power, but it would be really nice if the standard required cables to label the cable ends with symbols representing that cable's capabilities.

Oh, and if you have a cable with USB C on both ends, you still can't charge an i-phone since apparently USB C isn't good enough for the phones yet. That's something I really don't understand is even after all these years Apple still hasn't fully standardized on the USB C connector across their devices.


I agree that the cases of power and displays were very badly handled, but how often does this happen in practice? Your laptop came with a power cord, and if you pick up the wrong one, most retailers are really good about returns. And who needs a collection of display cords?

For 90% of use cases any old cord will work. The edge cases sting for sure but overall the design of the plug is a huge win.


The big thing with power for me is phones and other similar devices. I have one phone that supports a quick charge capability, but I need not only their power brick but also their USB cable (something about a pull up resistor that signals the charger / phone combo to go into fast charging mode). Same thing with a tablet, it has its own fast charging standard.

I gave my Pixel C to my Mom, and the charger that came with it seemed to work well on her original Pixel phone. But after a week of using it, the phone went out and had to be replaced. And look up reviews of almost any USB C power delivery hub, they are filled with accounts of Mac's getting fried from using them.


this has been true of USB basically since its 1.0 release. I have micro and mini usb cables, some can charge phones and some cannot. Some can charge my ps3 controllers and some cannot. I bought a usb-a male to usb-a male at some point. Never got it working with anything.

Sounds like you have a problem with USB and are just noticing it now with usb-c.


Or they could have just put both on the laptop like Lenovo has been doing for years. Maybe that tech is too futuristic though, Apple's liable to leave it for the groundbreaking 2026 Macbook Pro that brings back pro ports like USB-A and good old RJ-45.


5 years after the 2016 MBP and the only device I have with USB-C is my Android phone.

All my PC peripherals still use USB-A.


> IMHO It wasn't a design decision but a business one and I think the removal of mag-safe is a giveaway. Every little thing that they add makes the thing more complicated to manufacture, thus more expensive. Every non-off-the-shelf piece puts multiple teams in to work in multiple continents and this costs money. I think they aimed for the simplicity at management level and not at design level. They did some calculation based on how well people are locked in and how many will buy a new mac and went with the most profitable option up until the time for the great transition comes.

The touch bar invalidates a lot of these assumptions.

My personal opinion is that Ive's vision for Apple's design future was of ever slimmer and slicker pieces of hardware, USB-C helped a lot this vision to simplify design lines, streamline the ports layout and so on. The simplicity take goes away when you consider that supporting 4 USB-C ports that behave differently due to limitations on power envelope and Thunderbolt chipset doesn't really simplify the design for cost-cutting.


What's amusing about the "design purity" is this is basically (from an externals point of view) an upgraded PowerBook G4. Good design isn't limited to "make it look as much like a thin sheet of metal as possible".


Personally I dislike this bit. Thinness isn't everything, but it's something. I don't know that the new models really needed to look as brick-ish as they do, and the previous design was certainly sexier.


This is why you have a full product line, right? You can make one laptop that prioritises weight and thinness, and you can make another one that prioritises power, features, etc.

The problem with the previous Macbook product line was that all products had the same compromises. Design purity, and thinness over everything else. There was no choice for a "pro" machine that had a different set of tradeoffs to achieve pro features.


I'm just asking for a tapered edge :D The new bulbous rounded edges almost seem passive-aggressive from apple. Yes, prioritize power, and with whatever remaining ability to make things pretty that you have, utilize it. When you can but don't, that's unattractive. Is there a specific reason for the brick look? I'm not sure that we know there is.


I'd say that given no teardowns have yet been released, its also true that we know there isn't a specific reason for the brick look. They crammed a lot of battery in there, big, slow, quiet fans, six speakers, and the SoC is something like 4x larger than the base M1.


I think the rounded looks nice, but it is unbalanced compared to the top - the thing should be MORE brackish in my opinion.


Speaking of backpacks, I was expecting that M1 energy effeciency would allow for less weight to carry rather than +10 hours of standalone time. A few hours of life, but 0.5 kg less weight would be a better tradeoff for me. (I'm talking weight, not thinness. A thicker body with better heat dispersion is fine, as long as extra space is taken up by air.)


21 hours when watching movies. When doing real work, I get a good five hours out of my MacBook Air, whereas I'd get maybe one hour out of my Dell. If you actually have work to do, and you're stuck in the arse end of a coffee shop or airport, then you can still do meaningful work for hours.


Yeah like I get that Apple went too far but thinness is a really important quality. People love thin, sleek objects. For the average consumer that may be a bigger factor than an SD card reader.


I agree. For me thinness is portability and that is very important. I'm willing to trade quite a bit of performance for that.


I actually like the throwback, personally. I'm guessing they did that to differentiate it in the market.

What I have trouble getting over is the flipping notch to the top. Just why. The bezel on my 2019 macbook looks the same, if not skinnier.

Do people really not watch fullscreen videos anymore or something?


You should look into the actual implementation of the notch and I think you'll be pleased! Imagine the exact same screen as your 2019 macbook pro, but instead of black bezel on the left and the right of the camera, you have the menu bar instead.

Everything beneath the notch is the same size as the screen on previous laptops, so the notch allows for extra space beyond the 16x10 that you're used to. This means when not using fullscreen apps, you gain some screen real-estate because the menu bar now has dedicated space for it. However, when you're doing something fullscreen (like watching a video) the screen on either side of the notch is blacked out so you're left with a standard sized screen for viewing your content (with no visible notch). It seems like a win-win.

I much prefer this implementation to what they did on the iPhone with the notch cutting into fullscreen images and videos. The only downside I can see is if you have a light colored menu bar or background and don't like the look of it when not fullscreen, however my menubar is always black anyways so I don't see that being an issue at all.


Aside from me thinking it's visually distracting,

I get the argument, but if I was really concerned with screen space I'd just get a model with a bigger screen.

For my own workflow, I use almost the whole top bar for tool/automation shortcuts, so this is cutting a chunk of my space for such a tiny amount of space added.

Right now it's hanging past the left side of my webcam.

If it does black out the notch for videos though, then that's a lot better, ty for that info.

Does keeping the top bar black essentially hide the bezel? Or does it just make it not stand out as much?


> For my own workflow, I use almost the whole top bar for tool/automation shortcuts, so this is cutting a chunk of my space for such a tiny amount of space added.

Yeah I can see how that would be an issue. I think I remember reading that the menu icons would be truncated when there are too many of them, but who knows how that actually works in practice (ie. are they scrollable, or is there an additional menu that you can open).

> Does keeping the top bar black essentially hide the bezel? Or does it just make it not stand out as much?

I'm not entirely sure to be honest. Based on the reviews I've seen when the top is blacked out for a video the notch is basically invisible. So I assume if you have a black menu bar it would be similar to that. That's just based on pictures on the internet though so I'm not actually sure how visible it will be in person.


I’m wondering whether the notch was supposed to be for Face ID, but they couldn’t make it work due to current supply chain constraints, and that notched displays were already ordered.


It already had a front-facing camera, of course. Does a camera for Face ID require a bigger notch/bigger area? (Real question, I don't know!)


Face ID has a separate IR camera and a 3d IR dot array, so can imagine it requires a bit more space than a regular webcam.


It could also be they planned it this way and will add it to next year’s model. That way they can advertise new “must-have” features for upgraders without redesigning the case/OS


> Do people really not watch fullscreen videos anymore or something?

In fullscreen, it puts a black bar at the top and the display becomes 16:10

I don't see the big deal. I'm looking at my 2019 16-inch MBP right now and it looks about the size of the notch from what I've seen. The camera has to go somewhere, and this solution just pushes up the menu bar to the sides of the camera instead of taking up part of the 16:10 section beneath it.


>Do people really not watch fullscreen videos anymore or something?

How many of the videos you watch are taller than 16:10? Are any of them even taller than 16:9?


... All 4:3 videos are taller than 16:9 certainly, and there's a non-trivial number of those.


Absolutely. I run into this all the time with my phone's screen (likely a different ratio, but I almost never have it not cover video.

If there's a program that will literally add black padding until the notch goes away I'd probably use it, at least when watching media.


There is. It's called macOs. The screen is 16:10 and then the menu bar, with the notch is extra space on top of that.

I can't even be frustrated with you. This is so humanity. "I hate what they did! If only they had done this instead!" They did do that. They talk about that a lot. The article talks about that. They solved the problem in a way so obvious that it's what random people on the internet come up with, but of course the same random people on the internet are so convinced of their own superiority and so sure of Apple's inferiority and also unable to read TFA.


I'm not saying apple is inferior and everyone else is wrong. I'm saying that for my own individual workflow, on my current macbook, this is a bad compromise.

I keep a large amount of tools/macros/shortcuts in my status bar, and this will cut through the middle of that.

I'd rather have thin black bars around my 16:10 video then lose that screen real estate for the status bar.

It is a bad compromise for how I use my current macbook.

I'm certainly in the minority, and that's fine, but I'm just trying to argue my side and understand other people's, not scream till I'm red in the face until I get my way.


Well, I couldn't find a setting to drop the menu into the 16:10 area, but the notch really doesn't take up much space, at least on the 16" screen. I haven't found an app that has enough menus to hit it yet. IntelliJ had the most. So if you are actually considering dropping $3k on a laptop soon, then I'd say go to an Apple store and check it out.


Reports are the OS already does this for you when watching fullscreen media.

Apparently Apple still has some design sense after all.


Yeah, I'm not a big fan of the new design. That being said, I really, really want one.


Which again is weird for someone like Ive considering that "form follows function" is one of those mantras that people like Rams and the Bauhaus folk based their entire work on.


Even if that was the basis at some point I suspect much of it is now just an argument for particular parts the designer likes, as "function" isn't necessarily completely defined.


That makes me wonder what a PowerBook G3 (my favorite one but it’s surely nostalgia) inspired design could be.


while i think ive should get a lot of criticism for "form over function", he helped push the barriers of what is possible and in a way pushed the industry forward as well.

i think without steve to temper that it ran over a bit. you want someone to push the boundaries then someone to rein it in a bit vs the other way around.

IMO, this is probably going to be a very special time for the macs as we see the pendulum swinging back and getting all the "function" we missed. i fear tho in a generation or two it will be too "function over form" and we get an apple basically producing bricks again lol. u can already see it with the iphones.

sure... increase the thickness this year, but what prevents you from taking that thought further. compromise a little here or there, whatever. this is where someone with an obsession to say, thinness/lightness, helps. this is to say nothing about the hardware. they are far ahead there.


Whilst an afterthought, the kinds of people who buy Pros are the kind of people who decide on the tech for their wider families.


You don't really even have to decide for the rest of the family. Basically all of my family migrated over to MacBooks after I bought my first MacBook Pro within like a two year span.


> I suspect that Jony Ives’ departure may have facilitated this.

I'm sure that is a very accurate statement. I thoroughly enjoy the improvements to the keyboard and adding back in the new ports. But I wonder what the future of the MacBook will be without the visionary design from Ives? Are we going to see a melding of PC and Macs or will Apple keep design innovation at the forefront?


I'm not trying to be mean spirited, but it was clear that his time was up. That's totally fine. He made some great contributions, but that doesn't mean that he can or will through his last waking moment.


Yeah this feels like a pyrrhic victory. Yes, we've gotten our ports back, but is this indicative of a trend that Apple is moving away from design? It's too soon to say but I'd argue that it's likely Apple will experience some regression to the mean. Considering that Apple's design is several orders of magnitude away from the mean, that'd be a loss.


I’d say the new iMac’s argue against that. Very ‘designed,’ very thin, to the point of compromise in some ways. It seems they are letting the designers have their way in the non-pro arena.


Haven't we already seen the post-Ives era products? I imagine Apple will continue to be design centric, just without a Jobs-esque demigod-like figure that could dictate his will above and beyond anyone else - i.e. form over function.


I’m not sure if their pro-lines were an afterthought, rather they weren’t in a good position to strongly differentiate their pro devices. “Disruption” in the pro space is going to be about performance, Apple did leverage their technologies for better performance and made some choice software acquisitions - but this involved pros switching tools and workflows to realise a moderate speed jump. A more powerful disruption is simply having much faster hardware, which at the time was unavailable - but clearly in development for quite some time. I think moving forward we should expect the pro lines to have benchmarks that are not just impressive, but custom silicon that makes changing software/workflows a huge advantage. I will definitely be moving over my video workflows for example - the performance gains make the time savings worth it. I also feel that Adobe has to really pull their finger out now, as software that has been built on Apple’s tech is seeing insane performance jumps.


>I suspect that Jony Ives’ departure may have facilitated this.

This is a popular statement but Jony didn't have any design oversight for years before he formally left.


Products also take "years" to design and develop, so I'd say the statement may still hold true.


It's just a laptop in the same old rectangular shape, in the same old colors that is actually just catching up and there's nothing at all special about this one… Try to catch some breath.

Nobody anywhere likes the notch either. Triumph? No.


The most striking benchmark for me as a developer is the disk read/write speed. Wonder why Apple didn't highlight that. Nearly 2x as fast as last year's M1 Air or Intel macs - that sounds amazing to me. As a simple web dev, I thought I would never use any of the extra power available in an M1 Pro/Max, as I never do anything GPU-intensive. However, disk speed is a noticeable factor in day-to-day workloads, even when I'm running an IDE like IntelliJ and indexing a large repo.

Interesting because Apple chose to focus on the 200 vs. 400 "memory bandwidth" figure in their marketing to differentiate between the M1 Pro and Max, but as this review suggests, the 400 GB/s rate cannot be reached even in theoretical workloads, so it might as well be meaningless. Disk speed seems more important. Can anyone comment?

1) https://www.anandtech.com/show/17024/apple-m1-max-performanc...


I can walk out right now and get as fast an SSD and slot it into my Dell. Next year, I'll be able to put in a faster one, and the following year one that is yet faster.

I am surprised that the soldered in SSD speed is merely on par. And of course, for roughly 2x the price. So maybe that's why they aren't shouting about it.


> However, disk speed is a noticeable factor in day-to-day workloads, even when I'm running an IDE like IntelliJ and indexing a large repo.

You may want to check IOPS benchmark rather than throughput.


Thank you, that's the term I was looking for (IOPS benchmark). Do you know how I can test the IOPS speed on a Mac? Looking for a online benchmark, can't find one.



I ordered mine on release day and am eagerly awaiting its delivery.

Glad to hear that about the notch. I couldn't for the life of me think why it mattered. When the presentation showed it, I looked at the very display of the Mac laptop I was watching it on and sure enough, there was a gap in the menu bar where the notch would be, with the menu items on the left and the clock and all those little widget icons on the right. I was streaming it on my personal 2013 MacBook air (which is a great little machine but ready for an upgrade), and then glanced at my work Macbook Pro and its external display. On both those, there was also empty space where the notch would be. I thought, wow, what a great idea.

But then I saw all these comments about how it was horrible and a deal breaker, and some designer pompously talking about how when you're like him and pay close attention to detail it will be distracting, I thought either everyone had lost their collective mind, or I was missing something. I realize there are some apps with a lot of menu items which might have to oddly flow to the other side, but I don't think that's super common, and I guess I'll just have to see how it feels in the end. But glad that this reviewer isn't bothered by it.


> But then I saw all these comments about how it was horrible and a deal breaker [...] I thought either everyone had lost their collective mind, or I was missing something

I think people dialled the outrage back again when they realised the area below the notch is a full 16:10 display—i.e. they added display where bezel used to be, rather than cutting into the display area.


Yes, once I heared that I was ok with the notch. Not only that, it makes much sense this way.


It limits how many menus your apps can have or makes them draw around it. If you enable compatibility mode you'll be back to an even fatter bezel which looks like dog shit.

Rationalize that away.


Enable compatibility mode and you'll be back to the same "dog shit" as the previous screens which people didn't seem to be that outraged by.


Understand you may be that 5-10% of the market that the notch impacts. The vast vast majority aren't running Photoshop and will spend 90% of their time on a browser which likely complies with notch.


You think pro users spend 90% of their time on a browser?


There may even be an option in System Preferences somewhere to lower the menu bar when there's overflow and just treat the extra area as a bezel. Since that's what it does for full-screen apps that aren't updated, it seems like it would be a good compromise in those situations.


> It will probably take a little while for Mac apps to be updated to support the ProMotion display. I found that some apps scrolled text in the same buttery-smooth way that apps do on my iPhone and iPad, but others didn’t. I was able to find a few Catalyst-based apps that have already been written to support 120Hz displays on iPad and iPhone, and they looked spectacular.

It surprised me that an app would have to be updated to support a better display. Anyone know the details, and nature of the update that has to be done?


I think it depends on the framework - as part of the ProMotion library isn't automatically running everything at 120 FPS but dynamically scaling the framerate up/down based on demand and requirement so that it can balance the additional power consumption that 120fps drives.

Because of this, apps that don't update for ProMotion might be capped at 60hz (or something like that) so that the laptop doesn't have to run everything at 120hz if one app isn't updated.


It's not just a better display, it's a software system that dynamically adapts the framerate based on what's happening with the content at a given moment

I am still a little surprised it requires app support, but not shocked


For iPhone 13, you need to 'opt-in' for 120 FPS by adding a key to your app's Info.plist file. This is probably something similar.


I don't know about macOS, but on the Web introduction of 120Hz has been a compatibility issue. Lots and lots of implementations assumed `requestAnimationFrame` fires 60 times per second (treating it as a "tick" for game loops or anims), so the event frequency had to be capped.


I don't know the details, but the idea behind ProMotion is that the display dynamically changes refresh rate based on need. So, apps would need to tell the OS when they want a higher refresh rate to display smoother motion.


Anyone know the details, and nature of the update that has to be done?

I don’t know what is happening specifically with these apps (we’d have to look at their source code on a case by case basis) but to hypothesize:

Since 60hz has been a de facto standard on the Mac for decades I can imagine that a lot of apps may have a hard-coded 60hz timer for updating their window contents. This would completely negate the advantages of a 120hz on things like scrolling smoothness. These apps would need to be updated to not assume 60hz and instead update their window contents when requested to do so by the system. Furthermore, some applications which generate their window contents on demand (based on their internal data structures) may need new algorithms if they are too slow to update at 120hz. I know that many applications (including web browsers) I’ve used over the years can’t even keep up with 60hz updates when scrolling through a complicated document.


It is about power consumption.

They could, if they wanted, automatically opt existing applications to use 120 Hz refresh. But the power usage would be terrible, and it would be up to application authors to manually go in and tune it down (which many would not do, or would take too long).

So instead they keep them at the current refresh, and have them "opt-in" to higher. Which I'm imagining Apple is hoping application developers will do responsibly and only ask for 120 Hz when it is beneficial (although big YMMV).


Apple still won't take some accessibility issues seriously, which is unfortunate given they are a leader in this space. They don't test their devices on people with binocular vision dysfunction, although I've asked them to.

There is a whole community of people on LEDStrain.org who suffer from severe eyestrain, vertigo, headaches and migraines to the point where these devices are unusable for a minority of people.

One possible cause is temporal dithering which can be seen as the dancing pixels on a slow e-ink display: https://youtu.be/0y-I3hqQgCQ. Windows also uses this technique, but Apple's causes more strain for some reason.

It would be great if apple could look into this.

Some of us are desperate and will gladly pay for help. I am at risk of losing my IT job cause of these companies. I have a binocular vision dysfunction but treatment is not working well.

Any ideas from the community as the technical root cause?

Edit: not sure why I am being downvoted. Please have some empathy!


I think it is unfair to generally state 'Apple still refuses to take accessibility issues seriously' because the one issue you care about most isn't addressed to your satisfaction.

I think you'll find that generally speaking, Apple cares a lot about accessibility.


I agree, apple has done great with things like VoiceOver.

There is other stuff they can do, like a toggle to disable temporal dithering. Or actually test their devices with people with binocular vision dysfunction.


> Some are unable to work in IT or as programmers because of it.

This point is very forced. Just don't buy the most expensive laptop on the market.

I have never had any apple devices, and yet I'm a software developer.


Can you identify a system that doesn’t cause these issues for you?


Older systems like a ThinkPad from 2013 work. Something changed around then for the display tech.


IPS vs TN?

There are many, many differences between display techs. Higher / lower contrast, brightness, ‘Refresh rate’ - where these can be controlled at higher granularity than the whole display at once these days. Colour response, blue light and it’s presence / absence.

There is also simply screen size. Large screens (‘external’ monitors) are much more commonplace. Having more screen real-estate in field of view could be a factor.

How close we sit to screens. That many people also look at phone screens for the time they aren’t looking at a ‘computer’ screen. Maybe just total screen viewing time is important.

Perhaps it’s possible to eliminate many of these factors? Go back to 2013 screen size, amount of phone screen use, TV screen size, etc.

It might be easier to just go cold turkey and stop using anything but one small 2013 era monitor with a TN screen (no mobile, no TV, no tablet etc.) and see if all is well, then start re-introducing.


I thought CCFL vs LED but maybe CCFL was dead in 2013


Ongoing related threads:

MacBook Pro 14-inch and 16-inch review (2021): Apple’s mighty Macs - https://news.ycombinator.com/item?id=28987380 - Oct 2021 (81 comments)

Apple's M1 Pro, M1 Max SoCs Investigated - https://news.ycombinator.com/item?id=28987276 - Oct 2021 (78 comments)

Previously:

New 16-inch MBP with M1 Max to feature High Power Mode for intensive workloads - https://news.ycombinator.com/item?id=28950733 - Oct 2021 (105 comments)

Apple M1 Max Geekbench Score - https://news.ycombinator.com/item?id=28933663 - Oct 2021 (870 comments)

MacBook Pro 14-inch and MacBook Pro 16-inch - https://news.ycombinator.com/item?id=28908383 - Oct 2021 (2056 comments)

Apple’s new M1 Pro and M1 Max processors - https://news.ycombinator.com/item?id=28908031 - Oct 2021 (981 comments)


> The "butterfly" mechanism offered reduced key travel and keyboard reliability.

I couldn't quite tell whether this was a subtle joke or not. I didn't know whether the butterfly design was actually intended to improve reliability or not. Regardless, the effect was actually worse reliability.

I decided to get a second opinion from a machine [1] that probably was trained on input from humans. It said that the noun phrases were {reduced key travel} and {keyboard reliability} and did not suggest that {reduced} modified both.

[1] https://explosion.ai/demos/displacy?text=The%20%E2%80%9Cbutt...


Interesting input from the ai, but reduced is definitely intended to apply to both "key travel" and "keyboard reliability" in that sentence.


So it is a subtle joke? What was the intent of the butterfly design change, then? Just reduced key travel?


Yep I think the writer intended it as a joke, or at least a comment on how bad the butterfly keys were. The butterfly keyboards didn’t do much except help make a laptops a bit thinner. That had the effect of reduced key travel (often seen as a negative) and massive reliability problems (universally bad).


Oh, my mistake. I assumed that the "reduced key travel" was actually a feature.

If I re-read it as two negatives, it doesn't seem like a joke so much as a straightforward criticism.


I thought the joke was pretty explicit as I read it, and pretty funny.


reduced (key travel and keyboard reliability), not (reduced key travel) and keyboard reliability


> the Touch Bar. It was introduced in 2016 and… never really went anywhere. A more aggressive set of software updates from Apple might have turned it into something, but that never happened. > Now it’s gone, replaced instead by a very traditional row of function keys.

Well to be honest I liked the Touch Bar and I used it quite often. On the other hand I never used the "Function Keys".


How did you change volume or brightness?


You can set the touch bar to always show the "control strip" instead of keeping it minimized on the right side of the touch bar.


This is a game changer, thank you!

For anyone else looking to configure this:

 > "System Preferences" > "Keyboard" > "Touch Bar shows" => "Expanded Control Strip"


I was replying to the comment of a person that "never used the function keys" (in non-touchbar macs).


Can anyone recommend a good thunderbolt dock they've been using. Ideally I would like a dock that at least provides for displayport, ethernet, audio, and 1 or 2 usb-a ports. I'm fine with using the magsafe if power through the dock is not available.


Another +1 from me for the CalDigit TS3+ dock. I've been using it for about 1 1/2 years now with dual 4K UHD @ 60Hz displays, and haven't had any issues whatsoever. I also use it for a wired network connection and several USB peripherals at the same time. I really appreciate its layout where most ports are rear-facing so that you don't have cables on your desk coming out towards you and then looping back. The additional front-facing ports are great for popping in the occasional flash drive when needed. For some people the 87W output may not be enough, but it's never been a problem for me.


I have two that I've used four 3-4 years, and both are rock solid. Very rarely do I need to unplug and replug (e.g. if a display doesn't get identified properly).

With the other TB3 docks I tried, they'd all have really annoying issues or even require reboots from time to time.


Great point about needing to unplug and replug displays back in. I had issues with that at first, but it turned out to be a result of a bad DisplayPort->HDMI adapter. Now that I'm using normal DisplayPort and Thunderbolt cables direct to the displays from the dock I haven't had any issues. And if I need to change the input on one of the displays it's all handled seamlessly and macOS flips over to recognizing only one external display.


I've been a happy user of the CalDigit TS3 Plus for a couple years: https://www.caldigit.com/ts3-plus/

Based on their reputation and my experience with them, I assume their TB4 Element Hub is a winner as well: https://www.caldigit.com/thunderbolt-4-element-hub/


I've been using the Caldigit TS3+ for the last ~year, and it's pretty good. The only real issue with it I have is the fact that it only supports DisplayPort 1.2, so I have to run another cable for my 4k 120hz display.


What version of displayport is needed to support that? I thought that 2560x1440 was the highest resolution you could run over displayport with 120hz or 144hz


Displayport 1.4 (with DSC enabled) can support 4k 144hz+ with HDR enabled


I also use the CalDigit dock, but I am annoyed that it has compatibility issues with Catalina (I am holding out from Big Sur as long as I can). Unplug the dock and, on next plug-in, macOS can no longer see the dock's USB bus. External keyboard, mouse, backup drive, etc. won't work unless you reboot macOS.

I've heard mixed reports on whether Big Sur has fixed this entirely.


FWIW I never had any issues with either Catalina or Big Sur


Does anyone have any insight into how many GPU cores to choose with these machines? I don’t do much in the way of video work; but I’d like to be able to run a couple of 4k external displays as well as the internal display. And ideally have video calls not slow down my entire machine.


The higher end options are really only for intense video encoding, like if you're working on a movie (I wonder how many people choose macbooks that task). The basic option should be more than fast enough for running displays.


Pro can do 2 external 6k monitors and Max can do 3


The minimum. Even the smallest M1 GPU in MacBook Air has performance better than most Intel Macs.


Have any of the reviews yet compared a similarly specced 14" vs 16"? Since the primary difference in performance between the MB and MBA was thermals, I wonder if the extra size allows for better thermal performance and better overall performance in the bigger version.


Two charging question for owners:

1. can you still charge over usb-c or are you forced to use MagSafe?

2. Is the other side of the magsafe cable usb-c or is it glued to the brick?

I rather like not having extra cables/charging blocks.


1. You can still charge over USB-C. (The 16" cannot charge at full speed over USB-C, because it charges at 140W which is more than USB PD can currently deliver. But it will still charge to full, just slower.)

2. The other side of the cable is USB-C.


USB-PD can currently deliver 140W, but it's fairly recent and Apple probably didn't have the lead time to implement that in these machines. Should be there in macbooks soon, I hope?


If 140MW is more than USB PD can deliver, then how can the other side of the cable be USB-C?


Because the 140W charging adapter supports USB-C EPR (extended power range) [0], which allows compatible USB-C chargers, cables, and devices to work at up to 240W. (Obviously Apple's 140W charger only supports up to 140W.) However, the spec is so new that there's currently no USB-C to USB-C cable that supports 140W. I don't think we know yet if the MacBooks can charge at 140W with an EPR USB-C cable, not least because nobody is selling that cable yet. But I wouldn't be surprised if it doesn't work because that'd require all three USB-C ports to support it.)

The MagSafe 3 cable goes from 140W USB-C to MagSafe, but the USB-C charger is 100% standards-compliant USB-C EPR.

[0]: https://www.reddit.com/r/UsbCHardware/comments/qat7ej/new_ap...


That's really good news.

140W charging on the 16" had me worried that Apple was using a proprietary/non-conforming extension to USB-C, just like Nintendo did for the Switch. The reports of bricked Switches from standard USB-C chargers was an unfortunate demonstration of why this is a bad idea, and why it's important that Apple is using a standard (even if it's very new) method of increasing power delivery.


I really doubt that Apple would have gone out of their way to invent their own power profiles for the 16-inch MacBook Pro if one didn't already exist.

The 16-inch MacBook Pro certainly doesn't need 140 W to run. The Intel i9 MacBook Pro comes with a "measly" 96 W charger and it tops up just fine. The 140 W is there for fast charging and if they had to stick to 100 W, they could've made a professional machine just fine. It's not like the most common complaint about the MacBooks between 2016 and 2021 was "it charges up too slow".


Well, Apple did go out of their way to invent the new power profiles - they just did it via the USB-IF.


FWIW, Apple pretty much writes the USB-C standard.



It’s non-conformant, presumably.

There’s no real problem making a charger than can deliver more than the spec allows, but you wouldn’t be able to use it with a C-C cable. This isn’t one.


USB PD 3.0 can do up to 100 W, USB PD 3.1 can do up to 240 W.

Guessing they can only do 3.0 on the data ports for now and the MagSafe connector is just USB PD 3.1 implementation without any possibility for data.


The MagSafe port uses USB PD 3.1 EPR. The USB-C ports on the MacBooks don't actually support EPR, so they're limited to 100W.


With the other end being MagSafe apple has no need to adhere to usb pd specs.


makes sense, thank you


Presumably the charger has a proprietary feature where it will provide more power over the magsafe charging cable.


If you finish reading the article, the answers you seek are therein.


1. You can still use ANY USB-C port to charge the laptop. 2. The power adapter is USB-C to MagSafe 3.


Not an owner, but from the announcement:

1. You can charge over usb-c.

2. Yes, the magsafe cable as usb-c at the other end.


This was answered in the article you're commenting on. But what wasn't addressed was that the charging cable costs FIFTY QUID (I don't know what that is in BUCKS) but ... that's a lot for a stupid cable.


From the announcement video, they said you can charge over both USB-C and MagSafe.

On the website for the notebook the "In the box" section shows a separate MagSafe -> USB-C cable.


Don't own one but the reporting has been solid that USB-C charging is still available and that the magsafe cable just has USB-C on the other end.


1. yes, you can charge with usb-c

2. it is a usb-c plug


1. Yes 2. It’s a USB c cable


I assumed that 10gbit ethernet was only quietly released as an option for the M1 Mac Mini some time after the initial announcement, because Apple didn't want to commit to components where supply constraints could limit their ability to scale up production in the case of high demand.

Likewise, I wonder if there will be options for HDMI 2.1 (for 4K@120Hz), SD slot with UHS-III, and wifi 6E in the next few months for the new MacBook Pros.


I expect the lack of HDMI 2.1 is bandwidth related. Full-bandwidth HDMI 2.1 is 48Gbps but Thunderbolt 4 is "only" 40.

I'm not sure if they still do, but at one point each pair of Thunderbolt ports shared a bus, so you could have something on the left running at 40Gbps and something on the right running at 40Gbps, but not two things on one side.

If that's still the case, the SD card slot and the HDMI port are probably sharing bandwidth with the one Thunderbolt port on that side of the machine.


That's an interesting point. What interface do they use for the integrated display? It looks like there are multiple sets of chips involved in the physical ports for Thunderbolt or USB depending on what is connected.


Do ARM compatibility issues still exist?


Yes, but it's very rare now a days. I live in JVM land and there are quite a few native JNA libraries that I have run into issues with. One example is vlcj [1] I just ran into yesterday. I have yet to dig deeply into the issue just yet, but it's giving me the fun "no compatible architecture found error". SWT and JavaFX both fully works with arm now though, so I am pretty happy on that front. GraalVM native-image also does not support building ARM on Mac yet either. On the C/C++ side of things I have not ran into any incompatibility issues in the last 5 months. I'd say 99% of my time is smooth sailing at this point.

[1] https://github.com/caprica/vlcj


They really need to get on and support macOS ARM for GraalVM. It's clearly not going away - it just needs doing.


While we're asking, would it really be that much trouble getting Vulkan working? It falls in the same category of 'not going away but needs doing'.


Vulkan already works ok through MoltenVK.


It's far from a complete solution though, and it's still going to have worse performance than a native implementation. Apple needs to prioritize Vulkan support on these devices if they don't want to repeat the last 15 years of Mac GPU issues.



I wonder about games, and sadly this website doesn't list any.

I want a new laptop and those new M1 MBPs look interesting, with low TDP high performance. I primarily use it for development and browsing when I'm not home (I have a family overseas), but I also want to play games now and then.

I know Steam has native ARM support and some games I play are ported as well, but I really wonder how much can I get from double translation layers (x86 to ARM + Wine or derivatives).


Games are decent to good, I had a good experience using CrossOver--though I was playing 2d (spelunky, omori at the time). https://www.codeweavers.com/compatibility/ Retro emulation is phenomenal, PS1 PSP DS no problem, everything earlier great. YMMV as I have separate gaming machines


Of course. Any time you try to run a non-ARM binary on an ARM CPU without some form of translation or emulation, it will fail.

This is also true for any other CPU architecture.

If you want to know about specific compatibility issues, I'm happy to try to help as I've been using an M1 Air as my daily driver for many months now.


> without some form of translation or emulation

IIRC that's why Rosetta2 exists. The question is more whether there are issues despite that


I don't think I've ever had an issue related to Rosetta2. Everything I had to run on it worked flawlessly and without any noticeable performance hit.

This is only my own anecdote though.


I also had everything working fine but performance is really bad when you go through rosetta.

For example, discord client is still x86 and while it works without any issues, the heaviness is noticeable.

Also, .NET 5 does not have an arm build (.NET 6 does but it is still in beta). The entire toolchain works but compiling a binary can be 3-4 times slower on the emulated one.

Things work without any issues though. They are just slow.


I don’t notice many performance differences comparing GUI apps. Discord’s PTB is M1 optimized and, guess what: It’s just as shit.


PTB still shows up as Intel binary: https://cln.sh/5KA9Vh


I was running the main ruby on rails app I work on via Rosetta since the first M1 MacBooks came out. The performance hit was there for sure, but not huge. Running local tests or seeding my database ranged to "about the same" to 10-20% slower than my 10 core iMac Pro, which is still a pretty fast machine.

Since I've upgraded the app to support Arm natively, almost everything runs faster on the m1 13" macbook compared to my 10 core iMac, with the exception of long running tasks that will use all the cores you can throw at them.


There have been some font rendering issues in Ultimaker Cura (which is written in Python) that ran as x86 on M1. These have been resolved in the Python library.


In fairness, I've seen font rendering in Cura on Linux as well.


I've seen font rendering issues in Cura on Intel Macs as well. Cura's GUI and UX are pretty bad in general IMO though, so I don't use it.


Did they finally get Docker working on the M1 chips?



Native ARM containers work but there are various things that break with Docker x86 emulation, eg with Node.


Mostly. Docker for Mac's internal VM consistently kernel panics the first time I launch it after a reboot. I also have containers freeze up every now and then, something that never happened on my Intel machine.


Docker works mostly fine. ARM images are pretty common, and for those that don't have it, it runs a QEMU emulated x86 machine.

I have hit issues with it but they're rare enough that I don't care.


It’s somewhat unreliable and file system interaction is slow, but from what I’ve heard that means it’s no different from x86 Docker Desktop.


Docker has been working for months.


You’re missing the point, it’s an ecosystem question. I personally don’t experience many issues at all. If I do they’re usually solved by switching to Rosetta (but that does introduce some hairiness for command line stuff).


Full version of MS SQL Server doesn't work. The X86-64 Docker image won't start under QEMU emulation and the installer won't install it in the ARM64 insider builds of Windows 10/11 running in Parallels. You can use the ARM64 build of Azure SQL Edge, but it lacks many features that may or may not be important depending on your use.

Here's the GH issue related to the Docker issue if curious: https://github.com/microsoft/mssql-docker/issues/668


> Full version of MS SQL Server doesn't work

Stay tuned. (to avoid emulation)

> the installer won't install it in the ARM64 insider builds of Windows 10/11

Yeah, that was a bug: https://support.microsoft.com/en-us/topic/kb4530084-fix-erro...

The solution that works is running the SQLS installer, see that error, install that CU update, then restart the SQL Server installer that _then_ continues.

But that's an unreliable process, would need new install media to be fixed properly.


I have an M1, and I was unable to install tensorflow in a docker container on the M1 host (as of last time I tried, a few months ago).


I run tensorflows yolov4 no problem on m1 macbook air, cool thing is that you can fit larger models in memory than if you had a 8GB NVIDIA Card


Training or inference? How's training performance compared to 8GB NVIDIA, if you have one?


Yes, and they will never go away entirely

You can never run x86 dockers as easily and as quickly as on intel. If you have bunch of scripts in some deployment/development pipeline that depend on docker being x86, they will fail.

Other things…I’m not sure about October 2021, but last time I looked, R was still only available through Rosetta, because parts of R are in FORTRAN and that does not have native M1 support (yes FORTRAN), but honestly it’s so fast with emulation that it does not matter.

Same SciPy… or was it NumPy? One of those two have parts in FORTRAN and can only be run through Rosetta (so it can wreak havoc as you might need python pipeline parts in ARM and parts emulated). But I have not hit that personally as I don’t use Python directly

you cannot run Windows either dual boot or in virtual machine. Same with x86 linux.


There's no problem with either scipy or numpy running natively on Apple Silicon. The problem is with the 3rd-party package managers, which are all junk, and that's not new: they've always been hot garbage, especially homebrew. Just don't use homebrew, and then after you get used to that, don't use it on x86, either.


There were issues with numpy/scipy because of lapack and fortran. I guess they solved it somehow.


There were problems with homebrew finding a fortran compiler for this platform, there were not problems with fortran on macos/arm64 itself. fortran for macos/arm64 has existed since the platform was available. OpenBLAS has worked fine as far as I can tell with `make FC=gfortran`. The only problems have been with package managers.


You can run Windows ARM in emulation, as the MS Surface uses ARM and has a version of Windows available.

I have run Windows ARM in Parallels on an M1 on a client's machine.


In my experience the main issue with the M1+macOS experience is not the ISA, it's the big-little thread scheduling. I'm not sure what they were thinking but there's some heuristic that schedules background or batch work on the little cores. This includes manually-triggered software updates which unfortunately can take forever on those little cores. I waited almost 4 hours for an Xcode update to complete. All four little cores were maxed out the whole time, and these new chips only have two of them.


Doesn't an Xcode update _always_ take about 4 hours?


I haven't run into any issues. Homebrew seems to be (mostly) working, and most vendors have created universal images at this point. I think the only thing I'm personally waiting on is VMware Fusion (currently in beta).

Swift still doesn't have an Arm docker image from the official Swift.org group.


I mean, yeah. You're not going to be running anything 32-bit, and you can't use OpenGL or Vulkan, which already rules out a decent handful of workflows (including but not limited to my own). On top of that, there's legendary Docker issues that plague pretty much anything running MacOS, and there still isn't a competitive, complete package manager. eGPUs are almost completely off the table, and if they do work you won't be allowed to run an Nvidia card, you can't use CUDA, and OpenCL is fully depreciated.

So yeah, there's still quite a few holdouts on the MacOS and ARM side of things. Unless you're confident that your workflow functions perfectly on ARM, you should probably stick with a nice x86 machine for the foreseeable future. Apple will definitely improve on this model, so if you are dead-set on buying one I probably wouldn't spring for the first-gen anyways. Just my two cents though.


OpenGL still works on the M1 (Apple designed their own GL-over-Metal driver), albeit with the OpenGL version limitations from modern macOS in general. Vulkan works over MoltenVK.

Other than old games, I'm not sure why 64-bit only is an issue.

I've not hit any Docker-on-macOS issues with what I've used it for, at least.


Is MoltenVK usable right now? I was under the impression that it's a loooooooong ways behind DXVK in it's current state.


I've seen plenty of projects that use MoltenVK as a drop-in backend for macOS. It seems complete enough for general use, but I haven't used it directly so I'm not sure what's missing.


they're largely resolved--my air no longer panics and I cannot recall when it last did, though it did panic frequently the first several point-releases;

my web development is largely unaffected since around the 2nd quarter of 2021 and actively developed software either has a release for apple silicon and when that's not the case it mostly works with occasional crashes (eg inkscape); obviously this will depend on what applications you use however homebrew, node modules, deno, rust, etc all are working well in various long-lists of items that are less relevant beyond my generalization here; I intend to use applications running in windows and qemu but haven't taken time yet


Yes. Most user applications have been updated but plenty of libraries have not been.

Most of the python packages that have significant C deps often have issues compiling (e.g. Scipy, igraph, etc). It's almost always possible to work around these issues but it gets tiring since it's compounded with the usual macOS breaking changes every year. Any binary compiled with AVX-512 instructions (e.g. Tensorflow, MuJoCo, etc.) will also need to be recompiled.

So tl;dr most (but far from all) popular closed source programs have been updated/recompiled for ARM (or at least run decently with Rosetta), but many libraries/open-source projects require some painful recompiling.


youtube-dl didn't work when I tried it 3 weeks ago


I’ve used youtube-dl for the better part of a year now on M1 without issues. If it didn’t work for you, then you need to look into what’s broken on your specific machine.


YouTube dl is just python, whether or not it runs just depends on whether or not python’s interpreter runs on ARM, which it definitely does.


> the minimalist design mid-2010s Apple, which achieved design simplicity by forcing complexity and frustration on users

This is a really insightful, succinct take on how they strayed from the path.


The body text of this is very hard to read on Windows (Firefox & Edge) since the text appears to be bolded.

I would change to font-weight 300 or something like this.


indeed. All is wrapped into one huge <strong> tag which causes browsers to apply the bold font style.


I think it's pretty cool that video is getting such a performance boost, however there are more developers than video editors out there that could have used few more general purpose cores on the silicon. If you look at compile times, the M1 MAX is just slightly better than AMD mobile CPUs produced on an inferior process node. Nothing to write home about.


The big thing to look at here is perf-per-watt and heat output.

For a little bit I had a laptop with a Ryzen 5900HS and though its performance was great, utilizing that performance meant turning the laptop into a cooking surface with fans raging while being plugged into a barrel-style power brick that delivered more power than USB PD could handle. A laptop delivering that same performance without the crazy heat and fan noise while on battery while still delivering good battery life is a big deal.


I'm curious what specs are software developers planning on getting. From what I've seen, the m1 mba/mbp was good enough for a lot of use cases so I dont see the point of get some specced out mpb. I am currently leaning towards the base model 14" but with 1tb of space.


Minimum config that allows 64 GB RAM which unfortunately means the M1 Max, 2 TB SSD. Prices out just under 4k for the 14”, over 4k for the 16”. Not sure I care about the screen size, I’m almost always at my desk with a 43” display. But either way it’s gonna be a few months before I can budget it.


Getting the 10/16 CPU/GPU with 32GB of RAM and 1TB of storage.

The only real uncertainty for me was the GPU, whether to get 14/16 but the difference in cost was small and I figured it might be nice for some physics sims + tensorflow work.


M1 is one of the best things that happened to the MacBook. This is one of the best Macs to buy right now.


If it was above board to download and run MacOS in VMWare as a guest under my linux host (slowdowns because of arc differences forgiven), I'd be way more willing dip a toe into the ecosystem and see how I like it...


Aside from going into an Apple store, probably the cheapest way to try out the M1 Macs is to rent one from Scaleway (€0,10/hour + tax, minimum 24h - https://www.scaleway.com/en/hello-m1/ ).


If you were really curious you would use MacOS bare metal as shipped out of the box, with Linux running in a VM instead of wishing for premises you know won't happen.


Yes, I really am curious. And NO, your assumptions about what I would do if I really was curious aren't correct.

I had a mac book pro around 2011 (OS X snow leopard). half a dozen happy years with it until it was stolen. But these days since Apple is forbidding running MaxOS under virtualization, that's a deal breaker for me.


Apple have not forbidding running MacOS under virtualization for years...


which is the right one to get, 14 or 16?


I'm wondering that too. As usual, the pros/cons of physical footprint and screen size cancel each other out.

Unlike previous years, Apple allows you to configure the 14" model with the highest CPU/GPU configuration, so it changes the calculus a little bit.

The 16" should have better battery life and cooling, but it starts at $500 more. And it has "high power" mode, but I doubt many would need it.

This is probably the first time where it comes down to only which size you want to carry around.


Depends on your workflow.

Do you find yourself working away from your monitors a lot? Do you move from place to place, or do you set up and not move for a while? How's your eyesight? Do you use the native screen at all while docked?

I find 14" to be good for mostly docked work, or very mobile work. I use reading glasses anyway, so the smaller screen doesn't bother me much, but I find it difficult to side-by-side 2 windows. It's way lighter and fits in almost any bag.

I like the 16" because when working, I use the native screen with my monitors. I don't use this laptop anywhere except in 2 places (home office, and work office). I can definitely fit multiple windows on the screen. I can scale the display up, and it doesn't wind up eating a bunch of my screen real-estate. Apple is quoting more battery life for the 16", so not sure if that factors into your workflow or not.


>side-by-side 2 windows

I never use that even on 40" display. In general where size matters: terminal window is much easier to use if you write long one-liners, reading logs, editing in vim; I do set fonts to be large to have 240x70 characters on the full screen.

Even 16" is too small for any productive work, not mentioning ugly Apple German keyboard layout is absolutely terrible for any serious programming (square and round brackets are in very hard locations).


I have a 13 and 16 (one is mine, one was paid for by my office). All else being equal, I almost always choose the 13 when I need to be portable. YMMV. If your workflow benefits significantly from the extra screen size, that could tip the scales the other direction. It does not for me.


If you mostly use your laptop at your desk or docked, get the 16 (unless you literally clamshell it).

If you travel a lot or otherwise are moving around with it, get the 14.

16 was a bit big on airplane tray tables, but this one might be a bit smaller.


One (small) consideration is weight. Both version weigh more than previously - so if the previous 16" was already pushing what you find comfortable to carry, then the 14' might be a better choice.


I am debating. Thinking of going to the Apple store to check it out.

14: pro - lighter con - less screen for multiple windows

16 pro - multiple windows side by side, longer battery lift con - 5 pounds, won't fit in backpack


I used to go 15-16", but now I feel that is too big for a laptop.


I'd love to buy it for the better screen, but since it is my personal laptop and I don't need the performance, I don't know if I can justify the cost :/


Can someone tell me whether their 14in gives enough real estate when debugging in VS code with the terminal open?

A screenshot would be greatly apprecoated, thanks.


I almost never work without an external monitor (or two), so the laptop screen is really just for chat and email for me. Honestly anything below 20 inches is too small to be usable for serious work in my opinion. Hunched over a little screen is unergonomic and tiring. The 16” is only $200 more than the 14”, I think, for comparable hardware setups, so probably just go with the 16”.


For me 14" is enough to do work, but definitely would not wand to stare at it all day. These days you can just connect a second (portable) display via USB-C so that helps a little. Though I don't recommend doing that on battery alone.


Only you can answer this question. If you're used to a larger screen then it's quite the adjustment


No, especially with the terminal.


I wonder how the fan noise is going to be. My old (late 2013) MBP is dead silent unless I'm running something heavy like a video game.


I watched the abridged intro video an I swear I thought it was a parody... Everything introduced seemed like a joke.


If I had to choose between an SD card slot and a USB-A port, I definitely would have chosen a USB-A port. I would take both, if possible. I would even give up a USB-C port for a USB-A port, but that might be a little backward-thinking. The USB-A to USB-C adapter is small and inexpensive, but it is still something I will need, mainly for the occasional flash drive.


I disagree. I don’t purposefully keep A and C devices. I’m switching everything over to C and have dongles to do that.

Supporting both wastes the space for when I’ve moved to C.

I’m more likely to have an A->C dongle than an SD card reader.


Maybe Logitech will make a usb-c dongle one day, until then I'm stuck with an adapter that sticks out and is constantly in danger of being knocked and bent.


We may live in the future, but many of us still work with people who live in the past. I know Apple always tries hard to be forward-thinking, but USB-A is one of those ubiquitous standards that won't disappear for quite some time.


That's why they went with USB-C. USB-C is backwards compatible, USB-A is not forwards compatible.


I’ve been migrating to usb-c since my MacBook four years ago. I don’t have any A devices and just keep adapter dongles for people who have thumb drives or something.

I don’t care that much because it’s the size of a quarter and goes in my bad with hdmi and other adapters.


The magic keyboard still comes with a USB-A plug. Also, my low profile USB-A Yubikey is still fantastic and there's no viable replacement. It has been 6 years since Apple removed USB-A. I think we've found the line between forcing painful but necessary change (remember when apple removed the floppy drive in 2000?) and trying to force change, but failing. In the transition to USB-C they tried replicate their success killing the floppy. I think we can all agree that they failed.



Is everything still glued together without being able to repair yourself? /s


A fucking a notch!!!


I can fit a 16" MBP in my backpack. Your mileage may vary.


The title says Mac pro, not MacBook pro


I'm aware. 16" is even more powerful though!


I would have gone with the 16" MBP if it wasn't the weight.

I am interested how much different the performance is between the 14"


All the info I've found suggested no real delta, so i went 14". I run external monitors at home, and having the smaller size for travel was appealing.


Considering the wattage and thickness of both designs, I bet they are almost identical when specced the same. I don’t think the 16 has significant extra cooling/that the M1 max throttles much.


Other than that stupid notch it looks pretty good.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: