Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Apple is know for taking something others invented and making it better/more user-friendly. That's great but it isn't being bleeding edge. Like the first few iPhones. Smartphones made better, but without simple functions like copy and paste for generations. I find it surprising anyone thinks apple's strength is being bleeding edge innovative. It can be counted on one hand how many times they have been so.


Just in recent weeks:

- M1 [0]

- A14 in the iPhone 12 Pro (first 5-nanometer chip with 11.8 billion transistors) [1]

Previously, they introduced the first 64-bit CPU in a mobile device, which stunned competitors at the time [2].

Not to mention their excellence in hi-dpi displays. In 2012, Apple launched the MacBook Pro a with "retina" display. It took _years_ for non-Apple alternatives to materialize. In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+ (which I own, and it's excellent). However, the MacBook Pro 16" remains superior when it comes to sound quality, form factor (i.e. weight), touch pad precision and latency, thermal performance, etc., despite significant investments from Dell in those areas to catch up. [3]

[0]: https://www.apple.com/mac/m1/

[1]: https://www.apple.com/iphone-12-pro/

[2]: https://www.forbes.com/sites/ewanspence/2015/01/21/iphone-5s...

[3]: https://www.youtube.com/watch?v=72vjKOSwf3M


Thats cool, but nothing on that list strikes me as revolutionary kind of development. To me revolutionary is invention of the GPS, or a transistor, or the combustion engine, or a lightbulb - those inventions changed the world.

With everything on that list, thru made a better version of something that already existed, especially with display - it's not like they were the ones developing and manufacturing the displays. Even the ARM architecture and instruction set is not created by them.

I think it's important not to get carried away.


> In 2012, Apple launched the MacBook Pro a with "retina" display. It took _years_ for non-Apple alternatives to materialize.

Ironically this is now somewhere they could stand to improve.

The MacBook displays are excellent, particularly when it comes to colour reproduction, but for the past several years they default to a scaled display mode. For anyone not familiar, the frame buffer is a higher resolution, and scaled down for the display, trading sharpness for screen space.

Evidently the drop in sharpness is imperceptible to most people, but I can certainly tell, to the point where I forego the extra space and drop it back to the native resolution.

For a company that generally prides itself on its displays, I think the right option would be to just ship higher res panels matching the default resolution.

They have also done this with certain iPhone displays over the years, but at 400+ppi it’s well within the imperceptible territory for most people. For the 200-something ppi display on the MacBooks, not so.


> […] but for the past several years they default to a scaled display mode. For anyone not familiar, the frame buffer is a higher resolution, and scaled down for the display, trading sharpness for screen space.

My understanding of how scaled resolutions in macOS work is that graphics are always rendered at the display's native resolution. The scaling factor only decides the sizing of the rendered elements. Can you point to some documentation that supports your view? I'd like to learn if I'm wrong and understand all the details.


deergomoo is correct, Apple’s “Retina” displays work by displaying all screen elements using images/icons/text rendered at 2x the liner number of pixels as their non-retina counterparts. Since it’s a fixed 2x scaling, the only way to have anything other than the native panel resolution (with elements that are 2x their non-retina number of linear pixels) is to render at a frame buffer size larger than the actual screen. Then this frame buffer is scaled (by the GPU) to fit the actual screen size. Because it’s usually scaling down and not up this theoretically results in only very minor blurring that most people don’t notice.

It used to be this non-native scaling was only an option and by default the MacBooks ran at the exact native panel resolution. But at some point that changed so the default is one “notch” on the “more space” slider. I presume most people preferred it that way as you don’t get a lot of text on the screen at the native “Retina” resolution. But the sharpness is worse than when running unscaled.


It's easy enough to set them to the native resolution first thing. They probably noticed a lot of people don't like small text and so they set the default to scaled


The default is smaller text than native, not larger.


> In 2012, Apple launched the MacBook Pro a with "retina" display. It took _years_ for non-Apple alternatives to materialize. In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+ (which I own, and it's excellent).

Uhh, both Sony and Dell had 1080p, 1200 vertical and then QHD laptops in form factors down to 13" before Apple. I owned both before I moved to Apple myself.

You can read people here talking about how their laptops have had high res displays when Apple announced "Retina" back in 2012: https://news.ycombinator.com/item?id=4099789


1080, 1200, and 1440 are all of course smaller than the 1880 vertical resolution on the 15” MBP.

But it’s not just the resolution, it’s that Apple made such a high resolution usable via 2x rendering, and did so immediately for the entire system and all applications.


They are. But 1440 on a 12" laptop (like the Sony Vaio) competed well.

You can also get a 4K UHD Dell at 13".

> In fact, the only comparable laptop display in existence today would probably be the Dell XPS 17" UHD+

> But it’s not just the resolution

It was, above. Now it's the resolution and the ecosystem. "Apple did it first". "No they didn't." "Well, they were the first to do it right" ( for varying definitions of "right").

I have no particular horse in the game. In fact, my entire home ecosystem from Mac Pro to MBP to iPad, iPhone, Watch would more lean me in one particular direction, but ...


Wow, people talking about getting a 512gb SSD in their laptop for $2.5K as a good deal is kinda amazing- I think my $450 HP Pavilion with a 1050 and AMD 3500H might just be faster in every single way (but it definitely has lower build quality...)

It's amazing how far we've come


I paid a fortune for flash storage from 2009 to 2016 and it was totally worth it.

There are some technological improvements that are so transformative (wifi, flash storage, high-resolution/"retina" display, LTE data, all-day battery life) that once you try them you never want to go back.

Then there are the changes that make you go "hmm..." (butterfly keyboard, touchbar without a hardware escape key, giant trackpad with broken palm rejection...)


No sure about bleeding edge, because for the most part their products work but at different times they have defined:

- Phones, (Original iPhone way ahead of competitors)

- MP3 players (Original iPods)

- Tablets, (Pretty much the only serious tablet as far as I can see)

- Smart Watches, (Apple Watch still defines the category)

- Ultra Books, (First MacBook Air)

- All in One Desktop (iMac)

- Mobile CPUs (Apple silicon has been way ahead for years)

- Laptop CPUs (M1)

This doesn't just all happen by making existing things more "User Friendly". This takes real innovation to pull off.


> No sure about bleeding edge, because for the most part their products work but at different times they have defined:

I'm ex-Apple and an Apple fan as much as anyone, but I also have the benefit of being old. Not to take anything away from Apple's collective accomplishments, in many of these categories I'd say they "redefined" more than "defined".

There were many smartphones before the iPhone (the Palm Treos were great), many MP3 players before the iPod, many tablets before the iPad (the Microsoft Tablet PC came out about a decade before the first iPad), all-in-one PCs go back 40 years now, etc.


> I'd say they "redefined" more than "defined"

Pretty much, and they did a pretty good job too.

It's kind of the difference between invention and innovation.

Apple certainly does invent things, but they are a superlative innovator.


There were many computers before the Xerox Alto.

Xerox never made the Macintosh because they missed key innovations such as regions, the Finder, consistent UI guidelines, the ability to put a usable GUI OS in an affordable package.


PARC was also bound to Xerox which had trouble seeing beyond its photocopier business.


Doesn't matter what PARC's limitations were, what matters was the Macintosh was a huge innovation that created what the Xerox Alto and Star were missing.


As someone who adopted smartphones years before the iPhone: anyone who thinks the iPhone was just “incremental” is deluding themselves. The capacitive display alone was a game changer, let alone everything else.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: