Seems like Microsoft's Modus operandi the last few years has been:
Make anti-consumer move -> get backlash -> repackage same egregiousness while stalling & deflecting -> repeat cycle
Steamrolling their users then getting rewarded with their stock going stratospheric. Excellent!
> Steamrolling their users then getting rewarded with their stock going stratospheric
Welcome to the world of modern capitalism. I'm seriously starting to question if a company can survive on the stock market by creating a solid product and caring about the users of that product.
Perhaps manufacturers unlock bootloaders & whatnot after the 7 years so tinkerers et al can load 3rd party software to give the devices even longer lifelines?
I think they should have a larger responsibility than that, but yeah I absolutely agree that once devices are no longer "supported", the manufacturer should be required to relinquish control to owners in a safe manner (e.g. Android unlockable and relockable bootloaders).
Same goes with multiplayer game developers. If they wanna stop hosting servers, they should be required to release the server software in a manner that makes it possible for me to set it up for myself to keep playing the game.
They should be required to place the keys and the initial software build into escrow, so either when a fixed time has elapsed, or they have been found cupable for a breach of an applicable law (e.g. if they stop providing software updates).
Ah... There's a pattern here. Soon enough, just like with Facebook pages eons ago, they will nerf the reach of WhatsApp channels then prod channel owners to pay for more eyeballs.
It should be a law of nature that whatever Meta/Facebook acquires will surely be ad-riddled & 'spyware' infested regardless of the "we won't" promises they swear to abide by.
Tangent, I find Instagram to be one of the most egregious battery drainers amongst mainstream apps. A former IG employee made a claim here on HN that the inefficiency was due to ghastly experiments being run on clients devices. Shame, I cannot seem to find that comment.
Meanwhile, Mercedes is still charging for full rear wheel steering in Europe after electronically nerfing cars [0] despite shipping the cars with hardware fully capable.
It would be nice if Nvidia did not enforce artificial driver and legal kneecaps to consumer Geforce cards for cloud usage to prop up their enterprise ones... but shareholder rights come before anyone.
But then what's stopping cloud customers from scalping up all the consumer GeForce stocks for cheap and putting those in the data center like in the crypto mining days?
Cloud customers can afford to pay more for those GPUs than gamers because they generate revenue with them, gamers don't.
So it make sense to have some product segmentation in place to prevent one market completely cannibalizing the other while leaving Nvidia with less profits.
The current situation is still caused by manufacturing constraints at TSMC for the cutting edge nodes which both the consumer and data center parts occupy so it makes sense for Nvidia to prioritize the higher margin parts.
There have been great points made that Nvidia should split into Nvidia, the general compute company oriented to data center customers with deep pockets, and in GeForce, the gaming GPU company with access to all the cutting edge tech of Nvidia but seeks to be more scrappy and optimize designs for rasterization performance rather than generic compute and chases smaller die sizes on cheaper nodes to be price competitive. This way the data center compute market will stop cannibalizing consumer gaming one and we'll be back to having better GPUs at competitive prices.
There are some debatable licensing terms in various Nvidia driver releases that prohibit the use of consumer cards being hosted in "datacenters".
But the real issue is physical form factor and power. As has been noted in the press, etc, something like an RTX 3090 (and more so 4090) is literally designed to push frames as fast as possible power and heat be damned. They're multi-slot (which results in poor density), have card design/cooling challenges, power configuration issues, etc.
There's a story out there about the only dual-slot RTX 3090. Gigabyte came up with one (I have several - they're great) but supposedly Nvidia put pressure on them to pull them from the market[0] because people were putting them in x8 server configurations and using them instead of their much more expensive datacenter products.
They're just trying to eat the consumer surplus from enterprise customers, which are higher up in the demand curve. Everyone does that.
An individual developer is happy to charge a higher salary for its services from a larger corporation in comparison to working for an SME, simple because in a large org its services generate more value, allowing it to capture more of it.
You have long since missed the boat on changing that. This is how business is done: "well we can charge you 5x the market price for the RAM/SSD upgrade, so we will!"
I don’t disagree, but I think that’s a poor analogy. I don’t think devs take into account the business value their future job will bring their employer when negotiating salary. And if they do, they only do so when the balance is in their favor and they definitely wouldn’t lower their salary if they think the job has less impact than another job.
They do when they decide to interview for large orgs. They do it because they get better pay. It's the same service. Why not work for a small org that pays less?
They didn't come out on top, they revelled in it. What brought us back to some relative normalcy was the crypto crash & Etherium's switch away from PoW; even after that, the 40 series pricing and range seems to be nVidia cashing in on the scalper prices
nvidia maintained MSRP of 30 series cards during the WFH boom and did not allow AIBs to increase prices, this was one of the main complaints from EVGA that ended up with them pulling out of the GPU market. The scalping was done by third parties.
You could always use a Geforce card at home. Are you saying the cloud should use those Geforce cards and completely distort the price of the GPUs for home use?
> Apple will reportedly receive 20 billion USD for having Google as the default search engine. Presumably, this is why they are fighting so hard against having any meaningful competition on iOS.
I have a naive question..what would happen if google simply stopped paying this money, and let the user decide?...I guess it wouldn't affect them much, google is still the best search engine.
It's also 20 billion Apple has to do almost nothing for, except ban the other browsers and set google as the default search engine in Safari. It's a significant proportion of their yearly profit.
This doesn’t make sense. There are other browsers on the App Store and, although they have to use Safari’s browser engine, they’re free to prioritise search engine preferences as they see fit.
The real versions of Firefox, Chrome and Edge etc have been banned. Instead Apple forces them to create new browsers which use their controlled, locked and unmodifiable WebView removing the majority of the ways that these browsers can differentiate themselves while providing features exclusive to Safari (like the ability to install Web Apps).
This ensures that none of the "browsers" can compete on iOS and this obvious by comparing browser market share of the same browser between iOS and Android.
To be precise, the thing that makes a browser meaningfully different (the engine, Firefox has Gecko, Chrome has Blink, Edge also uses Blink, although Blink is often identified as chromium) isn't allowed to exist on iOS.
As you say, Apple only allows their own WebView to exist on iOS, which is an engine they both control entirely and is heavily locked down. Not helping matters is that WebView runs on WebKit (Safari uses WebKit as well), which is these days pretty much the equivalent of Internet Explorer in terms of browser shenanigans[0].
The result is that the only real thing you get from Firefox/Chrome/Edge on iOS is access to your synchronized bookmarks. Apple doesn't offer any form of a WebExtension implementation either to these engines (instead rolling their own version, which they confusingly also call WebExtension), and none of the previously mentioned browsers are even allowed to add the universal form of WebExtension support to WebKit. The result is that iOS also remains one of the few platforms where meaningful adblocking remains a crapshoot (entirely beneficial to Apple of course).
[0]: To be somewhat fair here, WebKit *is* very useful for more embedded/low powered devices that aren't intended to access a lot of websites to begin with. There are some uses for WebKit, IE had none near the end.
You couldn't set any if these alternative browsers as the default until fairly recently. That ensured their market share was kept to a minimum and people have gotten so used to Safari few people find it worth switching now.
Also, Safari has some super powers compared to the others on iOS (things the other browsers are simply not allowed or even able to do like Add to homescreen)
Steamrolling their users then getting rewarded with their stock going stratospheric. Excellent!