> computers are more complicated than they were 20 years ago.
Feels like Windows (maybe modern PCs) just doesn't work as well as it used to. Ran Windows 7 x64 for years with very little complaint, 8.1 wasn't great UI wise but worked.
10 onward just seems to lag & crash more. Totally crazy theory would be that binary compatibility (and dependencies) doesn't work well once the original author of a piece of code has retired (or died).
Would need an emulated MacOS GPU. The only way I could see this working is that there is a way through OS X hypervisor on Big Sur or > to passthrough metal.
That's like saying to emulate an ARM or Alpha processor, you need an ARM or Alpha processor to emulate... that's what emulation is about... emulating one CPU/architecture on a more different CPU/architecture...
I think you're mistaken, virtualization and emulation are two different things. Metal is not low-level enough to perform high-speed virtualization, what the parent is suggesting is that you'd need to use Metal to accelerate an emulated GPU, which would result in pretty substantial performance degradation. On top of that, most virtual machines wouldn't recognize the M1 APU, which means you'd probably also need to emulate interfaces to compliment it. When all is said-and-done, you'd have an incredibly fragile and slow stack that has a good chance of breaking with every Metal update and almost certainly breaks with each new GPU.
The struggle is not worth it, certainly not for $1,000. Probably not for $500,000.
Consistently not being able to run code that was just fine 2 years ago. Consistently having "quality assurance checks" that somehow always entail paying more rent to Apple.
I'm not sure why this was ever dropped, if I remember right back in the early 2010s, this ran XP very usable on a C2D XP laptop. Pretty sure I even got 7 working at some point, the only real limit was that it absolutely could not run 64 bit vms.
CPU as in core or socket? These days most CPUs are "many-CPU-cores-in-1-socket" and having X CPU cores over 1 or 2 sockets make a small difference, but software does not care about sockets.
As a trader, I can say that at least in Finance, the dead internet theory does apply. Type any stock ticker into google, especially a small one (random example: MRAM), not a single article for "mram stock" is written by a human, you have AI generated pages such as "stocknews.com", and dashboards like FinViz (which fwiw is a good and useful website). The articles are generally generated based on quantitative (and therefor easy to automate) aspects of stocks, for example price to earnings, or that the price has grown alot.
A second type of website exists, which I learnt of from searching tweets in google, and finding that algorithmic spinning (replacing words to avoid plagiarism detection) was clearly being employed. The grammar is often laughably bad, but clearly in terms of google is either the best content available (as their algos see it), or the only content available.
One more Note: Google Trends data is 'deflated' to the number of searches (you can verify this by using Google Ads which gives numbers of searches and comparing to trends data). I assume the method is similar to what is in use for the transparency report's "censorship / outage" feature (explained here: https://transparencyreport.google.com/traffic/overview?hl=en)
10 onward just seems to lag & crash more. Totally crazy theory would be that binary compatibility (and dependencies) doesn't work well once the original author of a piece of code has retired (or died).