Its not the basic mail and calendar functionality that drives large business to Microsoft (and to a lesser degree Google). It's really not anything that a normal user would see in an average role.
Email in a large organization requires things like central management, compliance with retention policies and other regulations, data loss prevention, encryption standards, auditing and ediscovery capabilities, etc.
Probably the lack of external stimuli. Generative AI only continues generating when prompted. You can play games with agents and feedback loops but the fundamental unit of generative AI is prompt-based. That doesn’t seem, to me, to be a sufficient model for intelligence that would be capable of “pondering”.
My take is that an artificial model of true intelligence will only be achieved through emergent complexity, not through Frankenstein algorithms and heuristics built on generative AI.
Generative AI does itself have emergent complexity, but I’m bearish that if we would even hook it up to a full human sensory input network it would be anything more than a 21st century reverse mechanical Turk.
Edit: tl;dr Emergent complexity is a necessary but insufficient criteria for intelligence
this is such a weird take to me. every piece of evidence I've seen shows that AI is quickly becoming better at writing code, debugging, finding security issues. my own experience, benchmarks, studies, new articles.. everything points to progress
So why was Moltbook full of security holes? I don't doubt that you can use AI to fix some bugs, but that probably requires at least someone writing prompts who cares about and understands the bugs.
It's like the old story about hiring a carpenter who just hammers in a nail to fix a squeaky floor. The difficult part was finding where the nail needs to go, not necessarily the hammering.
The business version of Windows doesn't have ads in the start menu. That's the consumer/home version. The "Pro" flavors of Windows are quite a bit more pleasant and I don't think there is any downside even on a home computer.
below zero days are really really crazy cold and above 100 days are really really crazy hot. I don't think the fact that things occasionally exceed the 100 point "normal" range makes it less useful, if anything the out of bounds numbers emphasize the severity of the temperature. it's common where I grew up in the midwest US to hear "wow its going to be BELOW ZERO" as a way to express extreme cold
For me personally “really really cold” starts below -30°C and crazy hot is above +30°C. It’s very subjective and outside of US many areas have climate where Fahrenheit doesn’t make sense at all.
maybe that's why its popular in the US? for most of this country the 0-100 range works quite well to describe the normal range of outdoor temperature. we seem to like 0-100 ranges, for instance speed in MPH works out nicely.. "over 100 MPH!" is a common expression for extreme speed drivers. school grades are often a value out of 100, etc. which makes you wonder why we don't prefer metric lol
And then Gary Kildall also seemed to like it with CP/M and PL/M, but those were after IBM had used it and I'd guess Gary was just copying IBM.
Between just those two influences you cover a huge portion of the mainframe and micro computer worlds during the 60s-80s
reply