Beyond some specific use cases, they don't seem to scale cognitively. The tangle of connections is a problem.
Tons of things like this were built in Smalltalk. (Including a UI->Domain model connection layer in the IBM VisualAge Smalltalk IDE.) They all had scaling problems, especially, "they don't seem to scale cognitively."
It's not as if the problem doesn't exist in most codebases. It's more that the problem is invisible without such tools. Tools making the tangle visible make themselves seem unusable.
The fundamental problem, is that we don't have ways of introspecting these horrendous relationship graphs for specific contexts. If IDEs and other programming tools generally could create custom browsers/IDE windows for things based in queries like:
"All of the methods that contain references to ClassA.Member1 and ClassB.Member2 which also call function Y."
...where this query can be modified or further specified at runtime. Then there could be specific built in queries that cover everything touched by canned refactorings. Then these further could be intersected or unioned.
EDIT: Forgot to complete my thought. If the graphical diagram could show contextually relevant slices of the system, it would greatly cut down the confusing web aspect of the diagrams.
What I've found is that many times, people like the perceived confidence that obstinacy can bring.
The problem with that method of evaluation, is that it's not First Principles. Basically, pg's essay in this case just reduces down to, "Is that person steered by First Principles thinking?"
Most people ain't steered by first-principles thinking, though, and that's the problem. To most people, first-principles-driven thinking lacks sufficient actionability; they just want definite answers, and first-principles-driven thinking tends to produce answers that are anything but definite.
First principles are great in principle, but what really makes for greater thinking is focusing on the reality and details of a problem, then picking applicable first principles. Often when I hear principled stances they’re entirely devoid of links between the real world and the utopia in the person’s head.
Each level up increases the concentration of toxin because the n-th level is eating the (n-1) level which has a higher concentration than the (n-2) level that the (n-1) ate.
Similarly, if we posit that all else being equal [1] a sociopath is more likely to go up a hierarchical level, then the nth level is promoted from the n-1 level that is more sociopathic than the n-2.
I also believe that presidential democracies are more prone to this concentration of sociopaths because voting the higher offices is more divorced from the voter (ie you dont know who you are voting for personally and are more easily mislead). Parliamentary (or congressional seats) democracy is more resistant to psychos. Monarchies are immune (except for genetic inheritance), but of course come with their own set of problems.
This risk should be evaluated iteratively. It won't kill Apple if some small fraction of the time, their releases are delayed. Overall, if their record is that they have stuff their competitors can't even get their hands on, they win.
Even though it's not technically illegal, there's something about this that feels unfair and bad for the market.
They're certainly the leading player in terms of cutting edge capabilities, but there are plenty of companies working on fabs. It wasn't long ago that Intel used to have the leading fab and simply got complacent/slow opening the door for TSMC to surpass them.
In the macro sense there seems to be more widespread awareness and money going towards fab competition than ever before. I suspect we'll trend closer to performance parity between competitors in the coming years.
Nobody will be able to compete against Apple CPUs while they're 1 process node ahead of competitors though.
I recently moved to Linux from MacOS, but probably will have to go back to Apple on the m4 for local LLM capabilities (high RAM specs much cheaper than commercial GPUs). Too bad Asahi is still missing some pretty critical capabilities (microphone, USB-C monitor connection etc)
Nobody will be able to compete against Apple CPUs while they're 1 process node ahead of competitors though.
If this artificial advantage allows Apple to slack off in their other areas of competitive advantage, then this is bad for the consumer, overall. This creates an environment of cynicism, where there's even degraded incentive to try and beat Apple with a better product.
If Apple slacks off in any way, some other company can step in with innovation and consumers benefit by increased choice and innovation. We shouldn't worry when a company slacks off except when its moats are illegal. If Apple's business power were properly regulated, they'd be forced to compete more of the time and when they didn't they'd face more consumer friendly Apple unfriendly results. Fix the antitrust enforcement and we'll fix the problem.
TSMC being the only game in town is what's bad for the market.
Beyond some threshold, any meta-gaming of the market is bad for the market. It's like patents. When they were first instituted, they did some good by incentivizing innovation. Then, people started meta-gaming them, and used them as legal weapons to suppress competition well beyond the original intention.
I don't think there's a good argument that monopolizing a feature is good for the market, even if the means that allows it is technically legal.
What could be argued, is if legal remedies to this kind of meta-gaming might also be meta-gamed in return and become worse than the thing they were trying to remedy. This seems to always happen in a variety of forms.
What makes Lisp macros so intuitive is not the prefix notation
The culture is the ultimate "reality distortion field." Prefix notation would be seen as intuitive, if that were culturally established. We'd see something like PEMDAS as arbitrary and silly.
Just look at how much content there is around PEMDAS and interpretation of math problems. Clearly, it really isn't "intuitive." We just have this enshrined in the culture. (That said, one of the biggest UX mistakes the designers of Smalltalk made, was to eschew operator precedence!)
My intention wasn't to argue whether or not prefix notation is intuitive. The point I wanted to make was the intuitiveness of Lisp macros is mostly unrelated to the use of prefix notation. Homoiconicity matters a lot more for macros.
We could potentially get over the bikeshedding by letting everyone configure their IDEs per their own taste for syntax.
We Smalltalkers were discussing doing this at Camp Smalltalks in the 2000's.
I'm currently working in golang, and I've noticed that Goland IDE expends quite a bit of compute indexing and parsing source files. Not only that, but, a significant portion of the bug fixes have to do with this, and the primary motivation for restarting Goland has to do with stale indexing.
Wouldn't tools like git simply work better, if they were working off of some kind of direct representation of the semantic programming language structures? Merging could become 100% accurate, for one thing. (It's not for some edge cases, though many might mistakenly think otherwise.)
> Merging could become 100% accurate, for one thing.
How so? Merge conflicts don't arise from the inability to locate the proper change, but from the inability to decide, which of the change, if any, would be proper.
I think there's some connection between your comment:
For example, if outside developers had more capability to explore different ways...
...and these 2 sentences from the article:
For the sake of this argument, let’s posit that there exist tens of millions — perhaps 100 million — users who love the iPad for what it is. People who feel empowered, not hamstrung, by how it works, and who have no or very little need for a computer that exposes the complexity of a desktop OS like MacOS or Windows. And that there exist tens of millions more people who enjoy having an iPad to complement, not replace, their desktop computer.
If someone did come up with an iPad competitor that had all of the ease of use, UX simplicity, and ecosystem benefits, but also better enabled power users with something nearer a desktop replacement, then this would simply kill the iPad.
The fact that Apple seems reticent to do this themselves, indicates that there is a kind of organizational blind spot there, driven by self interest. Either that, or there are typical Apple long term plans that we're not fully aware of, and they're taking their own sweet time.
> I tried everything to make it more tolerable, and hot water was by far the best. The effect didn't last forever, but it was remarkable how it a) was actually pleasurable and b) muted the itchiness
I've been thinking of a "low fantasy" story, which is actually Sci-fi under the covers. In it, the "fey" characters are just indigenous people who have immunity to a plant which is similar to poison oak, but which grows in nigh impenetrable hedge like clumps and walls. Your mention of hot water for relief gave me an idea for a story beat, where another character discovers the hot water effect, and simultaneously discovers how to infiltrate the "fey" character's territory and bathing practices similar to Japanese and Finnish bathing.
https://www.youtube.com/watch?v=l3whaviTqqg