Sometimes, in the interest of having something rather than nothing, I have to press publish. This entails getting things wrong, which is regrettable.
I will say, that I'm trying to steelman the code-as-assembly POV, and I dont think the exact historical analogy is critical to it being right or wrong. The main thing is that "we've seen the level of abstraction go up before, and people complained, but this is no different" is the crux. In that sense, a folk history is fine as long as the pattern is real
This is an interesting distinction, but it ignores the reasons software engineers do that.
First, hardware engineers are dealing with the same laws of physics every time. Materials have known properties etc.
Software: there are few laws of physics (mostly performance and asymptotic complexity). Most software isnt anywhere near those boundaries so you get to pretend they dont exist. If you get to invent your own physics each time, yeah the process is going to look very different.
For most generations of hardware, you’re correct, but not all. For example, high-k was invented to mitigate tunneling. Sometimes, as geometries shrink, the physics involved does change.
This just doesn't explain things by itself. It doesn't explain why humans would care about reasoning in the first place. It's like explaining all life as parasitic while ignoring where the hosts get their energy from.
Think about it, if all reasoning is post-hoc rationalization, reasons are useless. Imagine a mentally ill person on the street yelling at you as you pass by: you're going to ignore those noises, not try to interpret their meaning and let them influence your beliefs.
This theory is too cynical. The real answer has got to have some element of "reasoning is useful because it somehow improves our predictions about the world"
Skills wont use less context once invoked, the point is that MCP in particular frontloads a bunch of stuff into your context on the entire api surface area. So even if it doesn't invoke the mcp, it's costing you.
That's why it's common advice to turn off MCPs for tools you dont think are relevant to the task at hand.
The idea behind skills us that they're progressively unlocked: they only take up a short description in the context, relying on the agent to expand things if it feels it's relevant.
What Stripe did for payments, Pylon is doing for the mortgage industry: We're taking a sleepy industry with backward technology and re-building the stack from the ground up. We're first-principles thinkers, and our team is small, talented and ambitious.
I'm hiring generalists who love coding and want to build something beautiful in an industry where technology written in the 90s is the norm. We're Series A, well funded, and we have traction with customers. Come to Menlo Park and help us turn the $13 trillion US mortgage industry into a set of APIs.
And how hard it is proves that zfs didn't make a bad choice in not trying the same. (though it would be interesting if either had a goal of a clone - that is same on disk data structures. Interesting but probably a bad decision as I have no doubt there is something about zfs that they regret today - just because the project is more than 10 years old)
We did this at stripe when deprecating TLS 1.0, and called it a brown out (I don't know the origin of the term in software).
You do it when you have a bunch of automated integrations with you and you have to break them. The lights arent on at the client: their dev teams are focused on other things, so you have to wake them up to a change that's happening (either by causing their alerting to go off, or their customers to complain because their site is broken)
I will say, that I'm trying to steelman the code-as-assembly POV, and I dont think the exact historical analogy is critical to it being right or wrong. The main thing is that "we've seen the level of abstraction go up before, and people complained, but this is no different" is the crux. In that sense, a folk history is fine as long as the pattern is real
reply