Hacker Newsnew | past | comments | ask | show | jobs | submit | brhsagain's commentslogin

I agree that the word “just” carries that connotation but I disagree that it’s a bad thing. When I ask if we can just do something my intent is exactly to communicate that I think the thing is simple, that the details are unimportant (to me, to us) and that it ought to be easy to do (and if it’s not, that’s a problem in and of itself).

A lot of things are like this, and so to excise the word “just” would be to stop using a word that often concisely and accurately conveys what I’m trying to say.

It would be better if the article just said “this is rude.”


There are nuances to this, too, though: You can be communicating

> I think the thing is simple, that the details are unimportant (to me, to us)

…and simply be ignorant of a lot of the details that make this not simple for the person who has to "just".

I had such a "discussion" with someone who did exactly this and then refused to even acknowledge there are technical details in their "just" and that their "just" involves multiple wasted person days of effort (in this case for little benefit, as their "just" was to paint over them having to do something themselves). It's infuriating.

Now, the "just" isn't the only part of the problem here, but it will most likely the part where any useful discussion breaks down.

And, while I can blame my encounter on a person with… problematic particularities, it isn't obvious to me that one would always be able to discern easily if one is walking into the same trap.


I've never seen so many epicycles in my life...


Back when I was building an IDE with a custom text editor, I initially used embedded Neovim thinking I would get the entire vim feature set out of the box for free. Unfortunately this became a never ending source of bugs. I think the fundamental problem was that my application was structured like a game with a main loop in a single thread, and Neovim turned text editing into an async operation, where I would have a separate thread reading events from Neovim and then trying to safely update the global buffer.

Also, I was constantly fighting/reverse engineering Neovim to get the granular level of control over behavior that I needed for a seamless integration. It’s just a type of programming that’s extremely frustrating and not fun.

In the end I implemented custom vim emulation from scratch and surprisingly it wasn’t that hard to get the “20% of features that people actually use 80% of the time,” except it’s more like 5% and 95%, and in exchange I could own the whole stack instead of depending on a third party black box. Never been happier to delete a whole subsystem of code in my life.


I've always heard "auth" to mean authentication and "perms" to mean authorization.


I dunno. On the one hand I hate “web dev” more than anyone. I think it has led to such an astronomical decline in software quality that if you described it to someone from the days when computers were 1000x slower, they straight up wouldn’t believe you.

That said… the article doesn’t really ring true to me. What he is saying about the complexity of each part of the stack (http, html/dom, css) is technically true, but that’s not really how it washes out in practice. This whole “CSS is a complex graphics engine!” “HTTP is a protocol you could write a whole dissertation about!” sounds like an argument being made by someone trying to make a rhetorical point about web. In practice for most of web dev you don’t need to understand the deep nuances of CSS or HTTP or whatever. Yes, there is a large breadth of material you have to learn but the depth you actually need in any one area is much less than the author is trying to imply.

And yes, web is trash, but for different reasons. In fact some of those reasons are the opposite of what the author is saying. He says that each part of the stack is so complex it should be a separate specialty. But the real problem is the very fact that things are so complex. Rather than accept that complexity and subdivide the field into different disciplines, we should get rid of all this unneeded complexity to begin with.


He does also point out that CSS Grid or html tables havent changed. The web still mostly works the same.

You are yet another perfect example of raw antagonism against the web, a body of hate. You are legion. But, if we look at the arguments here, look at where complexity dwells, the things that are hard and changing aren't the fundamentals, aren't the essentials. They are not so complex.

What is hard/changing is state management. What is hard/changing is handling state in client-server or other connected architectures. What is hard/changing is being smart about offloading work to threads. And it's not like anyone else has conquered this complexity. None of the other ecosystems are particularly far in advance. The complexity of these cases seems to be inherent, not accidental.

The reason for so much complexity is because we change & improve & progress. This makes some people very upset. People drastically over-ascribe the woes of the software development world to the web, when really it's just that the web is now the default place for making software & most companies would bungle up these concerns no matter what platform they were building atop.


> On the one hand I hate “web dev” more than anyone. I think it has led to such an astronomical decline in software quality that if you described it to someone from the days when computers were 1000x slower, they straight up wouldn’t believe you.

Nearly all of this, IMO, can be explained by a lack of passion.

I grew up on computers, starting in the 90s. I didn't have internet access until near the end of the decade, and it was slow dial-up. If you broke the family computer, you had to figure out not only how you had broken it, but how to fix it. When I found Linux (Gentoo, obviously, because it's way more fun to spend days tweaking CLFAGS than to use the software), I was also thrust into forum culture, which was rife with RTFM. You quickly learn either to search and read docs, and demonstrate a modicum of capability and effort, or you lose interest and do something else.

This is not the case now. Even before the advent of LLMs, it wasn't that hard to find the answer to most of your questions on SO. The rise of cloud computing means that you don't have to know how to run a computer, you just have to know how to talk to an API – and even then, only at a surface level. You can pretend that TCP doesn't exist, have no idea how disks magically appear on-demand (let alone what a filesystem is), etc. Databases, which used to be magical black boxes that you accessed via queries carefully written by wizened graybeards, are now magical black boxes that you abuse with horrifying schema and terrible queries. Worse, you don't even have to know their lingua franca, and can commit your crimes via an ORM, which probably sucks.

And for all of this abstraction, you are paid handsomely. The difficulty in landing your first job is tremendously high, sure, but the payoff is enormous. Once you're in, you'll find that the demands of most businesses is not to upskill, but to push features out faster. Grow the user base, beat others to market, and dazzle VCs. No one has time for doing things right, because that slows down velocity.

This is aided and abetted by Agile, specifically Scrum. Aside from maintaining the cottage industry of Agile consultancies, it's designed to turn software production into a factory, where no one person really needs to know how to do anything tremendously complex. Instead of insisting that people learn how to do difficult things, we spend hours per week breaking down tasks into bite-sized increments with absurd abstractions of time.

"Thought leaders" deserve a callout here as well for their contributions to this mess. Microservices are a great example. A potentially useful architecture in some circumstances has been turned into gospel that people buy into without question, and apply everywhere regardless of its utility or fit. If you're lucky, someone eventually notices that rendering a page seems to take a lot longer than it should, but more often than not this is met with a shrug, or at best blaming the DB and paying AWS for a larger instance. Multiple network calls that each have to traverse N layers of routing and queues is slower than calls within a single process? Color me surprised.

When you combine the allure of a high-paying job that has little barrier to entry with no business incentives to do things differently, you get what we have today.


Why are financial crimes morally exempt in a way that, say, violent crime is not, even if the aggregate damage from the financial crime is greater? Unless you’re arguing that this isn’t possible?


I agree. In my corner of the universe (the US) I think the sec, Finra, doj, fbi need the pendulum to swing back 35% to correct for being too nice.

What would happen if we had two kinds of c-corps? One fairly if strictly regulated leading to fines, civil cases, and criminal cases in exceptions

And option two ...

Very light regulation but if a regulatory agency or fbi finds malfeasance only criminal charges, jail time, loss of lawyer license, inability to serve at public companies and more ie. hardball.

Maybe the elites should choose their risk level. I think it'd say a lot.


What’s wrong with that argument? Doesn’t getting more money from the user mean that the user is choosing to pay more, which reveals their preference?


*In a healthy competitive market with a consumer-base that's well educated on potential consequences of their purchases.

I do agree with the principle that a space-hogging marginally profitable business is detrimental to a community. Just that the opposite is not necessarily true; profitability does not imply beneficiality.

Humans do not fit the model of "rational self-interested agent" commonly applied for economic models. Gambling and addictive substances are two hugely profitable business sectors that would not exist if it we're remotely accurate.

I'll also preempt someone's inevitable assertion that the burden of verification should lie on the consumer. In informationally antagonistic environment, it's absurd to expect each individual to individually vet every service and product. That's a phenomenal waste of labor that favors only well-funded organizations practiced in deception. Any rational group would pool resources and have a single org do the research and share it with everyone. Oops we've reinvented a government.


> Humans do not fit the model of "rational self-interested agent" commonly applied for economic models.

They generally do -- the misalignment comes from analyzing people's behavior according to presumptive interests which have been externally attributed to them, instead of observing behavior in order to ascertain what people's interests actually are.

> Gambling and addictive substances are two hugely profitable business sectors that would not exist if it we're remotely accurate.

No, gambling and addictive substances exist because people enjoy them. Large numbers of people exhibit a manifest preference for short-term pleasure over long-term stability; expecting such people to act in ways that pursue long-term stability over short-term pleasure is itself irrational.

> I'll also preempt someone's inevitable assertion that the burden of verification should lie on the consumer. In informationally antagonistic environment, it's absurd to expect each individual to individually vet every service and product.

Unfortunately, your attempt at preemption has failed. Only the consumer has the relevant criteria necessary to determine how well a given good or service fits his own particular needs or desires. Being rational, most other people intuitively use the experiences and advice of others as Bayesian indicators of product suitability or unsuitability (even if they don't know what Bayesian indicators are), but they're still using those external resources as tools with which to make their own decisions.

> Any rational group would pool resources and have a single org do the research and share it with everyone. Oops we've reinvented a government.

No, you've reinvented Consumer Reports. Except for the "single org" part, anyway -- there's no single determination that could be applicable to all people all the time, so people will naturally develop a variety of parallel solutions that apply different criteria to the evaluation process.


I take it you view drug / gambling addicts not as people with a mental health issue making irrational decisions, but rather fully rational people that prefer "short-term pleasure over long-term stability"?


Absolutely. People make rational decisions to fulfill the motivations they actually have. But sometimes people, being complex creatures, have multiple conflicting motivations, where fulfilling one impedes another, which leads to psychological and emotional distress. So mental health does come into it, but as a matter of reconciling conflicting parts of ones own psyche, not as a matter of overcoming irrationality.

Or, to put it another way, the irrationality is a matter of having contradictory desires in the first place; choosing to act upon one and dismiss the other resolves the irrationality. The fact that some people make the trade-off in the opposite direction that you would doesn't make them irrational, it just demonstrates that people are different.


> Humans do not fit the model of "rational self-interested agent

> Any rational group would pool resources and have a single org do the research and share it with everyone. Oops we've reinvented a government.

The firse quote precludes the second.


I disagree... the diff lies in the definition of "humans" vs. "group". It's like the quote in the movie MIB "Kay : A person is smart. People are dumb, panicky dangerous animals and you know it"


"Any rational group would pool resources and have a single org do the research and share it with everyone. Oops we've reinvented a government."

I thought you where talking about Google Maps reviews but OK.


No, it does not. That the users end up paying more in no way means or should be implied that they _choose_ to pay more.

If cheaper options are made less accessible or clearer, and customers are intentionally mislead to more expensive products, as a result they will pay more too.


Also, there is a cost associated with searching. Consumers may intentionally forego the effort for perceived low marginal gains (especially in nominal rather than percentage terms, e.g. "I'm not going to waste my time to save a quarter." even if the quarter is a significant percentage difference). This is one of the factors in the success of Amazon. People "value" convenience.


But people will not pay more than a product is worth to them. The fact that they are willing to purchase the product at a higher price point indeed does imply that that price point is still lower than the consumption utility of the product for them.


Do you always click in paid ads before the organic results? No? Oh because you personally don't prefer them? Oh you mean they're preferred by the minority that does click ads, over alternative paid ads?

Gee I wonder what's wrong with that argument.


I take it you have never used the healthcare system in the US?


Doesn't being scammed reveal your preference for getting scammed?

Sure it does. For some definitions of the word preference.


Do the cops “breaking procedure” know for certain if the accused are guilty? How can they?


Politicians know the electoral vote is what matters, so they adjust their campaign strategy to optimize for it.

Theoretically, after everything washes, the electoral vote is what the people want, because it reflects which of the two candidates, both of whom are optimizing intensely for the electoral vote, is able to get it.


Having an existing example showing that something difficult is possible causes everyone else to replicate it much faster, like how a bunch of people started running sub-4-minute miles after the first guy did it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: