Hacker Newsnew | past | comments | ask | show | jobs | submit | barrkel's commentslogin

Incentives rule everything.

For the Romans, winning wars was the main source of elite prestige. So the Empire had to expand to accommodate winning more wars.

Today, the stock market and material wealth dominates. If elite dominance of the means of production requires the immiseration of most of the public, that's what we'll get.


> For the Romans, winning wars was the main source of elite prestige. So the Empire had to expand to accommodate winning more wars.

That's almost 100% backwards. The Republic expanded. The Empire, not so much.


Isn't that burying the lede on a technicality?

GP appears to be using "empire" as in "imperalistic" instead of as in "emperor".

Look here: https://vulert.com/vuln-db/CVE-2025-48633

It has to do with setting the device owner, and gaining those powers; enabling / disabling apps, remote wipe, etc.. It's a local privilege escalation attack and doesn't require user interaction.


How do you e.g. validate that a database product works with all the different cloud databases? Every time you change up SQL generation you're going to want to make sure the SQL parses and evaluates as expected on all supported platforms.

Those tests will need creds to access third party database endpoints.


Transitive dependencies?

Yeah, only works if all used Actions would use SHAs too, which is not the case.

Positive example: https://github.com/codecov/codecov-action/blob/96b38e9e60ee6... Negative example: https://github.com/armbian/build/blob/54808ecff253fb71615161...


I've also found many Actions that do other dodgy stuff, like pulling and executing unpinned scripts from external websites, or installing unpinned binaries from GitHub releases. Pinning an Action isn't enough, you have to audit it.

You specifying the top level hash doesn't do anything to pin transitive dependencies, and as the article points out, transitive dependencies - especially dependencies common to a lot of actions - would be the juciest target for a supply chain attack.

Ah, I see it now. Thanks!

No, it's more like they sell to a bunch of verticals and they have checklist-orientated market segmentation where they can eke out a little extra profit selling a specific line to a specific industry because they need a specific doodad or certification.

This has got to be part of it at least for some industries. I never understood why marketing teams would keep things so convoluted on purpose. Seemed like the exact opposite of what they should be doing.

Of course different strokes for different folks. My favorite laptop at this point is a terminal for my home workstation which is far more powerful than anything mobile. That means I prioritize decent graphical performance, OLED screen, long battery life, and I don't really mind too much using WSL in Windows. All my development doesn't really happen locally any more.

It's got that AI smell. The word choice. Antithesis ("not this, but that"). Persuasion words.

There's meat in it, it's not pure slop, but it was definitely fed through the slop factory.

And as some have mentioned, the facts are a bit dubious.


I feel this vibe. It's part of the reason I invested in a monster home workstation - threadripper 9995wx, 768gb ecc ram, 96gb Blackwell pro. I expect it may easily be the last proper home PC I buy that is scaled in line with the scaling that I grew up with in the 80s and 90s.

Increasingly, what we have are mobile terminals - possibly with a dock for monitor connections - for remote big iron. And the continuous push from governments for more control - seemingly synchronous demands for age gating (i.e. requiring IDs) and chat snooping - males me think this remote hardware won't really be yours before long.

Windows, caught up in the LLM mania, is to be left by the wayside too.


The Terry Davis tinfoil-hat version of me has a theory that this wider industry trend of pushing away consumers from general purpose home computers towards only using remote datacenters from locked down mobile/thin edge devices, is supported by both industries and governments because:

Number one, you become a recurring subscription instead of a one and done deal, making it incredibly profitable for industry

And number two, the government can more easily snoop on your data when it's all in the cloud versus a HDD box in your closet.

Granted, I think we're far away from that future, but I do feel that's the future the powers that be desire and they can use various mechanism to force that behavior, like convenience, and pricing, like for example making PC parts too expensive for consumers and subsidizing cloud and mobile devices to accelerate the move, and once enough consumers only know how to use Apple or Google devices they'll be less inclined to spend more money to build a PC and learn what a Linux is.


I'm cynical enough to not argue against your number two theory.

But the first one? I'm less convinced. I think the underlying assumption is that companies look to make the most money off consumers. I can get behind that.

But just the other day I was looking at a new GPU and considered running a local LLM. I was looking at spending no more than 1000 €. That would get a me 5070 TI 16 GB. Not enough to reasonably run anything interesting. I'm not looking to "tinker with things", I want to actually use them, mostly for coding. A JetBrains subscription would run less than 10 € per month [0], and keep me up to date with the evolution of things. My 5070 would be stuck at its mostly useless level forever, since I don't see requirements going down any time soon. If prices didn't change, I'd need more than 100 months, or 8 years, to break even. And during these 8 years, I'd never have a decent LLM experience by buying it outright.

Sure, this would be a capable GPU for other uses. But in my case, it would just sit around under my desk heating up my room.

---

[0] You'd get a 20 € discount if paying for a whole year upfront (10/month, 100/year). I'm also excluding VAT, since I'd buy this for work and have a VAT registered company.


My suspicion is it’s mostly a result of the slowing pace of computing growth.

When computers go obsolete in 2 years nobody wants them on their balance sheets, when they last 6 sales go down and you get more years of profit from owning them. This hasn’t been an instantaneous transition, but a low shift in how the industry operates.


Whoa that definitely sounds like a monster rig! Out of curiosity, how much did that cost? Blackwell alone can be $10k+! It's definitely an investment, especially since it may soon become a relic, not in terms of specs, but in terms what manufacturers will actually end up making.

A little over 20k CHF. I delegated building (and ensuring the parts work together and testing) to Dalco, a Swiss company that specializes in academic compute clusters and the like. The price was competitive with building it myself.

The CPU is more expensive than the workstation blackwell card. 8x 96GB DIMMs - 96GB was at a corner in the price per GB, 128GB was more expensive per GB - is also more expensive now than the GPU. In fact the prices for that kind of package on ebay seem to be approaching the price of the entire box.


The non-convex side mirror almost got me into an accident the first rental car I drove in the US. I was expecting to see more of the road than I did.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: