How did RC4 become so widespread when it came from a leak? Additionally, why was it the de facto standard stream cipher in the 90s, even though it was known to be flawed? Just the speed?
In addition to the other sibling comments, I think there's also a factor of greatly increased computing power. Back in the 90s and earlier, we just didn't have the computing power generally to encrypt everything with super-strong algorithms in realtime. This probably also affects who can practically do development work on state-of-the-art algorithms.
I recall, when it was originally created, SSL was a rarity, a thing only for the your bank account and the payment page for online stores, because nobody could afford the CPU load to encrypt everything all the time. Now, it's no big deal to put streaming video behind TLS just to ensure your ISP can't mess with it.
RSA was still selling RC4 into the mid-2000s as a product. While open source variants of RC4, often trying to avoid the RSA trademark by calling it things like ARCFOUR, started trading in the 1990s, there was still a sense that RC4 was backed by a security company.
Also, even though flaws were discovered as early as the open source variants had reverse engineered the RC4 algorithm, it was one of those "flaws exist but need things to exploit them that are out of our current threat models" problems, with it being a multi-stage, multi-year effort from the earliest flaw discoveries in the 90s to the most devastating exploits being developed around 2013-2015 taking advantage of those flaws in reproducible ways.
I also remember in the 90s it felt like the reverse engineered, open source efforts were once shining beacons of hope like PGP of releasing "enterprise grade" security algorithms from trade secret-protected corporate and governmental interests to "the common people". RC4 was simple to implement and easy to reason about, but gave "good enough" security for a lot of uses, certainly far better than "no security unless you pay a company like RSA and only if you don't plan to export your software outside of the US". That's why RC4 was the basis of a 90s idea called CipherSaber [1] about the idea of being able to implement your own security suite that you controlled and companies couldn't take from you.
Of course, things have shifted so much since the 90s when security suites were trade-protected and export-controlled. The security through obscurity of the algorithms involved behind trade secrets laws is no longer seen as an advantage and the algorithm being public knowledge has started to be a part of security suite threat models. Today's advice is never write your own security suite because there are several well regarded open source suites that have many eyes on them (and subsequently vulnerability plans/mitigations). Governments in the internet age have had to greatly relax their import/export controls on cryptography. We live in a very different world from the world RC4 was originally intended to secure.
It's fast, easy to implement, has very concise code, takes any key length up to 256 bytes, comes from a famous cryptographer, and there weren't a lot of alternatives.
Because "everybody uses RC4" (the sibling comment from dchest is correct). There was a lot of bad cryptography in that period and not a lot of desire to improve. The cleanup only really started in 2010 or thereabouts. For RC4 specifically, its was this research paper: https://www.usenix.org/system/files/conference/usenixsecurit... released in 2013.
I think this is a really good question, for what it's worth. Best I can come up with is that, at the time, our block cipher blocks were mostly 8 bytes wide, which doesn't leave a lot of headroom for CTR.
Its on the front page, that means it atttracted attention and was upvoted. If what you are saying was true, these posts would die very quickly and we would never see them.
I guess if everyone thinks mocking peoples' projects and efforts is funny, it's okay!
My opinion is a weakly that this is tiring and borderline insulting to people who are genuinely looking for feedback and community. Clever once a year or so, but the creator has leaned into it and posted a lot of meta in a small timeline.
I already made my point. If the community agrees with you then we wont see these on the front page anymore. If not then you will either need to be ok with seeing more of them, or not read HN.
So he's saying we should be replacing the seniors with fresh grads who are good at using AI tools? Not a surprising take, given Amazon's turnover rate.
This is intentional: make people think there's nothing online except harmful content, and propose a regulatory solution, which creates a barrier to entry. It's "meta" trying to stop any insurgent network.
It’s also meta overstating the power of influence. Why would they do that? Because it’s good marketing for them to sell a story around how their services running ads can be used for highly effective mass influence.
Yes, like the "Cambridge Analytica scandal." The "scandal" was people using their ad marketplace tools and a 100-line Python script to "hack" a presidential election. Also, the "Russian Election Interference" aka someone doing a $50k ad buy and "hacking" the presidential election.
They're squeezing their customers after locking in to juice their margins, having become a monopoly/monopsony. This is the classic enshitificaton playbook.
Nobody is locked in (unless they made some incredibly bad decisions) and this is a tiny fee in exchange for a useful service. I’m just baffled by the response to this.
It's not baffling if you read his Enshitification book. This is phase 2.
In 2010, people were saying it was very reasonable to start prioritizing publishers' ability to reach you over your organic contacts. After all, Facebook is providing this utility for free; shouldn't they be able to extract some additional revenue from their platform? And here we are in 2025...
This proposal is worse because all the valuable regions of code will be clearly annotated for static analysis, either explicitly via a library/function call, or heuristically using the same boilerplate or fences.
Makes sense basically creating an easy to point out pattern for static analysis to find everything security related.
As another response pointed out, its also possible that said secret data is still in the register, which no matter what we do to the curr value could exist.
> Makes sense basically creating an easy to point out pattern for static analysis to find everything security related.
This is essentially already the case whenever you use encryption, because there are tell-tale signs you can detect (e.g., RSA S-Box). But this will make it even easier and also tip you off to critical sections that are sensitive yet don't involve encryption (e.g., secure strings).
Stopped in the dark on a December night on the shoulder of an interstate junction to change a tire after I had a blowout while driving. Under normal circumstances, I probably could have handled it myself, but I was getting about four hours of sleep a night because of tinnitus.
I was very nervous when a random guy stopped. My initial thought was, "Am I about to be robbed?" But it turned out that he was just a local aerospace engineer, and it was his hobby to help stranded motorists.
reply