It's akin to using Plex v Jellyfin for serving media files to others. If they're not technically savvy they really don't want to have to think about networking protocols and their fragility... so you just deal with Plex because at least it turns down the volume on constant calls to your support desk / personal mobile phone.
Have you heard of Emby? It seems to be closed-source. However you can still run it yourself [0]. UX appears to be better than Jellyfin, but I haven't tried either and not sure which one to go for.
Training is the process of regressing to the mean with respect to the given data. It's no surprise that it wears away sharp corners and inappropriately fills recesses of collective knowledge in the act of its reproduction.
The logic of American economic policy relies on a large velocity of money driven by consumer habits. It is tautological, and it is obsolete in the face of the elite trying to minimize wage expenses.
How is it obsolete? If everyone is unemployed and a few AI barons are obscenely wealthy, the velocity of money will be low because most people will be broke.
Seems to me like that's still a worthy target if chasing it fights that outcome.
The materialist take is that his plan is to eventually over-value and then trade on his company valuations, and also have another merger lined up for future personal financial bailouts.
Signal protocol prevents replay attacks as every message is encrypted with new key. Either it's next hash ratchet key, or next future secret key with new entropy mixed via next DH shared key.
Private keys, probably not. WhatsApp is E2EE meaning your device generates the private key with OS's CSPRNG. (Like I also said above), exfiltration of signing keys might allow MITM but that's still possible to detect e.g. if you RE the client and spot the code that does it.
Whatsapp didn't implement Signal's protocol verbatim. They appropriated the core cryptographic security and then re-implemented the rest on their own servers. This removes all guarantees of secrecy as long as they can run arbitrary code on the servers they own.
Wouldn't ratchet keys prevent MITM too? In other words if MITM has your keys and decrypts your message, then your keys are out of sync from now on. Or do I misunderstand that?
The ratchets would have different state yes. The MITM would mix in different entropy into the keys' states. It's only detectable if the MITM ever stops. But since the identity key exfiltration only needs to happen once per lifetime of installation (longer if key is backed up), the MITM could just continue forever since it's just a few cycles to run the protocol in the server. You can then choose whether to read the messages or just ignore them.
One interesting way to detect this would be to observe sender's outgoing and recipient's incoming ciphertexts inside the client-to-server TLS that can be MITM'd by users. Since the ratchet state differs, so do the keys, and thus under same plaintext, so do the ciphertexts. That would be really easy way to detect MITM.
Humanity is awesome because we are naturally constrained in semantic-space, making it relatively straightforward to reverse engineer things that ancient humans made even if we share basically zero overlap in culture.
Wat? Android, Linux, BSD (incl iOS and Mac) absolutely dominate the market in terms of mind share and deployment. I can't think of another philosophy that is so directly, causally responsible for billions of dollars of realized value.
It sounds like you're parroting the corporate line of the early 80s. "Making money directly selling software artifacts is the only way to win." Which, as we know in retrospect, was a completely failed strategy, steamrolled by companies which... wait for it... adopted more flexible technology based on the Unix philosophy.
I'm going to lump together proprietary and for-profit software a little because I rarely see exceptions:
Proprietary / for-profit software is made that way, yes. There's no other way you can do it, given the incentive landscape. You take the huge base of FOSS code and spend as little effort as possible to build a thin little layer of moss on top. You respect the customers as little as possible to keep them around. You fix bugs as little as possible. You use other proprietary services that spy on your customers, like Sentry and Firebase, because privacy costs money.
Free / libre software has to compete on its merits. If a project isn't useful it doesn't grow. But not growing is okay. Some projects accrete over years, like a pile of stones forming a cairn. They don't need to squeeze money out of people to live because they aren't alive. They don't have to "eat".
I'm mixed on the Unix philosophy. It makes a lot of sense when you're building CLI tools that a hacker is going to plug together, because then your program is really a function in a programming environment that spans one or more entire computers. Tools like ripgrep, jq, curl, they're all great, I love them. A good function does one thing and does it well.
But just as often, I'm okay with huge software that does a lot. Web browser engines are evolving into universal GUI / IO frameworks and I'm trying to make peace with that. Systemd does a ton of stuff, but hell, so does the Linux kernel. I don't see an inherent problem with having an init system that acts like a monolith kernel for userspace. Microkernels are nifty but in the end we all ship our org charts. Maybe there's no need for microkernels if the real divide is between kernel programming and userspace programming. For the same reason, I haven't found any personal use for wasm, because wasm makes the most sense when you're connecting two pieces of code written by different teams at different times, like a GIMP plugin. I don't need wasm to plug my own code into my own code.
And for GUIs it's just been a fucking nightmare. 57 years since the Mother Of All Demos and it's still 10x easier to write `fn main()` and build a CLI program that runs on _every_ OS, rather than a GUI program that maybe runs on one OS, like it runs on Ubuntu 24.04 but not Ubuntu 22.04. What a fucking mess. GUIs don't compose, so every GUI project I've tried feels like I'm inventing the universe from scratch. It's fun but it's a stupid fucking waste of time.
Honestly browser engine frameworks like Electron and Tauri _are_ the Unix philosophy compromise. Making a GUI framework requires tens of thousands of lines of high-effort code made by experts. If a browser's one thing is "Be a GUI framework" then it allows your GUI app to just serve HTML and now it's a CLI app that runs anywhere without fucking with GTK 3 vs GTK 4 bullshit.
Sorry for the long rant. This stuff is all percolating in my head. I started with Visual Basic 6 and I still haven't seen GUIs improve since then. Phones have been a fucking step backwards, too. Everyone uses a phone but just like a mouse-and-keyboard player dominating gamepad players in a shooter game, I get a lot more done at a real desktop with a real pointing device.
reply