I do not know what is more critical: the risk of censorship or stand by while hospitals, banking, nuclear power plants and other systems become compromised and go down with people dying because of it. These decision makers not only have powers but also have a responsibility
Have you ever seen a hospital, a bank, a power plan to expose telnetd to the public internet in the last 20 years? It should be extremely rare and should be addressed by company’s IT not by ISPs.
This feels more akin to discovering an alarming weakness in the concrete used to build those hospitals, banks and nuclear power plants – and society responding by grounding all flights to make sure people can't get to, and thus overstress, the floors of those hospitals, banks and nuclear power plants.
In the UK we have in fact discovered an alarming weakness in the concrete used to build schools, hospitals and other public building (in one case, the roof of a primary school collapsed without warning). The response was basically "Everybody out now".
You feel it's similar because having access to port 23 is similarly life critical as having access to an hospital? Or is it because like with ports, when people can't flight to an hospital, they have 65000 other alternative options?
That's my question. Why is there infrastructure that has open access to port 23 on the Internet. That shouldn't be a problem that the service provider has to solve, but it should absolutely be illegal for whomever is in charge of managing the service or providing equipment to the people managing the service. That is like selling a car without seatbelts.
We are beyond the point where not putting infrastructure equipment behind a firewall should result in a fine. It's beyond the point that this is negligence.
Fixing the hospital: single place to work on, easier
Blocking all the roads/flights: everywhere, harder
Vs
Fixing all the telnet: everywhere, harder/impossible
Blocking port 23 on an infra provider: single place, easier
It makes sense to me to favor the realistic solution that actually works vs the unrealistic one which is guaranteed not fix the issue, especially when it's much easier to implement
nah, that's like seeing an open gate to nuclear tank - a thing easily fixed within few minutes - and responding to it by removing every road in existence that can bear cars
> censorship, the suppression or removal of writing, artistic work, etc. that are considered obscene, politically unacceptable, or a threat to security
It is not the responsibility of the Tier 1 or the ISP to configure your server securely, it is their responsibility to deliver the message. Therefore it is an overreach to block it because you might be insecure. What is next. They block the traffic to your website because you run PHP?
Similar to how the mailman is obligated to deliver your letter at address 13 even though he personally might be very superstitious and believe by delivering the mail to that address bad things will happen.
Well probably more people want to be city planners than the number of city planners society actually requires. Also, I think I would draw the line somewhere way before the real world. I want most of the technical details of the real world without having to deal with the politics. I don’t want to attend town hall meetings and stakeholder consultations in my game, but then again maybe someone else wants that.
In my area, streets are often church tower to church tower. From the middle ages. You can nowadays drive these streets and the middle line indicators align perfectly with the church tower showing up. I think the church /church based government share that property right understanding of the Romans :)
This is why data driven decision making is a trap. Even if the data is correct, which it's usually not, its still not complete just by definition. It's instinctually a dumbed down, distilled, and one-dimensional view of the real world, of meat space, and you gotta treat it like that.
Here's what is scary. I have been looking at many job descriptions for a Developer Experience Engineer or similar positions. About half of them ask for experience with automated tools to measure developer productivity!
Hmmm. I have a different take there: when you are young and wild, you achieve stuff because you think later and instantly produce code. When you turn older, you do it the other way leading to your example.
In the early 2000s I have been in a startup and we delivered rapidly in C# as we did in PHP. We just coded the shit.
I think what you said is a healthy progression : write dumb code -> figure out it doesn't scale -> add a bunch of clever abstraction layers -> realize you fucked yourself when you're on call for 12 hours trying to fix a mess for a critical issue over the weekend x how many time it takes you to get it -> write dumb code and only abstract when necessary.
Problem is devs these days start from step two because we're teaching that in all sources - they never learned why it's done by doing step one - it's all theoretical example and dogma. Or they are solving problems from Google/Microsoft/etc. scale and they are a 5 people startup. But everyone wants to apply "lessons learned" by big tech.
And all this advice is usually coming from very questionable sources - tech influencers and book authors. People who spend more time talking about code than reading/writing it, and earn money by selling you on an idea. I remember opening uncle bobs repo once when I was learning clojure - the most unreadable scattered codebase I've seen in the language. I would never want to work with that guy - yet Clean Code was a staple for years. DDD preachers, Event driven gurus.
C# is the community where I've noticed this the most.
As a long term observer: definitely not a goal. But you have to be clear here: JavaScript and C# both are OO languages, both are having origins stories in Java/C++, both are facing the same niche (system development), same challenges (processor counts, ...) and so on. And then, you put teams on it which look left and right when they face a problem and then you wonder that they reuse what they like?
C# language team is also really good. They did not do a lot of mistakes in the 25+ years. They are a very valid source of OO and OO-hybrid concepts. It is not only TS/JS but also Java and C++ who often look to C#.
The story was not to transform C# code to JS but to use C# to write the code in the first place and transpile it. Not for the sake of having .NET usage but for the sake of having a good IDE.
> They did not do a lot of mistakes in the 25+ years
If my memory serves, .NET and WinFS were the two major forces that sunk Longhorn, and both have been given their walking papers after the reset [1].
.NET and C# have grown to be mature and well-engineered projects, but the road there was certainly not without bumps. It's just that a lot of the bad parts haven't spilled outside of Microsoft, thankfully.
Not only that, they went as deep as mixing in project issues with language design. A massive rewrite mixed with massive feature changes is always a tricky thing no matter the language.
.NET was already a going concern before Longhorn even started. What sank Longhorn was the fact that writing an OS from scratch is hard and maintaining compatibility with existing OSes in the process is even harder, especially when you're adopting a completely new architecture. Longhorn would have been a microkernel running 100% on the .NET runtime, mainline Windows is a monolithic kernel written in C++. I don't know how it would have ever worked, whether .NET was "perfect" or not.
Android still runs on a monolithic kernel written in a memory-unsafe language. I'm finding it suprisingly difficult to find information on Meadow, other than it runs .NET DLLs as user-space applications, but nothing about the structure of the kernel.
Longhorn was going to be more than that. Microsoft did have Singularity/Midori projects, started around the middle of Longhorn/Vista, and continued much longer after Vista released to build out the managed microkernel concept. It's been about a decade since they've put any work into it, though.
The article is presenting some stuff mixed up. Had nothing to do with the language or the framework. WinFS was a database product. Over engineered and abstract.
.NET and C# were researched a lot for operating system usage (Midori, Singularity) but that was after Longhorn.
The operating system group UI toolkits was a further problem and they pivoted there dozen of times in the years. Particular for a C++ based os group.
But the death of longhorn was ultimately about the security restart of Bill Gates
Windows team is a C++ kingdom, and those devs will not adopt .NET even at gun point.
They redid Longhorn with COM and called it WinRT, irony of ironies, WinRT applications run slower than .NET with COM reference counting all over the place.
Google has showed those folks what happens when everyone plays on the same team, and now it owns the mobile phone market, a managed userspace with 70% world market.
As someone who has benefiter ones from this, I have to say: good.
In my humble opinion: the current state is better than no encryption at all. For example: Laptop theft, scavengers trying to find pictures, etc. And if you think you are target of either Microsoft or the law enforcement manage your keys yourself or go straight to Linux.
reply