Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Alternatively you can look at actually innovative programming languages to peek at the next 20 years of innovation.

I am not sure that watching the trendy forefront successfully reach the 1990s and discuss how unwrapping Option is potentially dangerous really warm my heart. I can’t wait for the complete meltdown when they discover effect systems in 2040.

To be more serious, this kind of incident is yet another reminder that software development remains miles away from proper engineering and even key providers like Cloudfare utterly fail at proper risk management.

Celebrating because there is now one popular language using static analysis for memory safety feels to me like being happy we now teach people to swim before a transatlantic boat crossing while we refuse to actually install life boats.

To me the situation has barely changed. The industry has been refusing to put in place strong reliability practices for decades, keeps significantly under investing in tools mitigating errors outside of a few fields where safety was already taken seriously before software was a thing and keeps hiding behind the excuse that we need to move fast and safety is too complex and costly while regulation remains extremely lenient.

I mean this Cloudfare outage probably cost millions of dollars of damage in aggregate between lost revenue and lost productivity. How much of that will they actually have to pay?



Let's try to make effect systems happen quicker than that.

> I mean this Cloudfare outage probably cost millions of dollars of damage in aggregate between lost revenue and lost productivity. How much of that will they actually have to pay?

Probably nothing, because most paying customers of cloudflare are probably signing away their rights to sue Cloudflare for damages by being down for a while when they purchase Cloudflare's services (maybe some customers have SLAs with monetary values attached, I dunno). I honestly have a hard time suggesting that those customers are individually wrong to do so - Cloudflare isn't down that often, and whatever amount it cost any individual customer by being down today might be more than offset by the DDOS protection they're buying.

Anyway if you want Cloudflare regulated to prevent this, name the specific regulations you want to see. Should it be illegal under US law to use `unwrap` in Rust code? Should it be illegal for any single internet services company to have more than X number of customers? A lot of the internet also breaks when AWS goes down because many people like to use AWS, so maybe they should be included in this regulatory framework too.


> I honestly have a hard time suggesting that those customers are individually wrong to do so - Cloudflare isn't down that often, and whatever amount it cost any individual customer by being down today might be more than offset by the DDOS protection they're buying.

We have collectively agreed to a world where software service providers have no incentive to be reliable as they are shielded from the consequences of their mistakes and somehow we see it as acceptable that software have a ton of issues and defects. The side effect is that research on actually lowering the cost of safety has little return on investment. It doesn't have be so.

> Anyway if you want Cloudflare regulated to prevent this, name the specific regulations you want to see.

I want software provider to be liable for the damage they cause and minimum quality regulation on par with an actual engineering discipline. I have always been astounded that nearly all software licences start with extremely broad limitation of liability provisions and people somehow feel fine with it. Try to extend that to any other product you regularly use in your life and see how that makes you fell.

How to do proper testing, formal methods and resilient design have been known for decades. I would personnaly be more than okay with let's move less fast and stop breaking things.


> I want software provider to be liable for the damage they cause and minimum quality regulation on par with an actual engineering discipline. I have always been astounded that nearly all software licences start with extremely broad limitation of liability provisions and people somehow feel fine with it. Try to extend that to any other product you regularly use in your life and see how that makes you fell.

So do you want to make it illegal to punish GNU GPL licensed software because that license has a warranty disclaimer? Do you want to make it illegal for a company like Cloudflare to use open source licensed software with similar warranty disclaimers, or for the SLA agreements and penalties for violating them that they make with their own paying customers to be legally unenforceable? What if I just have a personal website and I break the javascript on it because I was careless, how should that be legally treated?

I'm not against research into more reliable software or using better engineering techniques that result in more reliable software. What I'm concerned about is the regulatory regime - in other words, what software it is or is not legal to write or sell for money - and how to properly incentivize software service providers to use techniques that result in more reliable software without causing a bunch of bad second order effects.


I absolutely do not mind, yes.

You can't go out in the middle of your city, build a shoddy bridge, say you wave all responsibilities and then wash your hands with the consequences when it predictably breaks. Why can you do that with pieces of software?

Limiting the scope of liability waivers is not the same things as censoring what software can be produced. It's just ensuring that everyone actually take responsibility for the things they distribute.

As I said previously, the current situation doesn't make sense to me. People have been brainwashed in believing that the way software is released currently, half finished and crippled with bugs, is somehow normal and acceptable. It absolutely doesn't have to be this way.

It'a beyond shameful that the average developers today is blissfully unaware of anything related to producing actually secure pieces of software. I am pretty sure I can walk into more than 90% of development shops today and no one there will know what formal methods are. With some luck, they might have some static analysers running, probably from a random provider and be happy with the crappy percentages that it outputs.

It's not about research. It's about a field which entirely refuses to become mature despite being pivotal to the modern economy. And why would it? Software products somehow get a free pass for the shit they push on everyone.

We are in the classical "market for lemons" trap where negative externalities are not priced in and investing in security will just get you to lose against companies that don't care. Every major incidents remind us we need out. The market has already showed it won't self correct. It's a classical case where regulatory intervention is necessary and legitimate.

The shift is already happening by the way. The EU product liability directive was adopted in 2024 and the transition period ends in December 2026. The US "National Cybersecurity Strategy" signals intend to review the status quo. It's coming faster that people realise.


I find myself in the odd position of agreeing with you both.

That we’re even having this discussion is a major step forward. That we’re still having this discussion is a depressing testament to how slow slowly the mainstream has adopted better ideas.


I agree with you. But onsidering nobody learns any real engineering in software; myself solidly included; this is still an improvement.

But yes, I wish I had learned more, and somehow stumbled upon all the good stuff, or be taught at university about at least what Rust achieves today.

I think it has to be noted Rust still allows performance with the safety it provides. So that's something maybe.


> I can’t wait for the complete meltdown when they discover effect systems in 2040

Zig is undergoing this meltdown. Shame it's not memory safe. You can only get so far in developing programming wisdom before Eternal September kicks in and we're back to re-learning all the lessons of history as punishment for the youthful hubris that plagues this profession.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: