I don't use IPv6 because it solves a problem that I don't have and it provides functionality that I don't want. And also because I don't understand it very well.
My points :
- I don't have a shortage of IPv4. Maybe my ISP or my VPN host do, I don't know. I have a roomy 10.0.0.0/8 to work with.
- Every host routable from anywhere on the Internet? No thanks. Maybe I've been irreparably corrupted by being behind NAT for too long but I like the idea of a gateway between my well kept garden and the jungle and my network topology being hidden.
- Stateless auto configuration. What ? No, no, I want my ducks neatly in a row, not wandering about. Again maybe my brain is rotten from years of DHCP usage but yes, I want stateful configuration and I want all devices on my network to automatically use my internal DNS server thank you very much.
- It's hard to remember IPv6 addresses. The prospect of reconfiguring all my router and firewall rules looks rather painful.
- My ISP gives me a /64, what am I supposed to do with that anyways?
- What happens if my ISP decides to change my prefix ? How do my routing rules need to change? I have no idea.
I’ll plug my series of project ideas that have also been discussed here on HN over the years: Challenging programming projects every programmer should try
My dad devised the "Bayer filter" used in digital cameras, in the 1970's in the Kodak Park Research Labs. It is hard to convey now exactly how remote and speculative the idea of a digital camera was then. The HP-35 calculator was the cutting edge, very expensive consumer electronics of the day; the idea of an iPhone was science fiction. Simply put, my dad was playing.
This was the decade that the Hunt brothers were cornering the silver market. Kodak's practical interest in digital methods was to use less silver while keeping customers happy. The idea was for Kodak to insert a digital step before printing enlargements, to reduce the inevitable grain that came with using less silver. Black and white digital prints were scattered about our home, often involving the challenging textural details of bathing beauties on rugs.
> Also, I don't want to repeat this everywhere but I paid taxes and I lost a comma, so no need to worry about that anymore! Everyone please pull out your most microscopic violins!
Well, since we're talking about it, maybe you're down to answer a question I've always wondered about: money into the hundred millions, let alone billions, is for me an unfathomable amount of capital for one person to wield. I've always thought, if I ever had that kind of power to swing around, I'd spend it all trying to solve every problem I could get my hands on, until there was nothing left but my retirement fund (which could be 10 million and still let me spend hundreds of millions while retiring in permanent wealthy comfort). Hunger in specific areas, housing crises, underfunded education, across the world many issues that, at least locally, one individual with that kind of money could, so far as I can tell, independently resolve.
Why aren't the ultra rich doing it? You seem to have a more philanthropic mind than most, you're doing this cool project and nobody can deny your FOSS contributions. But even you are still holding onto keeping that count into the hundreds rather than the tens - is there some quality of life aspect hidden to us that's just really difficult to imagine giving up or something? Yacht life? Private flights? Chumming it up with Gabe and Zuck?
Becoming that wealthy won't happen to me but if it did, what would change about me that'd make me not want to spend it all anymore?
> while that shown in blue is the stapled notarisation ticket (optional)
This is correct, but practically speaking non-notarized apps are pretty terrible to use for a user enough so that this isn't optional and you're going to pay your $99/yr Apple tax.
(This only applies to distributed software, if you are only building and running apps for your own personal use, its not bad because macOS lets you do that without the scary warnings)
For users who aren't aware of notarization, your app looks straight up broken. See screenshots in the Apple support site here: https://support.apple.com/en-us/102445
For users who are aware, you used to be able to right click and "run" apps and nowadays you need to actually go all the way into system settings to allow it: https://developer.apple.com/news/?id=saqachfa
I'm generally a fan of what Apple does for security but I think notarization specifically for apps outside the App Store has been a net negative for all parties involved. I'd love to hear a refutation to that because I've tried to find concrete evidence that notarization has helped prevent real issues and haven't been able to yet.
To start with it can’t be universal. Models that you can edit (unless they are really dumb like DWG) are almost always domain specific.
You can - and should be able - to export to a universal format though.
But having a universal format is different than having a universal design space.
The requirments of a mechanical engineer are quite different from that of a structural engineer/and/or detailer for houses. And again different from those of a doctor planning a surgery based on CT model. For example you need rebars only in one of these. You need delicate fillet control only in one of these. You absolutely need support for import and visualization of volumetric data in only one of these.
What I’m getting at that while all design softwares have some common min set of features which _can_ be universal, the number of features in each stereotypical domain are surprisingly disjoint even if only comparing AEC and mech eng. Hence ”universal” design software would be a union of a very, very large set of totally unrelated features. Which suggests it would be hard to develeop, hard to use and hard to maintain.
So it’s better to have a collection of applications that aspire for ”universal scope” as a collection rather than ”one app to rule them all” which you will never get done in any case.
If we presume a hypothetical FOSS mission to enable computer design for all major fields benefitting from digital design for physical outcomes, it should then focus on this ”common min” core, interoperability (strongly linked to the common min core but separate concern - ie import and export) as well as domain specific projects of producing the domain specific UI and tooling.
” The problem is that the industry doesn't really know what it wants.
”
I would argue they do. What we call ’CAD’ today - digital design of surfaces for manufacturing - originated in aerospace and automotive industries.
What they _want_ is to manufacture.
To manufacture they need designs as input.
The designs ultimately end up as
a) drawings
b) surface models for toolpath programming
c) 3D models for project coordination, validation, etc
They don’t actually care how many buttons you need to press as long as the final design is fit for purpose.
I agree CAD software is stereotypically not fun to use.
Also - there is a non-trivial population of CAD users who actually take pride in their skill to use these more or less broken tools. It’s some sort of weird masochistic/macho badge of honor.
My guess is optimizing the cost of design is not that interesting to anyone as long as a,b and c from above are fullfilled.
The engineers labour who needs to use the CAD tools is an insignificant percentage of the total cost in any case.
To understand why you need to dive into to the cost structures and value creation mechanisms as well as the culture in hardware.
To start with, in many fields the overall culture in hadrware is ”we sell the same junk as everyone else”.
”Just having a solid, open-source framework to build upon”
What’s missing from OpenCascade?
” There would be no AWS without GNU/Linux.”
Sure there would. Open source more or less copypasted existing industrial/academic patterns. But open source probably makes it cheaper and better.
There would no AWS without internet. Both as the protocol but also as the billions invested to the fiberoptic cables crisscrossing the worlds oceans.
The world is built on hardware. Hardaware is not _stupid_. But it’s _different_ than software.
To revolutionize CAD you first need to have deep understanding of the _hardware_ value chains and industrial methods. I think what makes it hard is _not_ having a kernel but having this cross discipline knowledge inside single org.
Speaking with over a decade of experience as a developer in industrial CAD (but still just one random guys point of view only). The question _isn't_ about the availability of a 3D kernel.
3D kernel is not the "moat".
You can cross that with money.
You can purchase a ACIS or Parasolid and you are off to the races. Or even use OpenCascade if you know what you are doing.
The more interesting question is: Ok hotshot, you have a 3D kernel, 10M of investor money (or equivalent resources).
What's your next move? What industry are you going to conquer? What are the problems you are going to solve better than the current tools do?
What's the value you provide to the users except price?
What are you going to do better than the incumbent softwares in relevant specific design industries?
Which industry is your go-to-market?
Etc etc.
The programmer's view is "I will build a CAD". The industrial user on the other hand does _NOT_ want a "general CAD software".
They want a tool with a specific, trainable workflow for their specific industrial use case.
So "if you build it and they will come" will require speaking to a specific engineering/designer audience.
You can of course build a generic tool (it's all watertight manifolds in the end) but the success in the market depends on the usual story about market forces. What's your go to market/beached. Does it enable you to move to other markets?
And the answer usually is - NO. You need to build the market share in _each_ domain separately.
Swift is an early example of Apple losing its way. Such a stark contrast to Objective-c -- which was a simple, fast compiling language that hit way above its weight for expressivity and runtime speed. A great language for its day. Swift is "a C++ hacker's first attempt at language design".
Haha people have sent it to me! I'm currently trying to make it on my own in the wide world though.
I never disagreed with the decision to lay people off to become profitable. That's part of the implicit agreement when you take employment in the US. You can quit whenever you want and they can fire you whenever they want! I knew that going in!
I agreed with it, but of course it stung. That's only natural!
Sam and I are all good though! To his immense credit he reached out to me directly a little while back to mend any hurt feelings, of which I had a few. We're friends. He even came on my podcast and we talked for over an hour like old buds.
I have a lot of feelings and sometimes they get hurt. Sam has a fiduciary responsibility to the company. Today PlanetScale is a going concern and I'm happy and doing great! All is well.
This is all moot because greenfield projects rarely exist anymore outside of being an entrepreneur, and if you're selling something, you're probably using a shopify wrapper of some sort 99% of the time. If you're working on a greenfield project at a Fortune company, then you probably have a bunch of considerations and in-house frameworks you'll use as a jumping off point instead of running `rails new` at any point.
These discussions are pointless and I'm a little fed up with them. As another commenter has pointed out, this exact same article (with the exact same conversational style) has appeared for at least 10 years, though I'd push it to 15-20. Write something new...BUILD something new...but for god sake's stop reiterating the same point because it's SO. BORING.
Apple CEO Tim Cook "secretly" signed an agreement worth more than $275 billion with Chinese officials, promising that Apple would help to develop China's economy and technological capabilities - https://www.macrumors.com/2021/12/07/apple-ceo-tim-cook-secr...
Last week, the Chinese government ordered Apple to remove several widely used messaging apps—WhatsApp, Threads, Signal, and Telegram—from its app store. [..] In a statement, Apple said that it was told to remove the apps because of “national security concerns,” adding that it is “obligated to follow the laws in the countries where we operate, even when we disagree.” [but they don't disagree so much that they'd stop locking their devices against their users] - https://www.cjr.org/the_media_today/apple_appstore_china_cen...
Apple happily locks you out of your own devices, then cries "just complying with local governments" when those locks are used against their users. They're the person holding you down while others kick you. Every bit as guilty - especially when they see their users kicked again and again, yet continue holding them down.
Andy Grove, Steve Job's friend (if not mentor), agreed:
"Not only did we lose an untold number of jobs, we broke the chain of experience that is so important in technological evolution. ....abandoning today’s 'commodity' manufacturing can lock you out of tomorrow’s emerging industry." https://www.zdnet.com/article/us-high-tech-manufacturing-bas...
To add to this & the Jobs interview - an oil industry proverb: a healthy oil company has a geologist in charge, a mature one has an engineer in charge, a declining one has an accountant in charge, and a dying one has a lawyer in charge.
Buffett had a long term policy to compare discounted cash flow type valuations for stocks bonds and businesses. For example see his 1992 letter https://www.berkshirehathaway.com/letters/1992.html and ^F for "Burr".
So I guess it's more that he think stocks are expensive relative to bonds than a prediction of a crash.
I have worked in card payment industry. We would be getting products from China with added boards to beam credit card information. This wasn't state-sponsored attack. Devices were modified while on production line (most likely by bribed employees) as once they were closed they would have anti-tampering mechanism activated so that later it would not be possible to open the device without setting the tamper flag.
Once this was noticed we started weighing the terminals because we could not open the devices (once opened they become useless).
They have learned of this so they started scraping non-essential plastic from inside the device to offset the weight of the added board.
We have ended up measuring angular momentum on a special fixture. There are very expensive laboratory tables to measure angular momentum. I have created a fixture where the device could be placed in two separate positions. The theory is that if the weight and all possible angular momentums match then the devices have to be identical. We could not measure all possible angular momentums but it was possible to measure one or two that would not be known to the attacker.
>The current US administration's weird argument is that the USD values exchanged in inter-national trades must be symmetrical,
In Econ 101 going back to Keynes, if you run a persistent current account surplus, the increased demand for your currency to pay for your exports will strengthen your currency, thus reducing the competiveness of your exports, thus reducing your trade surplus. Vice versa for a deficit. Hence persistent trade imabalances should not exist due to self-balancing FX-effects.
This isn't happening in reality for a variety of reasons, but it's common knowledge that it's government policies that directly try to prevent that from occuring. Capital Controls, Protectionist Policies, etc, the most explicit mechanism is the controversial currency manipulation, whereby many central banks manage capital inflows by buying US assets to keep their currency stable.
But the aggregate result of this is that we have a strange situation today whereby the US dollar is simultaneously strong yet running a massive deficit, while surplus countries have weak currencies. And these imbalances are growing rather than shrinking. Many mainstream economists don't think the situtation is sustainable, but their proposals to fix it are differing.
I'd agree that Trump's method of "forcing" other countries to buy more American stuff is just a short-term fix that won't solve the underlying issues, but the real solutions of tariffs, currency revaluations or introducing capital controls will be hard to stomach for everyone, albeit necessary. Although the "correct" solutions like the Bancor will all be much more harmful to surplus countries than the USA.
I work in manufacturing in the US. Incoming quality control, for Chinese vendors, is necessarily set up with zero-trust. This isn't a "trust but verify" sort of thing, it's strictly "do not trust". Assume that, at every step of the chain, there will be a lie: change of process, material change, collected data, and that the product being given to you is even yours (delivering a knockoff at the final step, and reselling yours on the gray market).
This is all common knowledge, proven by example after example that it's necessary to have zero trust. It's truly an adversarial system. All the extra engineering effort for IQC is still cheaper. And, there's rarely an alternative to the amazing manufacturing ecosystem that is China.
After tracking down several of these types of issues, it appears that the Chabuduo mindset [1] is a very real thing.
Zuckerberg deserves a mountain of critisism for everything he has done, but one thing I'll give him is that he doesn't hesitate to get out his checkbook and pay absurd amounts of money for engineering talent.
Back when Apple, Google, Adobe, Intel and other big tech companies signed illegal anti-poaching agreements Zuck said no, I'll pay whatever it takes to attract the best talent, no matter where they work.
Facebook/Meta has consistently set the bar for engineering salaries across the industry, and Google and the rest are forced to match it.
Now every AI company is paying millions of dollars for regular engineers just because Zuck is out there throwing out wads of cash from the roof.
When it comes to salaries the man simply does not care about standards and precedents, and I love that.
Kagi founder here. I like simplicity too, but if that is the only thing I cared about I would have never attempted to build a paid search engine.
We started working on the browser before we started working on search. There were many browsers and search engines before us, and all of them failed because you can not compete with an ecosystem company (Google) with a single product. If your customers use your search engine, but then use Gmail and Chrome, your main competitor has means (friction) with them to attempt to win them back over (and you are paid, they are free!) if you ever become more than a nuisance.
This is why it was clear to me from day one that we need to offer a hollistic replacement for big tech for consuming the web - and search, browser and email (yes, we are building that too) account for 99% of consumption of the web.
Yes, it makes our job everything but simple, but without it we will not survive long term - I am 100% sure. (and in the meantime generative AI showed up and made things even more complicated)
I still love my job and the challenges it brings, every single day. While we live as a company in a complex environment we try to remove the complexity for our customers and make high quality products. I have pretty high standards for browsers and have been using them for almost 30 years - and Orion is by far the most powerful option on the market (speed/energy/privacy/features). Inviting you to give it a try.
Linus is pragmatic. His only hard rule is "don't break userspace", most everything else is contingent.
From what I can tell, he views Rust for Linux as a social project as much as a technical one - attract new (and younger) developers into the kernel, which has become pretty "senior", force all of the "unwritten rules" to be written down, and break down some of the knowledge siloing and fiefdom behavior that has taken place over the years (by sharing it specifically with that younger group of developers).
If you think about it, Rust for Linux is in some ways essentially an auditing committee reviewing all of the API interactions between kernel subsystems and forcing them to be properly defined and documented. Even if the Rust part were to fail (exceedingly unlikely at this point), it's a useful endeavour in those other respects.
But of course, if Rust is successful at encoding all of those rules using the type system, that unlocks a lot of value in the kernel too. Most of the kernel code is in drivers, and it's typically written by less experienced kernel developers while being harder to review by maintainers that don't have background on or access to the hardware. So if you can get drivers written in Rust with APIs that are harder to misuse, that takes a big load off the maintainers.
This story has been reposted many times, and I think GJS's remarks (as recorded by Andy Wingo) are super-interesting as always, but this is really not a great account of "why MIT switched from Scheme to Python."
Source: I worked with GJS (I also know Alexey and have met Andy Wingo), and I took 6.001, my current research still has us referring to SICP on a regular basis, and in 2006 Kaijen Hsiao and I were the TAs for what was basically the first offering of the class that quasi-replaced it (6.01) taught by Leslie Kaelbling, Hal Abelson, and Jacob White.
I would defer to lots of people who know the story better than me, but here's my understanding of the history. When the MIT EECS intro curriculum was redesigned in the 1980s, there was a theory that an EECS education should start with four "deep dives" into the four "languages of engineering." There were four 15-unit courses, each about one of these "languages":
- 6.001: Structure and Interpretation of Computer Programs (the "procedural" language, led by Abelson and Sussman)
- 6.002: Circuits and Electronics ("structural" language)
- 6.003: Signals and Systems ("functional" language)
These were intellectually deep classes, although there was pain in them, and they weren't universally beloved. 6.001 wasn't really about Scheme; I think a lot of the point of using Scheme (as I understood it) is that the language is so minimalist and so beautiful that even this first intro course can be about fundamental concepts of computer science without getting distracted by the language. This intro sequence lasted until the mid-2000s, when enrollment in EECS ("Course 6") declined after the dot-com crash, and (as would be expected, and I think particularly worrisome) the enrollment drop was greater among demographic groups that EECS was eager to retain. My understanding circa 2005 is that there was a view that EECS had broadened in its applications, and that beginning the curriculum with four "deep dives" was offputting to students who might not be as sure that they wanted to pursue EECS and might not be aware of all the cool places they could go with that education (e.g. to robotics, graphics, biomedical applications, genomics, computer vision, NLP, systems, databases, visualization, networking, HCI, ...).
I wasn't in the room where these decisions were made, and I bet there were multiple motivations for these changes, but I understood that was part of the thinking. As a result, the EECS curriculum was redesigned circa 2005-7 to de-emphasize the four 15-unit "deep dives" and replace them with two 12-unit survey courses, each one a survey of a bunch of cool places that EECS could go. The "6.01" course (led by Kaelbling, Abelson, and White) was about robots, control, sensing, statistics, probabilistic inference, etc., and students did projects where the robot drove around a maze (starting from an unknown position) and sensed the walls with little sonar sensors and did Bayesian inference to figure out its structure and where it was. The "6.02" course was about communication, information, compression, networking, etc., and eventually the students were supposed to each get a software radio and build a Wi-Fi-like system (the software radios proved difficult and, much later, I helped make this an acoustic modem project).
The goal of these classes (as I understood) was to expose students to a broad range of all the cool stuff that EECS could do and to let them get there sooner (e.g. two classes instead of four) -- keep in mind this was in the wake of the dot-com crash when a lot of people were telling students that if they majored in computer science, they were going to end up programming for an insurance company at a cubicle farm before their job was inevitably outsourced to a low-cost-of-living country.
6.01 used Python, but in a very different way than 6.001 "used" Scheme -- my recollection is that the programming work in 6.01 (at least circa 2006) was minimal and was only to, e.g., implement short programs that drove the robot and averaged readings from its sonar sensors and made steering decisions or inferred the robot location. It was nothing like the big programming projects in 6.001 (the OOP virtual world, the metacircular evaluator, etc.).
So I don't think it really captures it to say that MIT "switched from Scheme to Python" -- I think the MIT EECS intro sequence switched from four deep-dive classes to two survey ones, and while the first "deep dive" course (6.001) had included a lot of programming, the first of the new survey courses only had students write pretty small programs (e.g. "drive the robot and maintain equal distance between the two walls") where the simplest thing was to use a scripting language where the small amount of necessary information can be taught by example. But it's not like the students learned Python in that class.
My (less present) understanding is that >a decade after this 2006-era curricular change, the department has largely deprecated the idea of an EECS core curriculum, and MIT CS undergrads now go through something closer to a conventional CS0/CS1 sequence, similar to other CS departments around the country (https://www.eecs.mit.edu/changes-to-6-100a-b-l/). But all of that is long after the change that Sussman and Wingo are talking about here.
Built in bounds checking, slices, distinct typing, no undefined behavior, consistent semantics between optimization modes, minimal implicit type conversions, context system and the standard library tracking allocator combine together to eliminate the majority of memory bugs I found use for sanitizers in C/C++. Now I'm back to logic bugs, which neither Rust nor sanitizers can help you directly with anyway because they rely on program and not language semantics. Obviously these features together do not eliminate those classes of bugs, like Rust, but Odin chooses a different point on the efficient frontier to target, and does so masterfully.
To put the cherry on top, the sane package system, buttery smooth syntax, sensible preprocessor (the 'constant system'), generics, LLVM backend for debug info / optimization, open source standard library, simply build system, engaging and well intended community make day to day programming a pleasant experience.
That’s the absolute percentage difference. Look at the under 35 category, it’s literally down 25%. That means 1/4 people that would have owned a house in that age group don’t now. Under 45 is a relative drop of ~17%, so about 1/5. One in four to one in five people is more than enough to see an effect.
I doubt it’s the only cause at all, this anti-social (“Bowling Alone”) trend has been going on for generations, and probably has multiple causes. But the US housing crunch on young people is adding to it.
And this damn attitude of “the younger generations are just entitled weenies” about housing is about the most infuriating attitude in the world. My parents bought their first house on a single earners blue collar salary at the age of 27. That house, with almost no updates, now literally needs a top 1% salary and payments for 30 years to be able to afford. Don’t tell the kids to stop whining when they’re watching older generations gobble up their future in the name of preserving property values.
I find the 2025 versions to be a nicer looking than pre-2025 variants, so it’s overall an improvement. But I also find the 2014 to be usually a lot better (clearer and more obvious). So incrementally it’s an improvement, but historically still worse.
My points :
- I don't have a shortage of IPv4. Maybe my ISP or my VPN host do, I don't know. I have a roomy 10.0.0.0/8 to work with.
- Every host routable from anywhere on the Internet? No thanks. Maybe I've been irreparably corrupted by being behind NAT for too long but I like the idea of a gateway between my well kept garden and the jungle and my network topology being hidden.
- Stateless auto configuration. What ? No, no, I want my ducks neatly in a row, not wandering about. Again maybe my brain is rotten from years of DHCP usage but yes, I want stateful configuration and I want all devices on my network to automatically use my internal DNS server thank you very much.
- It's hard to remember IPv6 addresses. The prospect of reconfiguring all my router and firewall rules looks rather painful.
- My ISP gives me a /64, what am I supposed to do with that anyways?
- What happens if my ISP decides to change my prefix ? How do my routing rules need to change? I have no idea.
In short, so far, ignorance is bliss.