Hacker Newsnew | past | comments | ask | show | jobs | submit | fsloth's commentslogin

"decide what the software is supposed to do in the first place."

After 20 years of software development I think that is because most of the software out there, is the method itself of finding out what it's supposed to do.

The incomplete specs are not lacking feature requirements due to lack of discipline. It's because nobody can even know without trying it out what the software should be.

I mean of course there is a subset of all software that can be specified before hand - but a lot of it is not.

Knuth could be that forward thinking with TeX for example only because he had 500 years of book printing tradition to fall back on to backport the specs to math.


At work I'm implementing new 3D map geometry stuff for my employer (Mapbox) and as a a sideproject I'm building a simple 3D modeling software that gets you from idea to reliable, solid parts fast (https://www.adashape.com/).

I love Nausicaa but I would argue technically it has some weaknesses in the narrative and pacing compared to later works.

So in case someone comes to this without previous exposure Nausicaa may not be the best place to start. It _is_ nice.


Reminds me of General Atomics.

They wanted to build a spaceship (project Orion), but first had to come up with something they could sell immediately, hence they designed the commercially very succesfull Triga reactor.


” how can you explain how megalithic 100-ton bricks structures were build by "primitives"”

How can you explain we can today build structures that are 800m tall or reroute rivers?

Honestly, good ol human craftsmanship multiplied by available labour combined with ’basic’ geometry gets you really, really far.

Industrial processes don’t require individual craftmanship because it does not scale with the speed and velocity required by markets and capital. Hence if you don’t actually care about building stuff you may think people unassisted with industrial machinery would be much more incapable than they really are.

Humans are friggin talented.

My opinion is that said structures are made by humans - a function of basic human psychology, times population, time surrounding available resources. You don’t need to add alien intelligences to the equation.

And aliens per your description - not that interested really because it sounds more like religious conspiracy theories than something actually profoun.

I’m pretty sure there’s life out there (i mean basic chemistry right) but I’m not so sure if it’s anything that would travel here intentionally or that it would have anything to say to us.

I would be happy to be wrong! That’s the most interesting outcome always.


Yes! It's a fascinating conundrum. The giant megalithic "bricks" perfectly fitted and "moulded". Amazing craftsmenship. Entirely possible a forgotten technology was utilized by humans, or by something "before us". Antoher technological Earth-based civlization that wasn't "human" but the only traces of which remain are their megalithcs -- spooky, almost as if they knew they had to leave a stone legacy!


My God, they're rock rectangles cut with reasonable precision! They aren't millimeter replicas of each other.

They float the stones from the quarry. Then prepare a plum surface using water, which lays perfectly flat at rest. They measured out a height with a replica-sized reed. Then they use taught twine to chalk the mark. Finally, you pour hard sand on the chalk line and use ropes back and forth in the groove to grind into the soft stone.

It takes time, but it's a process so simple a child can learn it in a day. And then you apply the scale of having a city of adult laborers just as smart (though not as learned - there's a difference) doing it for years and years. Congrats, the rocks stack into a stable shape.

And it was done multiple times and the history even recorded for some!

Then, a long time goes by and all the structures built out of other material decay. All the structures not stacked in the most stable shape fall down. And all the structures not important and out-of-date with modern ventilation or security or needs are intentionally replaced. Now you only have the special building, which some folk weirdly worship or make conspiracies about.

So many logical fallacies and biases go into this, it's all incredibly frustrating. And to see how this beautiful, connecting history we share is warped. To see simple human cleverness that proves how we are fundamentally the same as those who came before us, completely cast aside. It's just... GAH!

The men and women who lived when those structures were built were just as intelligent as you! Your capacity for knowledge, your curiosity, your ingenuity - all in the same proportions! They were not "primitives" or "cave people". They were smart human beings who built cool shit!


Exactly.

The most astounding thing about the millenia old megastructures is the instant human connection it creates across the vast stretches of time.

They are not alien. They are deeply familiar.


The comment is slightly incoherent in it's argumentation, but factually correct.

I think the point is the energy policies in some of the nordics are as arbitrary and politics driven as anywhere.

How this is directly related to the sand battery is not clear to me, but it does paint an accurate if partial picture of the milieu.


A slow compiler impedes developers velocity, not only taking longer, but breaking their concentration.

The whole point of a programming language is to be an industrial productivity tool that is faster to use than hand writing assembly.

Performance is a core requirement industrial tools. It's totally fine to have slow compilers in R&R and academia.

In industry a slow compiler is an inexcusable pathology. Now, it can be that pathology can't be fixed, but not recognizing it as a pathology - and worse, inventing excuses for it - implies the writer is not really industrially minded. Which makes me very worried why they are commenting on an industrial language.


I get the feeling author would just like to use a better language, like F# or Ocaml, and completely misses the point what makes C++ valuable.

C++ is valuable, because the existing tooling enables you to optimize the runtime peformance of a program (usually you end up with figuring out the best memory layout and utilization).

C++ is valuable becaus it's industry support guarantees code bases live for decades _without the need to modify them_ to latest standards.

C++ is valuable because the industry tooling allows you to verify large areas of the program behaviour at runtime (ASAN etc).

I simply don't understand what type of industrial use this type of theoretical abstraction building serves.

Using the metaprogramming features makes code bases extremly hard to modify and they don't actually protect from a category of runtime errors. I'm speaking from experience.

I would much rather have a codebase with a bit more boilerplate, a bit more unit tests and strong integration testing suite.

The longer I use C++ the more I'm convinced something like Orthodox C++ is the best method to approach the language https://bkaradzic.github.io/posts/orthodoxc++/

This keeps the code maintainable, and performant (with less effor than metaprogramming directed C++).

Note: the above is just an opinion, with a very strong YMMV flavour, coming from two decades in CAD, real time graphics and embedded development.


C++20 inverts the traditional relationship between the core language and metaprogramming, which arguably makes it new language in some ways. Instead of being a quirky afterthought, it has become the preferred way to interact with code. There is a point of friction in that the standard library doesn’t (and can’t) fully reflect this change.

Metaprogramming style in C++20 only has a loose relationship to previous versions. It is now concise and highly maintainable. You can do metaprogramming in the old painful and verbose way and it will work but you can largely dispense with that.

It took me a bit to develop the intuitions for idiomatic C++20 because it is significantly different as a language, but once I did there is no way I could go back. The degree of expressiveness and safety it provides is a large leap forward.

Most C++ programmers should probably approach it like a new language with familiar syntax rather than as an incremental update to the standard. You really do need to hold it differently.


As someone that has only dabbled in C++ over the past 10 years or so, it feels like each new release has this messaging of “you have to think of it as a totally new language”. It makes C++ very unapproachable.


It isn’t each release but there are three distinct “generations” of C++ spanning several decades where the style of idiomatic code fundamentally changed to qualitatively improve expressiveness and safety. You have legacy, modern (starting with C++11), and then whatever C++20 is (postmodern?).

This is happening to many older languages because modern software has more intrinsic complexity and requires more rigor than when those languages were first designed. The languages need to evolve to effectively address those needs or they risk being replaced by languages that do.

I’ve been writing roughly the same type of software for decades. What would have been considered state-of-the-art in the 1990s would be a trivial toy implementation today. The languages have to keep pace with the increasing expectations for software to make it easier to deliver reliably.


As someone that has been using C++ extensively for the last 25 years, each release has felt as an incremental improvement. Yes, there are big chunks in each release that are harder to learn, but usually a team can introduce them at their own pace.

The fact that C++ is a very large and complex language and that makes it unapproachable is undeniable though, but I don't think the new releases make it significantly worse. If anything, I think that a some of the new stuff does ease the on-ramp a bit.


C++ can be written as the optimal industrial language it is. Simple core concepts year after year. Minimal adaptation.

The key thing to understand you are still using C with sugar on top. So you need to understand how the language concepts map to the hardware concepts. So it’s much more relevant to understand pointer arithmetic, the difference between stack and heap allocations and so on, rather what the most recent language standard changes.

You can write the same type of C++ for decades. It’s not going to stop compiling. As long as it compiles on your language standard (C++17 is fine I think unless you miss something specific) you are off to the races. And you can write C++17 for the next two decades if you want.


> Metaprogramming style in C++20 only has a loose relationship to previous versions. It is now concise and highly maintainable. You can do metaprogramming in the old painful and verbose way and it will work but you can largely dispense with that.

This was my takeaway as well when I revisited it a few years ago. It's a very different, and IMO vastly improved, language compared to when I first used it decades ago.


If you're going to go through the effort of learning a new language, it makes sense to consider another language altogether, one without 30 years of accumulated cruft.


An advantage is that if you already know the older language then you don’t have to learn the new idioms up front to use it. You can take your time and still be productive. It isn’t why I would use it but it is a valid reason.

I have used many languages other than C++20 in production for the kind of software I write. I don’t have any legacy code to worry about and rarely use the standard library. The main thing that still makes it an excellent default choice, despite the fact that I dislike many things about the language, is that nothing else can match the combination of performance and expressiveness yet. Languages that can match the performance still require much more code, sometimes inelegant, to achieve an identical outcome. The metaprogramming ergonomics of C++20 are really good and allow you to avoid writing a lot of code, which is a major benefit.


I only which concepts were easier, those of use that don't use C++ daily have to look the expression syntax all the time, much better than the old ways I guess.


Wait until people see how reflection on c++26 further pushes the metaprogramming paradigm. I'm more hopeful for reflection than I have been for any other c++ feature which has landed in the last decade (concepts, modules, coroutines, etc).


As someone that had the option to choose between C and C++, coming from compiled BASIC and Object Pascal backgrounds, back in the early 1990's.

What makes C++ valueable is being a TypeScript for C, born in the same UNIX and Bell Labs farm (so to speak), allowing me to tap into the same ecosystem, while allowing me to enjoy the high level abstractions of programming languages like Smalltalk, Lisp, or even Haskell.

Thus I can program on MS-DOS limited with 640 KB, an ESP32, Arduino, a CUDA card, or a distributed system cluster with TB of memory, selecting which parts are more convinient for the specific application.

Naturally I would like in 2025 to be able to exercise such workflows with a compiled managed language instead of C++, however I keep being in the minority, thus language XYZ + C++ it is.


Does go count as managed, in your view? (Honest question - I don't know go well enough to have much of an opinion.)


I'd call managed C#, Java, Go, Pythhon, JS, etc. Something with GC e.g., managed memory


Yes, managed languages are all that have some form of automatic resource management, regardless of what shape it takes, or a more high language runtime.

Using Go as example, and the being in minority remark, you will remember the whole discussion about Go being a systems language or not, and how it was backpedaled to mean distributed systems, not low level OS systems programming.

Now, I remember when programming compilers, linkers, OS daemons/services, IoT devices, firmware was considered actual systems programming.

But since Go isn't bootstraped, TinyGo and TamaGo don't exist, that naturally isn't possible. /s


> C++ is valuable, because the existing tooling enables you to optimize the runtime peformance of a program

This is true for MANY other languages too, I don't see how this makes c++ different. With gdb its quite the opposite, handlig c++ types with gdb can be a nightmare and you either develop your own gdb glue code or write c-like c++.

> C++ is valuable becaus it's industry support guarantees code bases live for decades _without the need to modify them_ to latest standards.

In times of constant security updates (see the EU's CRA or equivalent standards in the US) you always gotta update your environment which often also means updating tooling etc. if you don't wanna start maintaining a super custom ecosystem.

I don't see this as a positive in general, there is bit rot and a software that is stuck in the past is generally not a good sign imo.

> C++ is valuable because the industry tooling allows you to verify large areas of the program behaviour at runtime (ASAN etc).

Sanitizers are not C++ exclusive too and with rust or C# you almost never need them for example. Yes C++ has extensive debugging tools but a big part of that is because the language has very few safeguards which naturally leads to a lot of crashes etc..

I think the idea of using only a small subset of C++ is interesting but it ignores the problem that many people have, you don't have the time to implement your own STL so you just use the STL. Ofc it gives me more control etc. but I'd argue most of the time writing orthodox c++ won't save time even in the long run, it will save you headaches and cursing about c++ being super complicated but in the end in modern environments you will just reinvent the wheel a lot and run into problems already solved by the STL.


> handlig c++ types with gdb can be a nightmare and you either develop your own gdb glue code or write c-like c++.

That's why better to use lldb and it's scripts.

> I think the idea of using only a small subset of C++ is interesting but it ignores the problem that many people have, you don't have the time to implement your own STL so you just use the STL.

Yeah, agree. It's just much easier to take a "framework" (or frameworks) where all the main problems solved: convenient parallelism mechanisms, scheduler, reactor, memory handling, etc. So it's turning out you kinda writing in your own ecosystem that's not really different from another language, just in C++ syntax.


orthodox C++ should be a subset of C++, I would really use it, like if there was a compiler flag

I can imagine it might be insanely faster to compile


Better language? Well, now mix those with C libraries thst you need and make them generate code as efficient as C++ (I would assume people use C++ for a performance advantage of some kind in many scenarios).


sorry, I can't take something that argues for "printf" in favour of anything else seriously.


> sorry, I can't take something that argues for "printf" in favour of anything else seriously.

I think you're arguing from a position of willful ignorance. The article is clear on how it lauds C++'s std::printnl, not printf.

http://en.cppreference.com/w/cpp/io/println.html

Here's what the article argues:

> With std::format, C++ has gained a modern, powerful, and safe formatting system that ends the classic, error‑prone printf mechanisms. std::format is not merely convenient but fully type‑safe: the compiler checks that placeholders and data types match.

Solid remark, and the consensus on how std::printnl and std::format are an important improvement over std::cout or C's printf.


I was referring to the Orthodox C++ article linked by parent. Of course format is an improvement on both printf and iostream.


I'll bite. printf might be unsafe in terms of typing, in theory, but it's explicit and readable (with some caveats such as "PRIi32"). The actual chance of errors happening is very low in practice, because format strings are static in all practical (sane) uses so testing a single codepath will usually detect any programmer errors -- which are already very rare with some practice. On top of that, most compilers validate format strings. printf compiles, links, and runs comparatively quickly and has small memory footprint. It is stateless so you're always getting the expected results.

Compare to <iostream>, which is stateful and slow.

There's also std::format which might be safe and flexible and have some of the advantages of printf. But I can't use it at any of the places I'm working since it's C++20. It probably also uses a lot of template and constexpr madness, so I assume it's going to be leading to longer compilation times and hard to debug problems.


I my experience you absolutely must have type checking for anything that prints, because eventually some never previously triggered log/assertion statement is hit, attempts to print, and has an incorrect format string.

I would not use iostreams, but neither would I use printf.

At the very least if you can't use std::format, wrap your printf in a macro that parses the format string using a constexpr function, and verifies it matches the arguments.


_Any_ code that was never previously exercised could be wrong. printf() calls are typically typechecked. If you write wrappers you can also have the compiler type check them, at least with GCC. printf() code is quite low risk. That's not to say I've never passed the wrong arguments. It has happened, but a very low number of times. There is much more risky code.

So such a strong "at the very least" is misapplied. All this template crap, I've done it before. All but the thinnest template abstraction layers typically end up in the garbage can after trying to use them for anything serious.


Error Log/assertions prints are by are the most likely code to have not been run prior. Some compilers type check printf, but not all.


The biggest issue with printf is that it is not extensible to user types.

I also find it unreadable; beyond the trivial I always need to refer to the manual for the correct format string. In practice I tend to always put a placeholder and let clangd correct me with a fix-it.

Except that often clangd gives up (when inside a template for example), and in a few cases I have even seen GCC fail to correctly check the format string and fail at runtime (don't remember the exact scenario).

Speed is not an issue, any form of formatting and I/O is going to be too slow for the fast path and will be relegated to a background thread anyway.

Debugging and complexity has not ben an issue with std::format so far (our migration from printf based logging has been very smooth). I will concede that I do also worry about the compile time cost.


I largely avoided iostream in favor of printf-like logging apis, but std::format changed my mind. The only hazard I've found with it is what happens when you statically link the std library. It brings in a lot of currency and localization nonsense and bloats the binary. I'm hoping for a compiler switch to fix that in the future. libfmt, which std::format is based on, doesn't have this problem.


The article argues that modern C++ has type-checked string formatting, so it does not argue for (unchecked) `printf()`, right?


"The article" is ambiguous. The one this HN post is about does not argue for it, at all. But the one in the comment above directly says,

> Don’t use stream (<iostream>, <stringstream>, etc.), use printf style functions instead.

and has a code example of what they argue 'Orthodox C++' should be, which uses printf.

I'm all for a more sensible or understandable C++, but not at the expense of losing safety. In fact I would prefer the other way: I still feel incredibly saddened that Sean Baxter's Circle proposal for Safe C++ is not where the language is heading. That, plus some deep rethinking and trimming of some of the worst edge behaviours, and a neater standard library, would be incredible.


I was indeed referring to the 'Orthodox C++ article'.


"but don't share the prompts."

To be honest I don't want to see anyone elses prompts generally because what works is so damn context sensitive - and seem to be so random what works and what not. Even though someone else had a brilliant prompt, there are no guarantees they work for me.

If working with something like Claude code, you tell it what you want. If it's not what you wanted, you delete everything, and add more specifications.

"Hey I would like to create a drawing app SPA in html that works like the old MS Paint".

If you have _no clue_ what to prompt, you can start by asking the prompt from the LLM or another LLM.

There are no manuals for these tools, and frankly they are irritatingly random in their capabilities. They are _good enough_ that I tend to always waste time trying to use them for every novell problem I came face with, and they work maybe 30% - 50% of time. And sometimes reach 100%.


"There are no manuals for these tools" is exactly why I like it when people share the prompts they used to achieve different things.

I try to share not just the prompts but the full conversation. This is easy with Claude and ChatGPT and Gemini - they have share links - but harder with coding agents.

I've recently started copying and pasting my entire Claude Code terminal sessions into a shareable HTML page, like this one: https://gistpreview.github.io/?de6b9a33591860aa73479cf106635... (context here: https://simonwillison.net/2025/Oct/28/github-universe-badge/) - I built this tool for doing that: https://tools.simonwillison.net/terminal-to-html


That’s why I like how OC handles sharing sessions https://opencode.ai/docs/share/

Wish other tools would copy this functionality(and maybe expand it so colleagues can pick up on sessions I share)


Imperial is more familiar to you. You could just have said that.

Everybody hates swapping between units of measurement. You pick one and stick with it. It's natural having the need to move between two measurement systems irritates you.

>I measure weather much more frequently than I measure water temps,

In cold climates water temp is actually the most important thing to know about the weather by a long shot. The freezing point tells you if it's wet or dry, slippery or non-slippery.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: