Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For systems programming, C89 is definitely a "sweet spot". It's relatively easy to write a compiler for and was dominant when system variety was at its highest, so it is best supported across the widest variety of platforms. Later C standards are harder to write compilers for and have questionable features like VLAs and more complicated macro syntax. C++ is a hideous mess. Rust is promising, and would probably be my choice personally, but it's also still fairly new and will limit your deployment platform options.

C89 is still a reasonable choice today. I don't think it's depressing. It's a good language, and hitting the sweet spot of both language design and implementation is really hard, so you'd expect to have very few options.



Re: Rust, Curl is modular enough that you don't need to rewrite it in Rust in order to enjoy some of its benefits, you can just tell Curl to use Hyper (Rust's de-facto HTTP lib) as a backend. For the past few years they've been working on getting the Hyper backend to pass the Curl test suite and they're down to five remaining tests, so perfect support looks to be imminent: https://github.com/orgs/hyperium/projects/2/views/1 (seanmonstar occasionally streams on Twitch if you'd like to watch him work on these).


Also, and this sort of blows my mind, but Rust is almost 10 years old. It is a pretty darn stable language, especially for greenfield projects like a new HTTP library. The better Rust gets at interop the more it will begin to eat the systems programming world IMO, and we all benefit, even if it is quietly doing it without much fanfare.


I was with you right up until “quietly doing it without much fanfare”.

Personal opinions of Rust aside, it has gained so much fanfare that “rewrite it Rust” is now practically a meme.


The other side of that is the back end of what Rust folks are doing. There is a vocal segment that are doing a lot of surface level things. But there are also people quietly building the language and toolchain up to be something that you can do true low level embedded work with while maintaining (most of) the guarantees of Rust. That is what I was alluding to. I don't think "rewrite it in Rust" is always smart or even productive.

edit: It is also worth exploring why and how a systems programming language has generated this much excitement in folks that are "rewriting it in Rust". These people are also in their way making it that much easier for everyone else to transition to Rust, proving these projects work just fine in Rust. I do agree it is a meme, but it is a good one for us all. As an infosec practitioner nothing could make me happier than seeing people excited about a language that eradicates one of the worst and most pernicious classes of C/C++ bugs.


The sad part is that this was inflicted by the industry themselves,

https://www.schneier.com/blog/archives/2007/09/the_multics_o...

> The combination of BASED and REFER leaves the compiler to do the error prone pointer arithmetic while having the same innate efficiency as the clumsy equivalent in C. Add to this that PL/1 (like most contemporary languages) included bounds checking and the result is significantly superior to C.


Thanks for the link!

One data point more backing my theory that we're in the middle(?) of "computing dark ages", where the biggest crap and nonsense dominates everything (and people not even know how crappy everything is).

Do you think we will ever leave the dark age?

I mean before our AI overlords get in charge and kill and replace all the nonsense we've built, of course.


Not before our lifetimes.

This is a matter of quality, and like everything in computing, quality only matters when money or law is involved, so in a way returning digital goods with refund is one way to make companies take quality more seriously, other are stricter liability laws for when exploits ocurr in the wild.


It's not fair to judge an entire ecosystem full of extremely talented people by the vocal (and insufferable) 1%. Every group has them. What has become a meme has zero relationship to the quality of the thing.


I wasn’t judging anyone. I was just saying Rust isn’t exactly flying under the radar.


Rust is over 12 years old at this point as a publicly-available project (its development started internally sometime in the late 00s; it was publicized in summer of 2010).


Periodization is hard, especially with Rust, but if we're talking about reliability, counting time before 1.0 in 2015 doesn't feel right to me. Seven and a half years is still a long time :)


Agree, time since v1.0 is a very reasonable measure.


10 years and still an okish IDE experience… This is going to stay with Rust. The problem is not the tooling, but the language.


I don't get what you mean.

There are languages out there with much more advanced features than Rust, like e.g. Scala. But the IDE experience in Scala is not worse than with Java.

There is no fundamental problem with IDE support. It just takes some work with an advanced language.

(The only issue are languages which you need to write "backwards", like Haskell. But that's another story.)


I am talking about compilation speed.

I recently started working with Rust for contributing to projects like Rome/tools [1] and deno_lint [2]. My first impression with Rust is a bit frustrating: compilation takes tens of seconds or minutes. I am waiting in front of my IDE to get type hints / go-to def (often to fall-back to a text search). When I am launching unit tests, I am waiting that rust-analyzer terminates its indexing, and then I am waiting again that the tests compile…

The tools are now mature, and a lot of engineering work is done on both rust compiler and rust-analyzer. I am afraid that the slow compilation of Rust is rooted to its inherent complexity.

[1] https://github.com/rome/tools

[2] https://github.com/denoland/deno_lint


> I am afraid that the slow compilation of Rust is rooted to its inherent complexity.

AFAIK that's not the case.

The problem here is that Rust has "issues" with separate compilation due to some language design decisions (which affect incremental compilation than obviously). It's not built for that and it is, and continuously will be, quite difficult to make this work somehow.

But that's less a problem with the complexity of the language as such.

My Scala example stands: Scala is also quite complex and not the fastest to compile. But after the build system and the compiler crunched the sources once (which may take many minutes on a larger code base) the IDE is very responsive. Things like type hints or go-to def are more or less instant. Code completion is fast enough to be used fluently. Edit-compile-test cycles are fast thanks to the fact that separate compilation considerations were part of the language design decisions. (That's for example why Scala has orphan type-class instances; which are a feature and a wart at the same time).

As I understand Rust's "compilation units" are actually crates. This is not very fine granular and I guess the source of the issues.

I would guess splitting code into a few (more) crates (which than need to depend on each other) may improve the incremental build times. Also things like not building optimized code during development of course apply, but I think cargo does this automatically anyway.

But I'm not an expert on this. Would need to look things up myself.

Maybe someone else has some proven tricks to share?

OK, a quick search yielded some useful results, so I share:

https://www.pingcap.com/blog/rust-huge-compilation-units/

https://fasterthanli.me/articles/why-is-my-rust-build-so-slo...

https://news.ycombinator.com/item?id=29742694


Thanks for the detailed answer and the pointed resources!

In fact, I included the lack of compilation locality in "inherent complexity of Rust". However, I agree that this could be considered apart.

In my experience with TypeScript (quite different, I admitted), splitting in distinct compilation unit may help. However, this does not solve the issue.

This could be great if Rust could deprecate some features in order to improve its compilation speed. I am not sure if it is feasible…


I am unsure what you mean. Care to elaborate?


I think I followed. One aspect of a programming language is how easy it is to build a useful IDE for with code prediction, navigation, refactoring, etc. Java is relatively easy, Lua is very hard. Rust is somewhere in the middle, with macros being a complicating factor. https://rust-analyzer.github.io/blog/2021/11/21/ides-and-mac... discusses the problems specific to rust much better than I could.


Go has one of the best compiled language IDE experiences out there. GoLand makes it feel easier than Python code :)


Can it find all Implementations of an interface or what interfaces this implements ?


Yes to both. It is great :)


I'm unsure exactly what the above poster is trying to say, I generally find Rust development very pleasant with nothing but vscode and Rust-analyzer.

But... I'll admit there is one major stumbling block so far. Debugging iterator chains can be cumbersome because of the disconnect between the language and the compiled code. I've found myself stepping in and out of assembly more than I'd like. I assume this is the kind of problem that can be overcome with a nicer debugger though.


> I assume this is the kind of problem that can be overcome with a nicer debugger though.

I think this would be something that modern debuggers need to solve somehow in general.

There are more and more languages with high amount of syntax sugar, where the output to be debugged doesn't have much in common anymore with the code written.

Debuggers need to be aware of desugarings somehow.

But it makes no sense to implement this on a case by case basis for every language. We need next generation debuggers! (But I have no clue how "a sugar aware debugger" could be implemented; something in the direction of "source maps" maybe?)


See my other answer [1] to get more context :)

[1] https://news.ycombinator.com/item?id=33729406


So you're arguing C89 is better because it's easier to write compilers with it? How is that a relevant point for the context? We're talking about whether c99 it's better migrating for the end user and not a compiler writer


They are arguing that C89 is better because it's supported on more platforms (curl is used on all sorts of oddball embedded systems), and that it's easier to get a new platform going with C89




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: