Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
[flagged] Removed rust to gain speed (prisma.io)
71 points by 2233 4 days ago | hide | past | favorite | 65 comments




An important part of the story here, not mentioned in this post but noted elsewhere (https://www.prisma.io/blog/from-rust-to-typescript-a-new-cha...), is that they gave up on offering client libraries for languages other than JavaScript/TypeScript. Doing this while mostly sharing a single implementation among all languages was much of the original reason to use Rust, because Rust is a good "lowest common denominator" language for FFI and TypeScript is not; it wasn't entirely about performance. If they hadn't tried to do this, they would likely never have used Rust; if they hadn't given up on it, they would likely still be using Rust.

Yeah, the whole point of Prisma 2 was to be multi language and multi-DB with a Rust server in between you and the DB. There are a lot of advantages to that approach in enterprise. You can do better access control, stats, connection pooling, etc (Formal is a YC company in that space). Prisma 1 was a scala implementation of that vision.

Anyway, end of an era. There were a couple community bindings in Python and Java that are now dead I assume. I was heavily invested in Prisma around 4-5 years ago, that is funnily enough what got me started on my Rust journey.


I'm sure you could get even greater speed by removing Prisma. All you need is a migration tool and a database connection. Most recent example in my work where we removed an ORM resulted in all of our engineers, particularly juniors becoming Postgres wizards.

Congratulations, you have now increased the cognitive load to be productive on your team and increased the SQL injection attack surface for your apps!

I jest, but ORMs exist for a reason. And if I were a new senior or principal on your team I’d be worried that there was now an expectation for a junior to be a wizard at anything, even more so that thing being a rich and complex RDBMS toolchain that has more potential guns pointing at feet than anything else in the stack.

I spent many years cutting Rails apps and while ActiveRecord was rarely my favourite part of those apps, it gave us so much batteries included functionality, we just realised it was best to embrace it. If AR was slow or we had to jump through hoops, that suggested the data model was wrong, not that we should dump AR - we’d go apply some DDD and CQRS style thinking and consider a view model and how to populate it asynchronously.


I think this needs some nuance - this is definitely true in some domains.

Most of the domains I worked in it was the other way around: using an ORM didn’t mean we could skip learning SQL, it added an additional thing to learn and consider.

In the last years writing SQLAlchemy or Django ORM the teams I was on would write queries in SQL and then spend the rest of the day trying to make the ORM reproduce it. At some point it became clear how silly that was and we stopped using the ORMs.

Maybe it’s got to do with aggregate-heavy domains (I particularly remember windowing aggregates being a pain in SQLAlchemy?), or large datasets (again memory: a 50-terabyte Postgres machine, the db would go down if an ORM generated anything that scanned the heap of the big data tables), or highly concurrent workloads where careful use of select for update was used.


> In the last years writing SQLAlchemy or Django ORM the teams I was on would write queries in SQL and then spend the rest of the day trying to make the ORM reproduce it.

Ah yes, good times! Not Django for me but similar general idea. I'm not a big fan of ORMs: give me a type safe query and I'm happy!


SQL injection is only a thing for those careless to ever allow doing screen concatenation to go through pull requests.

If it isn't using query parameters, straight rejection, no yes and buts.

Naturally if proper code review isn't a thing, than anything goes, and using an ORM won't help much either.


> Congratulations, you have now increased the cognitive load to be productive on your team and increased the SQL injection attack surface for your apps!

Maybe I am speaking from too much experience but writing SQL is second-nature to me and I would wager my team feels similarly. Perhaps we are an anomaly. Secondly, most if not all SQL connector libraries have a query interface with all the usual injection vectors mitigated. Not saying it's impossible to break through but these are the same connector libraries even the ORMs use.

> ORMs exist for a reason. And if I were a new senior or principal on your team I’d be worried that there was now an expectation for a junior to be a wizard at anything

ORMs exist to hide the complexity of the RDBMS. Why would any engineer want to make arguably the most critical aspect of every single IT business opaque? ORMs may imply safety and ease, but in my experience they foster a culture with a tacit fear of SQL. Sounds a bit dramatic, but this has been a surprisingly consistent experience.


Brainfuck also exists for a reason. That doesn't imply that you should use it.

Using an ORM and escape hatching to raw SQL is pretty much industry standard practice these days and definitely better than no ORM imho. I have code that's basically a lot of

    result = orm.query({raw sql}, parameters)

It's as optimal as any other raw SQL query. Now that may make some people scream "why use an ORM at all then!!!" but in the meantime;

I have wonderful and trivially configurable db connection state management

I have the ability to do things really simply when i want to; i still can use the ORM magic for quick prototyping or when i know the query is actually trivial object fetching.

The result passing into an object that matches the result of the query is definitely nicer with a good ORM library than every raw SQL library i've used.


Every project I've come across that uses an ORM has terrible database design. All columns nullable, missing foreign key indexes, doing things in application code that could easily be done by triggers (fields like created, modified, ...), wrong datatypes (varchar(n) all over the place, just wwwhhhhyyy, floats for money, ...), using sentinel values (this one time, at bandcamp, I came across a datetime field that used a sentinel value and it only worked because of two datetime handling bugs (so two wrongs did make a right) and the server being in the UTC timezone), and the list goes on and on...

I think this happens because ORMs make you treat the database as a dumb datastore and hence the poor schema.


Honestly database schema management doesn't scale particularly well under any framework and i've seen those issues start to crop up in every org once you have enough devs constantly changing the schema. It happens with ORMs and with raw SQL.

When that happens you really really should look into the much maligned no-sql alternatives. Similarly to the hatred ORMs get, no-sql data stores actually have some huge benefits. Especially at the point where db schema maintenance starts to break down. Ie. Who cares if someone adds a new field to the FB Newsfeed object in development when ultimately it's a key-value store fetched with graphQL queries? The only person it'll affect is the developer who added that field, no one else will even notice the new key value object unless they fetch it. There's no way to make SQL work at all at scale (scale in terms of number of devs messing with the schema) but a key-value store with graphQL works really well there.

Small orgs where you're the senior eng and can keep the schema in check on review? Use an ORM to a traditional db, escape hatch to raw SQL when needed, keep a close eye on any schema changes.

Big orgs where there's a tons of teams wanting to change things at high velocity? I have no idea how to make either SQL or ORMs work in these cases. I do know from experience how to make graphQL and a key-value store work well though and that's where the above issues happen in my experience. It's really not an ORM specific issue. I suggest going down the no-sql route in those cases.


NoSQL is even worse, data gets duplicated and then forgotten, so it doesn't get updated correctly, or somebody names a field "mail" and another person names it "email" and so on...

There is zero guarantee that whatever you ask the database for contains anything valid, so your code gets littered with null and undefined checks, and if you ask for example a field "color" what is it going to contain? A hex value? rgb(), rgba(), integer? So you need to check that too.

In my experience NoSQL is even worse, they are literally data dumps (as in garbage dump).


Same here, I am a big advocate of knowing your SQL, and stored procedures, no need to waste network traffic for what is never going to be shown on the application.

This is a decent example of not buying, getting pulled, or being forced into any corporate pushed hype or eliminating one's options. They re-evaluated and looked at what programming language was best for their situation, which was removing the Rust language and using something else. It then turned out, they actually got gains in greater user contributions, simplicity, efficiency, and even speed.

> what programming language was best for their situation, which was removing the Rust language and using something else

This is correct, but I'd say that the key was removing Rust and not using something else. Fewer moving parts, fewer JS runtime boundaries to cross, no need to make certain that the GC won't interfere, etc.

Also, basically any rewrite is a chance to drop entrenched decisions that proved to be not as great. Rewriting a large enough part of Prisma likely allowed to address quite a few pieces of tech debt which were not comfortable to address in small incremental changes. Consider "Prisma requires ~98% fewer types to evaluate a schema. Prisma requires ~45% fewer types for query evaluation.": this mush have required quite a bit of rework of the whole thing. Removing Rust in the process was likely almost a footnote.


> This is a decent example of not buying, getting pulled, or being forced into any corporate pushed hype

It seems that maybe they did get hyped into Rust, because it's not clear why they believed Rust would make their JavaScript tool easier to develop, simpler, or more efficient in the first place.


At one time, they were targeting much a broader array of languages -- it wasn't specifically a JavaScript tool:

https://www.youtube.com/watch?v=1zSh0zYLTIE


There are examples where that's true, like Biome or oxc.

Biome and oxc are developer tools. I don't know why in the world they would do this, but it sounds like they were using Rust at runtime to interact with the database?

I claim that 99.9999% of software should be written in a GC language. Very, very, very few problems actually requires memory management. It is simply not part of the business requirement. That said, how come the only language closest to this criteria is Go except it hasn't learn about clean water (algebraic data types).

Meanwhile, earlier this week we had a big conversation about the skyrocketing costs of RAM. While it's technically true that GC doesn't mean a program has to hold allocations longer than the equivalent non-GC code, I've personally never seen a GC'ed program not using multiple times as much RAM.

And then you have non-GC languages like Rust where you're hypothetically managing memory yourself, in the form of keeping the borrow checker happy, you never see free()/malloc() in a developer's own code. It might as well be GC from the programmer's POV.


> And then you have non-GC languages like Rust

Rust is a GC language: https://doc.rust-lang.org/std/rc/struct.Rc.html, https://doc.rust-lang.org/std/sync/struct.Arc.html


By that token so is C.

GC is a very fuzzy topic, but overall trend is that a language is GC if it's opt-out of GC. Not opt-in. And more strictly, it has to be tracing GC.


By your definition, is Nim a GC language?

So by that definition yes, as of Nim 2.0 ORC is the default. You need to opt-out of it.

I'm not sure this opt-in/out "philosophical razor" is as sharp as one would like it to be. I think "optionality" alone oversimplifies and a person trying to adopt that rule for taxonomy would just have a really hard time and that might be telling us something.

For example, in Nim, at the compiler CLI tool level, there is opt-in/opt-out via the `--mm=whatever` flag, but, at the syntax level, Nim has both `ref T` and `ptr T` on equal syntactic footing . But then in the stdlib, `ref` types (really things derived from `seq[T]`) are used much more (since it's so convenient). Meanwhile, runtimes are often deployment properties. If every Linux distro had their libc link against -lgc for Boehm, people might say "C is a GC'd language on Linux". Minimal CRTs vary across userspaces and OS kernel/userspace deployment.. "What you can rely on/assume", I suspect the thrust behind "optionality", just varies with context.

Similar binding vagueness between properties (good, bad, ugly) of a language's '"main" compiler' and a 'language itself' and 'its "std"lib' and "common" runtimes/usage happen all the time (e.g. "object-oriented", often diluted by the vagueness of "oriented"). That doesn't even bring in "use by common dependencies" which is an almost independent axis/dimension and starts to relate to coordination problems of "What should even be in a 'std'-lib or any lib, anyway?".

I suspect this rule is trying to make the adjective "GC'd" do more work in an absolute sense than it realistically can given the diversity of PLangs (sometimes not so visible considering only workaday corporate PLangs). It's not always easy to define things!


> I think "optionality" alone oversimplifies and a person trying to adopt that rule for taxonomy would just have a really hard time and that might be telling us something.

I think optionality is what gives that definition weight.

Think of it this way. You come to a project like a game engine, but you find it's written in some language and discover for your usage you need no/minimal GC. How hard is it to minimize or remove GC. Assume that changing build flags will also cause problems elsewhere due to behavior change.

> Similar binding vagueness between properties (good, bad, ugly) of a language's '"main" compiler' and a 'language itself' and 'its "std"lib' and "common" runtimes/usage happen all the time (e.g. "object-oriented", often diluted by the vagueness of "oriented")

Vagueness is the intrinsic quality of human language. You can't escape it.

The logic is fuzzy, but going around saying stuff like "Rust is a GC language" because it has an optional, rarely used Arc/Rc, is just off the charts level of wrong.


Do you consider ORC a tracing GC, even though it only uses tracing for cyclic types?

You can add your own custom GC in C — you can add your own custom anything to any language; its all just 1s and 0s at the end of the day — but it is not a feature provided by the language out of the box like in Rust. Not the same token at all. This is very different.

Well, that's mostly what a Arc<T> and Rc<T> in Rust are. Optional add-ons.

...as part of the language. Hence it being a GC language.

Is this another case of "Rustacians" randomly renaming things? There was that whole debacle where sum types bizarrely became enums, even though enums already had an established, different meaning, with all the sad talking past everyone else that followed. This is starting to look like that again.


> ...as part of the language.

Which part? It's not available in no-std without alloc crate. You can write your own Arc.

Most crates don't have to use Arc/Rc.

> Is this another case of "Rustacians" randomly renaming things?

No. This is a case of someone not having enough experience with Rust. Saying Rust is a GC language is like claiming Pascal is object oriented language because they share some surface similarities.


> Which part?

The part that is detailed on rust-lang.org.

> It's not available in no-std without alloc crate.

no-std disables features. It does not remove them from existence. Rust's worldly presence continues to have GC even if you choose to disable it for your particular project.

> This is a case of someone not having enough experience with Rust.

Nah. Even rust-lang.org still confuses sum types and enums to this very day. How much more experienced with Rust can you get than someone versed enough in the language to write comprehensive, official documentation? This thought of yours doesn't work.

> Saying Rust is a GC language is like claiming Pascal is object oriented language because they share some surface similarities.

What surface similarity does Pascal have to OO? It only has static dispatch. You've clearly not thought that one through.

Turbo Pascal has dynamic dispatch. Perhaps you've confused different languages because they happen to share similar names? That is at least starting to gain some surface similarity to OO. But message passing, of course, runs even deeper than just dynamic dispatch.

Your idea is not well conceived. Turbo Pascal having something that starts to show some very surface-level similarity to OO, but still a long way from being the real deal, isn't the same as Rust actually having GC. It is not a case of Rust having something that sort of looks kind of like GC. It literally has GC.


> no-std disables features. It does not remove them from existence.

It's the other way around; standard library adds features. Because Rust features are designed to be additive.

Look into it. `std` library is nothing more than Rust-lang provided `core`, `alloc` and `os` crates.

> Nah.

You don't seem to know how features work, how std is made or how often RC is encountered in the wild. It's hard to argue when you don't know language you are discusing.

> Even rust-lang.org still confuses sum types and enums to this very day.

Rust lang is the starting point for new Rust programmers; why in the heck would they start philosophizing about a bikesheddy naming edge case?

That's like opening your car manual to see history and debates on what types of motors preceded your own, while you're trying to get the damn thing running again.

> What surface similarity does Pascal have to OO?

Dot operator as in (dot in `struct.method`). The guy I was arguing with unironically told me that any language using the dot operator is OO. Because the dot operator is a sign of accessing an object or a struct.

Much like you, he had very inflexible thoughts on what makes or does not make something OO; it reminds me so much of you saying C++ is a GC-language.

> Your idea is not well conceived.

My idea is to capture the colloquial meaning of GC-language. The original connotation is to capture languages like C#, Java, JS, etc. That comes with a (more or less) non-removable tracing garbage collector. In practice, what this term means is

- How hard is it to remove and/or not rely on GC? Defaults matter a lot.

- How heavy is the garbage collection GC? Is it just RC or ARC?

- How much of the ecosystem depends on GC?

And finally, how many people are likely to agree with it? I don't care if my name is closest to frequency of red, if no one else agrees.


I can't say I've heard of a commonly used definition of "GC language" that includes C++ and excludes C. If anything, my impression is that both C and C++ are usually held up as exemplars of non-GC languages.

C++ didn't add GC until relatively recently, to be fair. When people from 30 years ago get an idea stuck in their head they don't usually ever change their understanding even as life continues to march forward. This isn't limited to software. If you look around you'll regularly find people repeating all kinds of things that were true in the past even though things have changed. And fair enough. There is only so much time in the day. You can't possibly keep up to date on everything.

The thing is that the usual comparisons I'm thinking of generally focused on how much the languages in question rely on GC for practical use. C++11 didn't really move the needle much, if at all, in that respect compared to the typical languages on the other side of said comparisons.

Perhaps I happen to have been around different discussions than you?


> focused on how much the languages in question rely on GC for practical use.

That's quite nebulous. It should be quantified. But, while we wait for that, if we assume by that metric C++ is not a GC language today, but tomorrow C++ developers all collectively decide that all heap allocations are to depend on std::shared_ptr, then it must become a GC language.

But the language hasn't changed in any way. How can an attribute of the language change without any changes?


> That's quite nebulous. It should be quantified.

Perhaps, but I'm reluctant to speak more definitively since I don't consider myself an authority/expert in the field.

> But the language hasn't changed in any way. How can an attribute of the language change without any changes?

The reason I put in "for practical use" is because since pedantically speaking no language actually requires GC - you "just" need to provision enough hardware (see: HFT firms (ab)use of Java by disabling the GC and resetting programs/machines at the end of the day). That's not relevant for basically everyone, though, since practically speaking you usually want to bound resource use, and some languages rely on a GC to do that.

I guess "general" or "normal" might have been a better word than "practical" in that case. I didn't intend to claim that how programmers use a language affects whether it should be considered a GC language or not.


If Rust is a GC language because of Rc/Arc, then C++ is a GC language because of std::shared_ptr, right?

Absolutely.

Interesting... To poke at that definition a bit more:

Would that also mean that C++ only became a GC language with the release of C++11? Or would C++98 also be considered a GC language due to auto_ptr?

Are no_std Rust and freestanding C++ GC languages?

Does the existence of Fil-C change anything about whether C is a GC language?


> Or would C++98 also be considered a GC language due to auto_ptr?

auto_ptr does not exhibit qualities of GC.

> Are no_std Rust and freestanding C++ GC languages?

These are not different languages. A developer opting out of certain features as enabled by the language does not change the language. If one was specific and said "no_std Rust", then you could fairly say that GC is not available, but that isn't applicable here. We are talking about "Rust" as written alone.

> Does the existence of Fil-C change anything about whether C is a GC language?

No. While Fil-C is heavily inspired by C, to the point that it is hard to distinguish between them, it is its own language. You can easily tell they are different languages as not all valid C is valid Fil-C.


> auto_ptr does not exhibit qualities of GC.

OK, so by the definition you're using C++ became a GC language with C++11?

> If one was specific and said "no_std Rust", then you could fairly say that GC is not available, but that isn't applicable here.

I'd imagine whether or not GC capabilities are available in the stdlib is pretty uncontroversial. Is that the criteria you're using for a GC language?

> No. While Fil-C is heavily inspired by C, to the point that it is hard to distinguish between them, it is its own language. You can easily tell they are different languages as not all valid C is valid Fil-C.

OK, fair; perhaps I should have been more abstract, since the precise implementation isn't particularly important to the point I'm trying to get to. I should have asked whether the existence of a fully-compatible garbage collecting implementation of C change anything about whether C is a GC language. Maybe Boehm with -DREDIRECT_MALLOC might have been a better example?


> I should have asked whether the existence of a fully-compatible garbage collecting implementation of C change anything about whether C is a GC language.

In a similar vein, tinygo allows compilation without GC[1]. That is despite the Go spec explicitly defining it as having GC. Is Go a GC language or not?

As you can see, if it were up to implementation, a GC/non-GC divide could not exist. But it does — we're talking about it. The answer then, of course, is that specification is what is significant. Go is a GC language even if there isn't a GC in implementation. C is not a GC language even if there is one in implementation. If someone creates a new Rust implementation that leaves out Rc and Arc, it would still be a GC language as the specification indicates the presence of it.

[1] It doesn't yet quite have full feature parity so you could argue it is more like the Fil-C situation, but let's imagine that it is fully compatible in the same way you are suggesting here.


You make some good points, and after thinking some more I think I agree that the existence of GC/non-GC implementations is not determinative of whether a language is typically called a GC language.

After putting some more thought into this, I want to say where I diverge from your line of thinking is that I think whether a language spec offers GC capabilities is not sufficient on its own to classify a language as a "GC language"; it's the language's dependence on said GC capabilities (especially for "normal" use) that matters.

For example, while you can compile Go without a GC, the language generally depends on the presence of one for resource management to the point that a GC-less Go is going to be relatively restricted in what it can run. Same for Java, JavaScript, Python, etc. - GC-less implementations are possible, but not really reasonable for most usage.

C/C++/Rust, on the other hand, are quite different; it's quite reasonable, if not downright common, to write programs that don't use GC capabilities at all in those languages. Furthermore, removing std::shared_pointer/Rc/Arc from the correspondning stdlibs wouldn't pose a significant issue, since writing/importing a replacement is something those languages are pretty much designed to be capable of.


Good point. However, similar offshoot languages of Golang, do have them. Those are Vlang (aka Sum Types) and Odin (aka Discriminated Unions). However and per your comment, only Vlang has the optional GC (used by default) and the flexible memory management philosophy (to turn the GC off without its stdlib depending on it).

Games are 0.00001% of software? Web browsers? Operating systems?

Large parts of web browsers (like entire Firefox' UI) is written in javascript already

Operating systems _should_ use GC languages more. Sure, video needs to have absolute max performance... but there is no reason my keyboard or mouse driver should be in non-GC language.


> Large parts of web browsers (like entire Firefox' UI) is written in javascript already

Is UI even a large part of Firefox? I imagine that the rendering engine, JS engine, networking, etc, are many times larger than UI.


Most programs should be written in GCd languages, but not this.

Except in a few cases, GCs introduce small stop-the-world pauses. Even at 15ms pauses, it'd still be very noticeable.


Might want to read up on Ponylang. GC'd FOSS language without the stop-the-world pauses, demonstrating that it's possible. Should also be pointed out, that there are a number of proprietary solutions that claim GC with no pauses. Unfortunately, if coming from more common C-family languages, Ponylang may require more to get used to the actor model and different syntax.

There are GC algorithms that don’t require stopping the world.

I'm in the "Pro-Rust" camp (not fanboy level "everything must be rewritten in rust", but "the world would be a better place if more stuff used Rust"), and I love this post.

They saw some benefits to Rust, tried it, and continued to measure. They identified the Typescript/Rust language boundary was slow, and noticed an effect on their contributions. After further research, they realized there was a faster way that didn't need the Rust dependency.

Good stuff, good explanation!


I'm not sure your characterization is all that accurate.

Originally, they thought they could build a product that worked across many languages. That necessitated a "lowest common denominator" language, which is a void that has always been strangely lacking in choice, to provide an integratabtle core. Zig had only been announced a few months earlier, so it wasn't really ready to be a contender. For all intents and purposes, C, C++, and Rust were the only options.

Once the product made it to market, it became clear that the Typescript ecosystem was the only one buying in. Upon recognizing the business failure, the "multi-language" core didn't make sense anymore. It was a flawed business model that forced them into using Rust (could have been C or C++ instead, but yeah) and once they gave up on that business model they understood that it would have been better to have been written it in Typescript in the first place — and it no doubt would have been if it weren't for the lofty pie in the sky dreams of trying to make it more than the market was willing to accept. Now they got the opportunity to actually do it.


> I'm in the "Pro-Rust" camp (not fanboy level "everything must be rewritten in rust", but "the world would be a better place if more stuff used Rust")

While techno-religiosity is irrational and unprofessional, but that's some weak, eye-rolling both-sidesism.

The world would be a better place™ if more stuff used better and more formal tools and methods to prove that code is correct and bug-free. This is easier in some languages than others, but still, there's a lot of formal verification tools that aren't used enough and methodology patterns and attitudes that are missing from crucial projects and regular use. Effective safety and security assurance of software creation takes effort and understanding that a huge fraction of programmer non-software engineers don't want to undertake or know anything about. This needs to change and is defensible, marketable expertise that needs to be appreciated and cannot be replaced by AI anytime soon. There's no "easy" button, but there are a lot of "lazy" "buttons" that don't function as intended.


Good. Rust is fine, but it makes you pay a complexity tax for manual memory management that you just don't need most of the time. In almost all real world cases, a GC is fine. TypeScript is a memory-safe language, just like Rust, and I can't imagine a database ORM of all things needing manual memory management to get good performance. (Talking to the database, not memory management, is the bottleneck!)

I don’t think the problems they were dealing with had much to do with any of those properties of Rust. Their issue seems to have been that they weren’t using native JavaScript/TypeScript and that their situation was improved by using native TypeScript.

If they had been using something like Java or Go or Haskell, etc, they may well have had even more downsides.


> manual memory management

Rust has automatic memory management.

> Complexity tax

Could you be more specific?


The trait/type system can get pretty complex. Advanced Rust doesn't inherit like typical OOP, you build on generics with trait constraints, and that is a much more complex and unusual thing to model in the mind. Granted you get used to it.

OOP inheritance is an anti-pattern and hype train of the 90's/00's, especially multiple inheritance. Especially the codebases where they create extremely verbose factories and abstract classes for every damn thing ... Java, C++, and the Hack (PHP-kind) shop are frequently guilty of this.

Duck typing and selective traits/protocols are the way to go™. Go, Rust, Erlang+Elixir... they're sane.

What I don't like about Rust is the inability to override blanket trait implementations, and the inability to provide multiple, very narrow, semi-blanket implementations.

Finally: People who can't/don't want to learn multiple programming language platform paradigms probably should turn in their professional software engineer cards. ;)


> Rust has automatic memory management

Sure, if you define "automatic memory management" in a bespoke way.

> Could you be more specific?

The lifetime system, one of the most complex and challenging parts of the Rust programming model, exists only so that Rust can combine manual memory management with memory safety. Relax the requirement to manage memory manually and you can safely delete lifetimes and end up with a much simpler language.

Have you ever written a Java or Python program?


PSA: detached floating panels are pure cancer. Avoid them.

I literally can't scroll through your website.


So tell them ... posting it here is useless; the article was written a couple of weeks ago and they may not even know that it made it to HN.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: