Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Things that can be done millions of times per second per core don't "introduce delays" that a handful of people are going to see.

Oh but they can't. If you tried it, then you surely know that both OT and CRDTs need to consider the entire change history at some key points in order to derive the current value. Diff sync doesn't suffer from the same issue, however the way it keeps track of client shadows introduces writes on the read path, making it horribly expansive to run at scale.

Are you seriously trying to say that the database you created in a scripting language that uses linear scanning of arrays is 'unmatched' compared to high performance C++?

It's not about the language, but about the underlying algorithm. Yes, JS is slower, and surely linear scan is slower than typical DB queries. But what GoatDB does, which is quite unique today, is it's able to resume query execution from the last point a query ran, so you get super efficient incremental updates which are very useful when running on the client side (clients tend to issue the same queries over and over again).



I'm not sure what the point of all this is. Linear scanning arrays does not scale, this is basic computer science. Javascript is going to run at 1/10th the speed of a native language at best. You don't have any benchmarks and are bragging about stuff that was typical 30 years ago. You realize that people have done shared document editing for decades and that every video game keeps a synced state right?

The most important thing here is benchmarks. If you want to claim you have "unmatched" speed, you need benchmarks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: