I would have used CockroachDB, it has all the requirements listed and you don't need to know in advance the queries you will perform when deciding the database schema.
This would not work at scale for a company like Discord, with its volume of traffic. Cockroach, being consistency-oriented would quickly become transaction-bound. You want a database like a Cassandra or Scylla that is more performance/availability oriented. Otherwise you are going to see a lot of lag and latency in the Discord chat.
Cockroach is very, very good for a distributed SQL database. But it's still performance-limited in its very nature.
More here on the difference between NoSQL/NewSQL performance, using Scylla (a CQL-workalike) as a point of comparison:
Because he does not want to send Facebook every comment he makes on the internet? Specially your political ideas and opinions.
I don't have a FB either. For me it is like the Stasi, tracking everything you do, only way more detailed.
When I talk with friends, my exact opinions are forgotten over time, while FB will record your exact opinion on September 20th, at 8:12pm for your entire life.
Yes. Phone numbers are linked to a physical device (a SIM card or similar) or a contract (with a provider). An email does not need to contain a real name, and I'd say it's much easier for me to get a email with a temporary identity than it is to get a phone number for the same.
At least in PG, a regular view must calculate everything at read time, which might mean a lot of duplicated calculations, and a materialized view has to be refreshed "manually", it doesn't auto-update piecemeal when the underlying data changes.
I've had pretty good luck with using table triggers to update materialized views and make everything "automatic". A little more work up front, but pretty easy to forget about it once it's in place.
A REFRESH of a materialized view still requires full re-computation of all values, no? It's better than regular views if you have more reads than writes, but still quite wasteful.
You are right. I've been dealing with this personally with PostGIS and storing large geometries. Ideally you split the geometries up in to many smaller geometries (using a built-in function), and it's much faster to calculate intersections, contained lengths, etc using those instead.
Many places online recommend storing this collection of divided geometries as a materialized view, but I recently had to move it to a separate real table because inserting a single new record would take 15 minutes to update the view (on an RDS 4xlarge database). It could at least update concurrently, so other reads didn't block, but now that the the divided geometries are stored in a separate table I can add new records in under 5 seconds usually.