All software has bugs. This is same as claiming ETH will always get hacked because of the DAO hack.
As far as I can tell there is no fundamental design flaws in Solana that makes it so that it will always go down. There are bugs in the networking stack that are being fixed.
An no, "engineers" don't bring it up. I run a validator and only i'm incentivized to restart it back up because of my stake.
This is no different from ETH, Lido staking has ~30% market share[0].
Stake centralizing != less security. no amount of stake can allow validators to steal your funds, this affects only censorship resistance. So actual validator count is more important than superminority.
In case of USDC I'm assuming Circle should be running their own validator, which is the source of truth.
As someone who has used most networks, Solana has the best experience for transacting USDC.
I struggle to imagine projects that can do with a db latency of several hundred ms? Something fully async maybe, any human interaction would be incredibly slow
Same region, different provider. Also, their JS API has to do a round trip every time you set auth credentials on the backend, so it's at least 2 round trips for a single query.
I've tried doing this read & updates using Postgrest and row-level-security using Supabase. When it works its an amazing experience but even for semi complex stuff you would still need to use Postgres RPC, which is still another "API" layer. I find writing API using SQL a nigtmare.
Simple queries like this don't work on Postgrest:
`update likes set likes = likes + 1;`
ATMO you shouldn't have to maintain knowledge of what kind of crawler bot exist and having to maintain deny list. It should be the opposite, only expressedly allowed content should be crawled by mainaining allow lists.
You can do the opposite since the inception of robots.txt: User-agent: * Disallow: / and then whitelist google bot and whatnot. Most of the web is already configured this way. Just check robots.txt of any major website, e.g. https://twitter.com/robots.txt
That was my gut reaction too, but presumably unless it becomes regulated, at least some competitors to OpenAI won't respect any robots.txt and thus any open content might be training data.
- use coinjoin with something like wasabi wallet(https://wasabiwallet.io/)
- purchase BTC with cash