Hacker Newsnew | past | comments | ask | show | jobs | submit | shepik's commentslogin

oh, what a time to be a mouse!

Lots of folks are pointing this out, but it's not like there isn't evidence to suggest this will also apply to humans.

> Human knee tissue collected during joint replacement surgeries also responded positively to the treatment. These samples, which include both the joint’s supporting extracellular scaffolding, or matrix, and cartilage-producing chondrocyte cells, began forming new cartilage that functioned normally.


Too bad that as a mouse you die after two years tops.

Top comment my whiskered friend.

"brought to you by Carl's Jr"


The interface looks great.

Handling of "to" is a bit odd though.

150 kn in km/h [277.79976 kph] - this is unit convertion

150 kn to km/h [-276.79976 kph] - this is unexpectedly subtraction. I expected the unit conversion here, because of Raycast and Google


Yeah, this was a deliberate decision, but a tough one and I can see why you might find it annoying. I think it makes date and time calculations more readable:

09:11 to 15:46


If anyone ever asks me "I want to burn out as fast as possible, what should I do?" I will send them the link to your comment. Jokes aside, I think that I had very similar experience, and it does lead to burnout


FANGM all did layoffs. And many other companies as well. That they are all incompetent can't be the explanation.


Sure it is. Google for example increased headcount by 24% in a year and cut it 6% in the next year, a clear sign they made a bad decision.

Further in Google’s case they have plenty of money to pay these people until headcount naturally shrinks, but they don’t have anything useful enough for them to do that waiting and avoiding a large severance package is cheaper.


> avoiding a large severance package is cheaper.

it sounds like greed, not incompetence.

I just do not believe in a worldview where almost all CEOs of large tech companies are incompetent. Whatever the explanation for layoffs is, incompetence ain't it.


Greed would still try and avoid the need for a layoff because they are quite expensive. Google’s 6+ weeks salary + 6 months health insurance isn’t cheap.


It also applies to people: inherently insecure; their behavior cannot be audited or validated. We have a long history of prompt engineering attacks, which are in this instance called "social engineering".

Still, people are almost everywhere.


I suspect the 7000 years old field of accounting would raise objections to the notion that humans can't be audited...


I'm not sure. We audit tax forms and money transfers (and we can audit model inputs and outputs), but don't think we audit people themselves.


Just wait till artificial l neutral nets start reproducing themselves and paying their own way…


What do you think it is? Some corporeal being? It's an overgrown python script pulling weights out of a model.


Sure, and humans are just an overgrown chemical script pulling weights out of meat.


We are! Just animals. Nothing more.


That's a difficult question - i don't have experience outside of Russia so no idea what is unique and what isn't.

It is harder to get funds from foreign clients, primarily because customers are wary of paying money to a russian company (because of hackers and scammers, i think). But you can solve it by registering a US company.

There is a "local market" trap: Russian market is obviously smaller than US one, but still large enough to be considered as a viable single option. That's why many russian companies go solely for the russian market, and that's why companies from Belarus or Ukraine (where local market is not large enough) are often focused on US/Europe from the start.

As for the recent events, I am not a fan, and hope that the situation resolves peacefully.


Simple answer: Use tree if you need range access or to get elements ordered by key, and use hash otherwise.

More nuance: - hashmap may be resized if it's over capacity, the resize may cause a latency spike.

- hashmap is essentially a single random memory access, tree is a couple of accesses but they are not random

- tree is a bit like a sorted array with fast inserts/deletes. Some trees, like leveldb, are in fact sorted arrays (plus some tricks, of course)

- if you use b-tree, you are more memory-efficient (but less cpu efficient), and access to nearby elements is almost free. That's why b-trees are used to store data in a permanent memory

- there are many other tree variants, each of them with different trade-off


1) DuckDB - like the performance and that it's tightly integrated into python. Clickhouse local - i saw the announcement when it came out, but i just don't see the usecase for it in analytics. DataFusion is new for me. I'd say that the ecosystem is moving towards snowflake/bigquery/redshift/...

2) Why would you want to do that? You'd use bitmap index because it's quite compact and you can process the data at the speed of memory bandwidth, using ascii defeats that, no?

3) Not really. I just can't imagine costs of running any large website or app with serverless.

4) I think it's the same in the US - everyone competes for the same talent with FAANG (MAANG?). The salary gap between big companies and startups is even lower in Russia. Also, we humans want to do meaningful things, some of us struggle to find meaning in being another bigco employee.

5) Py


Thanks.

> Why would you want to do that?

We store user preferences (200+ yes/no knobs) in a bitmap (well, a bitmap-index like the one in Hash Mapped-Array Tries). We want to capture those prefs in a single sub-domain (limited to 63 lower-case alphanumeric chars) or a URL (limited to 200 mixed-case alphanumerics). Today, we simply convert the bitmap into url-b64 (or, b32 to store it in the subdomain), but we will soon run out the 63-char limit if we introduce more knobs.

A demonstration of it is here, in case the above didn't explain it well: https://rethinkdns.com/configure (choose blocklists, and see the selection generate a path appended to the base-url shown in the search-bar).


Oh. No, i don't think it's possible - i'd suggest to just use multiple subdomains.

Technically, you can squeeze out some bits: there are 36-37 possible characters of which you are using only 32, so with arithmetic coding you would be looking at about 1 extra bit for 5 characters, but it's a nightmare to code. And after those extra bits run out, you will get the same problem anyway.


Does another party need to decode the url? What about using a dictionary for the top 10k seen starting combinations and then encode the rest?

What about run length encoding? 1-9 for positive sequences. a-i for negative sequences (max means pattern continues) and the rest for frequent patterns like alternating sequences, etc

9967b would be 24 yes, 1 no, 7 yes, 3 nos, 1 yes etc


Thanks. RLE is a hit-and-a-miss (101010101010 etc); top-k is heuristic-based. Those are viable solutions, nevertheless (esp, top-k). Puny Code, which DNS uses, I thought was pretty neat for fitting in a state-machine in printable characters. Takes a bunch of CPU resources to restore it, though.

I was wondering if there are other techniques that I may not be aware of.


IMO most of Yandex/Ozon revenue is in Russia, so you a betting both on a company and on a ruble. Semrush, on the other hand, has most of its revenue outside of Russia (i think).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: