Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> infinite-time or infinite-memory computer

That doesn't apply for the Bekenstein Bound though.

Literally the first line of the wikipedia article:

> In physics, the Bekenstein bound (named after Jacob Bekenstein) is an upper limit on the thermodynamic entropy S, or Shannon entropy H, that can be contained within a given *finite* region of space which has a *finite* amount of energy—or equivalently, the maximum amount of information that is required to perfectly describe a given physical system down to the quantum level.



I'm arguing against the use of Big-O "in the limit" as GP puts it; our tech is far away from that limit and O(N^{1/3}) is a better model.


I mean, if you want to talk about our actual tech, it's bound by lithography of silicon chips, which are largely n^2, on printed circuit boards, which are n^2, on the surface area of the earth, which is n^2.

This has been observed since at least 2014: https://www.ilikebigbits.com/2014_04_21_myth_of_ram_1/3_fit....

Pretty much every physical metric pegs memory access as O(n^1/2).


Okay, then we just use a bunch of tiny black holes and pack some extra dark energy between them, then we can get back to volumetric scaling. In fact, since dark matter isn't yet fully understood, once we can harness it, we can fit several times as much black hole surface area in the same space as a singular black hole.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: