Hacker Newsnew | past | comments | ask | show | jobs | submit | KPLauritzen's commentslogin


An AI horror story disguised as My Little Pony fanfic


The two most difficult things in programming: caching, naming and off-by-one errors.


The three most difficult things are naming, cache invalidation, off-by-one errors.

And latency.


Latency is not difficult.

It is just lack of knowledge or focus to understand the total execution path and remove unnecessary elements from it.

I have worked in algorithmic trading where I did research and I designed framework to respond to stock exchange messages within 5 microseconds, consistently.

5 microseconds is difficult and requires some special techniques but 100 microseconds is quite easy and requires just understanding what is execution path and removing all unnecessary operations from it.

If you look at contemporary developers and web frameworks it doesn't seem anybody puts particular care into understanding those things and only try to twiddle couple of things and see how that transfers to performance.


Latency is (where I am used to it) a hard physical limit imposed on the upper end by the speed of light. When you have a system that literally spans a planet, there's only so much clever algorithms (and hardware) can do to speed up the time difference between "things happen in one end" to "things happen on the other".


Yeah, it is hard problem when you are hitting physics.

But then it really isn't a problem -- it is a limitation. Nothing you can do much about.

There are of course types of applications where latency is really tough topic. For one, low-latency trading is where companies are in arms race against other low-latency market players and there is always improvements form improving your product's latency.

But in a typical situation like websites and corporate services, the latencies required are so large that it is quite easy to reach them -- just don't do anything stupid.


Edit: I was wrong. I'm comparing monthly cost and yearly income.

I think your math is wrong. 4,000 CHF / 100,000 CHF = 4%


I'm gonna go on a limb and say the 2500-4000 CHF was per month which would be 30,000-48,000 per year :)


I'm pretty sure the 4000 CHF is monthly :)


Oops. My bad.


4000 * 12 months = 48000 CHF


Without considering salaries you can look up the costs for their compute: https://cloud.google.com/compute/pricing 128,000 CPUs and 256 GPUs I think they mention training for 2 months in the video


another commenter near the top (with some experience posted), estimating ~$2500/hour. 60grand a day to use hundreds of thousands of cores to learn to play computer games, roughly 1.8mill for 30 days of active learning. It's cool, does seem a little bit greedy, that is still expensive as buck yo. you need a big ol bank to fund you. dropping 60k/day on compute doesnt fly for many smaller companies if you ask me.


As our understanding of "AI" gets better, it'll cost less and less and will start to be affordable for smaller players; but the initial R&D always cost a lot.


They link to a short description of the reward function in the blog: https://gist.github.com/dfarhi/66ec9d760ae0c49a5c492c9fae939...


Wow, very excited about this. I don't know too much about RL, but for me the "170,000 possible actions per hero" seems far too large an output space to be feasible. What happens if the bot wants to do an invalid action? Nothing, or some penalty for selecting something invalid?


The interview with Hugo Martin, the creative director of DOOM, is great - https://www.youtube.com/watch?v=LVLecokaRv4


I think GP meant the Valve "New Employee" Handbook


They've been in overtime before. In game 2 I think. AlphaGo spent about 30 seconds on each move


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: