Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> very little money in optimizing deep learning

Oh - there are a lot of people working on optimizing AI. Amongst hobbyists, academia, and corporations alike.

The thing is, if you come up with a neat optimization that saves 30% of compute for the same results, typically instead of reducing your compute budget 30%, you instead increase your model/data size 30% and get better results.



Jevon's paradox of data and AI. The more efficiently data is used, the more demand their is for data.


Any state of the art model takes about three weeks to train.


More an indication of human patience than task difficulty.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: