Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is a different proposition, really. It’s one thing to move up the layers of abstraction in code. It’s quite another thing to delegate authoring code altogether to a fallible statistical model.

The former puts you in command of more machinery, but the tools are dependable. The latter requires you to stay sharp at your current level, else you won’t be able to spot the problems.

Although… I would argue that in the former case you should learn assembly at least once, so that your computer doesn’t seem like a magic box.



> It’s quite another thing to delegate authoring code altogether to a fallible statistical model.

Isnt this what a compiler is really doing? JIT optimizes code based on heuristics, it a code path is considered hot. Sure, we might be able to annotate it, but by and large you let the tools figure it out so that we can focus on other things.


But the compiler’s heuristic optimization doesn’t change the effects of the code, does it? Admittedly I’m no compiler expert, but I’ve always been able to have 100% trust that my compiled code will function as written.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: