Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>> it seems like they were worried that the Mac was increasingly marginalized as a platform for AR/VR development

As well as ML development, which is a pain w/o a local CUDA-compliant GPU. Yes, you could do it on the cloud, but sometimes it is easier to do ad-hoc stuff locally without having to spin up GPU machines and remember to spin them down.



Ding ding ding.

I use OS X for all software development except CUDA at the moment. This is huge.


isn't lambda/cloud function/server-less the suitable alternative for that pain point ?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: