Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

…and a depth sensor. Not sure why they cut it last minute


https://twitter.com/SadlyItsBradley/status/15806571095488225...

"RUMOR: Why Meta removed the Depth Sensor at the last minute

It allowed you to see people without clothes. It was not it's purpose, but during testing: someone noticed that by using the sensor someone could develop “creeper apps”

Privacy chaos, they prefered to skip the sensor "


I think that’s an excuse of convenience. It was likely just penny pinching imho.

They’re adding it to the Quest 3 and iPhones/iPads have had depth sensors for years now. It’s not be an issue for them.

They control the OS. They can control the drivers. It’s avoidable.


Lidar and depth sensing are the same


They are not the same. Apple vision has both LiDAR and a depth camera.

https://lidarandradar.com/differences-between-the-lidar-syst...


Ah yeah you’re right. Apologies for the mixup. However that link isn’t very good at explaining either, since it doesn’t describe Apple’s TrueDepth tech for the “depth sensor” which is a generic term for many technologies, whereas TrueDepth is specific.

Here’s a better one https://www.eyerys.com/articles/how-apples-lidar-sensor-diff...

Basically Lidar is time of flight and longer range. TrueDepth is a high density grid that’s pattern analyzed.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: