Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>And yet, from the App Store’s point of view, you can build a game with guns and cartoon violence and happily ship it to kids, while tracking your own body needs a 16+ “mature themes” label.

This really isn't an Apple problem, but an American culture problem. This is such a common trope in many forms of media:

* You can sell games with gratuitous amount of gore, but implied clothed intercourse gets you pulled from stores.

* You can get away with a lot of violence and possible sneak a PG-13 rating, but a single boob gets you rated R.



Well, no, because Apple categorizes all of these things separately. The world is not subject to MPAA notions about "Sex" or "Violence" -- rather, Apple splits those up into "does this app have any sex" or "does this app have any violence"

The author has a problem because what he is selling is an app to track sexual activity in explicit detail, which is a huge privacy invasion, and Apple's normal screens are rather good at noticing that

The author of this post is trying to sell an app that is not explicitly prohibited by Apple guidelines, but it is offensive to pretty much anyone who looks at it


> which is a huge privacy invasion, and Apple's normal screens are rather good at noticing that

Do you have any better evidence? Apple's App Store screens are notoriously useless; famously, Lastpass had to tell people to stop downloading their app from the App Store after Apple's "normal screen" replaced their app with a trojan horse: https://blog.lastpass.com/posts/warning-fraudulent-app-imper...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: