Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It sounds like you're advocating for a world where ordinary people need to choose between:

- not knowing the weather this weekend

- knowing the weather and signing away their firstborn child in the T&C

- becoming an amateur lawyer and spending dozens of hours reading and comparing T&Cs between apps to choose which one to use (until they change the T&Cs again of course, which they'll do without notifying you)



>becoming an amateur lawyer and spending dozens of hours reading and comparing T&Cs between apps to choose which one to use (until they change the T&Cs again of course, which they'll do without notifying you)

This right here is the issue. There's a protocol to these things focused on pushing out, but no reciprocal pipeline for feedback other than "clicked" to come back in. Clickwrapping should have been wholely dismissed as a valid medium for contracting. I'm willing to park on and die on this hill. A contract regime wherein oneside is the progenitor of all changes, is not, in fact, a meeting of anything.

And yes, I'm drinking my kool-aid at this point. Sucks being on the minimalist side of the Software world, but I'm doing my damnedest to cut out every EULA possible, replacing it with something wherein I can be assured the world won't be turned over on me at a moment's notice at the behest of a bunch of greed optimized psychopaths sitting on top of an infrastructure most of them would be powerless to keep running short of the economic game of Mutually Assured Destruction the West calls it's capitalist "free market" system (which is anything but once you scratch beneath the surface).


How about not granting location permissions and typing in your location manually? Weather forecasts worked fine before phones with geolocation built in.


We're talking T&Cs here. How would typing in your location invalidate you agreeing to (what a company would like to believe is legally binding) clickwrap T&Cs? Even if you deny individual permissions, apps will still slurp up your app list/hardware specs/any metadata they can get their grimy hands on, directly and indirectly through side channels. You're saying to give them 999 data points instead of 1000 and you think that's a solution?


>Even if you deny individual permissions, apps will still slurp up your app list/hardware specs/any metadata they can get their grimy hands on

Is this a purely academic thought experiment or something that's happening in practice? I'm not exactly sure what the "999 data points" consists of. Given basically nobody assembles their own phone, the most that hardware fingerprinting will reveal is "you have an iPhone 13", impossible to differentiate from all the other iPhone 13s floating around because they're all identical. Both android and ios have cracked down on software fingerprinting as well, so you can't for instance grab a list of all installed apps.


The crackdown was fairly recent, right? Do you think we should trust that both companies have at long last perfectly solved all privacy problems with this latest crackdown and now everything is perfect and we'll never have any privacy mistakes or side channel leaks ever again?

I don't know about iOS, but here's the situation on Android:

https://support.google.com/googleplay/android-developer/answ...

> The QUERY_ALL_PACKAGES permission only takes effect when your app targets Android API level 30 or later on devices running Android 11 or later.

So I guess end users should just check which SDK level their weather app was compiled for! Simple, right?

And if the parking app was compiled for SDK level 29, people should just go find another parking lot with a more recent app?

You're suggest technical solutions to social problems, and those rarely work out in the long term, especially with adversarial parties. Better to solve the problem at the source.


> You're suggest technical solutions to social problems, and those rarely work out in the long term, especially with adversarial parties. Better to solve the problem at the source.

That, to me, was the big takeaway from Attack Surface by Cory Doctorow. The idea that you can't "out tech" the State[1]. Because even if you, as an individual, are in fact (smarter|more talented|more capable) than any individual employed by the State, they still have you out-resourced to a degree that makes your cleverness moot. And as a defender, you only have to make one mistake and it's game over.

If I get Cory's point right, it's to say something like "as technologists, we should use our skills in service of effecting meaningful change through the democratic process", as opposed to creating better tech for evading State surveillance[2].

[1]: I think here you could probably read "the State" as "the State AND/OR BigCorps".

[2]: That said, there's probably still at least some basis for doing both. But "effecting change through the democratic process" is probably the better long-term strategy.


>And if the parking app was compiled for SDK level 29, people should just go find another parking lot with a more recent app?

The play store has minimum SDK level requirements, so you can't compile your app against an ancient SDK level to bypass all the restrictions. Moreover, your linked article suggests that even if you have an existing app that does this, the play store will eventually down your app if you don't provide an explanation. This is consistent with some complaints posted on HN recently, eg. https://news.ycombinator.com/item?id=41895718


You completely ignored the substantive part of my post, so I'll restate without distractions.

1. Do you believe that with these latest round of updates, our benevolent corporate overloads Google and Apple (both advertising companies to some extent) have at long last fully solved privacy, plugging every possible information leak and fixing every possible software bug, both present and future?

2. If you do, then do you think it's desirable that we expect every participant in modern society to enter into one-sided, legally-binding contracts with companies they've never heard of with every small action they take on a daily basis, and then use complicated technical measures to avoid fulfilling their end of those contracts?


>You completely ignored the substantive part of my post, so I'll restate without distractions.

I ignored those parts because you're moving the goalposts way past my original comment[1], which only objected to the claim that people were somehow coerced into having their location sold because the apps doing the tracking were providing "basic necessities". Is the fact you're using an iPhone, are visiting from an IP address that suggests you're in Kansas and using Verizon an "information leak"? I guess, by some definition. Is that anywhere close to getting your location tracked? Hardly.

[1] https://news.ycombinator.com/item?id=42117527


The reason Apple and Google continually patch and change their rules is because they have been playing a cat and mouse game with bad actors who, for decades, have continued to find ways to siphon personal data off devices despite the technical restrictions in place.

You seem to have an awful lot of confidence that "iPhone" and "Kansas" are the only pieces of data any app can get from a device.

So can we say that you agree with #1: after decades of playing cat and mouse with advertisers and spyware authors, these latest updates from Apple and Google are the magical updates that finally completely solved privacy once and for all, and there will never be any bugs or mistakes or security holes ever again?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: