And before you suggest that those users might still be using it on desktop more than on mobile:
"Mobile advertising revenue represented approximately 91% of advertising revenue for the first quarter of 2018, up from approximately 85% of advertising revenue in the first quarter of 2017." - https://investor.fb.com/investor-news/press-release-details/...
That doesn't say whether they are using an app on mobile or just a mobile browser. The app is (was? I haven't used it in ages) a notorious energy-hog that will drain your battery.
Unfortunately I can't find numbers that break down mobile site vs. native app use.
Judging by how poorly maintained and how rarely updated the mobile site is, I think it's safe to assume that the vast majority of people are using the native apps, not the mobile site. Major features like the marketplace and video tabs are nowhere to be seen on the mobile site, for example. If the mobile sites were getting the bulk of users, Facebook would surely prioritize it.
I don't think that's safe to assume at all. A lot of phone users (think mostly older people) still don't really know what an app store is. I don't have any numbers, but from seeing how 'ordinary' people use their phones, I wouldn't make any assumptions about preferring apps.
As for facebook's priorities - they could surely also be neglecting the mobile site with the aim of getting people to install and use their app. Again, I don't know if this is the case, but it's certainly plausible.
I think that might have been true in 2009. It's far from true today.
I just noticed this FB blog post: https://developers.facebook.com/ads/blog/post/2018/05/09/rel.... The chart at the top shows app vs. web use of mobile devices, not just for FB but overall. As I expected, mobile web is significantly smaller than native apps, from a time-spent standpoint.
Agreed. I haven't used the app in years. Instead I use mbasic.facebook.com. When I check the demographics Facebook puts me in for advertisers, I'm categorized as a "mobile user".
Did you uninstall that after using it? Otherwise that extension still has the ability to "read and change your data on all facebook.com sites." For all you know, it may have uploaded a copy of all your Facebook data to its servers already.
I deactivate the extension after usage. I also looked at the console/source code and the only external urls the extension is calling are google services (Analytics and Chrome Store) I have analytics blocked in the HOSTS file anyway.
If they didn't scan and detect child porn, there would be articles about how they're letting people get away with sharing child porn on Messenger. It seems there's no way for Facebook to win here, given that people want both complete privacy and also no illicit activity on the platform.
Thanks for that link - I can’t edit my above comment now to note that some scanning does happen.
I wonder how well it works, as the false positive rates must be huge? The idea of someone looking at my account and playing abuse/not abuse roulette is disturbing.
Well it's an automated system, so it's highly unlikely that someone is reading all (or any) of your messages. The volume of messages Facebook and Google process every day is astronomical, so no manual oversight process would scale. It's similar to how email spam filters have worked in a completely automated fashion for years. In this case, PhotoDNA works by comparing image hashes, so it probably has fewer false positives than spam filters.
But with a powerful search tool, one could go hunting for any 'type' of person they wanted based on social association info, geolocation or keywords. Its not benign or unweildy just because its large.
Well there's a good balance. Running PhotoDNA [0] on every image sent is a pretty good practice. Raising flags on any content that might break community guidelines is a completely different story. Two users might willingly want to break the code of conduct between them for whatever reason -- and Facebook wants to be able to halt that. In contrast, there's no legal grey area if you share child pornography. Just using an automated tool for that is great -- extending it to the entire platform's guidelines is not.
> so the original memo was more of a put up or shut up piece than an RFC.
That couldn't be further from the truth. Facebook's internal communications happen almost 100% exclusively through Facebook itself, meaning this "memo" was most likely a Facebook post, complete with liking, reacting, and commenting capability from anyone in the company.
Buzzfeed touched on this in their version of the story:
> One former employee who spoke with BuzzFeed News noted that they remembered the post and the blowback it received from some workers at the time. “It was one of [Bosworth’s] least popular and most controversial posts,” the ex-employee said. “There are people that are probably still not in his fan club because of his view.”
I think the issue is that Facebook hasn’t done anything to curb bullying, terrorism, fake news or electioneering. They allowed CA to abscond with user data and they took Russian money to target voters. If they can’t track their ads or comply with federal election laws, that’s a big problem.
So now you've given that extension full access to your Facebook account. And if you didn't uninstall it after using it, it's still able to see everything you do on Facebook. How is this any different from the Cambridge Analytica mess people were upset about in the first place?
And before you suggest that those users might still be using it on desktop more than on mobile:
"Mobile advertising revenue represented approximately 91% of advertising revenue for the first quarter of 2018, up from approximately 85% of advertising revenue in the first quarter of 2017." - https://investor.fb.com/investor-news/press-release-details/...