They are selling a configuration that costs $810.00 on Framework's website for only $549.00. Zero actual info on the about page or Google. I would treat it with suspicion at best.
They're just selling the motherboard on its own, not a whole laptop. To make if a complete system, you'd have to buy a laptop chassis from Framework's parts site and install the motherboard yourself.
This is a primary source. Journalists can now read (low-quality ones will use AI summaries) this information and dramatize, cut up and cherry-pick as necessary to get their readers to care.
It's only been in conservative press recently, with the terrorist connection tacked on. The same story ran the liberal/local news circles earlier in the year as most convictions were being handed out.
I know 4 people who were laid off this year. 2 federal government (1 contractor) and 2 large corporate. Entirely anecdotal, but the data I see isn't good.
Cameras with good software work great for that, however the data should NOT be freely accessible outside of the city/jurisdiction they surveil. That's the issue with Flock vs any other AI camera/database product.
First time I hear about Google tech being insecure or not private. Sure they siphon all the info THEMSELVES, but never have I heard about them implementing insecure protocols.
> but never have I heard about them implementing insecure protocols.
That's because they don't. Google takes security seriously. There's a reason GrapheneOS is only supported on Pixel devices currently as well, because of certain hardware security features.
Nothing you do with Google is private from Google but it's certainly designed to belong only to Google, your data is one of their most important assets. Of course they are going to secure it and prevent others besides themselves from getting or using it.
It's the most common misconception with Google, that they "sell your information." They don't, they never have. They use your info, aggregated with all other Google users, to sell targeting for ads. They don't sell the actual data.
> Nothing you do with Google is private from Google but it's certainly designed to belong only to Google
The same also goes for Apple, although Apple doesn't monetize your data as much so they collect less. They'll suck up all kinds of data out of your devices but will strictly protect that data from third party applications any way they can. They're also willing to use that protection to prevent interoperability or integration with third-party devices.
The difference for me is in the business model, and the fact that Apple offers true E2E encryption for photos while Google doesn't. If Google ever made their own version of Advanced Data Protection for Pixel phones, it'd be a wash.
Apple does pose more private defaults, though they will easily steer your towards "make backups encrypted with a key we also know in case you lose your password", which isn't much more private than Google's proposition.
When Google announced their AI hardware features, I was hoping they they'd implement the same offline/encrypted photo indexing that iOS does, rather than shoving everything through the cloud. Unfortunately, Google Photos seems as bad as ever.
On the other hand, setting up automatic backups and photo sync towards a self-hosted Immich/Photoprism instance is a lot easier on Android than on iOS in my experience, despite Google's reluctance to grant storage permissions to apps.
Google does actually have a kind of extended protection (https://developer.android.com/privacy-and-security/advanced-...), but that feeds more data to Google rather than less: it basically has you trust Google to protect you, by having Google pre-scan your browsing and locking down your account. If you're American, that may be worth it if you trust Google enough. It's a combination of Lockdown Mode and Advanced Protection Mode on iOS.
Yeah. They sell access to you, so an advertiser can tell Google "I want this shown to a mid-thirties tech worker living in SF who likes x,y,z and frequents traveling to q" and Google will show you their ad.
On every Apple interoperability thread this argument comes up and at this point I'm convinced it's part of some coordinated effort; surely no one can be that clueless to actually believe this, especially on a technical forum?
AirDrop is a peer-to-peer protocol, both the recipient and initiator need to explicitly take action, and even in Apple's implementation provides no authentication (recipient device is chosen by name, which anyone can change in their settings app). There is no way the existence of this Android client would reduce Airdrop security on iOS.
Do you also believe that TLS between an Apple device and a Windows device not secure either, since the Windows device uses a different, non-Apple-sanctioned TLS implementation, and the mere existence of which would somehow weaken Apple's TLS stack?
reply