However, AI-powered technology used by law enforcement has been proven to exacerbate racial biases.
This is a little misleading. Flock is primarily an ALPR that can identify make/model/color/identifying-feature of vehicles. It's not facial recognition. It doesn't itself have a racial component. The modal "proactive" Flock intervention (as opposed to investigative searches after crimes) is to flag a moving vehicle as stolen.
But in practice, the outcomes of deploying Flock are racialized, because the hot lists states keep of stolen vehicles aren't accurate enough for real-time enforcement, so recovered vehicles stay on the lists and false-positive. You're disproportionately likely to have a vehicle on a hot list if you live in a low-income neighborhood.
Even then: it's not clear how any of this is apposite to a Ring/Flock partnership. You can't use a Ring camera to do realtime ALPR flagging of cars. Presumably, this supports Flock's "single pane of glass" product; they just want police going to Flock for all their video needs. Police already canvass Ring and Nest cameras during investigations.
What's your point? Obviously we wouldn't allow something like this to be deployed in our community. But what does that have to do with Flock providing an interface to police departments for collecting Ring camera footage?
I don't think that's a good thing --- I think PDs should continue to manually canvass for footage and specifically ask residents for footage when they need it, I think that's actually the right public policy and we don't need to innovate past it. But good or bad, it has nothing to do with Flock's "AI".
The point is Flock intends to expand their product offerings far beyond license plate reading. It has to do with Flock's AI because all the features in all the products will converge on selling surveillence of civilians. All their products will feed data to all their other products because centralized data makes their features more attractive to LEOs. The further embeded they are in an investigation the more they maximize profits for shareholders. The math is pretty simple.
Also the thing not being mentioned anywhere in this thread that is in other reporting is that the Flock partnership is an interface for police to ask the Ring owner to share their footage with police. A thing that police can already do under the current system. It seems to just be a "single pane of glass" for both systems.
Right; it bugs me that TechCrunch is doing this "AI is racist" thing here. Like: I think a lot of naive AI is in fact racist. But AI has nothing --- that I can tell --- to do with this story. Do they know what they're talking about or not? I have an issue with this "jazz hands" stuff.
I feel the same way about this Flock stuff as I did about all the NSA stuff back in the Snowden days: like, directionally, I get it, I'm on the same page, but I know just enough to know when Glenn Greenwald is just making shit up, and I can't let that slide.
This is a little misleading. Flock is primarily an ALPR that can identify make/model/color/identifying-feature of vehicles. It's not facial recognition. It doesn't itself have a racial component. The modal "proactive" Flock intervention (as opposed to investigative searches after crimes) is to flag a moving vehicle as stolen.
But in practice, the outcomes of deploying Flock are racialized, because the hot lists states keep of stolen vehicles aren't accurate enough for real-time enforcement, so recovered vehicles stay on the lists and false-positive. You're disproportionately likely to have a vehicle on a hot list if you live in a low-income neighborhood.
Even then: it's not clear how any of this is apposite to a Ring/Flock partnership. You can't use a Ring camera to do realtime ALPR flagging of cars. Presumably, this supports Flock's "single pane of glass" product; they just want police going to Flock for all their video needs. Police already canvass Ring and Nest cameras during investigations.