The Verge · Feb 16, 2026 · Collected from RSS
Today, let’s talk about the camera company Ring, lost dogs, and the surveillance state. You probably saw this ad during the Super Bowl a couple of weekends ago: Since it aired for a massive audience at the Super Bowl, Ring’s Search Party commercial has become a lightning rod for controversy — it’s easy to see how the same technology that can find lost dogs can be used to find people, and then used to invade our privacy in all kinds of uncomfortable ways, by cops and regular people alike. Ring in particular has always been proud of its cooperation with law enforcement. That raises big questions about our civil rights, especially since Ring announced a partnership last fall with a company called Flock Safety, whose systems have been accessed by ICE. There’s some complication to that — we’ll come back to it in a bit. The backlash to Ring’s Super Bowl ad was swift, intense, and effective: the data company PeakMetrics says conversation about the ad on social platforms like X actually peaked two days after the Super Bowl, and the vibes, as they measured them, were strikingly negative. I mean, you know it’s bad when Matt Nelson, who runs the weratedogs account, is posting videos like this: View this post on Instagram Sen. Ed Markey called the ad “dystopian” and said it was proof Amazon, which owns Ring, needed to cease all facial recognition technology on Ring doorbells. He said, “This definitely isn’t about dogs — it’s about mass surveillance.” And then, on Thursday, February 12th, just four days after the Super Bowl, Ring announced it was canceling its partnership with Flock, in a statement first reported by The Verge’s Jen Tuohy. That statement itself is a lot: Following a comprehensive review, we determined the planned Flock Safety integration would require significantly more time and resources than anticipated. As a result, we have made the joint decision to cancel the planned integration. The integration never lau
Today, let’s talk about the camera company Ring, lost dogs, and the surveillance state.You probably saw this ad during the Super Bowl a couple of weekends ago:Since it aired for a massive audience at the Super Bowl, Ring’s Search Party commercial has become a lightning rod for controversy — it’s easy to see how the same technology that can find lost dogs can be used to find people, and then used to invade our privacy in all kinds of uncomfortable ways, by cops and regular people alike.Ring in particular has always been proud of its cooperation with law enforcement. That raises big questions about our civil rights, especially since Ring announced a partnership last fall with a company called Flock Safety, whose systems have been accessed by ICE. There’s some complication to that — we’ll come back to it in a bit.The backlash to Ring’s Super Bowl ad was swift, intense, and effective: the data company PeakMetrics says conversation about the ad on social platforms like X actually peaked two days after the Super Bowl, and the vibes, as they measured them, were strikingly negative. I mean, you know it’s bad when Matt Nelson, who runs the weratedogs account, is posting videos like this:Sen. Ed Markey called the ad “dystopian” and said it was proof Amazon, which owns Ring, needed to cease all facial recognition technology on Ring doorbells. He said, “This definitely isn’t about dogs — it’s about mass surveillance.”And then, on Thursday, February 12th, just four days after the Super Bowl, Ring announced it was canceling its partnership with Flock, in a statement first reported by The Verge’s Jen Tuohy. That statement itself is a lot:Following a comprehensive review, we determined the planned Flock Safety integration would require significantly more time and resources than anticipated. As a result, we have made the joint decision to cancel the planned integration. The integration never launched, so no Ring customer videos were ever sent to Flock Safety.The company also goes on to say that Ring cameras were used by police in identifying a school shooter at Brown University in December 2025. It’s an odd non sequitur in a press release about canceling a controversial partnership that really explains a lot about Ring, and how the company sees itself.As it happens, Ring’s founder Jamie Siminoff was just on Decoder a few months ago, talking about how and why he founded the company, and in detail about why he sees Ring’s mission as eliminating crime. Not selling cameras or doorbells, or floodlights, or anything else Ring makes, but getting rid of crime.We actually talked about Search Party and how people might feel about that kind of surveillance, and how Ring works with the cops quite a bit. In fact, Jamie briefly left Ring in 2023, and the company slowed down on its work with law enforcement. But ever since he returned, the emphasis on crime and the work with police has only intensified. I asked him about it:NILAY PATEL: You left, Amazon said we’re going to stop working with police, you came back, boy, Ring is going to work with police again. You have a partnership with Axon, which makes the taser, that allows law enforcement to get access to Ring footage. Did that feel like a two-way door? They made the wrong decision in your absence, and you came back and said, “We’re going to do this again”?JAMIE SIMINOFF: I don’t know if it’s wrong or right, but I think different leadership does different things. I do believe that I spent a lot of time going on ride-alongs. I spent a lot of time in areas that I’d say are not safe for those people, and I’ve seen a lot of things where I think we can positively impact them. So, we don’t work with police in the way of ... I just want to be careful, as we’re not ... What we do allow is for agencies to ask for footage when something happens. We allow our neighbors, which I’ll say in this point are our customers, just to be clear, we allow our customers to anonymously decide whether or not they want to partake in that.So, if they decide they don’t want to be part of this network and don’t want to help this public service agency that asks them, they just say no. If they decide that they do want to, which, by the way, a lot of people want to increase the security of their neighborhoods. A lot of people want their kids to grow up in safer neighborhoods, a lot of people want to have the tools to do that, and are in places that are dangerous. We give them the ability to say yes and make it more efficient for them to communicate with those public service agencies, and also do it in a very auditable digital format.That’s the other side. Today, without these tools, if a police officer wanted to go and get footage from something, they’d have to go and knock on the door and ask you, and that’s not comfortable for anyone. There’s no digital audit trail of it, and, with this, they can do it efficiently with an audit trail. It is very clear, and it’s anonymous.Jamie actually talked a lot about searching for dogs in this context, because one of the reasons he was so excited to come back to Ring was to use AI to search through the massive amounts of video generated by Ring cameras. In fact, he told me that Ring could not have built Search Party five years ago, because AI systems to do it weren’t available.Jamie is nothing if not direct about this, which I appreciate. The man really thinks you can use AI and cameras to reduce or even eliminate crime. But I had a lot of questions about this:JAIME SIMINOFF: But when you put AI into it, now, all of a sudden, you have this human element that AI gives you. I think, with our products in neighborhoods and, again, you have to be a little bit specific to it, I do see a path where we can actually start to take down crime in a neighborhood to call it close to zero. And I even said, there are some crimes that you can’t stop, of course.NILAY PATEL: Mechanically, walk people through what you mean. You put enough Ring products in a neighborhood, and then AI does what to them that helps you get closer to the mission of zeroing out crime?So, the mental model, or how I look at it, is that AI allows us to have ... If you had a neighborhood where you had unlimited resources, so every house had security guards and those security guards were people that worked the same house for 10 years or 20 years, and I mean that from a knowledge perspective. So, the knowledge they had of that house was extreme; they knew everything about you and that residence and your family, how you lived, the people that came in and out.And then, if that neighborhood had an HOA with, call it private security, and those private security were also around and knew everything, what would happen? When a dog gets lost, you’d be like, “Oh, my gosh, my dog is lost.” Well, they would call each other, and one of them would find the dog very quickly. So, how do we change that and bring that into the digital world is—Can I just ask you a question about that neighborhood specifically?Sure.Do you ever stop and consider that that neighborhood might suck? Just the idea that every house on my street would have all-knowing private security guards, and I would have an HOA, and that HOA would have a private security force. You can easily paint that as dystopia. Everyone’s so afraid that we have private cops on every corner, and I’m paying HOA fees, which is just a nightmare of its own.So, I would assume you live in a safe neighborhood.I hope so, yeah.No, today, I’d go to ... If you want, I’ll take you to a place where people live and have to, when they get home from school, lock their doors and stay in their house, and they can’t go out and—But I’m just saying that that model is “everybody is so afraid that they have private cops.”I think the model is that doing crime in a neighborhood like that is not profitable, and I think that you want people to move into another job. I don’t think that crime is a good thing and so I think ... But listen, it certainly is an argument to have, I do believe that ... I think safer neighborhoods allow for kids to grow up in a better environment and I think that allows them to be able to focus on the things that matter and so that’s what we’re going for.I just wanted to challenge the premise.I think it’s a fair challenge.The model is that there are cops everywhere. That level of privacy.Yeah, it’s not cops. I think it’s more that you’ll have the ability to understand what’s happening. It’s not like ... But yeah, I think, listen, it’s a fair statement, I guess. I think I want to live in a safe place.There’s a lot of intelligence in your neighborhood, and maybe it’s private security, maybe it’s not. What does the AI do? Does it just make the camera smarter? It lets you do a more intelligent assessment of what the cameras are seeing?Right now, we just say motion detection, motion detection, motion detection. It’s funny, when I started Ring… The book was fun because I got to go back and actually go through this whole story of how this thing came to be, and motion detection was an amazing invention. You’re in the airport, and there’s a motion at your front door, and you look at it like, “Wow, this is crazy.”Now, with AI, we shouldn’t be telling you about motion detection; we should be telling you what’s there, when you should look at it, when it matters, and we shouldn’t be bothering you all the time. That’s what I mean by this idea of these security guards at your house or in your neighborhood. There should be this intelligence in your neighborhood that can tell you when you should be trying to be part of something, but not always tell you. So, it’s not just like, “Car, car, dog, person, person.” It’s like, “Hey, look at this. You want to pay attention to this right now.”I really pressed Jamie on this because I still don’t think it is entirely clear how Ring accomplishes the elimination of crime through AI alone. And it’s why people don’t trust the company when it says it won’t use systems that can f