Is AI facial recognition software, like Clearview AI, reliable enough to be used as evidence in a court of law? Glenn, who is against Clearview, has a friendly debate with Ohio Attorney General Dave Yost, who wants to expand Clearview’s use in court cases. So, how do we balance the good that Clearview can do and the bad that it is capable of doing in the wrong hands (for instance, a totalitarian government)? AG Yost gives his thoughts and also previews how he hopes the Supreme Court will rule on this.
Transcript
Below is a rush transcript that may contain errors
GLENN: I am thrilled to have the Ohio Attorney General Dave Yost on with us. He served as the auditor of Ohio, for a long time. Like eight years.
And then he became attorney general. I think he's -- I think he won it with more votes than anybody else in the history of Ohio has. And he is defending and fighting for something called Clearview. Now, I like Dave.
But I'm against Clearview. And maybe he knows something that I don't know. So I want to have a conversation with him about what is happening in Ohio, and what's being heard now in the courts.
Dave, welcome to the program.
DAVE: It's good to hear your voice.
GLENN: Thank you, sir. By the way, thanks for everything you've done. You're really making a difference.
DAVE: You're very kind, thank you.
GLENN: Talk to me about the case now of Clearview. Which is an AI facial recognition, and it is a great tool for law enforcement. But it frightens me a great deal. Talk to me about the case.
DAVE: Sure. So let's -- let's start with the backs of crimes.
Fellows walking down the street. Minding his own business. Mind you, this guy has no criminal background. He's just -- he's a good guy.
Pays his taxes. Goes to work. He's walking down the street on February 14th, Valentine's Day. Day of love. And the bad guy, I'm not going to use his name comes up behind him, robs him on the street, shoots him twice in the back, and runs off. Now, surveillance cameras see him, that are just on the street. See him going into a particular apartment. Well, fast forward a week. Police doing their investigation, trying to figure out what happened.
And as a -- he goes to a convenience store, and the surveillance camera there, over at the cash register, picks up his face.
And he goes back, same kind of route, to the same apartment.
And so they go, hmm.
Wonder who lives there.
And they run the probation website, or the parole website from the Department of Corrections.
Lo and behold, then they run that guy against the -- excuse me, they grab a facial freeze frame, off of the convenience store footage, and run it through Clearview AI, and it's a match.
So they say, a-ha! They go in, and get a search warrant from the judge. During the search, they come up with the gun.
The murder weapon.
And so they arrest the guy. They've got a pretty good case at that point.
That guy goes to court and complains. And says, hey, that facial recognition tough is not reliable. They say right on there, that you can't rely on it.
And don't use it.
GLENN: Right.
DAVE: And the judge tosses the results of the search. Which means, this guy is going to walk, if we don't have the murder weapon for evidence.
GLENN: Right. And he tosses that because the clear view evidence is what got you the warrant. So anything is fruit of a poison tree. Correct?
DAVE: Well, that's what the argument is.
GLENN: Right. Right. Right. Right.
DAVE: But the law says there's a long-standing, decades long good faith exception. And you're only supposed to use the fruit of the poisonous tree. If there's no other -- if there's bad faith. And there's no other option to do it.
There was no bad faith here.
And, in fact, there's other useful evidence, probative evidence, including seeing the guy go into that apartment, that is useful. And supporting probable cause for the search warrant.
GLENN: It also reminds me a little bit of the glove doesn't quit, you must acquit.
The use of DNA evidence during OJ Simpson. Everybody said the same thing. That's unreliable. We don't even know what that is. Could be one out of every 100 have the same kind of -- they made all kinds of crazy things.
And so that was tossed out. Because people didn't understand how accurate, that was. Pragmatism so I don't -- I don't disagree with you at all. This is a great thing to get the bad guys.
However, Clearview. What they have done, is they have scraped billions of images, without anybody's consent off of the internet.
And I believe it's very, very accurate. And the argument would be, well, I'm not doing anything bad or wrong. So I don't have to worry about it.
But I don't -- you know, in a time where we're headed for AI the way we are. And what's happening in China.
This is exactly the kind of technology that is used for governments to track everybody.
How do you balance that the crazy world that we live in, to make sure it doesn't become a tool like China?
DAVE: Well, you know, Glenn, I worry about that too. And I think that the solution is the regulation of the use of the thing.
For example, we do not permit here in Ohio, the use of -- of facial recognition, without anything more to support an arrest warrant. It can only be used as a lead.
Then you have to go out and do the shoe leather. To prove that the guy you think it is, is the guy you're looking for.
GLENN: Which is what you did.
You used that. And you didn't arrest him because we had the AI. You arrested him because you had that, got you a warrant, you got in, you found the gun. Right?
DAVE: Well, it was a search warrant that got us --
GLENN: Right. Right. What I mean is, what you're saying you wanted to be used like, is exactly what you did. You didn't go get the guy because he was on Clearview.
DAVE: Exactly right. And here's the rubric. I know you're a fact guy. But you love -- you love, how do we think about this?
We have public spaces everywhere. So a cop can stand on the corner and observe all day long.
Ask sit there for an eight-hour shift. And just watch. And anything he sees is fair game. They're allowed to react right there. And that's not improper surveillance. Because it's a public place.
When does it stop becoming a public place? Or proper?
When it becomes a private place. If it's your home. If it's in some circumstances, your business.
You have to have probable cause, get a somewhere to sign off on that. I think when we're talking about these technological things, the question is: What is the government allowed to do with it? And what -- and did it occur in public or in private?
When we're talking about Facebook, you know, I'm sorry. It's electronic. But that's kind of a public place.
That's more like the cop standing on the street corner. On the other hand, the cop standing outside.
We just had a Supreme Court case about this a couple years ago. A cop standing on the street, but using sensitive ultraviolet thermal imaging to look for marijuana grows. They're looking at what's going on inside your private residence. That means, that's a Fourth Amendment violation.
So I think that this principle of public versus private fears. Goes a long way, to helping us think through this.
GLENN: Yeah, they are.
So, Dave, I want you to know. I mean, I hope this hasn't felt like a hostile interview.
I want you to know. I'm a fan of yours. But I'm very, very concerned of this slippery. Almost straightdown slope to the cage that AI could build for people.
And we could have all of the best intentions. But it falls into the wrong hands.
You know, we lose several elections in a row.
And, you know, it could be -- it will be a prison. It will be a panopticon. Like it is, in China.
And so that's why I'm concerned about it.
So this is in front of the Supreme Court. Closing arguments haven't happened yet.
DAVE: Nope.
GLENN: How do you think this is -- the court will look at this. And what do you think will happen?
DAVE: Well, it's a case of first impression, right?
I mean, we haven't had a lot of cases, challenging the intersection of the Fourth Amendment, protecting our privacy, in our homes. And papers.
And this new technology. So we're arguing for a narrow reading of it.
But that it should be -- it should be an available tool.
GLENN: Right.
DAVE: To your point earlier, I couldn't agree with you more. It scares me, what government can do about this.
If you think about, back to the Biden administration. And social media ask what they were doing.
GLENN: Uh-huh.
DAVE: Multiple that. Make that geometrically larger. That's the potential. We've got to be vigilant.
GLENN: What is the difference between this, and, for instance, in Texas, you can't clock me speeding with a camera.
A cop has to be there, to stop me. And even they can take a picture of me, driving the car. Et cetera, et cetera.
They cannot ticket me for speeding. It has to be a physical police officer.
What is the difference between this, do you think?
DAVE: Well, and that's a great -- that's the same law we have in Ohio. And that's a great example of how the government can restrain technology. To prevent it from going too far.
That's not a constitutional issue. That's a statute that the general assembly passed. And said, we're not going to let you do this. Yes. You've got the technology. We're not going to let you do this. That's just too far.
GLENN: Okay. Dave, I mean, I appreciate that at least you and others are thinking deeply about this because we're on the verge of a whole Brave New World.
And I honestly don't know what the right answer is. I mean, the law -- you know, law-abiding citizen in me, is like the guy clearly -- you've got the gun in his house. He clearly did it.
But the person that is concerned about this new technology, and things like China.
I just don't know how to balance it yet. But I appreciate the conversation. Thank you so much.
DAVE: Thanks for having me on.
GLENN: You bet. That's Dave Yost. He's the Ohio Attorney General, and that is happening in Ohio right now.