Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
Tartle Best Data Marketplace
June 30, 2021

Facial Recognition and Biometric Technology - Wrongly Accusing People With an Algorithm

Facial Recognition and Biometric Technology
BY: TARTLE

Facial Recognition and Consent

Facial recognition is quickly becoming a common tool in many aspects of life. It’s being used in stores to recognize customers as soon as they walk through the door. This can then feed back into Facebook and other social media in order to send you ads for the store. That of course gets fed into other algorithms so that you will be sent ads for similar stores. 

Another increasingly common use of facial recognition software is in device security. Phones, tablets, and PCs are now often unlocked by scanning the user’s face. If you don’t think your face is getting stored by Google, Apple and others to keep for some undisclosed purpose, then I have some swamp land on Tatooine I’d like to sell you. 

Then of course there is the security use of this software. You may have noticed cameras popping up here and there in a city near you. They’ve been in some places like Washington D.C. and London for years. These cameras constantly scan and record activity. Initially, this would have simply been to record any criminal activity so that the perpetrators could be swiftly apprehended. However, with facial recognition, they are constantly scanning faces in the crowd, looking for criminals. 

You might ask why that’s wrong. After all, don’t we want criminals apprehended? Of course we do. However, it should not come at the price of being treated as a criminal without having actually done anything. How many people were asked if they wanted cameras everywhere recording their every movement?

Come to think of it, how many people were asked if they wanted any of these new developments? Okay, when it comes to the screen unlocking it’s fair to say that people are agreeing to it when they buy the device and selecting their security preferences. But the rest of it? How many of us really want to be fed a bunch of ads just because we walked into the local GAP? Or even be bothered with a pop up asking us to opt in or out? And why does anyone think we would all like to be scanned to see whether or not we are wanted for any crimes? How hard would it be for that kind of technology to be used to locate not just criminals but people who the state does not approve of? Perhaps the most important question to ask is how, how do we find ourselves in a situation in which we even have to worry about the misapplication of this kind of technology?

There are too many reasons to explore here. However, one of the big ones is the simple fact that we have a hard time not doing something once we realize that we can, or even that we might be able to do a certain thing. Or to paraphrase Dr. Malcom in Jurassic Park “we are often so concerned with whether or not we can, but never stop to wonder if we should.” We develop a new technology and before we’ve even stopped to consider the implications, we are rushing ahead with new applications. Just think of nuclear technology. It has enormous potential for providing energy to the world but was first turned into a bomb. That tendency to leap before we look also manifests itself in the form of various justifications for whatever we are doing. For example, the fact that certain ‘ethicists’ openly wonder if consent is really necessary if people are being spied on without them knowing about it; ‘If they don’t know, does it really matter?’ The fact this question can even be asked and taken seriously by some should be deeply concerning to all. How many violations of liberties, how many crimes and injustices can be justified with exactly that same ‘reasoning’? 

How do we stop this? How do we fight this tendency of human nature without becoming luddites? By remembering that we are all individual human beings, full of dignity and worthy of respect as unique creations. If something is going to be happening to us, even something innocuous, we had better have a say in it. Only by treating each other in this way, with true respect, can we hope to preserve any kind of society that respects individuals and their choices.

What’s your data worth? Sign up and join the TARTLE Marketplace with this link here.

Summary
Facial Recognition and Biometric Technology - Wrongly Accusing People With an Algorithm
Title
Facial Recognition and Biometric Technology - Wrongly Accusing People With an Algorithm
Description

Facial recognition is quickly becoming a common tool in many aspects of life. It’s being used in stores to recognize customers as soon as they walk through the door. This can then feed back into Facebook and other social media in order to send you ads for the store. That of course gets fed into other algorithms so that you will be sent ads for similar stores. 

Feature Image Credit: Envato Elements
FOLLOW @TARTLE_OFFICIAL

For those who are hard of hearing – the episode transcript can be read below:

TRANSCRIPT

Speaker 1 (00:07):

Welcome to TARTLE Cast, with your hosts, Alexander McCaig and Jason Rigby. Where humanity steps into the future, and source data defines the path.

Alexander McCaig (00:26):

Have you...

Jason Rigby (00:26):

Have you ever seen the rain?

Alexander McCaig (00:30):

Have you ever looked in the mirror? Are you good?

Jason Rigby (00:33):

We got this... Yeah. No, the smoke just hit... Right when I sing like that, I inhaled a bunch. And then what is this wood?

Alexander McCaig (00:39):

Palo Santo.

Jason Rigby (00:40):

Yeah. Palo Santo fumes just shot right into my mouth-

Alexander McCaig (00:44):

That's good.

Jason Rigby (00:44):

... and my nose.

Alexander McCaig (00:45):

Cleanse... It's-

Jason Rigby (00:45):

It's cleansing my-

Alexander McCaig (00:46):

... burning the demons out of you.

Jason Rigby (00:49):

... Burn the demons.

Alexander McCaig (00:49):

You ever looked in-

Jason Rigby (00:50):

You end up losing control.

Alexander McCaig (00:52):

... Lose control. You ever looked into a mirror and not recognized yourself?

Jason Rigby (00:56):

I've scared myself a couple of times.

Alexander McCaig (00:57):

Okay. Interesting.

Jason Rigby (00:58):

Yeah. Just because I'm like, "What was that?" And I look and it was the mirror [crosstalk 00:01:02].

Alexander McCaig (01:02):

When I look at you, I get scared sometimes.

Jason Rigby (01:03):

Yeah. Everybody says that, like I have this... Not the bitch resting face, but I have this mean face. But I think I just am serious. Like when I'm trying to pay attention, because I have this, supposedly, ADD or whatever?

Alexander McCaig (01:14):

Mm-hmm (affirmative).

Jason Rigby (01:15):

So I'll get real serious. You'll see me either look to the side over here.

Alexander McCaig (01:19):

The eyebrows come...

Jason Rigby (01:20):

Yeah. And I'm like, I'm really trying to pay attention.

Alexander McCaig (01:22):

Then you get those two lines right above the nose.

Jason Rigby (01:24):

Mm-hmm (affirmative).

Alexander McCaig (01:24):

Yeah.

Jason Rigby (01:24):

So I look mean, I guess?

Alexander McCaig (01:26):

No, you don't scare me.

Jason Rigby (01:28):

I'm a big teddy bear.

Alexander McCaig (01:29):

Yeah. You are a big teddy bear.

Jason Rigby (01:31):

That's it.

Alexander McCaig (01:31):

A big tattooed teddy bear.

Jason Rigby (01:32):

Yeah.

Alexander McCaig (01:33):

Can we talk about-

Jason Rigby (01:34):

I don't want to harm anything.

Alexander McCaig (01:35):

... Can we not lose face here by going off on a rant?

Jason Rigby (01:37):

Not even a gnat, bro. Not even a gnat.

Alexander McCaig (01:41):

We're losing face with our audience, their audio recognition algorithm is dropping away.

Jason Rigby (01:46):

How do we lose control of our faces?

Alexander McCaig (01:49):

That's a great question. You know how we lost control? The moment we stopped asking for consent. That's it. That's what it boils down to. Can you give me a little history here on when facial recognition started in 1970-something with mugshots?

Jason Rigby (02:01):

Yeah. In 1964, mathematician-

Alexander McCaig (02:04):

Oh crap, '64.

Jason Rigby (02:04):

... and computer scientist, Woodrow Bledsoe, first attempted the task of matching suspects' face to mugshots.

Alexander McCaig (02:08):

Any relation to Drew Bledsoe.

Jason Rigby (02:10):

Ooh, that would be interesting.

Alexander McCaig (02:10):

Isn't that a football player?

Jason Rigby (02:11):

Yeah, he is. Yeah, he's a famous football player.

Alexander McCaig (02:13):

I don't like football, I just hear people say this stuff.

Jason Rigby (02:14):

Yeah, a quarterback. He measured out the distance between different facial features in printed photographs and fed them in a computer program. This is a 1964, bro.

Alexander McCaig (02:24):

Mm-hmm (affirmative).

Jason Rigby (02:24):

His rudimentary successes would set off decades of research into teaching machines to recognize human faces.

Alexander McCaig (02:30):

Geometry's, geometry, right? He had set rules on how geometry works.

Jason Rigby (02:33):

Right.

Alexander McCaig (02:34):

Okay. So, if I'm looking at geometry of faces. Oh that's an isosceles. This eye to here, to the tip of the nose.

Jason Rigby (02:40):

Right. Right.

Alexander McCaig (02:40):

I mean like, there you go. It's really not that complicated. The complicated part is what happens when you get a crap image, or it's like it's fuzzy, or there's too much shadowing on a face. What do you do with partial geometry?

Jason Rigby (02:55):

Well, I think we need to look into the philosophy of surveilliance too.

Alexander McCaig (02:56):

Surveillance?

Jason Rigby (02:57):

Surveillance, yeah.

Alexander McCaig (02:59):

Surveillance. Surveillance.

Jason Rigby (03:00):

Surveillance. Surveillance, there we go.

Alexander McCaig (03:00):

My name is Sir Veillance.

Jason Rigby (03:05):

Yeah. But when you think of this against our will, because we're walking down a street and then a camera follows us because an AI recognizes my face to be something that is of an interest. Now, now we start to get into this polluted cesspool of human rights, which is number three.

Alexander McCaig (03:28):

I know Amnesty International was trying to talk about this with the issues in New York City. But the key driving thing here is, you're not... There's no consent. This is how I look at this. If someone's a criminal, they've broken the law.

Jason Rigby (03:44):

Right.

Alexander McCaig (03:44):

When you break a law, you lose freedoms.

Jason Rigby (03:45):

Right.

Alexander McCaig (03:47):

No consent's happening there at that point, when I look at this. You can scan for people that have, essentially, given you consent by breaking the law.

Jason Rigby (03:58):

Why don't you give a clear explanation on the city council, and then you ask the citizens through-

Alexander McCaig (04:03):

Why can't the citizens ask that this is what they want?

Jason Rigby (04:04):

... That's why I hate some of the states, like this state, is especially, it drives me nuts. Washington State has this, and that's how they've legalized-

Alexander McCaig (04:10):

You don't hate it, it's just illogical.

Jason Rigby (04:10):

... a lot of drugs. Yeah, yeah, yeah, yeah.

Alexander McCaig (04:12):

Thank you.

Jason Rigby (04:13):

But I'm trying to be overdramatized on this. But this whole initiative process. And then allowing the people to bring in. If they get enough signatures. It's like Gavin Newsom, the Governor of California, they have enough signatures to kick him out because he's done such a horrible job in leadership.

Alexander McCaig (04:31):

I wouldn't know. [crosstalk 00:04:33].

Jason Rigby (04:33):

With California and people are moving out in droves, and taxes are going up, and homelessness. I mean, there's just so many things that are happening in that beautiful... I have to give it to them, California-

Alexander McCaig (04:44):

California is-

Jason Rigby (04:45):

... is probably the most beautiful state in-

Alexander McCaig (04:47):

It's huge too.

Jason Rigby (04:47):

... in the country.

Alexander McCaig (04:48):

The whole coast.

Jason Rigby (04:49):

I mean, it has everything.

Alexander McCaig (04:50):

Yeah.

Jason Rigby (04:51):

So, whenever you look at the right of the people, and then we look at number three in our big seven human rights at TARTLE.

Alexander McCaig (05:00):

Well, you have a right to choice?

Jason Rigby (05:02):

Yes.

Alexander McCaig (05:02):

Okay. If I want to step out into public to show my face, I made that choice, did I not? If someone's going to install some sort of security protocol, can we all decide on it? I don't want to a police state.

Jason Rigby (05:19):

Yes.

Alexander McCaig (05:20):

All right? Where people are just putting more stuff in here for that-

Jason Rigby (05:23):

Well, the responsibility does not lie in that government that you elected to decide what they're going to do with your will.

Alexander McCaig (05:30):

... No, their responsibility is beholden to the people who live within that state to decide what they want for their own security. So the second you've really stepped away in this world of big data and analysis of facial geometries at mass and not asking for consent, that's a problem.

Jason Rigby (05:51):

Right.

Alexander McCaig (05:53):

You're not respecting a human being at that point. You just think you can do whatever. Just because the technology's there and a camera's up doesn't mean you should be using it because it's there. Do you see what I'm saying?

Jason Rigby (06:05):

Yeah. And I think this Deborah Raji, she was a fellow at nonprofit Mozilla.

Alexander McCaig (06:11):

Yeah, Mozilla. Yeah, Firefox.

Jason Rigby (06:11):

Yeah, which we know. Yeah. This is very, very interesting. She advised members of Congress on the algorithmic accountability.

Alexander McCaig (06:18):

Mm-hmm (affirmative).

Jason Rigby (06:19):

She looked at over a 130 facial recognition datasets.

Alexander McCaig (06:23):

And?

Jason Rigby (06:23):

Compiled over 43 years. So she's looking at good data.

Alexander McCaig (06:27):

She's looking at a lot of data.

Jason Rigby (06:29):

Yeah. They found that researchers driven by the exploding data requirements of deep learning, gradually abandoned asking for people's consent.

Alexander McCaig (06:37):

Yeah. Because it becomes-

Jason Rigby (06:37):

You see how they used the word, gradual? I like this.

Alexander McCaig (06:40):

You know what? They... What's that, what's that idea?

Jason Rigby (06:45):

Cooking the frog, but slowly turning up the heat?

Alexander McCaig (06:47):

Yeah. Or you stepped over the line?

Jason Rigby (06:48):

Yes.

Alexander McCaig (06:49):

But the thing is, you've taken so many steps and you thought they were small steps. And you turn around and you're like, "I'm a mile away from any sort of moral compass I should've been on." You know what I mean?

Jason Rigby (06:58):

Yeah. Be on the ocean and try that, see if your calculations are wrong. See if you hit that island.

Alexander McCaig (07:03):

Yeah.

Jason Rigby (07:03):

"Oh, I thought I was..." Where was Christopher Columbus? Where did he think he was at when he discovered?

Alexander McCaig (07:07):

Oh, I know. Yeah.

Jason Rigby (07:08):

He just thought he was somewhere way out.

Alexander McCaig (07:10):

Like some middle islands, but really he was... It doesn't matter. I don't know my history that well, I guess. But here's the point. You've towed away so far. Then you say, "Ah, what if we just go a little bit further, we don't have to ask. We're cutting edge. Can we re-tweak this a little bit? You think we can pull a little bit more data in? If the government and the people don't know about it, is it really morally wrong for us to do it or use it?" This is a good question. If people have no concept of it at all.

Jason Rigby (07:37):

Right.

Alexander McCaig (07:37):

Completely unaware. Does it still mean you should be doing it? I've always thought to myself as I've developed in my ego, consciousness, all these things, trying to be a good human being. It's not what you do when people are looking, it's what you do when people aren't looking.

Jason Rigby (07:51):

Yeah.

Alexander McCaig (07:52):

I think that defines the quality of who I am as a human being, at least one aspect of it.

Jason Rigby (07:56):

Yeah. I mean, look at our president and vice president that we have now that we elected. And we see all these criminal heavy-duty, because they were more like police state type. I mean, like [inaudible 00:08:07] we're neither Republican, or Democrat. We're not any of that. But when we see bullshit, we call it out. But they had made all of these where they incarcerated a lot of African-Americans, which 90% of our population-

Alexander McCaig (08:18):

Why? Why? Why are they doing this? Why?

Jason Rigby (08:18):

... Off of bullshit. War on drugs, marijuana and stuff like this.

Alexander McCaig (08:21):

Oh, yeah. Yeah. Yeah.

Jason Rigby (08:23):

And so now you've got a guy that committed rape and he goes in for 18 years, which has taken the freewill of a woman. And then you have a guy that's in because he had a kilo of coke, he's in for 50 years. Doesn't make sense. So let me ask you this.

Alexander McCaig (08:36):

Something does not weigh right here.

Jason Rigby (08:37):

Here. I'm going somewhere. Now you take these deep learning machines and you throw in all the pictures of all the bullshit-

Alexander McCaig (08:47):

How many years of research?

Jason Rigby (08:47):

... people that are incarcerated. So now you have 90% of people that are incarcerated are male African-American. So now you just turn the machine into a racist, sexist...

Alexander McCaig (08:56):

Yeah, because you're feeding the algorithm with bad data, because it's bad data because it's biased.

Jason Rigby (09:01):

Yes. There we go.

Alexander McCaig (09:02):

You're front-loading these machines. And how many years of data did she go through?

Jason Rigby (09:06):

43 years.

Alexander McCaig (09:06):

She went through 43 years of data that didn't use any consent and it's only built out biases in the system. So they're very efficient at being biased.

Jason Rigby (09:14):

Right.

Alexander McCaig (09:15):

Okay? And they're very good at not asking people for permission. It's really-

Jason Rigby (09:21):

Well, she said this. She says, "Now we don't care anymore. And all of that has been abandoned." She says, "You just can't keep track of a million faces. After a certain point, you can't even pretend that you have control."

Alexander McCaig (09:34):

... I have trouble keeping track of my two dogs running around.

Jason Rigby (09:37):

Yes.

Alexander McCaig (09:38):

You'd think I have trouble remembering 500 face. I think they've tested limits on how many faces somebody can recognize or remember. I don't think it's that much. What are you going to do with a million faces? At that point, you're completely beholden to the software and the servers processing it.

Jason Rigby (09:57):

What if the government came and said, the federal government came and said, "Let's do a test study on this. And let's find a city that the people will agree that we can do this. And we're going to incentivize the city for doing this case study, and we need the city to be 60, 70% extremely diverse." But I mean, maybe have a 60, 70% population of African-Americans.

Alexander McCaig (10:18):

Why are you describing utopias to me?

Jason Rigby (10:19):

Yeah. And so then the people say, "Yes, we agree to this. This will help the city out. We're onboard with this." So now we're going to be able to learn. So now we know we have issues with lighting, especially with darker faces. And we know that we have issues with the machine learning, being extremely biased towards African males. So we look at that and we say, "Oh okay, well let's get a group of scientists together, make it open source. And let's take our best scientists. Let's look at how we can elevate human rights. That's our number one priority."

Alexander McCaig (10:51):

Correct.

Jason Rigby (10:51):

"How can we elevate human rights? How can we prevent false arrests and all these other things?" And then from that point, coming from that clear and concise focused point of elevating human rights. Now that testbed becomes something that can be used and implemented throughout the world.

Alexander McCaig (11:10):

Right.

Jason Rigby (11:10):

If the citizens choose to do so.

Alexander McCaig (11:12):

So should we have a deep learning system right now? We're going to write our own. We're going to do a deep learning thing that tests against biases.

Jason Rigby (11:20):

Yes.

Alexander McCaig (11:20):

I want you to deep learn biases.

Jason Rigby (11:22):

Yeah, that would be interesting.

Alexander McCaig (11:23):

And I want you to highlight them all and then tell me where we remove that from the algorithm.

Jason Rigby (11:27):

No, it'd be interesting. It'd be interesting to have a machine that understands all the biases that have done the historical context on each bias, and then knows exactly to call the bullshit out and tell us what bias that... Like it could read a white paper and then give you the biases in the white paper.

Alexander McCaig (11:42):

Yeah.

Jason Rigby (11:43):

That would be so cool, bro.

Alexander McCaig (11:44):

This is 60% bias. And we're going to show you every single point where it actually has bias in it, where there's actually no objective nature.

Jason Rigby (11:50):

Yes. Yes, exactly.

Alexander McCaig (11:50):

You know what I mean?

Jason Rigby (11:51):

Yeah.

Alexander McCaig (11:51):

That'd be cool.

Jason Rigby (11:52):

Yeah, that would be awesome. I think you just are onto something.

Alexander McCaig (11:54):

Think about how many books people would just not read.

Jason Rigby (11:56):

Oh yeah. They could just shove it through there and then you get the cliff notes of, "Well, exterminate that chapter, that chapter, that chapter, that chapter."

Alexander McCaig (12:03):

Exterminate.

Jason Rigby (12:04):

Yeah.

Alexander McCaig (12:05):

Your words today? You're on fire.

Jason Rigby (12:08):

No, I always like to think of it that way thought. Exterminate. Next. The flame thrower.

Alexander McCaig (12:15):

The thing is we have so many books that we go through.

Jason Rigby (12:18):

Oh yeah. It's endless.

Alexander McCaig (12:19):

I mean, if you're here at the studio, I mean we got-

Jason Rigby (12:20):

I just bought one on Edison this morning.

Alexander McCaig (12:22):

... How'd that go?

Jason Rigby (12:23):

Well see, I have this problem. So here's machine learning at its best. There's this website I signed up for and they ask what books you love.

Alexander McCaig (12:31):

Okay.

Jason Rigby (12:33):

This is for Kindle. And then from there... So, I love history. I love autobiographies. I love military history. I love Ancient Greece history. You get really detailed. I love leadership books, nonfiction, spiritual books, all these things. So then every morning it sends me an email and serves me up Kindle books-

Alexander McCaig (12:52):

Wow, yeah.

Jason Rigby (12:52):

... That are a dollar 99 or less, that normally were 20 or $30, or $10 at my categories that I love.

Alexander McCaig (13:00):

It's great.

Jason Rigby (13:01):

I guarantee you, I'll probably buy... Because I'm like, "It's a dollar. 99..." It's a dollar, 99 cents. I guarantee I buy three or four books a week off a Kindle. And my Kindle has three or 4,000 books in it. But I can't read that in my lifetime.

Alexander McCaig (13:11):

No.

Jason Rigby (13:12):

So, but I still buy them. Like today, I bought an Edison book, because I want to know a little bit more about Thomas Edison.

Alexander McCaig (13:17):

You're a hoarder. You're a hoarder.

Jason Rigby (13:18):

A hoarder of books. I'm fine with that.

Alexander McCaig (13:19):

I like hoarding books.

Jason Rigby (13:20):

I'm fine.

Alexander McCaig (13:22):

My preference books, if we're talking about book preference?

Jason Rigby (13:24):

Yeah.

Alexander McCaig (13:25):

Books that are out of print.

Jason Rigby (13:26):

Oh yeah.

Alexander McCaig (13:27):

Most of the books that I always want to read are ones you can't get in print anymore.

Jason Rigby (13:30):

I listened to that-

Alexander McCaig (13:31):

That's the good stuff.

Jason Rigby (13:31):

... Navy Seal guy, Jocko Willink, and he's into books too. And he loves the, he calls it the first edish. He wants that first edition. Or he's like, even the ones that you can get into where it's the publisher puts them out, like a hundred copies for certain people to read and look at it, like the giveaway ones? The first hundred.

Alexander McCaig (13:49):

Yeah. Oh my god.

Jason Rigby (13:49):

He even wants those.

Alexander McCaig (13:51):

Ooh, juicy.

Jason Rigby (13:53):

And then if you get one of those and it's signed by the author?

Alexander McCaig (13:54):

Get out. That's over the moon. That's over the moon.

Jason Rigby (13:58):

People get into this.

Alexander McCaig (13:59):

So can we continue here?

Jason Rigby (14:01):

Faces and books?

Alexander McCaig (14:02):

Yeah, faces and books. How does this? So then...

Jason Rigby (14:04):

What was the Game of Thrones, the no faces place? Wasn't there a temple that she went to? The girl with the sword?

Alexander McCaig (14:09):

Something of the black and white, or...

Jason Rigby (14:11):

Yeah, remember that? When there was the faces?

Alexander McCaig (14:13):

Yeah.

Jason Rigby (14:13):

And then she could change her face?

Alexander McCaig (14:14):

Anytime she wanted.

Jason Rigby (14:15):

Yeah.

Alexander McCaig (14:15):

This would be a problem for this algorithm.

Jason Rigby (14:17):

Yeah. What if we have shape-shifters?

Alexander McCaig (14:19):

Well, we've seen... Yeah.

Jason Rigby (14:28):

Oh that's funny, bro. Or, would we be able to finally catch reptilians? Bro, think about it.

Alexander McCaig (14:35):

It is just a beaver reptilian.

Jason Rigby (14:36):

Oh my god! Yeah. The AI all of a sudden catches them when they come out the corner at that Beverly Hills house. They come out the corner and then it shows like this long...

Alexander McCaig (14:44):

Scales?

Jason Rigby (14:45):

Yeah. We finally caught him. David Ike.

Alexander McCaig (14:48):

Oh my gosh. We're [crosstalk 00:14:50]. All right.

Jason Rigby (14:53):

Oh, we've got to be done with it.

Alexander McCaig (14:56):

Calm. Clear the air.

Jason Rigby (14:57):

How we lost control, bro? Not just of our faces. Face Off. Remember that movie with John Travolta? Okay. We're out.

Speaker 1 (15:10):

Thank you for listening to TARTLE Cast, with your hosts, Alexander McCaig and Jason Rigby. Where humanity steps into the future and source data defines the path. What's your data worth?