#4: Facial Recognition and Bias
Big Brother is watching all of us and facial recognition systems are employed by governments and companies alike all around the world. So what exactly is facial recognition, and is there a great risk for discrimination and bias when using facial recognition? (Spoiler alert: yes.) Ian and Michael discuss.
Transcript
Ian Bowie
Hello and welcome to AI Unfiltered with me Ian Bowie and our resident expert Michael Stormbom, where we will be talking about everything to do with AI in our modern digital society and what the future holds for all of us.
So the subject for today is bias and with specific reference to facial recognition, big brother is watching all of us. So Michael, can you please perhaps explain what exactly is facial recognition?
Michael Stormbom
Facial recognition. Well, it’s, it means a number of different things, but in essence, it means taking an image or on video and if there’s a face in there, recognizing for sure that it is a face and then identifying who the person in the picture might be. It’s part of a broader group of techniques, usually called computer vision, which is basically analysis of images and videos for various use cases and purposes.
Ian Bowie
So that’s pretty much of a nutshell answer. Who’s using it and what are the markets for it?
Michael Stormbom
Well, facial recognition is very widely used today. If you’re on social media, and you upload a picture and you notice that your friends are automatically tagged in the picture, that’s an example of facial recognition.
But there are a number of other use cases. For example, in security, so for example, accessing – access to restricted areas, you could use facial recognition to allow only, to allow only specific persons access used by by many employers already in the west and certainly in in China, where as well where facial recognition is very widely used in security. So analysis of surveillance for the for crime prevention and crime investigation.
Ian Bowie
I quite like your access answer, because I’m guessing that quite a few people listening to this. I’ve been watching the Mission Impossible films. And part of that is where they actually make a mold of somebody else’s face, and then put it on and use it to do exactly that access secure facilities. So if we think about that, obviously, we have the technology to create a rubber face if somebody doesn’t that make it rather easy to get into secure facilities.
Michael Stormbom
I don’t know if it’s all that easy to make. Rubber faces that have all of the characteristics but as far as I am aware there haven’t really been a wave of break-ins using fake faces, but I could be wrong on that one.
Ian Bowie
I like to think it’s true. All right, I mean…
Michael Stormbom
But more to the point about access you can also enter for example, in your mobile phone, you can set it up so to use your face as to access the phone so you so use your phone camera.
Ian Bowie
I’m a bit worried about that. I mean, what happens if you’re having a bad hair day?
Michael Stormbom
Hopefully the algorithm is smart enough to be able to cope with your…
Ian Bowie
Alright. So security, your mobile phone, crime prevention. But actually, how does facial recognition AI work?
Michael Stormbom
Well, there are a couple of different approaches to facial recognition. So one is that you have an existing database of images with names associated with them. Let’s for sake of example, say that they have a name a picture of you and a picture of me and the picture of your stag with Ian and a picture of me is tagged with with Michael. So then use that as data to train the facial recognition model. And then after that, you would have a model who is able to recognize either your face or mine, but that’s pretty much it. So if you put a new face or defeated to the to the model, it would be unable to correctly identify it would try to identify it as either me or as you.
Another approach, which is more than we use more practically useful for image recognition is to do train instead your trainer recommendation model that is that compares pictures. So if you have a photograph of you, which the algorithm knows is of you, and then you feed it a new picture, then the algorithm has learned to compare that new picture so as to determine whether it’s the same person in the picture.
Ian Bowie
All right, I mean, that’s all well and good but of course the subject of today’s podcast is bias in facial recognition. You know, we’re talking about okay, we can train it with my face, your face, all our friends’ faces, but we’re all basically ethnic white, Western Europeans. I’ve seen documentaries where they’re saying, Well, you know, a lot of Asian people are tagged as terrorists, simply because whoever is actually creating these programs is biased towards that way of thinking. Or a lot of black people are picked up as being criminals because again, the people who are programming these systems are like us, middle aged white men. How can we how can we get away from that?
Michael Stormbom
Yeah, and I think one thing worth pointing out here is it’s it’s not so much explicit programming, so no one necessarily sets out to create a racist AI. So that that’s it’s more of a… well, I’m sure there are people who do that as well. But I think the bigger one is this sort of like unconscious bias. And because these systems, they learn by the data that is fed to them. And so in the case of racial bias in these facial recognition, software, they’re the, the underlying cause is typically that the data used, it’s not diverse enough. So it’s a lot of pictures of white people. And so the system learn to recognize white faces better. And it that’s not necessarily due to a deliberate, deliberate choice to only have specific…
Ian Bowie
But how do we avoid that? Because we have to, I mean,
Michael Stormbom
Absolutely. And I mean, there are already real world cases as you know, of these systems. They are widely used. They’re widelly deployed. Governments use them all the time where they are incorrectly identify people as criminals, and typically, and with a racial bias sp a black person is more likely to be identified as a criminal than a white person, precisely due to this bias in the data.
Well, one question first of all, is whether we should be using the systems in the first place, but we can get into that later, but the main thing is really that the the dose design the system, there has to be diversity in those in those teams. So as to more easily notice that there’s a problem in the in the data that has been used, or rather, what data is not being used. So I think diversity, diversity in thought, diversity in ethnicity, gender…
Ian Bowie
is there actually a case for putting together a multi-ethnic, multicultural, and possibly even multi lingual team together to start developing this kind of technology?
Michael Stormbom
Absolutely. I think that’s very much. It. It’s a it’s a must in, in my view.
Ian Bowie
I think governments probably are aware that there is bias in the data, but they’re still using it.
Michael Stormbom
I think that’s a mixed bag. So I think some governments are not particularly well versed in, I think, in general, governments are not that well versed in in these type of technologies. I think that’s part of the problem. As well as all the regulation is lagging behind due to that reason. But certainly, there are governments who are very keenly aware of what you can do with with AI.
Ian Bowie
Alright, so I mean, how do we find the balance between privacy and security? For example, biometric data protection.
Michael Stormbom
Well, I think that’s a difficult question to answer. I think the individual has to decide for themselves what is them what is the appropriate amount of shall we say a personal integrity and privacy are willing to give up for in the name of security?
Ian Bowie
Well, we might not have a choice. I mean, governments are just rolling the systems out. Without any control really?
Michael Stormbom
No, indeed. Of course, and of course here in Finland we have the luxury of or is it a luxury but or rather, the rights to vote them out. But of course, it’s…
Ian Bowie
We have the same thing in the UK, but the UK is still the most watched society on the planet. I think after China.
Michael Stormbom
Yeah. But what do the opinion polls say on that one.
Ian Bowie
I haven’t got the data. I think a lot of people probably don’t have an objection to the fact that there are CCTV, CCTV cameras pretty much on every street corner because of course it does help prevent crime. And when crimes do happen, of course, it helps to catch the criminals. So from from that, I think that the main worry is misidentifying people for something that they’re not
Michael Stormbom
Yes.
Ian Bowie
I actually watched a documentary not too long ago about this, and they said, you know, they showed it I mean, these systems are actually profiling people.
Michael Stormbom
Yes.
Ian Bowie
Yeah. And getting it wrong.
Michael Stormbom
And getting it very badly wrong. Yes, indeed.
Ian Bowie
Yeah. And then the authorities believe the data because you’ve got a police officer, who perhaps is a good police officer, but doesn’t think beyond the evidence that’s in front of them, and will not accept at face value that that person is not the person that has been identified by the system.
Michael Stormbom
Yeah, I mean, that’s what I mean with governments not understanding the limitations of these technologies.
Ian Bowie
But who’s policing the government?
Michael Stormbom
those good question indeed.
Ian Bowie
So Can Can we actually trust governments with this technology, without having an independent body policing how they use it?
Michael Stormbom
I think where, this is where this podcast comes in, as well as also making the population aware. I mean, certainly that applies to people in the government as well, that there need there needs to be a better understanding of, first of all, what the limitations are, and also the inherent dangers especially in cases of bias but also automating our society and the implications. So I think, I think raising awareness and also raising the, the competence level of of government in terms of understanding
Ian Bowie
Yeah, can can facial recognition actually recognize emotion?
Michael Stormbom
There are system that claims to be able to analyze emotions based on facial expressions. However, I think that’s it’s much of a it’s a research topic mostly. But though there are commercial systems on the market already that claim to be able to analyze emotions. But studies have shown that they’re not particularly reliable.
Ian Bowie
Right at the moment…
Michael Stormbom
At the moment, and well, I don’t know if there’s given how individual facial expressions are, can we, is it even possible is it even practically possible to create a system that would be able to to analyze them with any reasonable level of accuracy, perhaps in combination with other sort of cues like to the sound of your voice and body language and other forms of nonverbal – and verbal, verbal expressions, perhaps?
Ian Bowie
Yeah, of course, we’re sitting here talking about this in at the end of 2021. I mean, if you go back 20 years, nobody would have believed this technology would have been possible in the first place. So it can be in 20 more years. Actually. It really is intelligent enough to recognize emotion and goodness knows what else.
Michael Stormbom
Yeah, well, I mean, I think if you if you couple with other data, let’s say if you’re coupled with well, biometric data, for example, on blood pressure and blood pressure, and heartbeats and perspiration and all that stuff, if you couple all that together, and possibly you could get a get a reasonable, reasonable analysis of a person’s emotional state.
Ian Bowie
Or it might just be that they haven’t had enough coffee that morning,
Michael Stormbom
Everything happens in context. Indeed.
Ian Bowie
Yeah. So all right, we’ve talked a little bit about okay, misrepresentation, and also the security aspects and maybe the negative side of big brother watching us, but um, what about practical applications for facial recognition just in our daily lives?
Michael Stormbom
Yeah, indeed. So as mentioned, when you when you access your phone, you can use your face as to as to wait to get into their phones or using their phone’s camera to recognize there’s actually you accessing the thing. And then as noted, many employers use that for access and of course, it’s convenient if you if you have photos or videos and it automatically tells you who are in the photos or in the video. So those sort of convenience, things well, image searches in Google and so forth.
Ian Bowie
Could it be used as a payment system?
Michael Stormbom
It could I believe, or even attempts or I’m not sure how far along they are with those but to do exactly that they use your face as a way to recognize that is it Amazon or someone who’s doing…
Ian Bowie
I think Amazon are experimenting with certainly with new ideas or like cashierless supermarket
Michael Stormbom
Yeah, indeed so basically, and then using your face then as the way to identify that you are the person taking these products and then charging you particularly for them
Ian Bowie
But why have we gone down the facial route? I mean, if you think about access, what’s wrong with the good old thumbprint?
Michael Stormbom
Well, I guess it comes down to this sort of recognizing, well mimicking human behavior in a way right you. I mean, I don’t go and identify people by looking at their finger. fingerprints.
Ian Bowie
Yeah, no. But I mean, for exactly what I’m thinking. If you want to get access to a building, what’s wrong was just using your thumb or your fingerprint.
Michael Stormbom
I don’t know if there’s anything wrong but it.. is it cooler to use facial recognition? I think a lot of this is driven by what is what appears to be cool as well, I think.
Ian Bowie
Right. Okay. So it’s all about the cool factor and not necessarily about the practical factor.
Michael Stormbom
I would say so. And, of course, she in a way, it’s less, it’s less invasive in a way right. If you just take a picture of your face. What I mean by that is that you have to physically put your thumb on something and there’s COVID on the all over the pad and so forth. So in a sense, it’s, it requires less physical action on your part to just show your face.
Ian Bowie
That is a good point. Yeah. But I mean, for example, people’s faces change over time. Somebody might decide to grow a beard. Somebody might have a facial disfigurement from an accident. Can can facial recognition see beyond that? So for example, if I grew a big bushy beard would my camera my phone still recognize me?
Michael Stormbom
So they’ve done research on that and apparently facial area does not affect facial recognition all that much interestingly. But yeah, no, I think it’s a I do think it’s a consideration that faces change over time.
Ian Bowie
And what about Glasses?
Michael Stormbom
Glasses if you’re wearing sunglasses that can affect it, I think they seem to rely quite a bit on also a lot of the eyes, the facial recognition, many facial recognition. By the way, if you’re trying to fool a facial recognition system by wearing a balaclava but your eyes are still on full display, then you might not be able to fool the facial recognition necessarily
Ian Bowie
Because I know when you go through the airports, you’ve got to take you’ve got to remove your glasses for the system to work properly.
Michael Stormbom
Yeah, but I would think that’s more of a precaution or the particular characteristics of whatever facial recognition is behind there. I mean of course, different facial recognition system works differently. So what might be true for one might not necessarily be true for the other.
Ian Bowie
Right. See, that’s an interesting point. Different facial recognition systems work differently. Shouldn’t there be a standard and a standardization within the industry?Now that’s got you thinking, isn’t it?
Michael Stormbom
I’m weighing the pros and the cons. Well, I mean, one thing is about the data. So what what the system has done is characteristic of a phase and that may or may vary from system to system. And of course the other thing is what are the layers on top of there, the AI part of the system itself so what sort of other information is fed into the system from so and I mean, that’s constantly do vary from vendor to vendor, as to whether it should be standardized. I think it’s more a question of should be should it be regulated. I mean, we talked about racial bias, which is, which is part of many commercial facial recognition systems as it is, so yeah, I don’t know if a standardization as such but I think regulation, I think, is rather crucial.
Ian Bowie
All right, facial recognition is I mean, it’s obviously clever stuff, but is it possible to fool facial recognition? I mean, I talked about the Mission Impossible, rubber face, but maybe that’s a little bit too basic.
Michael Stormbom
Well, I don’t know if it’s too basic, at least in those Mission Impossible movies. It looks like quite state of the art. The facial facial reconstruction there, but no, but I think that yes, I mean, there’s certainly ways to fool facial recognition systems and the way you fall it of course, depends on a bit on the inherent characteristic with a particular facial recognition system. So but for example, like a facial mask, it can be enough to fool some systems. So they haven’t had a very good time with facial recognition systems during the COVID era with their own varying facial masks.
And then, I think the primary way that people are to deliberately fool these facial recognition systems is to is to cause the system to not recognize that it’s a face in the first place. So you can – like if you wear makeup, like I don’t mean like regular day to day makeup but but putting makeup in your face in and with different patterns and so forth that can fool the system into thinking it’s not a face in the first place.
Ian Bowie
But I mean, if the system spots, somebody like that, isn’t that an automatic red flag for the authorities that they’re obviously hiding something, so go and pick them up?
Michael Stormbom
Well, no, I mean, if the system relies on first of all detecting that it’s a face and if it’s not a face, then it doesn’t care..
Ian Bowie
Yeah like you’re invisible.
Michael Stormbom
Indeed, indeed. And I mean, there have been other instances where people have, for example, like, while wear a mask for a specific pattern, right? So that forces us to be the thing is not the face at all, and those sorts of things. So that certainly happens. There are even AI systems that have been designed to fool AI systems, so it has become this sort of like industry of its own as well.
Ian Bowie
So I mean, if you can fool the system that easily doesn’t it just make the whole thing pointless?
Michael Stormbom
I think that’s a good question. If people are well first of all, are people aware of it.
Ian Bowie
The bad guys are you certainly let the cat out of the bag!
Michael Stormbom
Well, at least our audiences now. Yeah, well, yes. But But I think if they if the, if most people are not aware of it, and I think those systems are…
Ian Bowie
But most people are not a threat to society, but the people that are a threat to society. are bound to be looking for ways to beat the system.
Michael Stormbom
Yeah, indeed. And I mean, it doesn’t, it doesn’t matter how much time and effort you put into creating a facial recognition system, there just isn’t a system that is going to be 100% foolproof and 100% accurate.
Ian Bowie
So is there any point to developing this at all? If it’s not, I mean, you know, it’s that easy to trick it to fool it, to get around it. What’s the point in having it?
Michael Stormbom
Well, now that the creators of the facial recognition systems are aware of these workarounds Of course, they will move to counter that with their…. It’s going to be a little bit like a virus creators and anti-virus systems, it becomes this ever continuing struggle of the development and developing the facial recognition syste.
Ian Bowie
Cat and mouse.
Michael Stormbom
Indeed, indeed,
Ian Bowie
Yeah. Who’s who’s who’s in front of the cat or the mouse? Yeah, okay. You know, I don’t want to get too serious about this. But I like to think about some of the ways the fun ways that you can use this technology. And don’t you think it’d be really cool that you could just walk up to your car it would recognize your face, let you in and put everything to your settings and start up as an example?
Michael Stormbom
Yeah. Well, I mean, I think, isn’t that the promise of AI and automation in general, right? That it will make things a hell of a lot more convenient for for all of us. Well, I don’t know if that’s the main point, but it’s certainly a major selling point is sort of.
Ian Bowie
And then are we are we going to get to a stage where suddenly we you know, we don’t need a ticket for anything. We don’t need a credit card anymore. Your face is your credit card.
Michael Stormbom
I think that could be a distinct possibility. Then we come down to the question of the reliability of the facial recognition systems. And we were talking about identity theft in a previous episode, so I don’t know whether…
Ian Bowie
That’s true. Or of course, it means you can ride the bus for free if you’ve got one of the masks and you can trick the system. It doesn’t even recognize you.
Michael Stormbom
Yes, indeed.
Ian Bowie
So free public transport people. That’s what we’re aiming for.
Michael Stormbom
Yes.
Ian Bowie
Of course one thing where facial recognition isn’t gonna work. I guess it’s with kids. Because many of you think, you know, like, well think about your first passport, and you’ve got your baby passport and you’ve got your baby picture in there. But you know, if that was given to you when you were 18 months old, when you’re five years old, you absolutely don’t look like that anymore, do you?
Michael Stormbom
Yeah, no, indeed. So, I mean, that’s of course a general issue that your face changes over time, but certainly in those early years rather aggressively, indeed.
Ian Bowie
So I mean, it’s basically not going to work for kids at all. Yeah, no, I’m just thinking like if you’ve got facial recognition, door locks, so the kid comes home from school and then suddenly can’t get in the house. It’s not gonna work, is it?
Michael Stormbom
No, indeed.
Ian Bowie
So there are limitations.
Michael Stormbom
There are definite limitations. And even in the best of circumstances, as noted, facial recognition will never be 100% accurate.
Ian Bowie
Do you think that there might be something after facial recognition? You know, we had we had fingerprint recognition. Now it’s going to facial recognition is something after that.
Michael Stormbom
So something even more invasive like scan of your DNA on the fly or something of the sort? Why not?
Ian Bowie
Yeah.
Michael Stormbom
Yeah. I mean, of course, we are all carrying around these computers in our pockets, right, that can be used to identify us, you know, in a way as well. So. So the question is, how much further beyond facial recognition Do we ever need to move?
Ian Bowie
Or what about the..
Michael Stormbom
Especially if it’s sort of like implanted.
Ian Bowie
I was about to say yeah. What about the chip under your skin?
Michael Stormbom
Yeah, actually, we, in my place of work, we have this office in in Stockholm. So when we got this office in in Stockholm in like this office hotel, so they they very graciously offered the opportunity to have like a chip implanted in your like a small little thing that goes to like in your hand so that you could then automatically you you could open the doors there with to just show your hands. I politely declined opportunity to do so. But yeah, that’s a that’s already a reality. That you can do that.
Ian Bowie
Wow. Yeah, but I mean, so. Alright, it’s for this one hotel, and it lets you in then what happens if you decide never to use that hotel again? How do you get the chip out?
Michael Stormbom
I guess you have to visit…
Ian Bowie
a surgeon and cut out gross.
Michael Stormbom
Yeah, no, but let’s call it a proof of concept of the thing. And, of course, I mean, we don’t we tag our we tag our pets, you know with those digital….
Ian Bowie
Oh, we do. Yeah, that’s true. Yeah.
Michael Stormbom
So certainly the technology exists as a matter of whether we
Ian Bowie
do we want to tag ourselves.
Michael Stormbom
Yeah, indeed.
Ian Bowie
Yeah, I think maybe we can leave that for another time.
Michael Stormbom
I think that’s probably for a future episode.
Transcribed by https://otter.ai