Half of all US Citizens have their faces in a database that can be searched by facial-tracking software, and the slow creep of surveillance technology means more of us are being watched and analyzed by AI face-detection cameras, often without us knowing, and almost always without our consent. So what are some of the ways companies and governments deploy these technologies today? What are the limits and blind spots to these systems? More importantly, what can and does go wrong when we choose to outsource human recognition to automated computers?

Subscribe now on: iTunes | Google Play | Stitcher | Soundcloud | Spotify | RSS | or search "Ashes Ashes" on your favorite podcast app.

Chapters

  • 03:21 Implementations
  • 11:25 Landlords, Churches, Hotels, and Schools - Oh My!
  • 23:30 Technology Creep
  • 31:29 Potential for Abuse
  • 37:24 Ridiculous Ideas
  • 41:43 Failures and Blind Spots
  • 48:43 Irrevocable Consequences
  • 55:25 Aiding Selective Enforcement
  • 1:05:41 Fighting Back
  • 1:15:01 Daniel moving to Boston?

(This is a machine transcription, updated one coming soon!)

Thank you Alexey for this amazing transcript!


David Torcivia:

[0:04] I'm David Torcivia.

Daniel Forkner:

[0:06] I’m Daniel Forkner.

David Torcivia:

[0:08] And this is Ashes Ashes, a show about systemic issues, cracks in civilization, collapse of the environment, and if we're unlucky the end of the world.

Daniel Forkner:

[0:19] But if we learn from all of this, maybe we can stop that. The world might be broken, but it doesn't have to be.

David Torcivia:

[0:34] You know, Daniel, we're really lucky that this is a podcast/radio show.

Daniel Forkner:

[0:39] Why is that, David?

David Torcivia:

[0:41] Well, while there may be voice prints of us out there, in fact, some that we've made ourselves if go all the way back to that episode we did on fake news.

Daniel Forkner:

[0:51] Episode 18 – Scripted, David?

David Torcivia:

[0:53] Yep, that's the one, with our clever little robot voices which we may or may not be using even right now, I guess, our listeners have to figure that one out. But as much as we might be able to be copied from our voiceprints at least we're not there sharing our facial data.

Daniel Forkner:

[1:11] Wait, David, don’t we have our pictures on the Press page, the website?

David Torcivia:

[1:15] Well, I mean, I guess, now that you mention it. Those are there.

Daniel Forkner:

[1:20] I mean they're not moving though.

David Torcivia:

[1:22] Strategically, I’ve picked the photo where I'm half covering my face and it sort of blown out and there are shadows, just to, you know, foil these attempts – that's the theory anyway.

Daniel Forkner:

[1:32] My face is just out there.

David Torcivia:

[1:33] Well, FBI get on that, I guess, before we realize this and pull those photos but.

Daniel Forkner:

[1:39] So what you're saying is we have less facial data out there than if we were doing like a video podcast, like a YouTube podcast that a lot of people are doing?

David Torcivia:

[1:48] A videocast, if you will, yeah. So I mean, there's so much facial data out there these days and it is being used in so many interesting ways that this is going to be a growing problem as we go forward and this technology gets better and better and utilized in more and more places. And this is not the first time that we’ve talked about facial recognition on this show, in fact, it comes up quite a bit. In a lot of episodes things discussing both retail experiences, advertising as well as the more malicious side of things that the government or other bad actors might be doing with this tech. Facial recognition is getting so out of hand already at this point even in its infancy that we wanted to take an episode to dedicate to it, just to point out all the strange places it is popping up and how it's already being abused.

Daniel Forkner:

[2:35] What do you think someone might think facial recognition is being used for if they hadn't read all these articles that we had, David?

David Torcivia:

[2:43] Well, I mean, when you pair the word facial recognition, I think, especially with the way that it's shown in movies and in cinema and in the greater cultural consciousness, is that facial recognition is something that comes in in high-tech areas in important ports of entry, let's say airports, let’s say border crossings, and it exists there to sort of catch the bad guys: these are terrorists, these are maybe drug runners – people that we really want to keep out and so we have to be able to recognize them at the last minute before they're able to do anything and get them safely out of the way, these major crimes. And I think that's really the major way that this technology has been sold to us.

Daniel Forkner:

[3:22] Right. And of course, this technology is found in a lot of sites of transportation: we’ve talked about airports here in the United States that are using facial tracking in some of their terminals, I know, the international terminal at Atlanta Hartsfield Jackson is using it, LAX is another airport that’s using facial tracking. So it is used in this context, ostensibly I guess, to catch the bad guys like you mentioned.

David Torcivia:

[3:46] Well, also it’s sold as something that supposed to be about convenience. So there's a pretty impressive little video that comes from a Chinese airport that we’ll link on the website, I think I saw it first as a tweet, showing a man walking up to this terminal, it's just a big screen and it scans his face, recognizes his facial data from who-knows-where and then shows him, you know, not only his flight information, which gate he's supposed to be at what terminal but also a map and how to get there. Which, you know, ignoring all the privacy implications that could come from this, it's pretty cool, that's a nice way to simplify this thing and offer convenience. But that convenience is really just a packaging to get all this additional data on us which is something we’ll explore through this episode.

Daniel Forkner:

[4:27] David, let's talk about some other ways that this technology is being used. And the first thing that comes to my mind, cause we've talked about this a ton, is that it's sold to Law Enforcement officers around the world to narrow suspect list and to find wanted criminals, find those with warrants out for their arrest, and so these cameras are being deployed all over the place. Here, in the United States, San Diego is going crazy with these, installing them in street lamps in addition to audio recording devices. In New York City...

David Torcivia:

[4:58] Yeah, we have these giant kiosks that are all over the place, that both have a giant advertising screen on both sides and provide nice fast free internet that, of course, tracks you via your Wi-Fi MAC id as well as the Bluetooth signature of your phone. But these also sport cameras on three sides: one side is a video phone camera so you can make calls, but the other two are very high and face the street and are there just to gather facial recognition data and share it with who knows who but probably NYPD in their very exclusive and secret facial recognition program for which they're actively being sued by a couple of organizations right now to try and reveal the data. This is something we talked about at length in this show multiple times.

Daniel Forkner:

[5:41] Yeah, I know you're not really happy with the NYPD in there like that pervasive use of surveillance, David. But you might be thinking, well, I don't like walking down the sidewalk and having my face captured by these kiosks, let me just hop in the car, take a stroll, you know, ride my car down the road, I'll be safe from these cameras but, in fact, no.

David Torcivia:

[5:59] If I had a car but yeah.

Daniel Forkner:

[6:01] Or maybe getting a taxi, David, but you're not even safe there because New York has started placing a video surveillance cameras along bridges and toll plazas do identify people in their cars. And according to the New York Governor Andrew Koomio.

David Torcivia:

[6:23] Cuomo. [Both laugh at Daniel pronunciation]

Daniel Forkner:

[6:27] According to the New York Governor, whose name shall not be named, he says, “We are now moving to facial recognition technology which takes it to a whole new level where I can see the face of the person in the car and run that technology against databases. And because many times a person will turn their head when they see a security camera, they are now experimenting with technology that just identifies a person by their ear.”

David Torcivia:

[6:53] Oh, man.

Daniel Forkner:

[6:55] So you can't even jump in your car, David.

David Torcivia:

[6:57] I have to grow like long hair that covers up my ears or just wear earmuffs at all times.

Daniel Forkner:

[7:03] You know, what this really brings to my mind, I think, I'm jumping ahead a little bit but we had that episode on forensic science, episode 24 - Suspect Science where we basically breakdown many of the different methods the FBI and other law enforcement agencies use to...

David Torcivia:

[7:21] To basically make up all this forensic science, I'm sorry, that episode, it's one of my favorites but it makes me so incredibly angry. If you haven't listened to that one, please do, because it's probably one of the most important episodes that we've done.

Daniel Forkner:

[7:34] Right, because if we're talking about fingerprints, we’re talking about blood splatter, we’re talking about all these things that you see on TV as like the scientific foolproof way to show that someone was involved in a crime. And it turns out that all of it is just purely subjective, totally opinion-based, there's no science whatsoever to it. It is just totally made up. And facial recognition, I think, is one of those things that is dangerous in that we see it as this complex technology where there's algorithms driving the decisions and it can make it seem like this hard science that “oh, a computer is making a calculation,” therefore it's accurate. But then you see people saying: we're going to be using artificial intelligence along with this facial tracking software to identify people by their ear. And then you start to wonder, wait, hold up, is there even real statistical individual criteria to ears that can really distinguish a person from another? Do we even know that for sure?

David Torcivia:

[8:31] Well, we're going to find out the hard way. And the hard way is lots of false arrests and false matches, and we'll get to these fail statistics in a minute and some of these systems are appallingly bad, even though they're in current use right now. But I think that's jumping maybe slightly ahead of ourselves, just for the moment anyway.

Daniel Forkner:

[8:50] Yeah, sorry. So let's get back to some of the ways that this tracking is employed.

David Torcivia:

[8:54] I mean, there's so many uses that police departments around the country are deploying this technology for, we’ll get into some of that in a little bit. But what's really interesting, I think, about this is yeah, we are assuming at this point that most police departments, the government, the state are using facial recognition, the cat is out of the bag; but I think what most of us don't realize is how much this technology pops up in private places whether they are homes or stores and how ubiquitous this technology already is, again, even though it's very much in its infancy and still very much being refined because a lot of times it is just a disaster. But again, talking about New York, I guess we have a lot of facial recognition examples here, there is a Brooklyn landlord – Boo! Boo! – who has a 700-unit rent-stabilized apartment complex. And for those not familiar with it, this means that the rent basically can't be adjusted more than a certain amount, I don’t want to get into it. But let's just say that it is a bargain if you can get one of these apartments, landlords hate it, tenants love it, you going to be paying way under market rate if you can get this. Consequently, a lot of them when they originally were rent-stabilized which was several decades ago up to now they're generally lived in by low-income and minority individuals, which is another reason that landlords hate this, they're always trying to evict people and get out of this rent stabilization program, so either they can demolish the building or update it so that they can charge full market rates. That's another conversation which will actually have very soon.

[10:22] But so they've decided they wanted to deploy this facial recognition system for access into the building. It's basically a keyless entry system but updated so it's not a card or an app but your actual face that grants access to the building or to these individual apartments.

Daniel Forkner:

[10:37] Tons of buildings in New York already have a type of keyless entry system for residents to get in but this facial tracking software takes it to a whole new level. And it's concerning for a number of reasons, number one being: this creates a data trail, right, of everyone that comes in and out of this apartment, this home, this is their home and every time they enter the door, every time they exit someone has data on that, whether that's a landlord or the fact that the landlord can share that data with anyone else. In fact, according to the landlord themselves, they have not taken any steps to protect this data from being accessed by the NYPD or any other government agency like Immigration Control or something. And so already there's a privacy concern.

David Torcivia:

[11:25] Let's play a thought experiment for a second here, Daniel. So I'm an evil landlord and I know that’s redundant. So I'm an evil landlord and I'm going to spend a bunch of money installing this expensive technology system in my rent-stabilized building in order to track people's faces. Now, why would I make that investment?

Daniel Forkner:

[11:44] Well, I'm sure the landlord would say because it's a form of modernization that would allow them to charge more money because it would appeal to higher-end, but wait, it's rent-stabilized, so these are...

David Torcivia:

[11:56] Yeah, oh no, so that doesn't work. So what could it be?

Daniel Forkner:

[12:01] Well, other thing I'm thinking, David, since these are rent-stabilized apartments and you mentioned that the landlord wants to get rid of these tenants, right, so I'm guessing if they can catch the tenant doing something that violates their lease, they can kick them out and that's what they want. Now, how does this relate to the entry system?

David Torcivia:

[12:18] Well, now you're getting somewhere. So let’s say, first off, were not mentioning who or what organization we’re sharing this data with, so that means if we for some reason wanted to share this data like you mentioned with ICE or with NYPD, then maybe some of our tenants could disappear because they have a warrant out for their arrest or because they're here illegally in the first place. So, all of a sudden, those are vacancies which can be filled with non-rent-stabilized units, potentially depending on the legal loopholes I have to jump through, or, and this is I think the main motivation here, in the city there's a huge amount of people who rent those rent-stabilized apartments and then don't actually live in them and utilize them as Airbnb residences. They rent them out at high above market value…

Daniel Forkner:

[13:06] Definitely a lease violation there...

David Torcivia:

[13:09] It's a 100% lease violation but it's hard to catch if you just have a key, your neighbors basically have to snitch on you and turn you in for this. But if people are logging in with their faces or if their face isn't working and entering with the key, then all of a sudden the landlord now has data that people aren't using their face to enter their house but instead this key or this access code, whatever it is. That's fishy and so they can investigate this, check the surveillance footage and see: oh, these aren't my tenants living here, it's a string of random people, they're renting out this apartment. I can kick them out, evict them and now I've got this free apartment to do whatever I want to, raise it to market value and profit off of it. And I think this is the larger motivation that we're seeing here with these facial recognition programs in places like this where it's not about simplifying the life of my tenants or added inconvenience or added value or whatever. But it's about finding loopholes and petty ways to kick out people living in my apartments when I don't want them to.

Daniel Forkner:

[14:07] Although, to be fair, I don't have as much sympathy for someone that is taking up an apartment just to use it for Airbnb.

David Torcivia:

[14:15] Sure, I mean, I'm not saying, you know, are we comparing here who is more evil: the landlord or the person renting out an apartment for Airbnb? I mean, maybe it's a wash, one is a landlord, one is a middleman landlord basically. But the fact that this technology is being deployed to catch a small group of people and putting everybody's privacy at risk in this process, I think, is really indicative of this larger deployment of facial recognition that we're seeing, where it’s sold for this very small reason, terrorism, whatever, but everybody's getting caught in his dragnet. And the consequences for some people are very dramatic, which we'll get into later on, and the potential for huge consequences for all of us is absolutely there and frankly terrifying. But let's continue on some of these interesting deployments that we’re seeing.

Daniel Forkner:

[15:02] Churches are using facial recognition, David.

David Torcivia:

[15:06] What does a church need facial recognition for, Daniel?

Daniel Forkner:

[15:09] I honestly don't know but there is a company called Churchix, subtitled know your members.

David Torcivia:

[15:16] That's not all ominous at all. “Hi, we’re facial recognition company, we sell products to churches. Know Your Members.”

Daniel Forkner:

[15:24] It's sold as a way to take attendance but the same company also markets their service to law enforcement, classrooms, hotels, things like this for tracking suspects and criminals it says on their website. So it's that kind of an interesting mix of clients that might use this software. But yeah, the idea is if you're at church, you know, download their software, put some cameras up in your congregation and then you know who's attending, and if you deploy something alongside that like behavioral tracking software which I think is kind of related to this idea, you might be able to see if they're enjoying your sermon or not. And then if you find out they're not, well...

David Torcivia:

[16:05] Your parish is falling to sleep, you see the nodding off and then this software like automatically activates like a lightning bolt and thunder sound effect just to wake everybody up. I’m thinking too big.

Daniel Forkner:

[16:16] Or maybe you pressure them to increase their tithe to show their devotion or something.

David Torcivia:

[16:21] Oh man, we're really bad, we shouldn't, no one should give us this technology, this is really bad. You mentioned hotels though, this is another place that these facial recognition cameras are being sold as something to defend all of us and protect our safety, particularly in human trafficking. Hotels are a frequent destination for human traffickers both when their trafficking humans and also as a venue when they’re sex trafficking for brokering these sexual deals that are ultimately the purpose of the trafficking in the first place. And again, this is a great idea, the idea that we deploy these cameras here, we recognize people who are here frequently, they are probably sex workers or I guess frequent business travelers. But business travelers should have checked into the hotel, the sex workers probably didn't, the people in trafficking probably didn't, they could be compared to known traffickers databases or known databases of people that are being trafficked. That’s cool, great.

[17:16] But also this has meant that the vast majority of people who are working in the sex industry at hotels are not being trafficked. These are regular day-to-day sex workers, hotels are vitally important for them to conduct their business in a safe place. But this technology means that they are getting flagged as sex workers in this process. They're getting caught up in the dragnet and one of the only safe places for them to conduct their business is now being wiped off the map because they cannot go there in these hotels that have deployed this technology because it means risking arrest either by police or being kicked out of the hotel by the management, and then having to either conduct business elsewhere that's less safe or having to tell their John or whoever that they can't do this and risking payback or vengeance from that. So it's made this very important safe place that sex workers are supposed to be performing their stuff essentially eliminated. It’s this unforeseen side effect. And so much of this facial recognition is these unforeseen side effects, these chilling effects that are covering all sorts of parts of our society because we just don't consider them, because we're deploying is technology without any thought or reason or care to the consequences.

Daniel Forkner:

[18:27] I think you bring up a good point. Why is this technology being deployed in the first place and who is doing it? Because it's not necessarily we the public that is encouraging the deployment of these technologies. Another place they're showing up is in public schools and it's because companies are marketing this technology to schools and the parents that ultimately control the boards as a way to prevent school shootings.

David Torcivia:

[18:53] Wait, but, I mean, I don't know why I'm laughing and I feel crassy about it, but aren’t most of these school shootings committed by students?

Daniel Forkner:

[19:02] A lot of them are, yeah. So that's problematic, I mean the idea I think from the companies, they say: well, you know, we can identify suspended students, right? But then another thing is like a lot of these shootings like we talked about Christchurch last week and that wasn't a school shooting, but I mean the man wasn't trying to go about it stealthily, right? He walked up, he was recording himself.

[19:25] Identifying him 30-40 seconds earlier wouldn't have made a difference.

[19:30] So in a lot of ways it's kind of like a pointless technology to address this problem. But what these companies have figured out is that it's really easy to profit off of fear if you're a parent with a child in school, and the company is presenting this idea like: look, there's a threat out there and we can stop it, we're going to keep watching all the criminals in the area. I can see how that would be pretty appealing to a parent, but in the end, these companies are just making money off of technology that ultimately not going to do anything. And coming back to this question of who is really deploying this technology. And I really want to hone in on this idea that it's not coming from the public. I want to give another example, this is something that Detroit is calling Project Greenlight, and this is a police example, David. And essentially what it is is the police have designed this project where they get small businesses to pay them to set up facial recognition cameras on private property, they pay an upfront fee and then they pay an ongoing monthly fee to have the service set up. And you can go to detroitmi.gov, find a project Greenlight web page, you can actually see a map of all the businesses that have signed up for this and it's quite a bit. And the idea is that because these businesses are paying the police, they get priority 911 treatment, right? So if a crime occurs or they call 911 for whatever reason, the police will prioritize their property above and everything else.

David Torcivia:

[20:58] This sounds like an old-school mob protection money style racket going on.

Daniel Forkner:

[21:03] That's exactly what it sounds like. In fact, I just want to read a comment from someone just complaining about this online. And this is just a random person, but I think it really highlights the frustration around these types of things. This person says, “Under project Greenlight participating local businesses must pay X thousands of dollars to purchase a minimum of 4 high-definition surveillance cameras and other recording equipment from the Detroit police. Business owners then have to shell out $150 or so a month to store the recordings from the equipment as well as additional fees ranging a couple of hundred dollars to purchase all the signs, decals and green lights. Effectively, the Detroit police have become a private security company. Detroit is building a privately-funded massive surveillance network. But here's the naked capitalist part of this. The police promised to any business joining project Greenlight gets first priority when they dial 911. Yes, folks, you read that right, first priority. So if your child gets hit by a car or if your neighbor is getting savagely beaten, you better pray that you don't call when a gas station attendant wants to report that some black kids have been in his store 5 minutes too long. The police aren't hiding anything anymore, they're saying we don't care about everybody, give us the money and maybe we'll help you. That is some straight-up Mafia shit, it's full-blown protection money, this was not voted on, this was not determined by the people.”

[22:31] That's the end of the quote but I think that last sentence, that this was not determined by the people is very important because so much of these systems that we talked about, David, we the public cannot opt-out of it and we never opted-in. The police around the world are deploying these mass surveillance tech that the public never had a say in. But who is the stakeholder? Ostensibly it's supposed to benefit the public and that's what our governments are supposed to exist for, right, to serve the public? But then it's a bit strange, I think, that our governments are deploying these methods for surveilling us every single day and everywhere we go, storing are highly personal biometric data, tracking our locations, drawing conclusions from that data that we have no idea how these conclusions are being made because so much of these companies, they hide the algorithms behind their facial recognition software, behind proprietary walls so no one can even look at it. And we never voted on. So how can this truly be something that benefits the public?

David Torcivia:

[23:30] I think there's also this technology creep that a lot of people unaware of. So we have these vague senses that maybe facial recognition is a thing, we've seen it in CSI or whatever. We have a very conscious relationship with surveillance cameras. They're everywhere, we see them, in fact, there's a thriving market for fake surveillance cameras, they become so ubiquitous. I don't think a lot of us have put together that connection between the two, thinking like all these old surveillance cameras can be converted to become facial recognition cameras fairly simply. We’ve talked about this before, it's just a matter of hooking it into software backend and then it's possible to do all this. We also don't realize the scope of these facial recognition things, how far the technology has come, how much information it is able to read about us. And it's not just a matter of recognizing who you are.

[24:26] But the technology is now being developed to tell a lot about us, all sorts of things. And I guess we'll get into that in a moment, but it's emotion, there are efforts to read into our sexual orientation – all sorts of things that are based on your face, modern-day phrenology as technology. We don't think of this, we are unaware of this, we are unaware that police are actively installing these, that many the body cameras that we were clamoring for police to install just a few years ago after all these horrible shootings that they committed that haven't stopped even with the presence of these police body cameras, well, these cameras now themselves are supporting facial recognition, tracking us as officers walk around on their beat and extending this all-seeing eye, the panopticon into every single part of our life. I mean, up until very recently we've enjoyed this period of anonymity that has been relatively new in human history. I mean, when we were living in small towns and villages and everybody knew everybody, the idea of being anonymous was sort of ridiculous, because you knew your neighbors, you knew your community. But as our cities got larger, as we moved farther from home, we found ourselves being able to be invisible for the first time and it really changed how our society worked, it changed how we interact with each other, how we moved through the world, the way that we interacted in our own private lives versus the public.

[25:43] But with this technology, we’re taking away that anonymity again for the first time in a very long time. But what's interesting about it is it's not being taken away by a strong community which is what it was in the past, but by this all-seeing eye that threatens us with violence if we act out. And that I think is a key distinction and something we need to remember and be very wary of. And we’ll explore this concept more in-depth as we go on.

Daniel Forkner:

[26:08] And to just expand a little bit more, not on Project Greenlight, but this idea of private businesses installing this software, we're seeing this being deployed in retail stores across the US as a way to prevent shoplifting. And what they're doing is tying these cameras to the automatic doors of a store. So if you've ever shoplifted at this place before or you’ve shoplifted somewhere else before, your face could be logged into a database that says: do not let this person into a store. And any business that has connections to this database can now use it so that when you walk up to the Ace Hardware or the Walmart or the Shell gas station, the doors won't open for you.

[26:54] I think that's really profound because what we're saying is: let's reorganize the space, the infrastructure that everyone lives on that we all depend on and let’s give access to that space over and then the keys to that over to an algorithm that will automatically decide if you are worthy of entering a space or not. And ultimately, there's a human decision that went into creating that algorithm. But there's something unsettling about removing the human decision in that moment where maybe someone was down on their luck and they went to a Walmart and they stole, I don't know, usually, it's a necessity, right, that maybe baby formula, before they locked it up with something that was commonly shoplifted because mother's absolutely needed that to feed their babies. Well, let’s say you did that at one point, and now because in an act of desperation you did something that you thought was necessary to sustain yourself, to keep your family going, you’re now barred from the same store that you might be willing to pay for something that is a necessity, now where are you going to go? But it goes deeper than that because a lot of these companies now are trying to develop behavioral algorithms that can not only tell if you shoplifted before by matching your face to a database but predict if you are going to shoplift. And this is where it kind of gets scary.

[28:17] Some of these companies are trying to train these algorithms on what human behavior means. And the way they do it because again, we have to remember: all of these algorithms, all of these artificial intelligence decisions, this machine learning – at the end of the day these computers are making decisions based on subjective human inputs about what results we want. And when it comes to tracking behaviors, some companies literally have cameras capture the movements of actors acting out different behaviors and then a person has to tell that computer: hey, that behavior you just saw? that's suspicious, and if you see that, you should raise the probability that a person is going to shoplift. And we’ve talked about how this AI which is trained on these large data sets, when that data set comes from some kind of human curation, it can end up exploiting out the impacts of human biases and discriminatory prejudices, and stereotypes. I mean, just think about that behavioral example, does this take into account individual idiosyncrasies or cultural differences? You know, some people are more fidgety naturally than other people, does that mean they're going to be discriminated against by a computer that thinks they're being suspicious because they're being fidgety in a gas station?

David Torcivia:

[29:34] Well, and some of these technologies, Daniel, it’s not just about open sesame to the door by revealing your face and that is literally how this is being deployed at some of these places: like look up at the camera, it scans your face, then it opens the door if you're not on one of these blacklists.

[29:49] Because of this shoplifting fear, if you are clocked as somebody who might be a shoplifter or who has been tagged in the past as shoplifter, there are some stores, these companies that we will not name, but say that they are doing this, who will alert the police as soon as you enter the store, even if you aren't committing a crime or haven't committed a crime in the past. Cause it's not even about somebody who has necessarily shoplifted but somebody who is added to this list of potential shoplifters or problem causes, whatever list that they decide they want to keep. They pre alerting the police just that you entered a store because they don't trust you and because they want to farm off this idea of responsibility of their worthless goods and threaten your life with police conduct. Because when the police come in, they can arrest you for no reason, you can be locked up in a cell awaiting trial, if you can't post bail it might be months, you'll lose your job, you’ll lose your apartment, you end up homeless even if you didn't commit a crime – this happens all the time here in New York with our cash bail system, people a locked up in Rikers Island for months at a time, oftentimes having not committed a crime at all. They are eventually released, the government says, “I'm sorry,” and that person's life is destroyed. And this is some risk that these stores are taking in order to protect these goods that are mostly worthless and are all covered by insurance.

[31:10] They have no risk here and they're willing still to threaten people’s entire lives to protect these things that they would have been paid back for anyway. That's the willy-nilly way these technologies are being deployed, it's already hugely over-reaching and it's still just barely, barely hitting the market. And it's going to get so much worse.

Daniel Forkner:

[31:30] Speaking of ways to abuse this technology. I think you mentioned earlier in the show how this technology might be bringing back old school phrenology, which is, of course, the unscientific notion that you can judge a person's character or intellect or something by the way their skull is shaped, you know, some other clearly racist notion. Well, there is a Russian behavioral psychologist who researches artificial intelligence and mass persuasion and he presented to the Russian Prime Minister in 2017 on his research. And in the same year, he published papers on the way that he believes facial recognition software could be used to detect a person's sexual orientation, their emotions, their predispositions to criminal behavior, their IQ, even their political ideologies. And I think this is where we really start to see how this technology could be used for evil.

David Torcivia:

[32:28] Yeah, I mean, it doesn't take a huge leap of the imagination to imagine ways that facial recognition that gives you information on sexual preferences could be easily abused by a variety of regimes. But it’s not just Russia that is doing this research, we like saying evil Russia or whatever but there was a paper from China analyzing people's faces to predetermine if they were likely to be criminals or not. They felt that they could detect criminals before people committed crimes, that some people have a criminal face shape and are more likely to. Of course, their data that they used was terrible but it doesn't matter because they can sell this technology, because police departments don't care, they just want something that works, that can get convictions and they're not concerned about the science. Here in the United States, there are a variety of major companies including companies like Microsoft who are deploying similar facial recognition things. Microsoft has one particularly aimed at reading people's faces during political rallies for variety of reasons, one of which is emotions but also, eventually, I imagine it will also include political alignment either by tying your face directly to some sort of database they have of your recorded political alignment or by some sort of phrenology based on the way that you’re dressed, your skull is shaped, that you’re more likely be a democrat or republican or an anarchist or Nazi or whatever.

Daniel Forkner:

[33:47] Yeah, it was in 2016 that Microsoft showcased a facial recognition technology they call Realtime Crowd Insights, where the technology works by scanning a crowd of faces in real-time and trying to estimate their age, their sex, but clearly identifying what the computer believes is their emotional state. So and the salespeople at Microsoft actually promoted this as something to be used at political rallies. You know, something that the president or a political candidate could use to gauge whether or not people were engaged in what they were saying. Buy it doesn't take a great leap of imagination to see how this could be abused. Oh, we noticed these people didn't smile at what I said therefore off with their heads or whatever.

David Torcivia:

[34:35] Imagine it combined with that behavioral analysis of the image in earlier that is looking for people trying to assassinate your political leader and it keeps getting false positives, but secret service out there just murking people in the crowd because they think that they might be a threat. I can easily see that occurring too. I'm laughing but it's because I'm actually terrified.

Daniel Forkner:

[34:56] We are all training ourselves to get ready for the Realtime Crowd Insights, no matter what we actually feel inside, we just got a bottle that in.

David Torcivia:

[35:05] Just got to smile, baby.

Daniel Forkner:

[35:06] Just just got a smile otherwise we might be victims of face crime.

David Torcivia:

[35:10] Daniel, you're so funny and smart. [faking laugh]

Daniel Forkner:

[35:13] And we're talking about like nefarious purposes, an evil intent. But I think that's really actually come back to what you mentioned earlier about technology creep. Where it's not necessarily that someone sets out to say: okay, how can I design a system so that I can discriminate against my people by just saying that their political ideology is wrong or their sexual orientation is wrong? You know, you can imagine a violent political regime somewhere that wants to eradicate people of a certain sexual orientation or something. But ultimately, what we're doing with all of this facial tracking, all these cameras everywhere is creating databases. Databases that have so many data points of not just your face but who you are, where you are, what you do when you're there, what rallies you attend, what concerts you attend and then when you're at those concerts, what do you feel, you know, where do you go.

[36:10] And what this provides is a sandbox, a playground for the big data analyst to draw all types of correlations that may not mean anything but could be useful for promoting and selling technology for some useless thing. Again, take sexual orientation, I guarantee you if you take 350 million faces and, you know, 10-20-30, a thousand datapoint underneath each one of those and tell a computer to find a correlation, it can find a correlation with whatever you asking it to do. But are those correlations actually meaningful? Who knows? That requires a much deeper scientific rigor and deeper analysis. But do you think those commercial companies are just looking for a quick product that they can sell to some political regime that doesn't have any scruples with killing its own people or whatever it is or shutting down journalists who want to speak out against them or raiding the apartment buildings of people in New York City rent-stabilized housing so they can get them kicked out and get kickback from the landlords? There's all kinds of ways you could use these products and I guarantee these companies don't care if they're actually accurate. What they want is to make a profit and we are providing them the opportunity to be competitive and create and innovate in ways that ultimately just come back and harm us.

David Torcivia:

[37:24] Let’s name and shame one of these companies, Daniel, cause you didn't mention these companies, so let me just like, there's so many: Amazon has a product called Rekognition, this is one of the most popular ones on the market, we’ve mentioned Microsoft, there is a company, the one that provides these systems that will automatically alert the police if somebody speaks, that one is called Cogniz. There's an Israeli company though that I think has the most insane sales pitch called Faceception, which is God damn, what a name, but let me just pull up their website real quick, faceception.com/ourtechnology, so we go straight to the good stuff here. And there’s a bunch of marketing speak talking about their facial recognition things and all the stuff they could do. But if you scroll down, they have a section called Our classifiers and there's a little bit of copy here that says basically that our algorithms can score an individual according to their fit to these classifiers and then it lists like six or seven classifiers. So why don’t we read through some of these, Daniel, and you can tell me if we're getting into insane territory or no.

Daniel Forkner:

[38:30] According to Faceception, it can read a person's face and tell you if they have a high IQ. Which according to their definition is a self-made person, freethinkers, and entrepreneurs, exceptionally gifted, tend to be less socially oriented – okay – they value truth, facts and logic more than emotional relations. Okay. I didn't know that all people with high IQ are less socially oriented. Also, I'm pretty sure IQ is kind of just like quasi pseudoscience, to begin with.

David Torcivia:

[39:02] Yeah, I mean, it's like how good are you at a particular type of problem-solving tests. There are so many different types of intelligence. It's really limiting. But it should be no surprise that a facial recognition company is trying to limit into a very specific type. So High IQ is one of these, there’s another one that says, academic researcher. And they say that these are endowed with sequential thinking, high analytical abilities, a multiplicity of ideas, deep thoughts and seriousness, creative with high concentration ability, high mental capacity and interest in data and information. I think they definitely just scanned their own face and then rode out this horoscope-like description of themselves.

Daniel Forkner:

[39:42] Wait, so the idea is that their software scans the crowd and tells you who fits the profile of an academic researcher?

David Torcivia:

[39:48] Yeah, but it doesn't end there, it gets more insane. So I mean, they're already scoring you on high IQ, they are scoring you on how much like an academic researcher you look. And then they're also scoring on your, I guess, similarity to a professional poker player’s face as well as a bingo player’s face. And this is what they say about bingo players which, remember, is a game where you just listen for somebody to yell out your number and then you are trying to yell bingo as fast as you can, that's it. And this is their description: endowed with a high mental ceiling, high concentration, adventurousness, and strong analytical abilities, tends to be creative with high originality and imagination, high conservation and sharp senses.

Daniel Forkner:

[40:32] Here's another one, they can label people pedophiles. This is a hit: this personality trait suffers from a high level of anxiety and depression. Introverted, lacks emotion, calculated, tends to pessimism, with low self-esteem, low self-image and mood swings. Again, this is all just basing off their face. Now, what is interesting is: this is kind of funny, right? But at the end, this company is selling this product to governments around the world. And in their own words, they use this for Homeland Security and Public Safety contracts. So somewhere out there are police officers who are looking at a screen watching people go by and seeing classifiers show up saying one person fits the likely personality of a terrorist, another the likely personality of a pedophile, another a professional poker player. And then, I guess what, the police are supposed to just...

David Torcivia:

[41:25] Look out for bingo players!

Daniel Forkner:

[41:27] Not act on that information?

David Torcivia:

[41:29] Yeah they like, this guy score as high as a bingo player and not as high as a pedophile so I guess he can go through. But this guy, he's both a brand promoter and a terrorist so let’s give him the cavity search.

Daniel Forkner:

[41:41] Right, it's just ridiculous. What's a little bit ironic about the topic and a little bit unsatisfying is that through, for all this trouble and all this inconvenience, at the end of the day that technology doesn't really even work.

David Torcivia:

[41:55] It's an expensive mistake for us to make, but...

Daniel Forkner:

[41:58] And to reveal the problem with this software, researchers tested Amazon's recognition, face recognition software set to an 80% confidence level. And they ran members of Congress against a database of mugshots in the country. And what do you know? 28 members of the US Congress were found to match with people who had their mugshots taken by local police. Of course, it wasn't an actual match but kind of an interesting way to showcase how the problems with these technologies, although they do discriminate highly against people of color and low-income people, where they're being deployed most often – everyone is at risk for these false positives.

David Torcivia:

[42:41] Yeah, there's some part of me says that if this was actually detecting facial recognition in the likelihood to catch crimes, that there’s a number of Congress members who are flagged, should be a lot higher.

[42:56] But the fact that this technology is set at 80% is thinking that almost 30 members of Congress are actually people photographed in mugshots show that we can't have a lot of confidence in it. What Amazon does in their defense is recommend not using their software for law enforcement with a confidence interval of less than 99%.

Daniel Forkner:

[43:16] To be unfair, I've seen reports that although they claim that that's the recommendation, they don't truly train law enforcement officers that they sell this technology to actually ensure that they do that and use it correctly. And why are they even shipping it out with the option to have a less than 95% confidence score in the first place? Those are questions I want answered, David.

David Torcivia:

[43:38] I mean, it gets even worse than that as always, Daniel, this story continuously degrades as we go on through this episode. So last year, the FBI was talking about their facial recognition system which has probably, and I don't know this for sure so that might be reaching, but one of the largest databases of faces in the entire world, certainly more than most private companies can offer. And they said they only have an 85% chance of correctly identifying a person from within a group of just 50 choices.

Daniel Forkner:

[44:10] But David, it gets even worse than that. There was a study done by the Massachusetts Institute of Technology that found that facial recognition software offered by IBM could only correctly identify the sex of women of color 65% of the time. That's not even their identity, that's their gender.

David Torcivia:

[44:31] Yeah, we’re setting the bar pretty low here, it's not even saying: can you match this face; it's just: can you figure out the sex of this person? And the software is failing on these dark-skinned women.

[44:43] But, Daniel, it gets even worse than that. So in the UK, police recently tested their facial recognition system. And when they put it in the action, they found they had a pretty high failure rate, do you want to guess?

Daniel Forkner:

[44:58] 5% failure.

David Torcivia:

[44:59] How high is too high?

Daniel Forkner:

[45:01] 10%.

David Torcivia:

[45:01] So it says, you know, of every match they get, that would mean that 10% of them are incorrect matches. So if they got 10 matches that would mean one was wrong. That would be pretty bad because they're wasting somebody’s time, they’re wasting police resources, it is a mistake. But if it had been just a 10% failure rate, they probably would have been jumping for joy, because their actual failure rate on their facial recognition software was 98%. That means if they had 10 people the software identified as criminals then all of them would be wrong. That means if they had a hundred people identified as criminals then only two of them might be right.

Daniel Forkner:

[45:42] David, this number was so crazy, I had to look into it. And what appears to be is that the Metropolitan Police is using the software to just basically scan faces just all day in public spaces and if someone happens to match with a like a wanted criminal on a list or some other suspect, the police would be pinged and they get an alert. Of all the alerts that they received in the past couple of years, I think, it was 98% of them were mismatches. And what I found out too was recently, like the past five months or so, the Metropolitan Police received, I think, it was five matches and 100% of them were mismatches. So their current failure rate is at 100%. And, of course, it's important to point out that these systems for whatever reason fail a lot more for women and men of color. And this has big implications, right? So imagine you have a whole population of black students, High School aged let’s say. There are at school, they are already, if you're in the US, already being viewed by the police, their teachers, society at large as criminals or potential criminals. And now we're deploying these autonomous facial recognition systems designed to lock people out of schools, lock people out of stores and all these systems are going to do the same.

[46:59] And that's the danger with these false positives, these failure rates of these systems. That if it fails 35% of the time for black people, that means that the person who already goes about their life being viewed with suspicion, will now have computers locking them out of spaces on the suspicion that they're wanted criminal or suspicious or fitting the profile of some sex offender. And what will that do except deepen the stereotype? Deepen the feelings that we as a society place on them of being out of place and unwelcome. Which I think speaks to a larger problem beyond just people of color, because the more ways that companies are finding to implement this technology in churches, schools, retail stores, taxi cabs, our street lamps, the harder it is to ensure that these systems are accurate in the aggregate. Because while that particular system might have, let's say best case scenario, 99.9% accuracy under controlled settings like the lighting and angles that are used to train that software, well that environment goes out the window once it gets deployed randomly by organizations for specialized use.

[48:06] If a camera is accurate at detecting faces two feet away under ideal settings but then a church puts the camera in a ceiling corner looking down at a congregation that has partial shadows or something, well now the accuracy plummets and you have people being misidentified as sex offenders and then the police get involved. And if you take, let’s say the population of the United States, 350 million people, a 2% failure rate – explode that out by that number of people interacting with the system every single day in a variety of contexts. That's a lot of people who are going to have their lives truly inconvenienced or worse.

David Torcivia:

[48:44] Well, let’s look at a very real example of somebody's life being destroyed by this technology. Now imagine this, Daniel, you're sitting home alone, it’s late at night, you hear a knock on the door, you go outside, you open it and then you’re tackled by police who kick you, break your teeth, beat you and arrest you. You haven't committed a crime, well outside of happening to look like someone else and triggering one of these false facial recognition alerts. This is what happened to Steve Talley in 2014 where he…

Daniel Forkner:

[49:15] This is the craziest story I've ever read.

David Torcivia:

[49:17] I mean literally what I said, just then actually happened. So he hears a knock at the door, he's in his boxers, he went outside, he is tackled by the police, flashbangs went off, all of his neighborhood saw this attack on him, he went to jail.

Daniel Forkner:

[49:31] He was in jail for two months and he was held in a maximum-security cell.

David Torcivia:

[49:36] So I mean imagine being locked up in a maximum-security cell for two months having not committed any crime snd then being released and police are like “oops, sorry, our bad, we had a false facial recognition.” And the way they figure this out eventually was because they just compared a surveillance record of him at his actual workplace where he was on the phone selling stuff at the exact time this bank robbery occurred, and I have no idea why it took them two months to come and check this evidence and very clearly see.

Daniel Forkner:

[50:04] Yeah, I think it was audio evidence that cleared him cause he was on the phone, yeah, selling like a mutual fund, he's a financial advisor. And what’s crazy, so you mentioned that he was beaten by police outside of his home and that's how they got him to the jail, so after his release, two months after, he went to the doctor and they found that he had suffered “a broken sternum, several broken teeth, four ruptured discs, blood clot in his right leg, nerve damage in his right ankle and a possibly fractured penis.”

David Torcivia:

[50:35] I honestly.

Daniel Forkner:

[50:37] Yeah, I didn’t know it was possible either.

David Torcivia:

[50:39] I heard but I don't know exactly the mechanics of it but it's terrifying.

Daniel Forkner:

[50:44] So after he experienced all those injuries, while he was in jail, he couldn't pay his rent so after he was released, because he didn't commit the crime, he had to live in homeless shelters. He tried to get another job as a financial advisor which was his old job, but no employer would hire him because, apparently, his charges showed up on his background check and they were, I guess, they didn't want to hire someone who had just robbed a bank 2 months earlier. So he was working really hard to get his name cleared, he was trying to sue the police department and then, David, one year later…

David Torcivia:

[51:18] This story keeps going.

Daniel Forkner:

[51:20] Video recordings of a man at a different bank robbery got flagged as a match for Steve and he was again arrested for a robbery that he never did.

David Torcivia:

[51:30] You think that the second time around somebody at the police would have been like, okay, very clearly this guy that we caught using facial recognition the first time didn’t do, so there's obviously some guy out there robbing banks who looks just like him. And then the second bank robbery occurs and using facial recognition techniques again, they're like: oh, it's the same guy we arrested before who we cleared as innocent, let's arrest him again. And that is literally what happened, police have gotten so fucking lazy with their investigations using this cheap forensic technology like facial recognition that they don't even bother looking into, you know, things like alibis that would have been helpful during the first arrest or the fact that there's this, very clearly, person looking similar out there robbing these banks, and so they take him into custody cause it's the easy thing to do, they can close the book it looks good on the records. I'm getting off-topic though, keep going with the story.

Daniel Forkner:

[52:24] No, I think you kind of hit it. I mean the prosecutor apparently kind of had to do some backtracking cause initially, they said that Steve had committed both robberies. But once it was clear that he did not commit the first robbery, they kind of had to backtrack and say, well, it must have been a different person doing the second robbery, even though initially, I think, it was pretty clear from the video recording that it was the same person. So it didn't really matter. You know what's so interesting about these cases is that when you look at it from the prosecutor's perspective, so often it seems obvious that they don't care who did it, they just need to put somebody behind bars. And it's like clearly Steve didn't commit the first crime and it was the same person who committed the second crime, but you don't know who the second person is and you have Steve, so why not just throw Steve back in jail? But you mentioned forensic and again, going back to episode Suspect Science. There's a cognitive neuroscientist from the University College of London who really sums it up pretty good I think. They say, “What is similar enough,” she's talking about matching a face with another face. “What is similar enough? Nobody can tell you, it's in the eye of the beholder. You need to know that if this person has a right nostril bigger than the left nostril, are the chances one out of a million or is it every second person?” And that quote really made me think because again, we are presented with this technology and we just take it for granted like, oh, it's a computer it's using an algorithm, of course, is going to be accurate. But what are the criteria and who decided the criteria? And, I mean, do we even know if Steve has a twin brother? I don’t know why I said that.

David Torcivia:

[53:58] [laughs] A long-lost evil twin brother.

Daniel Forkner:

[54:01] But once you think about, it does seem kind of obvious that these are just correlations. And unless there is a true rigorous scientific method to determine statistical significance, that's all it's going to be, it’s a correlation. So we don't really know how significant a certain characteristic is. And we talked about fingerprints, again, another technology that's presented to us as this foolproof scientific method, but again, we don't know the statistical significance of one person's fingerprint versus the other. There is really no way to test that, it just comes back down to subjective experience, a human, after the artificial intelligence has presented 2 possible matches, a human is sitting down looking at a computer screen and just looking at two pictures, just guessing if they're the same person’s or not, whether it's a fingerprint or a face – that's what it comes down to. And expanding this with artificial intelligence, well, that’s only going to make this worse because those algorithms are built and directed by a human. But what's interesting in terms of the psychology is now that when an authority uses these technologies, they have plausible deniability. Now when they bring that suspect, you said it, David, they're getting lazy because they can, because they don't have to investigate and truly do rigorous scientific methodology to build evidence in a proper way because they can just say, oh, the computer flagged this person, we looked at it – yes, that's a match and why would the computer be wrong?

David Torcivia:

[55:25] Well, the obvious question then, Daniel, is what do we do about this? How can we fight back against this ubiquitous creep of facial recognition technology? Cut off our heads!

Daniel Forkner:

[55:37] Well, I was going to say plastic surgery.

David Torcivia:

[55:38] Plastic surgery gets expensive for every time I'm trying to go outside and buy something new from the grocery store. I mean, obvious thing is to cover your face, right? But unfortunately, in a lot of places, increasingly in fact in the United States, have anti-mask laws. This is something that is absolutely the case here in New York, I’ve complained about it several times on this show. If you are a New York lawyer please reach out to me, I would love to gather a group of people to write some Protester’s Bill of Rights that includes things like the ability to mask yourself in political situations. There's a long proud tradition of that both in the United States and around the world. You need anonymity when you're trying to act out against powers-that-be. But unfortunately, we’ve legislated that right away and the NYPD is happy in protest situations to rip off your mask, to arrest you for wearing a mask, all the while shoving a GoPro in your face recording you to add you to their secret facial database with who knows what extent, what they do with that, whatever. So that aside.

Daniel Forkner:

[56:40] Did you read about that guy, and I think it was London, so the Metropolitan Police were testing a new facial recognition software or something and they had vans parked, like unmarked vans, where police were inside scanning the faces of people who walked by, and this one guy was just walking down the street and another passerby just mentions to him, just so you know this vans’s a police van and they're scanning people's faces. So this person, he didn't even cover his face, he just kind of like popped his collar, kept his head down low and walked by the van. And, of course, because the camera didn't pick up on his face, for some reason the police were offended. So they got out of the van, chased them down and fined him 90 euros for disorderly conduct or something like that. So even though the law stated that in this case, it wouldn't be a crime to refuse to have your face captured, the police still took offense that someone would actually do that.

David Torcivia:

[57:34] Well, sure I mean, this is actually one of the ways that we're seeing facial-recognition actively being deployed by police cause you don't have to identify yourself, depending on the state, depending on the laws when you are being detained or arrested or questioned by the police. There's some points when you eventually will have to, but a lot of times you don't necessarily need to do that, but what police departments are doing now: they stop you, they'll take a picture of your face and then upload it to their database and identify you that way. So you don't even have this right to anonymity that you used to have before. And this is very common, this is the same technology that police are using to catch people who shoplift $12 worth of gas or $30 worth of whatever. And maybe we didn't address this, and I think I should take a moment to before I get into the rest of this what can we do, but we alluded to at the beginning of this episode how this technology is deployed to catch big crimes, flashy crime, sex trafficking, pedophilia, murders, terrorism – these things that are the great fears of our society. But in fact, this technology is really bad at catching all of that, just like the police are bad at catching all of these. Most police work and the use of whatever forensic tools they have aren’t about stopping these crimes before they been committed, but about trying to piece things together and catch people after the fact. They're not a preventative force, they're there to file paperwork, help with insurance claims and maybe eventually put somebody behind bars, usually, because that person fucked up and it's so obvious that even the police can figure it out.

[59:02] So, I mean, the facial recognition is sold as this magic technology to put a Band-Aid over this and make it suddenly not the police's problem but technology, but AI, but these masters of our world: Microsoft, Google, Amazon.

[59:15] And upload the responsibility to the cloud and have these digital black boxes in neural nets take the hard part of figuring out who's going to commit crimes so the police can just react. Unfortunately, it’s really bad at that. But what this technology is really good at is recording all the tiny petty crimes all of us do all the time and building a huge database that basically allows police to put anybody in prison whenever they want because they know we're all committed crimes all the time. And we're all being recorded committing these crimes all the time because most petty crimes are committed in places where this surveillance exists and most major crimes are committed in places where video recording does not exist, does not occur, private spaces, places outside the beaten path, where they are committed there precisely because they're not being watched. That means that this technology can only really be applied to this large amount of petty crime that's been committed, things like jaywalking. I mean here in New York, everyone is jaywalking literally all the time, it's a proud tradition. But in China, they're actively using the facial recognition technology, not just to catch jaywalkers, but they put your face on a giant screen to shame you as a jaywalker and then it automatically find you from your WeChat account subtracting money straight from your bank account as soon as they detect you jaywalking.

[1:00:30] So it's a chilling effect, it's a shaming effect and it's automatically making sure that people committing these even tiny little crimes are being discouraged. And of course, I'm sure that dings your social credit, if you do this enough times you probably won't be able to fly, get on trains, whatever. Your life is ruined. And then the social credit system, remember, it's not just your life but your score affects your friends and your family. So this facial recognition is being used to control all of society. So you might lose your friends, people might distance themselves from you because you've been caught crossing the street, not at a crosswalk.

[1:01:04] Or crossing the street without the light even though there's no cars there. We're taking away the common sense, the ability of us to do the things we need to do to get through our day to day because it maybe breaks some stupid law that doesn't apply to us and punish us for that fact. I mean, laws are supposed to serve us because that’s why we make them. We're supposed to agree to have them because they're for a greater good. But laws oftentimes need to be bent, cause when they’re enforced rigorously and across-the-board, then we all are being punished for things that really shouldn't be. I mean, especially when we give the ability to selectively enforce laws, to the police. Like you mentioned with that example there where somebody didn't want their face recorded, Daniel, and instead, they’re being tackled by the police for disturbing the peace. And this is an arbitrary thing that police can just make up at any point and arrest you for it, and there's very little we can do. I got a traffic ticket once for erratic driving because I changed lanes in front of a police car and they got angry that I drove in front of them, it was unsafe driving or something was the thing on the ticket.

Daniel Forkner:

[1:02:08] And to bring it back to the beginning of the episode, I think, another example would be the landlord situation where maybe please don't want to enforce jaywalking on everybody, right? But if there is a particular building that let's say governor, or maybe he's too high up, but maybe some local city official who is really interested in pushing new development there. Maybe there's a gentrification project going on, but oh wait, we have these pesky tenants and their rent-controlled or rent-stabilized apartments, and they're never going to leave and we can't gentrify until we kick them out, how are we going to do that? Well, now that we have facial tracking everywhere and we know where they live and we know where they're going, let's just follow them around digitally, find a moment where they jaywalk or toss a cup on the ground littering or some arbitrary law that we wouldn't even consider negative from a societal standpoint, let's just watch them do that and then we'll go arrest them. And then we'll just get them bogged down in the jail system, the bureaucracy there until they miss their rent payment, then the landlord can kick them out. But while we're on the subject, David, did you call jaywalking a proud New York tradition?

David Torcivia:

[1:03:18] Yeah. We should have never given up the streets to cars and New York, I find, is one of the only American cities where we really try and push that back. At times, people just literally walking in the street in front of a car staring the car down, being like a challenge, like run me over, I don't care, I want to cross the street right now. And the drivers are just like, fuck it, I don't care, this is what you have to do to drive in New York I guess. Imagine for a second, to carry this example just a little bit more, so you know the NYPD has that secret database they create, a facial recognition particularly of protesters. Say there's an action coming up, they see they have this wide surveillance network through these kiosks they built around the city, they know where you moving around, those kiosks could conceivably track you jaywalking, you get pegged for a jaywalking ticket, maybe multiple jaywalking tickets and that becomes a large heavy fine, cops know where you are, the come stop you, give you your tickets, then you have to either fight this, pay it off in court, you have enough, it becomes a delinquent thing, you can face jail time, whatever. And you can very easily remove people who are considered a nuisance by NYPD or by the city or by the governor or by the government or whatever through this constant nitpicking of all the stupid tiny laws at all this break. And remember, there are analyses that take a look at how many laws each of us breaks every day and it varies widely, depending on which one you're looking at, most of us are breaking 10-100 laws every single day. Laws oftentimes that aren't being enforced, laws that we oftentimes don't even realize are on the books. But these are all things that could at some point be enforced in order to target us individually and that is the power that facial tracking is enabling. We're handing over this huge amount of power to these institutions. And it doesn't even end in these things like the state but also if we pass this over to the private industry, then this information could be passed from the state to private industries, and now we can’t purchase things and we're really shoved out of modern society the same way that the attempts to direct us to cashless could also potentially at some point bring us to. I got really off track here and I really do want to take a moment to talk about some of the ways that we can fight back. And I mean obviously, most of these revolve around the idea of how do you protect your face, disguise your face, hide your face from this ubiquitous facial tracking?

Daniel Forkner:

[1:05:41] There are a couple of interesting examples of a way to beat these facial tracking algorithms, David, one, of course, is wear a mask. There's a Japanese man who is obsessed with making masks and they're very much like the masks from, if you ever watch Mission Impossible, one of the earlier ones, where one of Tom Cruise's go-to strategy was just to 3D print a mask of whoever it is you wanted to impersonate. Well, there's a Japanese man selling these for about $2,700, they're plastic. He's working on a more soft version, malleable version but they're very realistic, very realistic, you can try that. But I don't think people are using these masks specifically to beat facial tracking things, it's a little bit too expensive. But there are other things that people are doing. Actually one of my favorites is a new type of fashion called HyperFace, that's actually the name of the prototype, HyperFace, it was introduced at Sundance Film Festival 2017, designed for the neuro speculative afro feminism project in which really cool about this is that the fashion doesn't actually cover or alter your face in any way, which is counterintuitive. But basically, what they're doing, they're designing the opposite of camouflage. Camouflage is something that obscures the thing that you're trying to hide

[1:07:03] You want to disappear among the leaves so you wear clothing that appears to be like leaves. That does the opposite where rather than hiding your face, the patterns present in such a way that the algorithm thinks that the pattern is your face.

[1:07:19] So in order for a computer to recognize your face in the first place, it looks for things that match up with its algorithm. And if multiple things match what the algorithm is looking for, it focuses on what it thinks has the highest probability of being what it wants. So again, instead of covering your face, this new pattern presents something that so perfectly matches what the algorithm wants that it ignores your actual face. The prototype is a bandana but this could probably be a t-shirt design or something, so that's an exciting development going on, David.

David Torcivia:

[1:07:54] Yeah HyperFace is really exciting stuff, it builds off an old art/hacking program called CV Dazzle which was, you ever seen those World War I ships, Daniel? That are covered in like crazy black and white stripes and stuff like that? Well, that's called Dazzle camouflage, and the idea wasn’t to try and disguise your ship against the sky or the sea but instead just to make it so crazy, they couldn't tell which parts were the front which was the back and sort of throw off the periscopes of these U-Boats or whatever submarines what were attacking these ships. And in the same sense, CV Dazzle just covers your face in crazy shapes that try and throw off his primitive facial recognition technology so it can't figure out those points that it's looking for on your face and identify them, record them in their database and then compare them to that database.

[1:08:44] Unfortunately, it's several years old now, the technology for camera recognition has improved a lot since then. So the need for these more futuristic types of camouflage technology like HyperFace, which is a sort of let's overwhelm them with the swarm of faces – has been important about taking this technology further. But there's a very simple technology people are using for the same sort of facial recognition fighting at the same time. There are some researchers who realize that if you can just put on a hat that shines infrared light on your face via LEDs then this is enough to throw off quite a few numbers of these facial recognition systems, it selectively highlights or darkens your face basically and it throws off these points that it's looking for. The best part is infrared LEDs aren’t visible to the naked eye, only these cameras should be seeing them. Of course, they work better in some situations than others. In full sunlight, it’s going to be less effective than it would be in dusk or evening. All these systems have different limitations, Juggalo face paint actually, to shout out to the Juggalos, has been shown to be fairly effective in fighting facial recognition, it is very similar actually to this CV Dazzle paint makeup systems. Like I said, as technology gets better, as they're moving to ear identification or gate identification or behavior identification, a lot of these facial things going to start falling apart. But we can fight back in the meantime, covering your face however you can, mask, sunglasses, hats – all these things are helpful. There are some researchers looking at sticking like stickers or tattoos of eyes or blips all over your face. What would I love to see is people playing with all this technology, seeing what works, seeing what doesn't and that’s coming together to sort of build a massive community database to let us know. Because a lot of us need to know what works to fight facial recognition, to fight these oppressive governments, these state institutions and just be able to exist in a world with our anonymity preserved, at least in terms of the digital footprints that we leave behind even in the real world.

Daniel Forkner:

[1:10:46] And if you're a fan of legislation, there are two bills that have been introduced in the United States Senate and Congress, both which would bar the government from using facial recognition technology in public spaces, including police body cameras without a warrant, and could even allow people who are the victims of this type of technology to sue the government. Although these bills are still in committee, we will see how they progress. And I suppose, the only thing that I would add, David, is just be aware that these systems exist, if you are going to go protest or something like that, maybe wear sunglasses, maybe wear a hat. You know, obviously, in New York City, David, you have the anti-mask laws but maybe there are still things you can do to try to obscure these cameras like the fashion statements or the face paint that we mentioned.

David Torcivia:

[1:11:36] Just from a perspective of the fact that our world is bland and cyberpunk futures are kind of cool aesthetically if you ignore all the social and political implications of those worlds. But I would love to see people walking around with crazy face paint and like stickers on their faces and stuff just as like an accepted fashion thing in order to defeat these algorithms. But maybe that world will come and maybe it'll come sooner rather than later. But I'm not holding my breath, but the change has to start somewhere, so maybe this can be the Ashes Ashes fashion segment.

Daniel Forkner:

[1:12:08] And again, if if you work in technology or you know people who work in technology, I think the implications of these types of things really aren't understood even by some of the people who should above all understand them. I was talking to a friend who's in technology and telling them a little bit about some of these things and they were kind of surprised. And I think that's common in the world where any technological progress is seen as an end unto itself. So many people don't think about the broader implications the, you know, scary ways of these things can be abused simply because technology is seen to be good and progress is seen to exist in only one direction. And we have to remember that technology isn't really this thing that just emerges the more we progress as a society, there's no set path to progress but the technology we implement, the things that we put out into the world is very much decided by us, by human beings deciding, this is what we want to create. And we have the ability to create so many things, we have the choice to go in so many directions. And I think if we just put a little bit of thought into these types of things, I don't think we would want to go in this direction, although I know I'm preaching to the choir. So just pointing out that not everyone is aware that we have the ability to go in different directions and the technology that's presented to us as these silver bullets to the societal problems like shootings, like shoplifting, like crime – they're not necessarily that and they could be sold to us for nefarious purposes. We should be aware of that so we can head it off as best as possible. Anyway, that's a lot to think about, David.

David Torcivia:

[1:13:47] As always, Daniel, but think about we hope you will. And this time I also want to add: and do something about it. If you're wondering where to start, feel free to reach out and we'll point you in the right direction. You can find more information on everything we talked about today, detailed sources list, that video we mentioned, as well as a full transcript of this episode on our website at ashesashes.org.

Daniel Forkner:

[1:14:12] A lot of time and research goes into making these episodes possible and we will never use ads to support this show. So if you appreciate it, would like us to keep going – you, our listener can support us by giving us a review, recommending us to a friend or visiting us at patreon.com/ashesashescast, we appreciate it very much. We also do have an email address that is contact at ashesashes. org. We encourage you to send us your thoughts, we really do appreciate them and we read them.

David Torcivia:

[1:14:44] You can also find us on all your favorite social media networks at ashesashescast. We've also got a great Discord Community, we encourage you to join. If you want to find a link to that, it’s on our website, just scroll to the top, click the Community link and you’ll find a friendly link there for our Discord invite.

Daniel Forkner:

[1:15:01] Also, on a personal note, I am considering moving up to Boston in a few months maybe, August or so, I don't know. But I'll be looking for work to do up there, so if you happen to live in that area or have some connections, I would appreciate any help from you, the listener. Reach out to me and let me know.

David Torcivia:

[1:15:18] Let's get Daniel a job! So everybody, contact us and we can filter that information forward. But until then.

Daniel Forkner:

[1:15:28] This is Ashes Ashes. Bye.

David Torcivia:

[1:15:31] Bye-Bye.