Nothing Left to Hide Transcript


  • 01:55 A History of Spying
  • 07:09 Surveillance as Business
  • 09:03 Ethiopia
  • 12:35 Mexico
  • 15:14 Name and Shame
  • 16:02 Where Does this Money Come From?
  • 18:06 China
  • 23:19 Surveillance at Home
  • 28:25 Affecting our Right to Protest
  • 32:22 Terrorist or Enemy of the State?
  • 43:49 Predictive Policing
  • 49:24 Chicago
  • 51:37 How Data Could Be Used to Reduce Crime
  • 56:57 Nothing to Hide

(this page is currently being edited, please pardon the machine translation until this is complete)


Return to main post

David Torcivia:

I'm David Torcivia.

Daniel Forkner:

I'm Daniel Forkner.

David Torcivia:

And this is Ashes Ashes, a podcast about systemic issues, cracks in civilization, collapse of the environment, and if we're unlucky the end of the world.

Daniel Forkner:

[0:14] But if we learn from all of this maybe we can stop that. The world might be broken, but it doesn't have to be.

David Torcivia:

[0:21] Today we're continuing our series on surveillance. This time we're going to look towards the governments of the world, and of course those companies that help them, and how things have evolved since the revelations brought to light by Edward Snowden back in 2013.

Now at the time we learn how wide and pervasive the surveillance apparatus of the NSA and the US government had become, confirming all those hushed whispers of the cybersecurity community (and of course the shouting of the conspiracy world), and yes that's right everything was being recorded, and yes the US government was spying on its citizens as well as everyone else, and yes people got mad, people protested, they wrote a bunch of angry news articles, and our politicians promised that we would do better.

But has anything actually changed?

Daniel Forkner:

[1:06] David that's a good question but it certainly doesn't seem like much has changed and we are certainly seeing a lot of reports coming out these days about human rights abuses involving governments using surveillance equipment and products both abroad and at home.

Local law authorities are using this on American citizens and we're seeing a lot of reports of some really egregious examples of surveillance abuse in other countries around the world.

David Torcivia:

[1:30] Yeah I mean surveillance has really become an industry which is sort of a crazy thing to think about. This world that had formerly been you know governments, is now companies selling products to these governments and building this huge worldwide industry that's designed to spy on people and designed to let these governments act on this information, usually in a very violent way. It's an interesting time and definitely a scary one.

A History of Spying

Daniel Forkner:

[1:56] Government have always tried in some way to surveil their citizens, but the integration of government surveillance with these information technologies that allow the digitization of mass surveillance kind of started to develop in the 90s in the United States when the Pentagon and the broad intelligence community started developing financial ties with groups that served kind of as a bridge between the government and this network of private contractors, and as an investor of Information technology projects throughout the country.

These groups served as shields behind which money could be funneled into technology startups and research, and private contractors could meet with Pentagon and intelligence officials to influence military operational policy and the use of these information technologies.

And all this was behind closed doors and away from public scrutiny with the ultimate goal of all this being to steer the future of information technology towards the benefit of the war and surveillance needs of our military and intelligence community.

David Torcivia:

[3:01] I mean it really shouldn't be a surprise that the government is working with these private corporations to do this stuff.

Edward Snowden himself was an NSA employee directly he was a contractor right? And contractors and outside companies and capital investment have been a big part of this apparatus since it was starting to be digitized like you said in the 90s.

But what is really surprising is some of these companies, these capital groups that the government has gotten involved for this intelligence apparatus.

These are major companies you would recognize and tend not to think of in this way; companies like Google, like Facebook, but maybe that's more of a conversation for another time.

Daniel Forkner:

[3:37] Yeah some of the early involvement of the intelligence community with Sergey Brin and Larry Page is kind of revealing and didn't come out until just a few years ago, but like you said it's not really surprising that the government would be involved in information technology that it could use, but it's important I think to reveal the rationale behind the government's intent to get behind this technology.

[3:58] And so one of the groups that formed financial ties to the military intelligence community was founded by this U.S. Navy Captain who stressed the importance of manipulating perceptions when it comes to information warfare (IW).

He believed for example that it was important to manipulate the perceptions of civilian populations and political leaders so that their cost of War could be justified.

David Torcivia:

[4:21] That’s taking a page right out of Edward Bernays Propaganda (1928) right there.

Daniel Forkner:

[4:25] Something we might want to discuss in the future right?

David Torcivia:

[4:28] The very near future I hope.

Daniel Forkner:

[4:29] So manipulating perceptions of civilian populations is something this Captain stressed, and the military intelligence community's agreed with him.

DARPA which is -

David Torcivia:

[4:38] DARPA is that like Defense Against the Dark Arts?

Daniel Forkner:

[4:45] Well, they do have those YouTube videos you might have seen; those four legged robots that run around, they look like dogs but -

David Torcivia:

[4:53] That's straight out of Harry Potter. Harry Potter meets minority report.

Daniel Forkner:

[4:56] Except Harry Potter didn't have machine guns mounted to -

David Torcivia:

[4:58] Yeah well it's actually DARPA, the Defense Advanced Research Projects Agency; the same group that gave us those killer robots but also the internet.

Daniel Forkner:

[5:07] Yeah exactly, and they adjusted their priorities in the 90s towards this information technology saying that “we're only going to fund information technology that can be used in war,” so that gives you an idea of what they wanted to use this type of technology for.

And in 2003, so this is the same year that the NSA’s Total Information Awareness (TIA) program was exposed as having spied on American citizens right?

In the same year the Bush Administration came out with their Information Operations Roadmap, and what it did is it directed the Pentagon to consider the internet as a weapon system, and that going forward the Department of Defense (DOD) should operate on the premise that it will fight the internet as if it were an enemy.

The administration urged the Pentagon to achieve “maximum control over the full spectrum of the world's emerging communication system,” and a review ordered by Donald Rumsfeld - he was the Secretary of Defense in the early 2000s - repeatedly highlighted mass surveillance as critical to this Department of Defense transformation.

David Torcivia:

[6:12] And that development of the military's relationship with the internet is actually continuing still today.

Just this past month depending the Pentagon approved nuclear retaliation for cyber attacks, which is something that before now had never been considered or even remotely possible but I mean cyber attacks were considered maybe an act of war, even though they’re constantly going on from all sides, but to be able to respond to one of these with nuclear weapons really shows how important and how great of a risk the internet and how these military leaders are considering it.

Daniel Forkner:

[6:42] So that's pretty alarming but all this is to point out that the government's obsession; this desire to build massive surveillance technologies; control global communication systems; fight the internet as if it were an enemy; manipulate perceptions of people using information…

All this has played an enormous role in encouraging a huge global private industry for surveillance technology.

And frankly this industry has gotten way out of hand.

Surveillance As Business

David Torcivia:

[7:10] And this really is an enormous industry. We're talking billions and billions of dollars of revenue every year, with companies in every major tech nation.

The US, Germany, Canada, England, and Israel, especially Us and Israel - those are the two big dealers so to speak in this world - and dealers really is the right word for this because, like we discussed these technologies, these surveillance tools, and some of them are more aggressive than that, really are weapons of war right? If the government considers a cyber-attack to be something worthy of nuclear retaliation, then what we're talking about right now are arms dealers.

This software can be used as weapons; these are tools used by governments in order to target people, in order to bring real life violence down on them either through chilling, or through direct arrest, or assassination.

So what's interesting about this though is that this technology isn't governed in the same way that arms-dealing is. There is an agreement which allows for countries to regularly exchange information on conventional weapons and dual use goods and technologies, which is what this falls under in terms of this. But it's non-binding and there's only a handful of countries that signed it, and the United States and Israel, again those two major players in this field did not sign it so they're not held to this.

Nations do tend to have some sort of internal restrictions on selling to what they consider humanitarian risk nations. The US for example has banned the sale of this software to Iran and Syria, more for political reasons than anything because at the same time they’re you know selling arms to Saudi Arabia, and the terrible things that they're doing to both their own population and places like Yemen, so being good of heart never stopped them for that.


And again these are political things; Israel too, the other big dealer here, does regulate the export though they have a very bad history of selling to countries that use and abusive this stuff.

One really great example is Ethiopia.

Daniel Forkner:

[9:04] Ethiopia is a good example. This is a country that has a history of some human rights abuses, it's a country where only 5% of the population has access to the internet, it's a poor country, has a poor social infrastructure. And Ethiopia we found out was purchasing surveillance products from an Israeli intelligence company, and then it was in using that technology to target journalists, academics, lawyers, and other people in countries around the world including the United States, Canada, and the UK.

What’s really striking about that is you have these developed countries building up these surveillance products, selling them to authoritarian governments around the world who can then not only use that equipment on their own citizens but also the citizens of other countries including the countries where that surveillance equipment was developed and sold from.

Even though there are supposedly restriction and regulations to prevent that type of abuse, a lot of these companies can get away with the fact that they just say “look we're selling to an authority, and they tell us they're not going to use it for nefarious purposes so that's enough accountability for us.”

[10:11] And it allows these countries in the case of Ethiopia to also get around other International regulations. For example usually a country that wants to obtain information about someone from another country, to build an investigation or to build evidence for something, they have to get consent from the other country. You can't just start spying on another nation’s citizens right?

But this technology is so sophisticated that it allows these governments to get around those type of safeguards.

David Torcivia:

[10:38] Yeah and the technology that does this is interesting. A lot of it does rely just on human error, so a lot of this is sort of phishing stuff where they’ll send you an email or text message or something, you click on the link because it's some desperate plea of you to do something, and then as soon as you click that your software, your phone, your computer is hacked.

You are susceptible to this software that runs because of holes in your operating system, in the software that's running, and they exploit those and they get inside you and then they're using data to find out your contact list, read your messages, look at your photos, go through your emails; all these things provide valuable data and it builds up networks and then people in your phone at that point are also at risk because they're now linked to you in a way that a despotic government could say “these are also enemies of the state” even if it's just you know your dry cleaner; if it's a friend and acquaintance; someone you don't know, someone you don't agree with politically, but now all of a sudden because they got hacked - often times for no valid reason - you are also now on this list of enemies of the state.

And this is actually happening in Turkey right now; there was an app that the government decided is an app used by enemies of the state, and they're currently running up hundreds of thousands of people who downloaded this app or even were tracked by a tracking pixel within this app –

So these aren’t even people who actually downloaded the app some of them; they just had a friend that did or looked at somebody's phone, or their friend downloaded or used it on their WiFi - a million different things and people are now languishing in jail by a regime that is rapidly declining into a dictatorship, and their very lives are at stake because of this kind of technology.

[12:10] Meanwhile, these companies in Israel and the United States and the UK - throw their hands up in the air and say “oops we didn't know that they were going to do this but it's not our fault” you know like “yeah sure we sold them missiles and nuclear bombs but we didn't know they were going to use them to kill people so sorry!” and that sort of thing is ridiculous.

They say that most of these things are used against enemies of the state, but the question of what is an enemy of the state is very interesting.


[12:35] So let’s look at Mexico for a second.

There’s a company called NSO group that sold a piece of spyware software to The Mexican government and they assume the government's going to use this for catching “bad guys” whatever that means.

But what actually has happened is - this is the most benign group I could think of of targeting – the Mexican government started using this spyware to try and catch nutrition activists okay? People concerned that Mexicans were eating unhealthy diets - they wanted a soda tax implemented, which the Mexican government did, and these were people who pushed for that, these are people who work for NGOs, people who worked for the government implementing this stuff - they were targeted by the government because these corporations that have been providing these unhealthy foods were complaining that this was cutting into their sales, and these people started getting weird messages on their phones and their computers.

One great example was this guy kept getting text messages from unknown numbers and they would say something like “Mr. Simon your daughter was just in an accident and it's really bad I hope you come. I'm sending you info about where she's been admitted” and there's a file attachment, and if he clicks that file attachment Boom he has spyware on his phone. Or the other one he got was “Simon buddy my dad just died we're devastated. I'm sending you info about wake please come” and again it's the same sort of attachment.

Or there where this one “Simon today there's a piece in the newspaper that made a series of accusations of corruption of you, you have to check it out so that you can deny these’ and then if you click that again the government would have spying information on them and these are not terrorists, these are not people trying to blow up the government or assassinate someone, these are people saying “we should eat healthier” and they're being targeted by this software, by this government.

[14:14] Meanwhile this Israeli company, NSO Group that sold them the software said “whatever you know it's how they use it is how they use it but we're just selling neutral software we have nothing to do with this.”

Daniel Forkner:

[14:24] That spyware once it's on their phone, it's important to point out what it can do right?

It can extract their text messages, it can find out everything that's on their calendar events so that you can figure out where they're going to be; it can look at their emails, instant messages, and where they are.

David Torcivia:

[14:39] It can also do more active surveillance too; so it can turn on microphones; it can turn on cameras and record.

Those things you see on TVs when drug dealer are like “we have to take the phones, take the batteries and put them in the other room.”

There's a good reason for that because your phone can spy on you in that way, and again these nutrition activists were being attacked by the government - and attacked is the right word I think in this situation - with this technology.

Daniel Forkner:

[15:04] The motto for NSO group is “Making the World a Safer Place.”

Name and Shame

David Torcivia:

[15:08] Yeah and stuff like that, this like self-righteous “we're doing good with our tech; we’re the good guys” thing is ridiculous.

I think we really need to name and shame some of these companies and the people that work there. These are companies like Endace, which is a New Zealand company; Lynch IT that’s American; NSO Group Technologies they’re US-Israeli; Verint, they supplied Columbia with a bunch of stuff; NICE systems which one of the more ironic names of a spyware funding group, they’re Israeli and they sold to regimes in Kazakhstan and Uzbekistan that actively use these to silence political dissidents, control journalists to make sure that they weren't getting political disagreements in their newspaper. Another company is Netronome; Cyberbit - they're the ones that sold the stuff to Ethiopia, they’re a company that does billions and billions of dollars in revenue every year from the sale of this technology to many nations that are despotic.

Where Does This Money Come From?

Daniel Forkner:

[16:02] And where does this money come from that they're making these billions of dollars?

In Ethiopia for example this could be a dictatorship, an authoritarian government that has billions of dollars right, but it doesn't have the engineers, it doesn't have the social infrastructure of really educated people to develop these technologies itself.

But it can then use this money - maybe even it takes on debt - to purchase these products and enrich these companies, and then turns around and uses these products on its own citizens to surveil them, to investigate them, to frame them, to put them behind bars…

and then at the end of the day who's really paying for it? It's the citizens that are stuck with a debt that the government took on to purchase this - really what is military equipment right?

So it's really insidious when you consider that the people that this is really harming are the ones that are paying for it; they're caught on the hook for it; it’s preventing them from progressing and developing; and the companies that are making the products are the ones that are getting rich off of it in addition to the authoritarian governments that can use it to maintain their power.

David Torcivia:

[17:01] Yeah it's really heinous from just about every possible direction you can look at this, and I'm willing to bet and to put down my money that there are far more innocent people who are just disagreeing with the state – again these are journalists, these are academics - being caught up, being surveilled with these technologies than there are “bad guy:” people that are looking to physically injure, bomb, to blow up. The amount of those people being surveilled by these technologies, it's just a tiny fraction compared to the regular good people that are trying to help, trying to make the world a better place, but because their viewpoints don't align with whatever government, they are being attacked.

And so if this is what can happen in nations that can't afford to have their own systems built, that have to sort of lease or purchase these plug-in spyware software systems - and again it’s software predominantly though there are hardware components every now and then - what happens in countries that have the funding, that have the infrastructure, that have the technological wherewithal to build this at home?

So these are countries like Israel itself, the United States, and of course China.


Daniel Forkner:

[18:06] China is a country that’s using this surveillance technology to is logical conclusion every chance they get.

China as a whole already has a big surveillance infrastructure in place. If you are a foreigner and you try to go to China, if you've ever filled out a Visa, they're going to ask you what company you work for, how much money you make, what country you're coming from, what your job is, and I did get a Chinese visa so I did have to fill out all this information in addition to getting someone from China - a citizen - to write a letter that says... actually I don't know what they said about me but I had to have some kind of personal connection in China to get the visa that I got.

And it's a way to rank foreigner. So you're going to get a score, an A, B, or C and it's all going to be made up of you know if you're a teacher from Egypt, maybe your score will be very low, but if you’re a doctor from the United States your score can be much higher right.

But if you are a citizen of China, then the scoring is actually a lot more complex. You just as a regular Chinese citizen, the government will know things about your employment, how much money you make, loans that you have, what your family is like, things you do, traffic violations, all this to build scores about you which then translates into privileges or things that will prohibit you from participating in.

But there's a particular Province in China, Xinjian, that might very well be the most heavily surveilled area on planet Earth at this moment. This is even above and beyond what goes on for most citizens in China.

Geographically Xinjiang is very important to China for their “one belt one road” initiative to establish these trade routes going Westward from China's Mainland through Russia, Europe, and the Middle East. So there's a big emphasis on surveillance in this area, and it also has to do with a certain subset of the population there: Uyghurs who are linked in the eyes of the Chinese government to Islamic terrorist attacks.

You'll see that in certain cities you're going to have a police post every hundred yards, on every corner of the street in some places. And wherever you go you're going to have to go through a checkpoint like you do at the airport right? They're going to check your bags, they’re going to x-ray your bags, that they're going to check your eyes to do an iris scan to make sure you are who you really say you are. In a lot of cases if you're just walking down the street the police officers will ask them to stop, plug their phones into a machine and go through their photos, they’ll look at all their text messages, to see if they've been engaged in any activity that the administration doesn't agree with.

Every car, every commercial vehicle in Xinjian has a tracker that the government can use to find out its location.

If you go into a kitchen shop because you're a cook and you’re a Uyghur and you want to buy a knife so that you can cut up your vegetables… This is the really crazy one. The shop owner is required by law to take your personal information - so every citizen obviously has their own ID, and their name, their ethnicity, and everything about them is on it - and what the shop owner will do is take that information, translate it into a QR code, and then laser engrave that onto the blade so that if that blade shows up anywhere, or if the police officers just want to inspect it they know exactly when it was bought, who bought it, and where it should be.

And if all that's not enough, if you remember our episode last week on genetic engineering and the implications of giving up your genome information, well China is also right now actively collecting DNA data on many of these citizens to build a database of their genetic information.

You know so that there's nothing they can do to escape the surveillance tape.

David Torcivia:

[21:32] Damn that's crazy. I mean there's like surveillance, and then there's s.u.r.v.e.i.l.l.a.n.c.e, and that's definitely into the crazy realm.

But I mean that’s what total surveillance could look like, and it's also in the cameras there right? So even if you're not part of this Muslim minority everyone is being monitored as part of this. There are cameras everywhere, and these cameras have facial recognition like we talked about, and it’s not just facial recognition; they recognize your skin color, what clothes you’re wearing. They even have gait recognition - so they can't see your face because you're wearing glasses or you have a mask or a hood on because you're doing something bad, they can still figure out who you are by how you walk.

They follow cars, they follow makes of cars, they follow license plate numbers. They've done show-offs of this technology where they invite reporters from the BBC or wherever, and they say “okay go walk around, let's see how long it takes our surveillance system to figure out exactly where you are. We’re going to give you a 15 minute head start or whatever and go somewhere and then we can start looking for you and I'm going to tell you when we found you.”

It didn't take them more than 7 minutes to do this. This is a city with 100% camera coverage, they know exactly who you are, where you're going, and what you're doing.

And this technology exists right now solely in China but we have the infrastructure for this in places like London – where there are some of the most security cameras per capita in the world - New York, San Francisco, Washington DC, LA.

All these cities are currently at the level where they have the camera infrastructure to enable this technology, or are rapidly beginning to get there.

Surveillance at Home

The question then at that point, just like we discussed in earlier episodes, once you have the hardware installed it's just a matter of connecting the software to it and then you can have the same exact system with the same pervasive panopticon that China has here in the United States… and at that point we have to ask ourselves “well what are they going to do with this technology?”

Daniel Forkner:

[23:19] Alright hold up David, so I mean we're talking about China, were talking about Ethiopia and Mexico… I mean I guess a lot of these technologies are being developed by companies in developed countries right, places like Israel, places like the UK and the United States, are you saying there's a chance that this technology is going to come back and affect us as citizens of these countries that are you know bastions of democracy and privacy?

David Torcivia:

[23:42] Privacy… that ship has sailed my friend. Maybe we can get back to the dock but right now it's not looking so good, because right now I mean this isn't a question of “could this happen?” but it actively is in a lot of places. Let’s talk about Fresno.

Daniel Forkner:

[23:57] Yeah okay so Fresno in California has been highly criticized for deploying a whole host of technologies to surveil people and come up with risk scores associated with their activity to alert police if there are individuals that maybe we should be more concerned about. Some of the technologies that they employed were things like software to track social media. So software that would look at every individual's Twitter, Instagram, Facebook posts, and try to come up with a risk associated with it like you know “this person said this and based on our algorithms we think they have a threat score of XYZ,” and then if a police officer ever had to visit their home or stop them on the street or something, their computer would tell them “hey this person has this risk score you might want to watch out.”

And they combine that with other things like their police cameras. In the Fresno Police office they had a whole bunch of monitors that could at any time plug into over 200 police cameras that they have throughout the city. And more than 800 cameras that they could tap into in schools and traffic cameras, and 400 more cameras that could be viewed at any time from their officers body cameras.

If that's not enough they even have microphones strung up all throughout the city to work with this technology called ShotSpotter to where if a gun ever fired...

David Torcivia:

[25:13] Or something that sounds like a gun like a backfire from a car.

Daniel Forkner:

[25:17] Well that's true, they could use these microphones that are strong up everywhere and kind of triangulate it and find out the location.

David Torcivia:

[25:23] Wait wait wait, I want to talk about body cams for a second just as like a “I told you so.”

A couple years ago when all this horrible police murdering was happening - not that it stopped or changed since then - but everyone was clambering “oh we need body cams, body cams on the police, it's going to make them magically perfect whatever” I was telling everybody “noooo, we don't need more cameras on the police because they're just going to use this for their own benefit and not for the citizens - for our public’s benefits,” and lo and behold that's what's happened.

A lot of places, yeah they have these body cams now, we paid a lot of money for them and companies like TAZER/AXON are very happy that we did, and a lot of this footage never makes it out of the police office unless it's something that very clearly demonstrates that the police officer did no harm.

Daniel Forkner:

[26:06] And even some of these court cases where civilians have died wrongly at the hands of police officers, the courts have limited the jury's ability to view these body cameras to the very last second of the interaction between the police officer and the citizen that was killed, so that the full context of these video cameras are not even displayed in court cases where it could be very important for a jury's decision.

David Torcivia:

[26:29] Right right, and even more than that they've been shown time and time again that even though these are body cameras attached to someone, the angles can be very deceiving. You can think you see something but when you see a camera from the side that actually saw the whole thing it looks nothing like what we thought it did.

They’re a colossal mistake in terms of trying to demonstrate of what really happened in the situation. This idea that video is this neutral observer… which is something we'll talk about more in the future and we'll talk about the police more in the future as well, but what I want to focus on is the combination of this body camera technology in places like Fresno where these cameras are constantly piping data back into either the police station or some central server at TASER somewhere, and integrating this with facial recognition technology.

This is not something that is coming in the future, this technology is existing right now. It used to be that we would scan these videos later and see if we saw anybody, but we are now at a point where this software is being sold to Police Departments, and you can walk down the street - I'm an officer walking down the street I've got my body cam on - and the body cam is actively scanning everyone's face that I'm passing by, and it’s issuing information about the people passing by to the officer via ear pieces.

This will be like “okay this male up here, he has an outstanding warrant for some sort of stupid non-violent crime, you can arrest him if you want,” and then also maybe even integrating threat scores for people like “oh look out this person is a high threat score, which is a conversation for predictive policing which we’ll get to in just a little bit.

But this is a huge privacy violation, it’s a huge mistake, a tactical mistake primarily coming from the left, people pushing to try and regulate these police forces but it said just give them even more power over themselves, even more power to control the image that they want, even more power to shift the narrative in whatever ways that serve the police as well as giving them a new tool to further increase the surveillance’s panopticon in ways that people frankly never even considered were possible.

Affecting Our Right To Protest

Daniel Forkner:

[28:25] A pretty alarming example of how this video surveillance can be abused by local authorities is a case that happened in Memphis recently where local authorities used footage that they had taken of protesters in the Black Lives Matter movement, created a watch list of these individuals who participated in this movement, and then prohibited those individuals from entering the Memphis City Hall without an escort.

Frankly I don't even know how that's legal but it goes on.

David Torcivia:

[28:52] Yeah and I've been in protests in New York. So NYPD has a unit called TARU, the Technical Assistance Research Unit.

If you’ve ever watched Law & Order you know the guys that come in and do like “oh yeah let me hack the phone” beep beep beep; or “let me scan down where the person is,” those are people coming from the TARU unit coming in and assisting whatever technical need the police department has. And so I see these people at protests; they have big vests and jackets that say NYPD TARU on them and they walk around like they're big Kings or whatever. What they’ve started doing in the past couple years is they walk around with a GoPro - NYPD uses these and now in New York we have no-mask laws okay, so this is saying that if you are at a protest or even anywhere technically, you're not supposed to cover your face, and the reason for this was people at protests are going to do something bad so we need to be able to catch them.

But this law was made before facial recognition and pervasive surveillance existed, so now what am NYPD does is use of fact that we're not allowed to cover our face and hide our identity, and they walk around these protest, they walk through the crowds and push people out of the way if, and if somebody is wearing something on their face they’ll rip it off. So it's one guy walking holding this camera and another officer behind him with his hand on his shoulder, and then behind that several more officers from the regular NYPD unit with one hand on their gun making sure everybody's going to stay away from this, and they shove this camera in your face and they slowly walk through the crowd filming everybody around.

And typically they'll go to first the sections that they think are going to be the most trouble; the anarchists, the Black Bloc; then they’ll move to protesters the state disagrees with more, groups like BDS, people shouting things about Palestine, they’ll walk to there and film those.

Then they'll move to the rest of the crowd after that and all of this is recorded and dumped into NYPD’s secret facial recognition database, and right now we have no idea what that database is, we have no idea what technology that database is, we have no idea what they do with this database, and we have no idea why NYPD is recording these people peacefully protesting, following the law, doing exactly what NYPD is telling us to do, in protests that have permits and are allowed to be there.

They're capturing all of us, recording this and doing something with this data and they won't tell. They're actually being sued by several organizations right now including the ACLU to give up any information at all about this program. Those court cases are actively ongoing but we're seeing this again and again all around the country right now. I know my face is in this database, I know lots of my friends are, and so we need to have this conversation where either “stop this facial recognition program for people who are following the law but are seen as dissidence of the city or the state,” either that or “let us cover our faces and protect our anonymity” because the fact that going to a protest means you might end up on some, list you might have some sort of threat score associated with you, that when a cop is passing by you they get that little notice in their ear from their body cam saying “look out this person is a dissident” and that makes a cop tense and maybe you might be in some sort of situation because of that.

That's alarming; it discourages people from protesting, from participating in the political process, and we have to choose, am I going to try and do something that I believe in, or am I going to be safe and protect myself?

The fact that we can't cover face and we know that somebody’s going to do this is a huge problem; it makes people make this decision, and that's something that shouldn't happen certainly not in a healthy democracy.

I would love to see some sort of protester’s Bill of Rights and if any of you listening have political involvements experience working on pieces of legislature this is something that you need to get interested in right now because as we move forward these next two years I think this is going to be a very important thing especially as this technology becomes more accurate and more pervasive.

Terrorist Or Enemy Of The State?

Daniel Forkner:

[32:22] These Technologies are presented both by the companies that make them and the governments that use them as a way to fight terrorism and protect National Security.

But this surveillance, as you're pointing out David, and as the Memphis Black Lives Matter watch list case, among many others demonstrates that these technologies are used less of a tool for fighting terrorism as it is a way to fight potential enemies of the state.

These two words might be equal in the eyes of our governments, but they absolutely are not the same thing, and we should not tolerate the confusion of these two.

An enemy of the state is any individual or any organization that threatens the status quo that is the current power structure. In the United States, the simple act of trying to organize a political party, rally people behind issues that would shake up the current political structure, or go to one of these protests like you mentioned David… any of these things that you do can get you labeled as an enemy of the state, in which case the full powers of all our surveillance technologies will be used against you.

And if it seems that far-fetched, you know we have to consider the Federal Bureau of Investigation was founded in the United States in response to the need to surveil and discourage political dissidents, so basically citizens who have political views that differ from the administration.

In the 1970s, the American public was alarmed to discover that the FBI had a program that for a couple of decades had illegally surveilled, infiltrated, framed, and in some cases even murdered innocent citizens who were members of domestic political organization; these are organizations that were in the feminist and civil rights movements for example.

David Torcivia:

[34:04] Right and this includes people like MLK, Malcolm X and everyone else. This is COINTELPRO.

Daniel Forkner:

[34:10] Exactly and the tactics of COINTELPRO and other government programs are usually defended in the name of National Security, but then the question becomes:

Whose security is really being protected when citizens can be murdered by our own government simply for their political views?

David Torcivia:

[34:26] Yeah and what the government, and what this law enforcement is doing is really wrong, but maybe what's even worse is these companies that have looked at this and not said “this is a terrible thing we should do this,” but said “you know I can make a product and sell this and make money off of this practice.”

I mean as like cartoonously super villain evil as that sounds this is what lots of companies are doing, like we saw in these US and these Israeli spyware companies, and like we’ve seen a lot of companies here in the United States selling products to local law enforcement, to the government, in order to surveil and actively arrest Americans.

Daniel Forkner:

[35:03] Okay this is an example that really blew my mind; it's not the most crazy surveillance thing out there but it just goes to show this connection between money, surveillance, and quote unquote “Justice” right?

This company Vigilant Solutions, they have a product, these ALPR cameras - they are attached to police vehicles - and what they do is they read license plates and add these license plates to a huge private database of vehicles and all that.

Initially, when this company tried to get local police authorities to adopt this technology, the local police said “we actually don't need it; it's too expensive and the only use we would have for license plate readers is to catch car thieves, but car theft has actually declined a lot over the past several decades.”

But this company didn't want to get rid of an opportunity right, so they came up with this brilliant idea: “hey let's give these police officers this technology for free, and then we’ll require the police officers to give us all their information on anyone who has an outstanding warrant or outstanding fines, and anytime one of the vehicles for one of these people pops up on our cameras, if the officer makes an arrest and it results in a fine we get 25% of that fine.”

David Torcivia:

[36:15] For a second I thought you were just going to say like we drone strike them but that's not that much less evil.

Daniel Forkner:

[36:20] I mean it's legal bribery right?

David Torcivia:

[36:26] Well it's just perverse incentives. So it's a lot like the private prison systems where they lobby the government to say “okay we’ll build these prisons; we’ll run these prison, but you have to guarantee us a minimum number of inmates in our prisons anyway you can,” and so the way they can is by introducing crazy laws like Three Strikes and stuff and aggressively pursuing these silly things that this ALPR technology enables: “oh you got a warrant out for your arrest? You're off to jail; oh you haven't paid this fine? Hand it over here is a bigger fine now” and a lot of these technologies are focused on enforcing these trivial non-violent crimes where nobody gets hurt but are oftentimes symptoms of poverty, and they're easy to pursue and they’re safe for officers and more than anything they’re a giant Cash Cow.

Daniel Forkner:

[37:06] So what are some other technologies that might be used today to surveil American citizen?

David Torcivia:

[37:12] Okay so we got a bunch; we talked about camera, we’ve talked about facial recognition. That's the big push on a lot of this. Other types of surveillance - so we have the software that’s spying on social media accounts, we have Stingrays which is the coolest name of all these spy Tech, and these are little boxes that act basically as mobile cell phone towers - but they’re there bad cell phone towers - when it gets in between your phone, and whatever cell phone tower it’s supposed to connect to, it intercepts that call and acts as a man in the middle device, capturing all the data from your text messages, from your phone calls, and everything else in between.

As an aside to everyone listening if you want to get away from this - if you’re at a protest or something that you know they is going to be Stingrays deployed at - don't text, don't make phone calls, if you’re going to do that use an encrypted messaging application like Signal, that can do both of those things but they can't capture the data in between.

But that's a conversation for another show - we will do a show on this everyone, on how to protect yourself from these technologies and to make mass surveillance as ineffective as possible, but like I said that’s a conversation for another time.

Daniel Forkner:

[38:18] The contracts for some of these technologies like Stingray that police departments have to agree to just in order to acquire the technologies are kind of alarming. So a lot of police departments if they want to purchase this Stingray equipment, a lot of departments will have to fork up $500,000 up front before they can even see the technology. This company doesn't allow police departments to even view a demonstration of it until after they bought it, and in the process of buying it these departments have to sign non-disclosure agreements where they won't tell anybody about the technology, they won't tell anybody how it works. They won't disclose anything about it.

David Torcivia:

[38:53] And then comes back and bites them in the ass sometime. There's been court cases where because this technology is so secretive, and they capture say like a drug dealer or something using this Stingray tech, when the lawyer says “ok where’d you get this evidence from?” and the police department says “it’s from a Stingray device” and the judge says “what’s that?” and the lawyer says “well tell us about a Stingray device” and the police department says “oh wait oops I mean never mind” and they’ve literally dropped cases because they don't want to reveal this technology.

That leads to something, they actually have this whole technique for this now, and this is common in both police departments as well as cooperation with groups like the NSA and DEA, something called “Parallel Construction.”

Daniel Forkner:

[39:33] This is a technique that’s used to get around laws that are in place to protect people, to prevent police officers and other investigators from collecting evidence illegally.

David Torcivia:

[39:43] Yea laws that keep our justice system honest; this is a system designed to get around that.

Daniel Forkner:

[39:48] And the way it works is an intelligence group like the DEA will use one of these surveillance technologies to target individuals, acquire information about them that could lead to illegal activity.

David Torcivia:

[40:00] Usually an illegal surveillance technology; so no warrants, nothing.

Daniel Forkner:

[40:04] And then once they have this evidence, this data on certain individuals, they’ll go to a police department, or maybe vice versa maybe the police department calls the DEA, and the DEA will say “hey we have this information on this person; they're going to have drugs or they're doing some other illegal activity, but you have to go get the evidence yourself because it's technically illegal that we're sharing this information with you and if you were to bring it up in a court it would obviously be thrown out.”

So then the police department has to go out and fabricate this whole investigation in an attempt to make it look and appear as though they collected this evidence just through chance, maybe through a routine traffic stop or whatever.

David Torcivia:

[40:42] Okay sometimes this Parallel Construction ends up in like the craziest, like hairbrained Three Stooges kind of scheme.

There was one case where the police department cooperated with the DA, they knew this guy was going to be carrying cocaine or some sort of drug in his car, and what they had was another police officer was in a truck following the suspect, any they pull up at a light and this truck with a police officer ran into the back of the suspect’s car, then he popped out of his truck and he pretended he was drunk and so obviously our suspect was like “okay well we got to call the cops” and they called the cops who are in on this and knew it was going to happen.

And they said “oh this is a drunk driver we have to pull you over for processing” and so they pull the suspect over and had him sit in the police car and wait while they were quote-unquote “doing paperwork” to irrigate the drunk police officer who's in on this and not really drunk, and then they have another police officer come in and steal the suspect’s car and drive off with the suspect's car and the suspect’s like “oh no my car is being stolen” not knowing it's a police officer stealing it, and so they chased him down and they eventually catch it and they’re like “okay well we caught the suspect stealing your car and we're going to do a search of the car to make sure there's nothing in there,” and they searched the car and it obviously find the drugs but on some level like I don't know how this is not some sort of crazy entrapment; I don't know how you can still have probable cause when you know it's a police officer stealing a car, and I'm not sure how this would have held up in court but these are the kinds of things that these people are coming up with.

Less ridiculously sometimes it's just a fake tip that the officer will call in, and be like “we got a tip on this,” and then because it's an anonymous tip they never have to go in and investigate. This is the kind of stuff this surveillance, the access to the tools enables, because we're supposed to trust that officers, that law enforcement, that these three-letter agencies are going to use these tools quote-unquote responsibly and legally.

But a lot of times the allure of using this tool, of this technology is much greater than the Judicial Systems that are supposed to limit how it’s used, and so they will use it anyway, find something and then have to make up a story about how they caught you anyway.

Daniel Forkner:

[42:44] There's so much crazy tech out there that's being developed to try and increase the surveillance capabilities. I mean we've got planes that people have found have been flying over protest sites right,

David Torcivia:

[42:55] I actually run plane tracking equipment in my apartment like a crazy guy wearing tin-foil because I follow these planes.

Daniel Forkner:

[43:03] At what point was it the US military that tried to develop a blimp?

David Torcivia:

[43:07] They're still actively developing a spy blimp.

It's a blimp, like totally covered in antennas and sensors and stuff, and they fly it really high above the ground and it has cameras and it's tracking and it's following everything beneath it and picking up literally every single signal from cell phones and everything beneath.

They tested this of course over Americans, I think in Baltimore somewhere conveniently where all those protests and things were happening - I don't know if that was coincidence or not.

They have drones that are flying around spying on everything, there are high altitude drones that record literally every single car in a city; if you have some sort of crazy sci-fi idea of what surveillance could be there's somebody out there developing it, or it's already out there in the hands of local law enforcement all around this country.

Predictive Policing

But of course the biggest one here and the one that has the most danger is that minority report technology that’s predictive policing

Daniel Forkner:

[43:59] Predictive policing is this emerging field that comes out of this emphasis on Big Data, Machine learning, and AI that seeks to take all this data that we're collecting and create algorithms out of it to try to predict what's going to happen in the future, and to make - in this case anyway - make police officers more efficient so that they can go to locations where there's going to be crime and it's kind of marketed as this ability like “if we take all this data on crime and we take all the social media data, and we plug in these algorithms, we’re going to get an objective and fair prediction on where we should send police officers to find more crime and it will make us safer and all this.”

This idea that you can build an algorithm around big data to try and solve some problems is something obviously that's getting a lot more attention these days just because of our computing abilities and the ability to collect so much more data, and there's this idea that these algorithms are really objective because they’re math-based and because they use data and data is based on facts, and how could that be biased?

David Torcivia:

[44:59] And as we’ve talked about technology can't be used for evil right?

Daniel Forkner:

[45:03] Right well and that's the problem is that this technology is ultimately a tool and it depends on how you use it, but these algorithms are particularly insidious because what they end up doing is really just leveraging human biases in data and making them that much worse in practice.

The way this in predicted policing, in order to build these algorithms that are supposed to be objective, you have to first start with data; data on arrests, locations of those arrest, and ultimately what that does - it's not really data on crime itself, it's really data on police behavior - and so if you have an area that is already over-policed, you're going to be feeding into this machine a lot of data that is already biased towards a certain subset of the population and a location.

[45:50] I'll give you an example of this: white people and black people smoke pot at about the same rates; in fact for certain demographics white people actually smoke more pot than black people. But if you look at the arrest records, blacks are arrested much, much more than white people; in some municipalities that figure can be as much as 10 times more than white people.

So that's discriminatory right? That data is not fair it's discriminatory, and if you try to build an algorithm based on that data you're going to get an algorithm that tells you “hey go to these areas that are black and search them for pot because you're likely to get arrests.”

This attempt to predict where police should go to arrest people is really just based on where police have gone in the past, and areas that are over-policed now will become over-policed in the future, even more so with this predictive policing.

David Torcivia:

[46:41] So that's a lot of stuff to think about and consider, but the very short and simple version of it is: bad arrest data makes bad predictive policing, and as it stands almost all arrests are bad right now.

We don't evenly enforce the laws okay? There are areas where different police departments have different policies of what laws they enforce. There are areas where different police departments are explicitly racist in who they enforce laws against. There are areas where if you're white and you do something you're fine, and if you're black you're not.

[47:18] I grew up in the South and I have many black friends who say “oh yeah we don't drive through that town because we know if we do anything wrong, whether we drive too slow, too fast, we don't stop smooth enough, the cops will pull us over.” Meanwhile in that town I can do whatever I want because the color of my skin is different.

But when you load this arrest data - and again this is not.. we try and think about crime data as like crime covers all crime but it's not really it’s arrest data; so it's when an officer actually decides to do something, to make an action, to record an event, that this data point is created and when that data is fed into these algorithms, these supposed to be neutral bastions of code where there's only the pure technology, then we get this distorted enforcement, these distorted predictions.

And it really is prediction right? So I mean policing as a field is reactionary right? You don't show up before murder; they don't show up while a murder happens, at least not on purpose. There are times when they get there in time to do something in a situation but those are so rare, and the vast majority of crimes that occur - and violent crimes are just a small portion of all crimes – but police react. They show up after the fact and record something for insurance or for statistics, and maybe if you’re lucky they’ll catch whoever did whatever, but for the most part it's just this reactionary force. So the hope is that this predictive policing will enable them to predict crimes right, that's the ultimate goal, but does it really work like that?

The only application now is it says “oh we should put more cops here on these days.” You know this is the same technology retail stores have been doing for years saying “oh yeah this is the day that the store is busy so let's bring in extra shift workers.”

That's all this is at the moment. If it predicts a murder somewhere what are you going to do, wait outside someone's house waiting for them to kill someone? That's crazy; that's not going to happen and maybe in this panopticon, this perfect surveillance world where all of our actions are tracked and recorded where they know where we are at all times, maybe then they want to apply this predictive policing by saying “okay we know this person has a very high threat score right now; let's watch what they're doing.”

[49:20] But for now until that technology is in place - and again that technology is coming in place so maybe this is something we need to start thinking and worrying about.


[49:28] But until that's the case this doesn't work, it doesn't do anything, and actually Chicago introduced a large predictive policing test, and the hope was that it would dramatically cut down on the violent crime, in particular their homicide rate.

In the end result of all this, of all this money spent, of this time invested of these officers moved around according to the code that spit out and said “this is where the crime is going to be, make sure there are people here,” was nothing. Homicides didn’t change at all; predictive policing was totally useless and nothing happened.

Daniel Forkner:

[49:56] I like your example of trying to predict whether or not someone might commit a murder. You know when people think about crime and the type of crime that worries them, it probably is along the lines of violent crime. But this predictive policing using data cannot predict violent crime it's impossible. It would be in the form of a risk factor like you mentioned like “oh this person has a 30% chance of committing a rape in the next year.”

Well what do you do with that? You going to follow that person around every time they go on a date? It's silly; it doesn't make any sense, so what predictive policing does is it really will ultimately focus on non-violent crime that is predictable and can translate into arrest records ,but that's another piece of this supposedly “algorithms are objective argument; it's just math-based so it’s objective,” but what is the goal of this algorithm?

If you're trying to fight crime and the goal of a police department is more arrest records, does that necessarily reflect what everyone in that community thinks is the right approach to decreasing violent crime? Probably not.

David Torcivia:

[50:56] Arrest records are like you mentioned primarily nonviolent crime. Non-violent crime is primarily a crime of poverty, and so what we end up just making is this algorithm that predicts poverty, which we don't need an algorithm to do, we achieve nothing.

Daniel Forkner:

[51:11] And then sends police officers to enforce laws against poverty like you know going to the bathroom in a public park because you don't have a place to live, or loitering, or-

David Torcivia:

[51:20] having a signal out on your car, all these silly things that keep people in this poverty cycle. This algorithm just reinforces that, set aside manpower in order to enforce these poverty enforcing laws and ideas, and we just have this vicious cycle.

It's a shame because this data actually is valuable.

How Data Could Be Used To Reduce Crime

[51:37] We hate on data a lot on this show; we talk about all the terrible ways it can be used, but ultimately you can use data for good.

We could be using this very predictive policing technology, these algorithms to say “these are poverty areas, how can we help them? How can we do things in order to make people less likely to commit crimes?” And the criminal justice system says “oh we can do that by putting cops everywhere.”

But if we looked at this from a better perspective; from a perspective saying “how can we help communities, how can we get people out of these poverty cycles; out of these crime cycles, we could say “maybe a park here; maybe investment in a community center here; maybe better schooling in this area,” these things that help people break out of these systems, give them alternatives to committing these crimes, setting aside funding for programs that benefit people instead of paying for more surveillance, paying for more officers’ over-time standing around threatening people would be a much better use of our time and our resources.

Daniel Forkner:

[52:31] That’s such a good point, and I just realized that even the word “crime,” if I hear that word or if I read it I immediately think of police, and arrests, and bad people, but we need to maybe think about crime differently as a symptom of an underlying societal flaw.

David Torcivia:

[52:46] It's not all crime right? I mean there is some violent crime that isn't part of this, but that's such a small portion of this - most crime like you said yes is a symptom of poverty.

Daniel Forkner:

[52:54] Just to clarify, just to summarize:

The way predictive policing works, and the way it’ll work going forward, is data is examined for arrest records and their locations, and then an algorithm will tell a police officer where to go based on that data.

Cathy O'Neil who wrote a book called Weapons of Math Destruction; she's a Harvard mathematician and she writes about the problems with using big data, she gives this example saying:

After the financial crisis, if you had gone to Wall Street and arrested every banker and found out that a lot of them had cocaine in their pocket you'd make a lot of arrests, and then that data would be translated into this predictive policing and it would say “hey, go to Wall Street because a lot of people there are committing crimes with cocaine.”

But we didn't do that so there is no data on crimes on Wall Street bankers and cocaine use; that data does not exist, so these algorithms and these police tactics will never target that group of people, it will only be used to target people that are already discriminated against, and already over-policed, where the data exist.

David Torcivia:

[53:55] I know bunch of financial people and they do all have cocaine in their pockets.

Daniel Forkner:

[53:58] Maybe not all of them but…

David Torcivia:

[54:00] Well where this actually is most dangerously applied is in that sentencing, in after the arrest is made. These same algorithms that predict crime again are used to predict when you're going to commit crime again, and so this is a very common practice it’s in many courtrooms around the country, much more than municipalities that are using predictive policing. But this is a common part of sentencing.

So you commit a crime, you go to court, you lose your court case, you're found guilty, and now the judge has to decide how they're going to sentence you. So they’ll run your data through this piece of software that spits out a score more or less of how likely you are to commit crimes again and what those crimes are likely to be, and it integrates your arrest record, who you are, who you are related to, where you live, how many jobs you have, all stuff in addition to stuff like the information you provide via questionnaires.

“Do you have mental illness; do you live in a high-crime area; are you related to someone who's committed crimes before?”

All these silly things that if you ever said these out loud in court someone would yell “objection” and it would be struck from the record and you couldn’t bring this up. But because it's after the case, because it's encoded in a way that these technologically non-inclined people say “oh yeah it's code it’s just part of this questionnaire you know this is neutral whatever,” you get a score spit out on you and the judge will see this and sentence you based on the likelihood that you’re going to commit a crime.

The higher your likelihood of committing a crime, the higher your threat score, the longer and harsher the judge is going to sentence you. And of course the longer you spend in jail actually, when you look at the statistics, the more likely you are to end up back in jail. So this self-defeating algorithm predicts its own success.

[55:38] It's saying this person is likely to commit a crime and end up back in jail, so let's put them in jail longer, which of course makes you more likely to end up back in jail again. This recidivism cycle is reinforced by this algorithm and creates this self-fulfilling prophecy of loops, of keeping our jails, of our prisons full. Of course this ties in with the prison-industrial complex, with these private prisons.

And private prisons again like federally level aren’t a humongous amount of prisons, but in many states they are; and this is a component of that system that gets hardly any mention despite it being a major part of how people end up trapped in this.

[56:12] It also takes agency away from the judges because it makes it difficult to consider why people end up like this; and it says “you know what that doesn't matter” because we're going to just trust software, we're going to trust his code, this number, and that's who we are.

It's trying to distill our essence down on just a couple of datapoints, and saying “this is who this person is, and because we know who they are we know what they're going to do.”

They have no idea what we actually want to do, who we actually are, despite what advertisers, despite what governments, despite what software designers think. We are individuals, we have agency beyond this, and trying to fit us into these molds based on simple questionnaires, based on data that’s often unrelated to who we are, is crazy. Especially when you add to the fact that a lot of the data they feed in is suspect in the first place like we talked about.

Nothing To Hide

Daniel Forkner:

[56:57] David I mean that sounds bad; obviously all this predictive policing sounds bad but I'm not a criminal, I don't break laws, I really don't have much to worry about.

David Torcivia:

[57:07] Is that really the angle you want to go for Daniel? Because I'm willing to bet you are in fact. I'm going to bet everyone listening to this show in a panopticon society where there is pervasive, complete, and total surveillance is a criminal.

Without a doubt; I would bet every single thing I have on that fat.

Daniel Forkner:

[57:26] Not me David I told you I don't break the law.

David Torcivia:

[57:29] Everyone except our perfect saint of a co-host Daniel Forkner is the criminal; that is a fact.

Every day, so I live in New York where jaywalking is a favorite pastime of all the city dwellers, and each time I jaywalk technically I'm breaking the law, and if we had perfect surveillance with facial recognition and records that go back for as long as you want…

[57:50] When I go and do these things, when I protest, when I do whatever, eventually I piss off somebody enough, and they say “okay what do we have on this face?” and they reference it, and they find all my jaywalks and now I get a bill in the mail, a fine for thousands of dollars for all the times I’ve jaywalked. “Oh no I'm a poor podcast producer that puts all his money into a podcast for some stupid reason, I have no more money, I can't pay this fine, looks like I'm going off to Rikers Island for the next couple years because I can't afford bond,” and eventually they'll get to my court case, through it out, but it doesn't matter because my life has been destroyed in that time.

And that's the kind of thing that we can introduce - very simple stupid stuff: you change lanes without using your blinker – there are so many laws, some of which we can't even read because they're locked behind paywall. Some states, I think it's Georgia my native home, has a massive collection of laws and you can't access them without paying LexisNexis a small fee.

Which is fine whatever, they have to get theirs, this is capitalism. But the fact that I can be breaking laws that I'm not even allowed to read seems sort of crazy to me.

[58:52] We're all constantly breaking these laws. I’ve seen estimates that most people break ten to a hundred laws everyday, many of them felonies, because they're so many ridiculous laws on the books going back hundreds of years and the fact that a lot of laws are just stupid. They're not enforced because they are stupid, but if you are recorded breaking these laws all the time and you piss off somebody enough, or the political winds change and suddenly you were once a favorite of the government but now you're a dissident, or something happens and they choose to get you – and this has happened a lot; this happens all around the world, this is happening like we said in Turkey right now, and it could happen here, remember a lot of people are terrified of Donald Trump and say “oh no, fascism is coming,” and maybe it is, maybe it isn't - but I mean the Democrats are approving extensions to spying bills to enable this apparatus for Trump… so even though they say they might be pissed off and scared, something tells me they're cowards or thinking otherwise, I'm looking at you Nancy Pelosi.

[59:47] The political winds change, someone decides they want to get you, and you can go to jail basically for forever because of these crimes we are committing all the time, you do have something to hide. You are a criminal and you need to start thinking like that, and the fact that we are okay with these body cams, this facial recognition, this pervasive surveillance, with these sentencing things, with the way that our society is heading towards “I want to feel safe no matter what the cost,” and that cost is our freedom… that's a problem we need to do something about right now.

Daniel Forkner:

[1:00:17] If you look at people who live in authoritarian governments where the surveillance state has really been taken to its logical conclusion, and what could eventually be a reality for us as well, if you say “I don't have anything to hide,” you just have to ask yourself “well do I have any friends? Do I have a mother, father, do I have a child? Do I have a sister, or brother, do I have an employer, a boss, colleagues, associates?”

Because at the end of the day when the Surveillance State rears its ugly head, those relationships will not only be used against you they will be taken away from you.

[1:00:55] You go to a protest and all of a sudden you receive an e-mail with personal information that you don't want to be leaked to the public and you’re told “give up the names of all your friends who also participated in this protest and we won't do this,” or maybe your sister does and now you're contacted by someone who has information on you and says “give us personal information about your sister or else we're going to throw you in jail for not cooperating with the authorities.”

This is happening right now in other countries and it's happening because of this technology that we've allowed to become an industry that’s unregulated and that's developed in a lot of cases in the United States, or in Israel, or in the UK, and it's breaking down relationships where people don't trust each other; they don't share personal information with anybody; they don't talk when they go outdoors; they have curfews, and it's something we should all be concerned about.

David Torcivia:

[1:01:45] It’s actively changing how we think and how we act with each other right? The term for this is “Social Cooling,” which is the most benign term for saying “we’re all terrified of being caught, by being trapped, saying the wrong thing, so I'm going to change my behavior.”

I don’t know the sociologists that named it Social Cooling but they should be ashamed of themselves because this is a terrible thing, a horrible way to live or lives, to experience the world, afraid of ourselves, afraid of each other, afraid of our government, and because of that changing how we act, the things we say or rather the things we don't say, things we don't post, the things we don't go to, the things we don't watch, the people we don't talk to anymore, the topics that are taboo.

The ideas that we're not supposed to think about or talk about because they're problematic. These things that push our society forward, that move us into directions, that take us new places, that make us a better, stronger people, individual, and world, are being stolen from us right now from this social calling because of his pervasive surveillance. Because of the threat that if we do the wrong thing, say the wrong thing, are in the wrong place, we're going to get trapped up in this.

Daniel Forkner:

[1:02:53] And the people who use this technology know this. If you go back to that blimp example, some of the experts who are on board with this blimp experiment to put this giant blimp in the air on American soil, admit that in addition to the surveillance capabilities of these blimps, the fact they're so big and visual they do have the benefit of “if you put a camera in the sky over an area that you want to surveil, or that has a lot of turmoil, a lot of protests, or a lot of violence, that area will calm down” because people don't want to act a certain way when they know they're being watched.

The people that are using this technology know the psychological effect it has on people. I guess that goes back to the “perception management.” Part of this technology is used to manipulate our perceptions so that we act a certain way.

David Torcivia:

[1:03:41] So we conform.

How different is this from when you are a child, you have that crazy parents who takes your door off of the hinges so you can’t have privacy anymore.

That's what we're doing as a society, and we would point to that parent and say this is child abuse; but when we do this to each other because we feel unsafe, or because we want to feel safer, suddenly it's okay? No it's still crazy and this is a huge problem, and we're making it worse, and we're making society sick because we're letting this happen.

So what can we do?

Daniel Forkner:

[1:04:11] One thing that comes to mind again is that all this technology is marketed in a way that says “we're fighting terrorism, we're deploying this technology in the interest of National Security,” we should not accept that.

If you’re surveilling people, not just in the United States, not just in the UK, but around the world, that's not National Security, that's not fighting terrorism - that's controlling people against their will.

We shouldn't be on board with that; we should challenge our political leaders when they try to say that, and we should challenge these companies saying that this stuff keeps the world safer when in fact it doesn't.

David Torcivia:

[1:04:46] And not just challenge but shame right.

I mean if you know somebody who works for one of these organizations, for these companies activity making this spyware, this malware, these spying devices, shame them.

Because what they're doing is wrong and making our world of worst place. They might write it off saying “I just write this bit, or I just design these boards” whatever it doesn't matter; you're contributing to something actively harmful. Making life worse for all the humans on this planet. In many cases, getting people, innocent good people killed. That blood is on their hands and shame them for that. In the same sense shame our lawmakers who approved this, use words like “oh it's for terrorism, or to protect the children,” when really it's about consolidating power; when it’s about making sure that we act a certain way, a way that benefits them.

Even people who claim they’re Progressive, people like Nancy Pelosi we just pushed this bill through for Trump. She should be ashamed and we should shame her.

In the same since, local law enforcement, this isn't time to give them free donuts, to give them coffees, to let them get away with this sort of abuse of power. Shame them don't thank them.

[1:05:49] Regulation doesn't work at this. We’ve seen already with Edward Snowden these organizations are happy to break these laws because it gives them enough power to do so. The same with this parallel construction that’s going on. These organizations are happy to skirt these lines. The same with these companies selling this software, selling these products to despots, to dictatorships, to regimes using them for violence, for murder, for assassinations, for locking people up and destroying their lives and saying “oh we're just selling something it's okay.”

[1:06:18] No it's ridiculous and we shouldn’t stand for that. We shouldn't stand for any of this and we need to break down the systems that enable this; the systems that enable consolidation to power; the systems that enable ignoring these things for economic gain, and whether that means destroying our power systems, our government systems, our economic systems, is a question for maybe another time, but it's definitely something we need to consider and think about because that might ultimately be the only way to get out of this Pandora's Box that we've opened.

Create systems that don't reinforce, that don't benefit, that don't allow people to consolidate power, economic power, government power, whatever it is, and to look at alternatives and say “how can we get around this; how can we offer systems that instead allow us to build together,” to build community, to help each other, instead of saying “I don't trust you I'm going to watch you.”

Because that’s the world we have right now and it's not a world I want to live in.

Daniel Forkner:

[1:07:08] I'm definitely on a list now just through association.

David Torcivia:

[1:07:12] Yeah well I mean at the same time we need to stop thinking in terms of lists right? Because the days of lists are long gone.

Everyone is on one list right now, or one list for each government, one list for each company, and that list again is the threat score, and by listening to this, by being associated with me, by being part of this show, by being associated with people who care and want a better world; like those activists in Mexico, like that Black Lives Matter, like people interested in a better world, your threat score’s up higher; you’re a threat to the status quo and so we need to maybe look beyond the status quo for solutions because we're definitely not going to find them there.

So tell your friends and family that - tell them the days of lists are gone, they’re a threat score on a single list now, and because they know you The Listener listening to me right now, their threat score has gone up and so maybe they should do something about creating a world where threat scores don't matter. Where they're gone and don't exist anymore.

A world where we can finally trust.

Daniel Forkner:

[1:08:12] To me that's the most important part of this - is how can we focus on creating the world that we actually want to live in as opposed to simply participating in the world that’s handed to us and being reinforced through things like surveillance. If we want to live in a world where we can actually trust people, where we can trust our community, where we can build social relationships and not have to worry about people spying on us, then that's how we should act right? We shouldn't look at our next door neighbor and say “oh they look suspicious I'm going to keep an eye on them and maybe call the police on them,” maybe talk to them and get to know them instead and start being a part of a better world that we can all live in.

David Torcivia:

I think that's the best advice Daniel when it comes down to this. I think I've heard it before maybe: “Love Thy Neighbor.”

So that brings us the end of what turned into a very animated episode. I hope you enjoyed it, we have another great show coming up next week, where we talk about the very exciting topic of Pensions but I promise it's going to be a nail biter; you're definitely going to want to tune in to that.

Daniel Forkner:

A lot of time and research goes into making these episodes possible and we will never use as to support this podcast, so if you enjoy it and would like to keep us going you can support us by giving us a review, and recommending us to a friend.

David Torcivia:

You can also visit our website for sources, links, full transcripts to these episodes and more, or you can follow us on your favorite social media network @ashesashescast.

Daniel Forkner:

That wraps it up for this week. This is Ashes Ashes.

David Torcivia:

Bye bye.

Daniel Forkner: