Permanent Record

Discussing the broken state of the world - whether that's climate change, our tech dystopia, or our rapidly changing society - and what we can do about it.

Return to main post

David Torcivia:

[0:00] Hi everyone I'm David Torcivia.

Daniel Forkner:

[0:02] I'm Daniel Forkner.

David Torcivia:

[0:03] And this is Ashes Ashes.

Daniel Forkner:

[0:06] So we have been talking about climate change recently, but today we're going to be talking about the surveillance and tracking that goes on in the real world. A lot of us have some idea about how our smartphones and web browsing reveal things about us and the data that generates. We want to focus this episode specifically on some of the tracking that goes on outside of our digital apps and products. Of course all that data is collected and and digitized and we will talk about how that impacts our daily life about we want to make it kind of specific to that area.

David Torcivia:

[0:40] We also want to focus on the fact that this isn't just, you know, when we talk about surveillance and we talk about tracking the story always initially goes to government surveillance, government tracking and that is a big part of the equation but it's not what we're going to talk about today. It's not the focus of this because most of the tracking that we encounter day to day, throughout our lives, and that really affect us individually - it's not the government tracking but corporate tracking. For consumer reasons and advertising and other things.

So that's what we're going to focus on. This is part of a long series we're doing on tracking so we will eventually talk about that government stuff, we'll eventually talk about digital stuff, we'll talk about the places like Facebook, but today you know, like Daniel said, we are really focusing on how tracking follows you away from your computer, off your phone and encounters you in the real world even without any electronic devices.

Daniel Forkner:

[1:26] And I'm glad we're going to be breaking this up over several episodes because when you start looking into it you realize how expansive it is and just how much is going on, it's really hard to even know where to begin and make sense of all this.

David Torcivia:

[1:40] Yeah I mean this is really an industry worth hundreds of billions of dollars because it's interconnected with advertising and data analytics and everything else. And there's so much money and so much information it's just everywhere. It's in everything you look every product you use every single moment you have in your life is probably touched upon by this tracking and by the the analytics and behavior modification that comes from it so it's really an all-encompassing thing and it's really I think the defining aspect of our modern digital Tech related world.

Daniel Forkner:

[2:09] So David you say this is a characteristic of our digital world but companies have always tracked us right?

David Torcivia:

[2:15] Yeah I mean we had some kind of tracking for as long as advertising and the idea of consumers as a whole has existed because for businesses to most efficiently target us and sell us things that we may or may not need, they've tried to break us down into demographics, and historically those demographics have been very broad. It's been about sex: male, female, it's been about age: 18 to 35, 35 to 40 whatever, and it's just very general characteristics about us. Maybe ethnicity, maybe where you live, zip code and stuff. Just very general things that you can look up almost publicly. And so some of this was collected at the point of sale when you would check out they'd ask for your ZIP code or something. There were often buttons on the tills, they press the button for, you know, this is a male this is a female checking out. And if you go farther back in some countries and in some places they would actually have little pad of paper and write some details down about you every time somebody came and checked out and somebody would manually collect all this. They used this data for direct mailing and sending out coupons and helping the company figure out where people are coming from, very general things. It was a slow expensive process, you didn't get into too much detail. So this is not something new. The fact that people have been tracking us in order to sell us things is not a new thing at all.

Daniel Forkner:

[3:26] So basically we've had this kind of demographic tracking right, or the attempt has been to categorize people into groups and to figure out how to effectively market to different areas of people. I guess the digital component of this has allowed that to become more individualized and to become more personal.

David Torcivia:

[3:44] Yeah and it's not just about the individualization but the scale of it also has dramatically increased so it's not just about when you go in and buy something, but even if you go in you don't buy something, if you just look at something, if you walk past the store. All this stuff is now being tracked and categorized. There's so much more data on us, there's so much more information, there are so many more categories they are tracking all this on which is something we're going to explore later on in this episode.

Daniel Forkner:

[4:07] Okay you just said we are being tracked a lot more on a larger scale, and you mentioned I guess walking into the store and looking at something and that can be tracked. What exactly do you mean when you say if we look at something that's being tracked?

David Torcivia:

[4:20] This is one of my favorite topics to get on to. I'm a huge privacy advocate, I spent a lot of time looking at papers, doing research, looking at advertising products and stuff in order to get better idea of just how much this landscape is changing.The change has been dramatic especially in the past 5 years or so. And one of the ways that this landscape has evolved so dramatically is facial tracking and this is something I don't think people realize is as ubiquitous as it is, and is as advanced as it is. We all have a little bit of limited experience with facial tracking on your cell phone or on modern point and shoot cameras you know, you take a picture it has smile detection and facial detection and it automatically focuses on those and that's most people's experience in day-to-day life of facial recognition and how far it's gone. Or maybe if you upload a photo to Facebook or something it'll suggest, "is this person David Torcivia here?" and tag it yes or no, and so people are aware that this technology exists but I don't think they realize how much has entered their everyday life.

Daniel Forkner:

[5:17] It is very common to see security cameras outside office buildings, and you walk into a bank and there are cameras on the ceiling, but I imagine a facial tracking camera would be pretty expensive and kind of hard to implement in lots of different places.

David Torcivia:

[5:31] Well you'd be surprised. So a lot of this camera technology is good enough, and in a lot of places they don't even have to upgrade their cameras in order to start getting this technology. We've had HD cameras for a while, especially with loss prevention programs a lot of big stores were pioneers in this. Target is always a big one pushing the sort of technology. Answer so the way these systems work is yes you can buy them with new special cameras that are optimized for this, but they've also devised ways that they can hook all these cameras up to computer on the back end and integrate this facial recognition technology all at once so you can just walk into a bank, or a restaurant, or bar, or a church even - I've seen these being sold and marketed to churches - and pitch this idea where you take your already existing camera technology, route it instead of into recording for security purposes and instead route it into this system and next thing you know you're analyzing everyone's face. People walk in and it and recognizes that you are a woman 18 to 35, and you look happy right now, and you spent this many seconds looking at this section, and you spent this many seconds glancing at that and you can see readouts of all this data analyzing your emotions and your micro emotions second to second automatically throwing you into demographic classes and analyzing exactly how it is you interact with the store with the displays and with everything.

Daniel Forkner:

[6:43] You're saying it can tell what my emotions are?

David Torcivia:

[6:46] Yeah so a lot of these technologies use machine learning which is one of the big buzz words people use these days, and it means more or less that they write this sort of vague algorithm that teaches a piece of software to learn and recognize stuff and then run through lots of emotions called training data and they keep showing pictures and say "this is a happy person" and "this is a sad person" and "this person is angry" and whatever and they keep showing these like very simplistic emotions over and over but hundreds and thousands and tens of thousands of different images and letting the computer learn to recognize these things, from all these different points that it tracks and follows on our faces. And over time the software learns to recognize new pictures and new videos that come in and automatically classifies this. And some of these companies advertise this facial recognition tracking services, and not just learning about emotions, but offering data on IQ, on impulsiveness, likely jobs and stuff, and it gets really into these almost pseudoscience phrenology-ess ideas; it's almost like we're back in like the 1920s and there's someone out there with a caliper measuring the size of your skull, deciding if you're like an intelligent person or or Mongol race or something.

Daniel Forkner:

[7:55] That's pretty scary I don't like the idea of being judged on my emotions and IQ, I try not to you know tell people about my IQ I'm a little embarrassed about it. But what does all this mean in terms of my experience? So I've been walking into retail stores ever since I can remember. I don't feel like the experience has changed that much, I mean I walk in, there are shelves, I buy what I want and I leave. What use is analyzing my facial expressions and what I do within the store to the retailer? What can they do with that data?

David Torcivia:

[8:24] Well depends on how far they've worked on integrating these programs. Some stores - and this is the ultimate goal of a lot of them - would love to get rid of the hard price tags you see on everything. When you walk up to a shelf there's a sticker that says this is $9.99, and they would love to switch to digital pricing where they can update these tags dynamically all at once. And the idea then ultimately is that they would love to customize prices for who you are and if they analyze your face and discover you look wealthy. Or like you have a good job or you're in a good mood or something they're going to charge you more money. And if you look unhappy, you look like you might not have as much or you can't pay as much then they might charge you less. Or if they don't want you in their store it'll jack everything up really high to get you to leave and go somewhere else so you don't come back and hurt the shopping experience of other consumers. And we're not quite here yet there's some stores demoing very early examples of this technology, we're not anywhere close to it on a wide-scale, but this would not be surprising to see in 5 years and certainly by 10 years it's going to come in place.

It's not just that. So this date is also use for digital advertising. So say you go to the store and you spend a long time looking at some products and you ultimately decide you don't want to purchase it for whatever reason. Maybe you don't really need it, maybe you decided this is not the right one, maybe the price isn't right here. And you leave and you forget about it and that's it you got on your merry way. But they've tracked this and they know all sorts of things about you based on how you looked at it and how you interacted in this facial analysis. And they've collated that with other data they have on you based on information your phone carries or say you purchased something else and they can link it to your credit card and then your phone number and everything else about you.

[10:01] And the fact that you looked at this product now follows you. So you go online, you log on to Facebook, you go search for Google; your results and what you see changes based on what the store is targeting, knowing that if they can just show you this specific type of ad, or they can shift what kind of news stories, or what you're seeing on your friends profiles, then they can change your behavior and get you to purchase something that you've already decided you didn't want to buy at the moment. And by modifying what you see and what your world is around you, ultimately they're able to modify your behavior and capture that sale in it and extract that money from you. And it sounds insidious and I mean it sort of is. Ultimately all this advertising and all this tracking stuff is about behavior modification, and we have all sorts of like politically correct words for that, but advertising ultimately is about behavior modification on both a personalized and mass scale. And in some of these, when you go read the ad copy of these websites pitching these facial recognition software, they're open about this and saying "we can help you modify the behavior of your customer in order to extract more money from them" and they say basically that point blank. Which is almost kind of refreshing that at least they're honest about what they're doing.

Daniel Forkner:

[11:11] I still find it pretty unsettling. I've never liked Billboards; I drive down the street see all these Billboards, but at least I feel like okay I know they're trying to, you know, sell me the fast food joint on the road and I feel like I have a choice where I can say "no I don't want to do that" even though there are subliminal attempts to get my attention. I've heard that the color red makes people think of food. But all of it is out in the open and I feel like I have a choice to reject it but this sounds like I don't really know that it's going on. Like I can look at something in the store and then that somehow... how does that get to my phone and what I see on Facebook?

David Torcivia:

[11:47] We'll talk about the process of how that information propagates and is spread out and bought and sold and traded and then combined with other things and ultimately use to change your behavior. But you mention Billboards; I want to bring this up for a second. So maybe you've started seeing these digital Billboards pop up, some are in screens in stores advertising in the windows, if you're living in a larger city you've definitely seen small digital billboards on like the side of bus stops or on the street corners. Here in New York we have this free WiFi kiosk that the city has built. The city didn't actually pay for any of these. They were built entirely by advertising companies to put up all the money for it in exchange for building these billboards all over it so they can advertise to people, but they also built cameras into these billboard.

[12:30] And they analyze you as you walk by and you look around on them. If you register to use their free gigabit WiFi it tracks your email address and your phone MAC ID and it follows you as you go around the city to see where you spend time, what stores you go into. And then it tracks your face and it knows what you're looking at and it can tell when you're approaching and it changes the ads that are shown on this digital billboard in the real world. It specifically targets you or it says "oh this person is likely going to the shopping district right now they're going to stop in to H&M let me show them an ad for that" or "let me show him an add for Zara and try to get them to go there" or something. And this sort of advertising this tracking that we've gotten used to on the digital space in Facebook and stuff where we come to expect these custom personalized ads is now spreading out and interacting in the real world and sort of shifting our experience of cities and of public spaces into something that's personalized for us in order to get us to spend money, in order to change your behavior and get to us to do different things that don't necessarily benefit us but benefit these companies that are advertising to us and trying to manipulate our behavior in order to extract you know that that money from us. Or in cases of political ads get us do a specific behavior whether that's vote or that's not vote or anything in between.

Daniel Forkner:

[13:41] You mention manipulation at some point, and maybe that's a strong word I don't know but I definitely get the sense that this has a way of distorting my image of the world because when I think of marketing and advertising I think of it as kind of a blanket thing that applies to everyone, and part of the appeal of certain products is the social aspect that you know a lot of other people are engaging in this activity or product as well. If you're saying I can just walk down the sidewalk and see advertisements that are tailored to me personally and no one else is seeing it. I don't know that opens a whole nother dimension for me it's like I'm seeing a world that no one else can see and that's influencing my behavior and what I think other people are doing when in reality it could just be directed at me based on some of the things I've done or things I've revealed about myself without even knowing it.

David Torcivia:

[14:27] Well I mean we're not talking about like a fully augmented reality where like the people walking next to you see something different but it might be something like you're walking with friends down the road, and this camera analyzes you and your friends and it picks out all your faces and then checks with its database and it says oh you know “this person is depressed based or other data that we found” and that triggers this very quick ad auction that happens. And this all happens within milliseconds, and somebody says “oh we’re trying to target depressed people to sell them this product because we know they're more likely to purchase our product if they're depressed” whether or not this is something that's good for them or not, whether it has anything to do with depression or not but they just know the fact that depressed people are more likely to buy this, and so then it figures out and analyzes all your other friends as well; “this guy looks wealthy he has more money to spend blah blah blah” and it looked at all of you and it said okay this person is most likely to buy something, we're going to target them, and because of that we're going to get the most expected return out of this and then it shows you this ad, this is crazy right?

[15:26] This is a crazy process that is crazy to think that this is happening, but for some reason maybe because we don't realize how much this is occurring or maybe because we just accept technology is like “oh you know it's new and it's improved therefore it must be better than the old way of doing things,” or must be a good thing just because it's higher tech, we just accept it and that's crazy to me I don't know.

Daniel Forkner:

[15:47] All this is done with the intent of selling me something it sounds like.

David Torcivia:

[15:52] That sells it short I don't want to say that this is just for selling because advertising is about more than just getting you to buy a product. So all retailers, all corporations, they identify you as having a lifetime consumer value - there is jargon for this - and it says over the course of your lifetime you are worth X number of dollars to this company if they can capture your business, and that might not mean selling you something right now; it might mean just priming you right now, modifying your behavior so that somewhere down the road you will become a customer and they can extract the most value possible from you. It's not always about selling things and especially in political advertising which is something that's easy to grasp they're not trying to sell you a product, they’re not trying to get money out of you they're trying to get you to vote, or they’re not to get you to vote or trying to get you to care about a topic because you’ll support, or you won't support politicians who also are in favor of this or against this or whatever. I think that's really the purest form of understanding this this type of advertising because it is truly about modifying your behavior in the real world, whether it comes to you digitally or whether it comes to you in a physical space like these advertisements or these billboards, it's about modifying your behavior to best benefit the advertising, whether that's a company or that's a person whatever it is it's about getting that desired outcome of your behavior based on showing you different things based on shaping what you see and ultimately affecting how you think in order to get that behavior.

Daniel Forkner:

[17:15] And even though I feel like I'm a pretty frugal person - I'm careful about what I buy and I try to pay attention to the things going around me - I do feel a little bit like it's overwhelming or almost like I'm powerless to notice some of these things because I go throughout my day a lot and I'm zoned out on thinking about something else and, you know a lot of people have seen that video that was part of the psychology study where a man in a gorilla suit is dancing around a basketball court while a basketball game is going on and no one notices the gorilla because their focused on something else and I do that a lot in life. I feel like there's a there's a huge opportunity to influence the things I'm doing and I just have no idea because there's just too much to focus on.

David Torcivia:

[17:55] A lot of this is hidden too so maybe this is a good time to explore just like how this whole mechanism works because the actual companies themselves are ones you've never heard of and operating in ways that you can't see easily unless you're looking into the details of all this. So might be best to spend a few minutes examining what this chain of tracking does and how it goes from watching your face in a CVS, or recording that you walked past this place or you spent this amount of time here based on signals coming in from your phone or some of the more insidious trackers would say like “oh you went to the doctor's office then you went to the back specialist, and so we know that you're hurting and you're in pain so like how can we take advantage of that.” Every single thing you do is ultimately playing into this thing and it's not even necessarily about “we have cameras here watching what you're doing,” but every time you have an interaction that at one point touches a digital networks odds are somebody’s scooping up the information, entering it into a silo then that company’s turning that selling it to another company with sells it to another company which combines it with more information, sells it to another company and then ultimately gets up to these like 5 or 6 mega data brokers and then they turn around and sell this information to advertisers to use it again to modify behavior to get you to buy stuff, to get you not to buy stuff, whatever it is they're trying to do, and so let’s look at that process for a second.

Daniel Forkner:

[19:12] So you say there's a process where all this data gets aggregated to Big Data Brokers or whatever, how is the data pulled in in the first place?

David Torcivia:

[19:21] Maybe it's best to start looking at what types of data they're collecting because it's not just retail transactions you know it like, yeah you went to the store and bought a Snickers at 1 a.m. and that says something about you but that's just how we buy stuff and it doesn't feel like such a huge invasion of privacy. But these data Brokers and ultimately advertisers have access to a lot more information than we would guess - things that might be very personal. So for example there's a lot of companies that track your health, so they know that you have asthma they know you have arthritis they know that you're an alcoholic, they know you're addicted to gambling they know you have some sort of sexual problem whether that's impotence, maybe that you have no sex drive or maybe you have an overactive sex drive. They track your doctor's visits, they track your prescriptions, they track every time you go to any sort of doctor's office some of them can even track diagnosis. Basically everything about your medical history that they can get without violating HIPAA if they're in the United States. And again this conversation is mostly relegated to the United States but this is happening around the world, especially we’ll get to the conversation later about what's going on in Asia because there's some really interesting implication of what this data can have that already happening there and we might find ourselves experiencing the same in the future in the United States.

[20:34] But all this data is collected in all sorts of things that we would find very personal. Financial data, so we or familiar with the credit reporting agencies especially with that large hack Equifax just had. But you know, date on how much money you have, how much money you earn, how much your loans are, how much money you owe, how late you are on those loans - things just pass your credit score but very detailed things about your life. The value of your vehicles all sorts of stuff, and then the last thing is risk assessment. This is the kind of thing we’re familiar with in terms of insurance. How you drive for example is a big one insurance companies are trying to get. So for modern insurance tracking, they would love to put these boxes in your car that analyze how quickly you break whether you ever speed - speeding a mile or two over the speed limit which is something we all do. They would love to analyze that and see the fact that you do speed that puts you at higher risk bracket and they can charge more money from you. And this is something that we see is going to be integrated directly into cars in the future and is occurring right now actually with these smart cars and stuff with integration of Android Auto with integration of Apple's Carplay. And with other smart cars that connect to the internet this data can be automatically collected and given to insurance agencies, who then analyze this. It doesn't just end there they take this and pass it on to other data brokers who pass it on again and ultimately finds its way to advertisers. So all these things that we find super personal are actually being tracked and recorded.

Daniel Forkner:

[21:57] That surprises me though about the medical information, isn't there some kind of doctor-patient confidentiality? And I'm assuming “they” who you're talking about are data companies and corporations. How do they get access to my medical information?

David Torcivia:

[22:10] Well yeah I mean the conversations you have between your doctor and you are protected under certain things; there's a lot of details about your health that are protected. And there are very strong medical laws that cover all this in fact the HIPAA legislation which is our Health Data protection act is the most powerful piece of privacy legislation that we have in the United States. It's a really great start for how we should be facing all these other privacy conversations about when we talk about legislative answers. But there's a lot of information that bleeds out on so this concept of “data around the edges.” So maybe you’ve heard the conversation about metadata before, when we were talking about Edward Snowden and some of the NSA leaks and stuff, well the same concept carries over to this sort of commercial data, and all these little tiny bits of pieces of your life end up as data and that's what it bleeds. And taken as an individual piece it's nothing to worry about it's just out there. But when you start collecting all these little things, when you can collect from somebody's phone or from these doctors’ offices which will sell some of this data to these data brokers or insurance agencies, like you go to the doctor’s you pay an insurance, then your Insurance Agency record some of this data and then somebody buys this data from the insurance agency. But some of it is HIPPA protected but a lot of it isn't and they take all the stuff and they analyze it and then it finds its way to these lists. You can actually you can go online and buy lists of people with all sorts of medical problems in order to send them advertising and it's cheap. You want to direct mailing list of like half a million alcoholics? Or if I'm an advertiser, half a million gamblers?, which is a very vulnerable demographic. If I want to take advantage of them it costs me a couple hundred dollars. This data is not hard to find, it's not expensive, it's within the grasp of anybody.

Daniel Forkner:

[23:50] Yeah I actually looked at one of these companies that sells some of these lists: dmdatabases.com. And they have all these types of categories that they put people in and you can choose certain categories of people and send mailings to them or whatever or get the information and I looked at the medical one and I was just kind of shocked. They have people categorized by addictions: gambling, alcohol, drugs, sex. They have ailments, just hundreds of ailments: multiple sclerosis, asthma, anything you can think of, and what also surprised me they have a whole list of drugs, they say these are people who have taken certain drugs and you can just select this list, and then you can further break it down by demographic so you could select a zip code or an income bracket or age. And then select “I want to target people that have taken this drug and who are also clinically depressed and anti-social.” And the layers and different customization you can do is really shocking.

David Torcivia:

[24:44] So let me just jump in there real quick because I want to emphasize just how many data points these companies record about each and every one of us. So in the past when you’re tracking somebody you have very basic data - maybe 10 or 20 points: sex, age, maybe income, location, you know basic stuff like that as well as information on how to contact them. Now, it's routine for one of these larger data broker firms - maybe you've heard of the company Oracle - they're one of the big players in this this area. They boast about having 30 to 50 thousand categories that they placed each and everyone of us in. So this extends way passed basic demographic data, but into every single tiny detail about you things that you don't even realize, things that your friends don’t realize, things if you asked yourself. I mean if you sat down and you said I'm going to write 50000 different things about myself you couldn’t do it. But this machine learning picks up all the details and data and tiny little signals in this noise of information to reveal a stunning amount of information about us and then this information is turned around and used to take advantage of us.

Daniel Forkner:

[25:43] When you think about data I tend to think of ahard fact about something I did. Like, I went to the store and bought something at such-and-such time for such-and-such price, and even with online services and retailers where they may tell you they don't sell your data or share your data with other people but what I found out is that, even if they say they don't share that hard data they're still free to interpret it and categorize it and put labels on it and then sell that, which oftentimes is even more valuable to these companies buying this data. For example let’s say I go on Amazon which is something I did recently and bought an espresso maker at 1 in the morning. Amazon can then make an interpretation of that and say “oh Daniel has a mild addiction to caffeine and he has impulsive tendencies especially at 1 in the morning.” And I'm wondering how companies can use that interpretation and that data to then target me not just in general but at different moments of my life where I might be more vulnerable than other times.

David Torcivia:

[26:44] That's a really great example and I think the impulsive nature especially is one of these marketers favorite categories to target. But that bleeds out a lot of information about you it tells us - you know like you said - that you’re impulsive and that you buy things at weird times so that they can advertise to you at weird time. It shows that you're up late, that means you're probably a later riser. All these little details in this one purchase which the facts are: I bought an espresso maker from Amazon at 1 a.m. and shipped it to this address and I paid with this credit card or whatever, all of a sudden becomes all these vague ideas about who you are and they call this inferred data and that’s predicting both who we are and what we will do. And that information about what we will do is used to show us ads or not show us stuff of in the case of Facebook or Google to customize your newsfeed to show us more friends or show us more hard news or whatever it is in order to get that expected behavior to change our behavior from what they thought we would have done before to something that aligns with the values and the needs of these companies and corporations.

Daniel Forkner:

[27:43] We've been focusing a lot on purchases and advertisements but I think we all like to think that we're kind of immune to advertisement, at least I feel that way a little bit. But even more importantly isn't a lot of this to our benefit, I mean, wouldn’t I like to receive personalized advertisements that are based on my interest and what I like rather than having to wade through images of products and activities that I have no interest in. I mean it seems it can make my life a little bit more efficient.

David Torcivia:

[28:09] So there’s a couple problems with that. The first is this assumption that you have to be looking at ads. You don't. You know, they’re there to try and get you to do something they're not a necessary part of the function of anything, and though you know we always have that explanation “Oh you know ads pay for this new story or this pays for this website” or whatever, it doesn't have to that's just the model we’ve chosen at this point it's not something that's necessary. And we like to think of the ads as the product on this web sites but it's that old tired cliche about when you're not being charged for anything you are the product. So these websites like Facebook and even the news and stuff aren't selling ads as a way to make money there selling you to advertisers. And that's the way we need to think about this, but in terms of personalized ads it's interesting that they can also manipulate the way we think about ourselves. So there was this great study where they showed people what they told were personalized ads. And they found that if they showed personalized ads quote-unquote because they weren't they were just adds they picked out, that make people feel better about themselves they were more likely to purchase this product. So for example the show ads for an expensive watch to some of these people and told them that they were picked out and personalized for them because they are seen as a sophisticated consumer. And these quote-unquote sophisticated consumers were just shown a picture randomly decided “you know what I am a sophisticated consumer and I deserve to buy this watch because of that.” It totally change their behavior trick them into thinking that this is true, when it's not at all. In reality you're not shown ads because you're a certain way or because of certain strengths but because these ads analyze you and find out you have a weakness they're able to exploit whether that’s “they have a propensity to buy these green products so we can take advantage of that and sell them,” or “oh this person is depressed right now even though they may not even know it at the time and we can use this to target some stuff”, and this has big consequences in life. There are cases were ads have ousted people with private information that they didn't want out and changed their lives. In some countries, you know being outed as gay or something on accident by one of these ads can have disastrous consequences for your professional career and even for your life. It becomes almost a life-and-death situation.

Daniel Forkner:

[30:17] So in one instance it was found that Stanford was sending out these advertisements for its SSAT course. And the price for the SAT course would fluctuate based on the ZIP code that it was being advertised to, and it was discovered that Asian demographics were particularly vulnerable, so Asians were being charged more for this SSAT course and the lower the income of these people the higher the price would be, presumably because these Asians of lower income status were more willing to pay to send their children to a good education class.

David Torcivia:

[30:49] So this personalized stuff can be very aggressive and it's not necessarily about finding the best deal for you but it's about extracting the most value as possible out of you. Again to go back into situations where some people have been like really torn up about this, the go-to case for every privacy person is Target. A few years ago before a lot of this facial tracking and stuff - this is in the early days of data analytics for your consumers - this young girl came into the store and happened to buy some products that Target decided was associated with pregnancy. This girl was pregnant, she lived at home with her family, and she didn't want her family to know. Target picked up on this because she bought some vitamins or coconut body cream, things that they had decided was associated with mothers.

And so Target had found that if they were able to capture the business of these early Mothers immediately then they would often be able to hold that business of that person, as the child grew up throughout their life, and create a consumer that last 15-20 years. They're tremendously valuable and so they do everything they can to capture this consumer as a customer for life. And so they targeted her and sent her tons and tons of customized coupons because even direct mailers that we get in the physical world now are customized for us; they know what we buy they know what we like and the send us very specific coupons for that. She started getting coupons for baby stuff for diapers for things, and it kept happening and happening and her mother and father were like “why are we getting all these baby ads you know I'm not pregnant, I’m not having any kids,” and it eventually came out that this girl was the reason why, and Target outed the fact that she was pregnant to her family and caused a huge problem at home. This personalized advertising – it’s important to remember there are no consequences for getting things wrong, there are no consequences for outing people and destroying their lives, hurting them in ways that the people who designs these algorithms that came up with these ideas never anticipated. There's no consequences at all for accuracy, there's no oversight, there's no ethics board, this is a wild west industry and there's so much data and they know so much about us and there's absolutely no oversight and that's crazy to me. I can't imagine how this happened.

The fact that we freak out about government surveillance and the fact that they - whatever it is that they know - it's a fraction of what these private companies know and sell about us every day, every second of your day, and there's just very little outrage. There's very little knowledge that this even is happening because these companies are so big this process is so byzantine we don't know what's going on.

Daniel Forkner:

[33:16] This topic is so interconnected with so many areas of life as evidenced by the fact that we've kind of walked away a little bit from the things in our real world experience… but speaking of the home…

David Torcivia:

[33:28] Don't get me started on the internet-of-things let's talk about that for a second.

Daniel Forkner:

[33:33] What is the Internet-of-things?

David Torcivia:

[33:34] Yeah okay this is one of my other favorite rants, and it ties in really well with this real world tracking. So internet-of-things, all these wonderful new gadgets that we have and we put in our homes that make our life so convenient and technically magical, gets us closer to that Jetsons future, are all part of this internet of things. This is light bulbs, this is refrigerators, this is our thermostats, these are our vacuum cleaners, all these cool little things. I walk home I press a button on my phone my light bulbs turn on, and then you know I can change the color. My Nest Thermostat comes in and knows I'm home and start suggesting the temperature to heat it up to the temperature I like. My Roomba is going around vacuuming everything and it knows where everything is and blah blah blah, and a million little things that are so cool and oh my God the…

Daniel Forkner:

[34:19] Amazon can you raise the volume of this podcast episode right now?

David Torcivia:

Yea, Alexa please subscribe to Ashes Ashes.

Those little Amazon Alexas, Apple Siri home things, Google home, these are literally always on listening microphones; it's crazy to me that we're paying somebody money to put these in our house. And then they’re used for advertising, they’re used for selling and they’re use for tracking and building data about us. And some of the advertising is built-in and it's hidden as not advertising. There's this great story I love about one of the Google Home Products. Somebody came and he asked “OK Google what's my day look like?” and it told him you know his schedule, and the weather and whatever but then it ended with an ad about Beauty and the Beast telling him “oh and by the way Disney has this wonderful Beauty and the Beast experience going on right now that you should really check out.”

Daniel Forkner:

[35:10] Google had said that it wasn't an ad but you know they had noticed that this guy didn't have much going on in this life so it wanted to suggest something fun for him to do haha.

David Torcivia:

[35:18] Yeah they call it like “experiences story with one of our Google home Partners,” you can call it whatever you want but it is an ad, and you can call and ad whatever you want but it is behavior modification. That's fine if it's a guy he can just be like “whoa that's weird this is a weird future I don't know why I paid money for this,” but if you have kids at home, they're going to be hearing these things and interacting with these things and they’re just going to assume this is how the world works and it does affect their behavior and it changes how they grow up and it changes the way they interact with the world.

Daniel Forkner:

[35:47] Speaking of children, toys now fall under this category of Internet of Things. I've seen a lot of toys that have wifi connectability and I'm not really sure why a toy would need WiFi.

David Torcivia:

[35:59] This is another one that I really like. There’s this robot or doll or something, they gave it to the kids and then it asks these kids questions about themselves like “what's your name” and the kid says “oh I'm Billy,” and it asks Billy “what are your parents’ names” and then Billy tells his new little friend his parents names and it keeps going on and ask about his school, and ask about their family and all sorts of detail and stuff. And then it turns out this isn't a toy made by like Hasbro or a toy company but a robot made by an intelligence agency, who is selling toys to people in order to collect data and then often times the Internet of Things has terrible terrible security, because the focus is on selling these products, getting them out cheap and getting them out quick.

Daniel Forkner:

[36:40] And one of the dolls that falls under this Internet of Things category, you could actually connect to it with a Bluetooth device from up to 50 feet away, and there was absolutely no security around it so I could be on a street somewhere outside of an apartment building and pull up my phone and look for Bluetooth devices in the area, and the doll would just show up as one of these devices with the name of the doll that you know clearly a children's toy so I could connect to it. And because the nature of this doll, you know talking to the child and recording the child's responses which is creepy enough as it is - because I’m now connected to this device I can actually interact with it in the same way so not only can I record what the child says to this doll but I can actually record my own voice and talk to this child from you know 50ft a way through my Bluetooth device.

David Torcivia:

[37:30] Yeah I mean who comes up with these ideas this is literally insane. Again I feel like I'm taking crazy pills all the time when I'm reading these stories like how is this allowed to happen, who okayed the idea?

Daniel Forkner:

[37:41] The biggest thing my mom had to worry about when I was a kid was whether or not I played a violent video game and now we have to worry about if a six-year-old is interacting with a toy that’s secretly recording everything about him or her.

David Torcivia:

[37:54] And then a lot of this data ends up online anyway so like we think it's only a conversation between the kid in the toy and then we learned it's uploading it to the internet so okay now it's a conversation between the kid, the toy and whatever company owns this toy, but then often times these get hacked. There was a story recently, five million of these messages between kids and their toys were just released, and this data is out there and a lot of it is resold to advertisers anyway but it's phenomenal just how terrible the security is and how much we don't realize how much tracking is going on with these devices. I mean Roomba - it's a vacuum cleaner it's supposed to vacuum and that's it. Roomba decided we weren't making enough money just selling vacuum cleaners so now our roombas are going to build maps of your house.

Daniel Forkner:

[38:37] These are the automated vacuum robots that go around.

David Torcivia:

[38:40] The little automated vacuum robots that everybody films cat videos with that you see on YouTube and stuff, but they're mapping. They’re little mappers now. They map out your house, upload this to Roomba and then Roomba sells this to advertisers who want to learn how big is your house, when are you home, how often do you vacuum, blah blah blah blah all sorts of stuff about you. Even benign things like oh it's a Nest; it's a thermostat, it turns the temperature up or down and it knows to turn it up and down at certain times. Very simple right? That information is again part of this metadata, it knows when you're home, it knows you know where you are in the house sometimes. All this extra data that you think would be a benign thing, your Philips hue light bulbs for example. It knows where you are in your house; you can assign like locations on these light bulbs like “oh I'm in the bedroom this is a bedroom light bulb, and this is a living room light bulb” and so by doing this now Phillips knows somebody’s in the living room right now, somebody's in the bedroom right now and this data becomes very valuable for Phillips to turn around to sell to advertisers who can use this to customize the way that they interact with you.

Daniel Forkner:

[39:43] I've been seeing a lot of advertisements for TVs too that are being marketed as smart TVs so they can connect to the internet and your favorite streaming service, are you saying that I have to worry about those as well?

David Torcivia:

[39:54] Yeah in a couple of different ways actually. So one, these TVs are tracking what you watch. So when we think about tracking “oh it's tracking what we watch based on information that’s coming in off, you know the cable, and so if I play off my Roku or my Apple TV or whatever it's not going to have that info” but they're actually analyzing the video that's on the screen so even if you're playing like pirated content - movies you downloaded - it still knows what you're watching and it uploads that data, and sends it off to Samsung or LG or Vizio and they turn around and sell that to advertisers. And they also work with companies like Nielsen the TV ratings and stuff where they know how long you're watching, what parts you watch, when you turn it, you know if you skip commercials, if you mute commercials all sorts of details and data, that again becomes very valuable to advertisers. But even worse than that, they started integrating a lot of voice recognition for some reason in TV's. Most people don’t even realize that their TV is capable of this but almost all modern Smart TVs have a microphone, either on the TV itself or on the remote and it's supposed to be like a cool convenient feature by “oh change the channel TV” and the TV does that, but they're also using this to listen and they use it for advertising and it's right there on your terms of service when you register and you sign up for the TV and you click it. Vizio was just sued about this actually and lost because they weren't upfront about the data they were recording and they were recording more information and they said they were.

But this is this is just the life we have to expect now; anything that connects to the internet is connecting to the internet not because it's trying to do some sort of convenience thing for us. The convenience is a side effect of them just trying to collect data on us to upload this online and then use that to sell to data brokers to ultimately turn around and manipulate us in some sort of way.

That is our modern world; that is the internet of things and even if we don't want to interact with this, if we’re somebody who is not buying these gadgets in these little tech things and not purposely putting microphones to listen to everything we say in our houses, we don't have a choice because going around on the streets and stuff we are now tracked. We’re watched, our faces are recorded, if you have a smart phone it gives off beacon data for everything you do. So even if you disable Bluetooth like on the latest IOS. Bluetooth is not actually off all the way because Apple and Google have these beacon programs – they’re called beacons - to analyze your phone on Bluetooth, and while you can turn off Bluetooth the default option is not to. It shows it as off and you can't connect it but it’s still giving off that data so that Apple can sell this tracking data to advertisers and then you know that causes both ads that pop up on your phone when you're close to places, as well as ads later on. This Beacon did it gets very detailed: where you are in stores, how long you spend in certain parts of stores, where you walk past, it’s even integrated into a lot as like city street furniture and stuff. You have no idea how much you're being tracked, how much every single action you have is recorded. Everything you do is turned to data that is sold, sold, sold, again in order to collect and create this profile for who you are which is turned around and used to sell ads or to show you certain things to get you to act a certain way.

Daniel Forkner:

[42:54] And it's not just selling ads; one of the surprising facts about all this is how it's used against you in in the financial world, and financial services like credits and your ability to get insurance at a reasonable rate.

David Torcivia:

[43:07] Let's talk about that Chinese program right now because I think that’s a good Segway into this because this is the future we might see. There are patents that exist for this already in the United States, Facebook is one of the major patent holders for this technology.

Daniel Forkner:

[43:19] So in China right now - the government has this plan that they want to roll out in 2020 to enroll every citizen to be a part of this kind of social scoring system. So every citizen will get a number, a single number, that will be determined by everything about their life: whether they get speeding tickets, whether they pay their bills on time, what kind of work they do, and all of this is going to be tied into this one single number that basically, I guess, we don't really know what the government's intent is but I guess it's to show who’s a good citizen and who's not a good citizen.

David Torcivia:

[43:51] It opens up also I think the plan is it's a giving you access to money if you want like loans and stuff this is sort of replacing a credit score, but also say you’re student applying to schools you have to have a certain social value number in order to get in or out. Some jobs won't be open for you. The question becomes what if they open this data to companies and they can use this to analyze you? You can't rent an apartment without having this number high enough or whatever.

Daniel Forkner:

[44:16] It's funny you mention companies because the government is actually taking its research and it's taking its cue from about eight large companies that are doing something just like this in China right now. For instance one of the big companies that's doing this is the financial arm of Alibaba the huge Shopping Network, and analyzing everything about a consumer, they use that to build a credit score and of course that credit score is encouraged by these companies to be displayed and compared among consumers. A lot of dating sites require you to input your credit score you know to determine who you should be matched up with.

David Torcivia:

[44:51] Yeah and I wonder how much this is already being integrated into thing we don't realize. You bring up dating okay let's talk about Tinder for a second, the go-to dating app or a hook-up app or whatever it is.

Daniel Forkner:

[45:02] I don't know anything about it.

David Torcivia:

[45:03] Well we don't know what algorithms it’s using, what it decides how it's going to show you people whether it’s not going to show you people and some of it's based on how many people swipe on you or don't swipe on you. But it’s very simple for them to also integrate this sort of other advertising tracking data into it to make sure people are only shown people of similar Financial ability, similar education levels, all these details and stuff. Okay Cupid is famous for doing this sort of data, and it becomes it's almost like a weird sort of… algorithms deciding who will get to have kids somewhere along the way because they're influencing who we get to meet and then who we get to meet is who we eventually fall in love and marry and have children with and so we're going to start seeing these first algorithm babies in the future. This is one of these side effects that we don't think about of how these analytics and tracking and these black boxes of algorithms start influencing our world.

Daniel Forkner:

[45:56] So that Chinese company that we mentioned - this financial arm – one of the spokesmen came out and said that in bragging about the capabilities of this service that it can take two people say a mother whose purchasing diapers - which it knows because it's part of this Shopping Network - or versus another person who maybe plays video games 10 hours a day, and the spokesman says “what our goal is to say which one of these people is more responsible and show that in their social score.” And this is kind of alarming to me because do we want a company to decide for us as a society what is responsible and what is irresponsible behavior? And if we assume that these companies are profit driven and driven by these short-term self-interest, what is it that is motivating these decisions of what is responsible and what is irresponsible? And you mentioned algorithms right so a lot of these decisions about what this company wants to see in terms of responsible behavior is driven by algorithms. We think of algorithms as this very neutral thing you know it's math driven, how can math have biases? But these algorithms are initially programmed by humans and all the human biases that we have go into them and the way these algorithms go forward with their machine learning, is that we don’t really understand, even the people who program them, how the algorithms derive their end result.

So a lot of these algorithms based on this data are being used to determine people's credit score and as more and more data is taken from our activities that are outside financially related matters - like whether or not we play video games - this algorithm now has free reign to decide if we deserve a loan or not, and the loan officer that is ultimately going to oversee this decision has no idea how this algorithm came to that decision. And neither do we as consumers. That's pretty alarming considering that we may change our Behavior based on what these companies are saying about us.

David Torcivia:

[47:54] Yeah and another point that I want to emphasize with this technology and with a lot of these other algorithms to is that it's not just your behavior that is taking into account, but also the behavior of your friends, of your family, of your friends of friends everyone's got that weird Uncle that you like hate and he's like doing like horrible things. He’s influencing your score. That girl that was back in high school that was so cool back then but then turned into like a weird burnout now: she's influencing your score. Every single person that you interact with throughout your life and the people they interact with are all parts of your score. So you while you might be personally responsible, maybe your family isn’t, you're being dragged down by the actions of others that may or may not actually have any influence on your ability to pay back a loan or whatever it is they’re trying to decide about you. And so we really are losing control over our agency as individuals and as people because they're boiling us down into scores and a set of values of whether it's this is the long-term customer value - what they're worth throughout their lifetime - whether it's this is the risk value of whether they're going to repay this loan, or how much they're going to cost for insurance company or on a short-term thing this is the chance that somebody's going to buy something. All these algorithms, all this tracking, all this data and this technology is about transforming you from an individual - from a human being - into a chance of doing something.

Daniel Forkner:

[49:19] Your financial future is being decided by things that are even outside of your financial activities. I mean there are financial service companies operating in some of these Southeast Asian countries right now that are experimenting with you know determining credit scores based on if you use correct punctuation or grammar in your text messages. Your call logs, your GPS information, your friends and associates and...

David Torcivia:

[49:41] I want to stop for a second to interject to point out the fact that while this is mostly going on right now in Asia and Africa and South America, are a lot of American firms that are the ones doing this. This is American Technology and it's eventually going to come back home to us and it's already being integrated in lot of stuff so this is not like “oh the Chinese Boogeyman over there,” like we're red-baiting or something this is an American thing; this is an American Technology, and American ideas that are spread out and being used in all these places and they're going to come back and bite us in the ass just like they're getting everyone else right now.

Daniel Forkner:

[50:14] A lot of people will say “look this is a lot of data but I really am not worried about it because I have nothing to hide.” But it does seem like we all have something to hide. We all have a private life that we'd like to think is hidden from view not just the public not just strangers but even our most closely intimated relationships. You may have a partner, a wife or a husband and maybe when your partner leaves the house you like to jam out to your favorite rock song in your underwear and not something that I like to do but some people do that. And that privacy is very much a part of our life, and even the things that we disclose to some people we don't disclose to other people so there are contexts for our information. We tell our banker something that we would never tell our doctor, and we tell our doctor something that we would never tell our best friend. And so we have that expectation of privacy and contextualization, but what these companies are doing in tracking every single moment of our Lives is breaking down those contextual barriers and aggregating this data into a single source. A single database.

David Torcivia:

[51:23] That's where this privacy violation really comes in and starts feeling actually like I'm being violated because they're taking information from my doctor, combining that with information from my banker, and taking things that only my friends maybe know about me and rolling this all together to know a more complete picture of me than anyone else knows. This soulless faceless company. These soulless algorithms pretend to know who I am based on collecting all this data of the different faces I show different people at different times. And that may or may not be true; that may not be who I actually am. It may think that that's who I am but that isn’t necessarily the case. And then using that information to manipulate me. It's crazy.

Daniel Forkner:

[52:03] What really concerns me is the effect this will have on society over a long period of time. When that expectation of privacy is lost how does that affect our ability to be creative? If you're a musician you may like to practice your music alone before you show it to someone else, and when that is gone it might affect your willingness to practice.

David Torcivia:

[52:24] Yeah this is a very like it's a cooling effect maybe you've heard that term before. Where the fact that you know the things you do the things you say aren't private anymore so it affects your behavior, you do different things you don't do something you would have done before. I'm already seeing like kids now so that there there's a website okay, I was doing research for this episode I came across the service called Social Filter, and it analyzes your social media for things that it thinks might be a problem for you whether you're applying for schools or jobs or stuff and a friend of mine was just testing this out and he tweeted something vaguely political; very mild tweets that is nothing that would rock the boat, he wasn't like calling for a socialist Revolution or like a some sort of weird Nazi thing or whatever it was a very mild thing. But this software pops up and says like “oh you know it's important to care about politics in the government but that social media is really not the place for that so you should delete this.” What!?

Daniel Forkner:

[53:18] I've heard of that kind of social filtering where it’ll go, you know, prune your Facebook for employers but I didn't know it did that in real time.

David Torcivia:

[53:27] Yeah it does, it sends you e-mails like warning you that this is the case. A social media is supposed to be a social place where you talk about these things. Where are we supposed to talk about them if everything is tracked and feeding into these algorithms, like where is this supposed to happen?

Daniel Forkner:

[53:41] Many years ago when Facebook was first introduced it was common to hear “Look be careful about what you put on there because when you go to a job interview, they're going to search you up and do you want them to see your drinking party last night? No so you know just be careful and keep it professional” and I think we all just kind of accepted that, but as we see it taken to its logical conclusion we're realizing more and more that it's we the consumer, it's we the individual and we as a society that have to conform to what our employer expects from us, or our government expects from us and I think we should start asking the question: are we the ones that need to conform and change our life to meet the expectation of a company or should it be the other way around?

David Torcivia:

[54:28] To me the answer to that is unbelievably obvious. The fact that I would have to mold who I am as a person to be an ideal worker for somebody else; not to rock the boat, not to say something political, not to show myself drinking, or anything that might possibly offend somebody - this is the like crazy end game of making sure that you are perfectly conformed worker bee and that's… who wants that? That’s a boring world that's not anything interesting I don't want that as a person I don't want to see - I don't want to live in a world where that's expected and where that's happening around me. Because all the individualism, the creativity that emerges out of that and our experiences as individuals and human beings and whatever country we’re from as Americans, as Chinese, part of that cultural identity, part of that difference in society and who we are and what it means to be living in the world, to be experiencing this culture is about that difference. about trying new things, about being weird, about doing all sorts of things that might not be seen as socially acceptable in terms of the masses. This emphasis on getting rid of that because now people will always see that because it's always recorded… The implications that has is terrifying.

Daniel Forkner:

[55:38] And look, we all make mistakes maybe you shouldn't have gone out last night and done that thing you did. But you took that social risk, you made a mistake and then you slept it off and you go about your life. What's unsettling about this data is that it has no shelf life, it's there forever, it's your permanent record and that thing you did last night is available to advertisers and insurance companies and financial services companies forever and we shouldn't have to live with that reality. We should be allowed to make mistakes, we should be allowed to experiment, we should be allowed to be creative and to be individuals, but still exist in a society that accepts us and doesn't attempt to exploit those things about us.

David Torcivia:

[56:23] Well exploitation I mean what is the implications for this, for the future, for government leaders for politicians and for Business Leaders? It has to be profound; the people growing up in this right now that experience having every moment of their life recorded… what's stopping these companies from taking this information and using that against them? Or what’s stopping them from a rogue employee from doing the same thing? We've created the most powerful surveillance engine imaginable, way more than George Orwell or Aldous Huxley, or any of these dystopian writers could have imagined we could have done and we did it outside of the government; the government has their own powerful surveillance, but created an even better one in the name of advertising. And we've unleashed this power with no oversight, with no ethics, with no control over this data into just the hands of a few companies and in a few people and the fact that this power is just is sitting out there is terrifying.

Daniel Forkner:

[57:11] Yeah and like you said this has incredible implications for our sense of democracy when a politician that grew up in this information age can be approached by a company that has data on him or her and be forced to act or vote in a certain way. That's incredibly dangerous.

David Torcivia:

[57:26] Once again we brought ourselves to a very negative, “everything is awful” end to this episode. But we’re not going to stop here, we’re going to talk a little bit more. What can we do in this situation? Unfortunately like always the answer is not a lot. In terms of legislation there are some answers, Europe has some good privacy laws, the right to be forgotten for example. Which is a step in the right direction but they still have this sort of consumer tracking there, and maybe it's not the scale or as personalized we have here but that doesn't really matter because like we pointed out metadata reveals a lot about us. Again it only takes about 4 pieces of random information in order to identify 90% of users and that’s four pieces of data. We have 30,000, 50,000 pieces of data. The cat’s out of the bag, we've opened Pandora's Box, it's not going to be shut again in terms of legislative action without shuttering these companies and stuff so what are our options?

Daniel Forkner:

[58:19] Maybe this is why so many of us had this fantasy of living off the grid you know in the woods, living off the land, away from prying eyes. I mean this is something that I actually plan to do for real, but it's not really practical for most people, I mean so many of us live in cities. There's not a lot of woods you can just go off to and besides even if there were, why should we be forced to do that? It's not really practical to just live off the grid and I mean can we live off the grid in this Society and is that really the only solution?

David Torcivia:

[58:50] I also identify with that fantasy and I hope maybe someday that it won't be just a fantasy for me. But you’re right it is Impractical for everyone and it's not practical for all of us at large and so maybe this is a second for me to becomes just more than political than I have been. I have to say, so I am left. Very very far left. You keep going left and eventually you'll get to the end and I'm a little further past that. And the reason I got all the way out there was struggling and grappling with this surveillance problem, both government and corporate. And for me I looked at all these legislative solutions that we’ve seen; I’ve seen the companies and the government itself breaking these laws, ignoring these laws, and I came to the conclusion that I think the only way we can end this, that we can get around this, again because this is already been let out there's no putting it back in back in and getting rid of Technology, is by eliminating the incentive sport to the track and goes to Big incentives to have this data to record all the stuff about us and keep it forever.

One is economic in order to sell stuff to us and change our behavior. And two is power, so the political aspect of it in order to influence elections and things, but also to influence the behavior of people.

The only way to get rid of this is to kill those two incentives and the only way to kill those two incentives are 1) to restructure the economy so that there is no economic incentive for it and of course yes that means some sort of communism, or other alternative to capitalism, sorry all you entrepreneurs out there, and 2) to create a power system that doesn't reward the consolidation of power. Yes that means that our current political system and the way our world is structured at large is not compatible with this, we have to look to other alternatives and there are out there and maybe that's a conversation for a different time but our world has to be dramatically restructured in order for us to get away from this panopticon that we built.

Daniel Forkner:

[1:00:40] That's interesting David I'd be curious to know if there are other solutions besides a total restructuring of society and economics and our political structure. And maybe that's something we can talk about in the future you know, that's something I’m also questioning and trying to figure out myself.

David Torcivia:

[1:00:55] I would love to know those solutions too Daniel, but maybe we can figure them out as we go forward. Again this is not the only time we’re going to be talking about tracking there's a lot more to discuss here, believe me we've only just barely scratched the surface of any of this and we're going to dig very very far down into this whole world and talk about the actual mechanisms of this, and oh are we going to spend a lot of time discussing Big Brother Facebook.

Daniel Forkner:

I'm certainly excited and I hope everyone is too we’ve definitely got a lot more topics coming up that we're excited about. Again we hope you join us, tune in next week.

David Torcivia:

And if you want to read more about any of this; if you want to see some of these creepy facial tracking videos from these advertising beacons, if you want to look at the research studies on metadata or any of these other links that we've gathered and collected and read through in order to start researching this episode in the series we have those on our website at AshesAshes.Org, under the name of this show “Permanent Record” and you can also find us on your favorite social network where they track all of us at Ashes Ashes Cast.

Daniel Forkner:

I'm eyeing my vacuum cleaner right now it's sitting in the corner of my room; I see it in a whole new light now.

David Torcivia:

As you should and as you should with every part of our new digital world. Until next week.

Daniel Forkner:

Bye.