Subscribe now on: iTunes | Google Play | Stitcher | Soundcloud | Spotify | RSS | or search "Ashes Ashes" on your favorite podcast app.

Chapters

  • 05:50 DNA and Me
  • 11:56 Corporate Wellness Programs
  • 19:31 Why might your employer want your health information?
  • 27:09 Anonymous data isn't?
  • 34:42 Mental Health Surveillance
  • 41:01 Building a nefarious infrastructure
  • 48:47 In who's hands are the keys to women's health data?
  • 57:53 Pirvacy concerns in Australia
  • 1:00:27 Benefits from central health databases?
  • 1:04:46 What can we do?

Thank you Jandun for completing this transcript!


David Torcivia:

[0:06] I'm David Torcivia.

Daniel Forkner:

[0:09] I'm Daniel Forkner.

David Torcivia:

[0:10] And this is Ashes Ashes, a show about systemic issues, cracks in civilization, collapse of the environment, and if we're unlucky the end of the world.

Daniel Forkner:

[0:20] But if we learn from all of this, maybe we can stop that. The world might be broken, but it doesn't have to be.

[0:40] In late 2016, a man escaped his burning home in Ohio after gathering his heart pump and a few other belongings. A 59-year-old man with a pacemaker, the police did not believe that Ross could have escaped his house in time unless he had started the fire himself. So, a judge allowed the authorities to retrieve the data in his pacemaker. They had a cardiologist look at the data, and they concluded that based on his heart rhythm at the time of the fire he could not have possibly run out of the house. So, Ross was charged with arson and insurance fraud. And since then, the Middletown, Ohio police have been granted warrants for pacemaker data at least a couple of times, to build investigations against suspects. And this fact should give us pause.

David Torcivia:

[1:31] Millions of people are walking around every day with medical devices in or around their bodies. Whether it's an insulin pump, a pacemaker like Ross, a small computer chip in their brain to regulate something like Parkinson's disease, or something more obvious like a prosthetic limb, most of these devices now store electronic data. And many even transmit this data wirelessly to healthcare providers and insurance companies. And when it comes to data that is so intimate to our personal health and the very way our bodies work and function, most of us probably find the idea of people peering into that information incredibly intrusive and unethical, even if it's in the service of an official search warrant for police investigation. But, in fact, we live in a world today where our very personal medical and health-related data is open for the world to see. And the veneer of privacy starts to fade as we begin to see just the tip of marketers, retailers, analytics companies, intelligence institutions, and more, that are all getting their hands on the very fabric of who we are. Yeah.

Daniel Forkner:

[2:37] But, David, it's not just medical devices that are acquiring data on our personal health. I mean, consider all the devices we have around us today that are collecting this data from our fitness trackers, our diet trackers, our smartwatches, our cell phones, and countless many more apps and services that we use every day.

David Torcivia:

[2:58] I was actually just working on some advertisements, Daniel.

Daniel Forkner:

[3:01] Not again, David, come on.

David Torcivia:

[3:02] I know, I feel super guilty, but here I am anyway. I was working on these ads, they were for Philips, and Philips has a whole bunch of new products that are medical technology, ways of living your life products. They're, like, things that’ll keep you from snoring, sleep apnea devices, things that will help you dream specific ways. And all of these things are associated with apps that will send data back to Philips, or presumably also your doctor, and that data is, of course, very valuable. And Philips is, of course, not the only company that's doing this. Nokia, Withings, all these health tech companies have products that are just sitting there, watching you all the time, and taking this data, throwing it up into the cloud somewhere, and then who knows what happens to it after that data is gone and out of your control. I mean, we're talking things that lie underneath your mattress that detect how you're sleeping, things that are looking at the air in your room, the environment around you, even the way that you toss and turn in your sleep. We're talking blood pressure, we're talking heart rhythm, there's a whole bunch of new smartwatches that will not only read the pulse of your heart, but if you put your hand on it, it will actually tell you the exact electrocardiogram of your heart, which can be used to diagnose arrhythmia and other things. All these amazing technologies, which should be an exciting time for ourselves taking our personal health into our own hands, instead of having to constantly lay that off on a medical doctor. Except for this cloud that hangs over our heads, which is quite literally the computer cloud, of where all this data is sent, uploaded, and then at that point, enters this black box. And our privacy, well, it goes away with it.

Daniel Forkner:

[4:38] That's a scary thought. And in some places this is taken to an extreme, David, where people are not simply wearing devices or using them temporarily. But in Sweden, for example, over 4,000 people at this point have gotten microchips installed directly into their finger, so, inside their skin. And the purpose of this, it's supposed to make life more convenient, you can connect your credit card information with this, so you can just purchase a good at a store by swiping your finger. Some people have linked it with their smartphone apps so that, hey, instead of giving you my phone number, let me just tap your phone, and now you have all my information. People are using it to get into their apartments without a key, train stations are taking this instead of requiring a ticket. And it's another thing that we're introducing into our lives to make our day-to-day more convenient. But associated with all this, like you mentioned, is electronic data. And when it comes to our personal health, the things that really make up who we are, that's the type of data we would want the most security surrounding it, right David? I mean, that's the thing that we want to be most private. But we're finding that that's not always the case.

David Torcivia:

[5:51] Well, Daniel, what is more personally identifiable than the very DNA that makes us up and defines who we are as individuals and biological entities. And I can't ever talk about DNA without having that little song from Jurassic Park, "Dino DNA," playing my head. You know what I'm talking about?

Daniel Forkner:

[6:22] Yeah, I know what you're talking about, David.

David Torcivia:

[6:24] With that aside, well, right now, thanks to modern DNA sequencing technology, we are sitting at a very exciting time in our personal health, and defining and learning who we are. And that's made possible by this very affordable DNA sequencing technology.

Daniel Forkner:

[6:41] That's right. I actually have some friends who, for Christmas, they received a product from 23andMe, it came in a little box. And inside that box are components that you can spit into and then ship off to the company that will sequence your DNA and tell you where you're from, or tell you your, you know, your genealogy, broadly in terms of race and ancestry, and help you fill out that family tree if you have more data.

David Torcivia:

[7:07] Yes, we've all seen the 23andMe or AncestryDNA test kits that have been a popular gift for the past few years. And while some of those are limited to specifically whatever random genealogy thing you're looking for, if you want to find out exactly what 0% you are Native American actually, these tests, I guess, can enable that. But some of the higher-end offerings of these, including products from 23andMe, also offer health information with this data. And that can give you valuable insights to your health, things like, do you carry the gene that makes you more susceptible to breast cancer, in which case you should get more frequent mammograms, and get them earlier. These types of information can be used to empower your health, and there is something valuable there. But this also means that in order to give you this detail insights into your health, that companies like 23andMe have to store your DNA in their servers. That means somewhere there's a file, if you spit in this tube, mail it off, and paid your $99 or $150 or whatever it is, there's a file floating in this digital cloud, that is your genetic code, who you are. And that information can be extremely valuable.

Daniel Forkner:

[8:15] Well, and I think that reality is unsettling for some people. And I was actually proud of my friends. You know, I didn't really say much when they told me they received this 23andMe product, I was kind of like, okay. But a couple weeks later I asked them, hey, did you ever get that done, and they actually said, you know what, we decided not to. We thought about it, and the idea of putting our genetic information in the hands of some company that might use it against us was unsettling, we decided not to do it. And that's the reality is that many people are waking up to the fact that companies like 23andMe and Ancestry.com, well, they have a history of selling that genetic information to pharmaceuticals and other companies. And if there was any doubt about that, last year one of the largest pharmaceutical companies, GlaxoSmithKline, invested 300 million dollars in 23andMe as part of a four-year partnership that will give the pharmaceutical more direct access to the genetic profiles of millions of people. Part of which is to develop new drugs, which maybe we'll benefit from, David, but who knows what other nefarious purposes they might want with that data, that could come back to harm us. But, the listener of this, astute listener out there, you say but, Daniel and David, that data is anonymized. Oh really? Well, we'll get to that.

David Torcivia:

[9:31] This is my big hypocritical admission, Daniel, I actually have done the 23andMe. Years ago, I did it after my parents decided they wanted to do it, and at that case my DNA is donezo, they already would have known who I was exactly. I had no more genetic information left to protect. So, I spit in the tube and now I know a bunch of health traits about myself.

Daniel Forkner:

[9:56] Was it worth it David? Was it interesting?

David Torcivia:

[9:58] No. I mean, the ancestry stuff is, like, totally ridiculous and questionable at best. But, saw some interesting health things and I know I don't have to worry so much about some stuff, so. I think at some point, and that's something we need to remember in this episode in particular, a lot of these technologies do have value for their users. It's just the way that this data is ultimately used that betrays the user's trust and the privacy that we feel like we should have with our data. If we're giving somebody permission to take it, oftentimes we don't even realize what's being done with it, or how it's being taken, or that it's being taken at all the first place. And I will say, in 23andMe's defense, this research data is opt-in, which is nice. If someone is going to be violating your privacy, you should have to opt-in to that, so that is a good step forward. Of course, though, everyone clicks that box, you know, it's a terms and conditions box basically, just like you check a box at the DMV when you want to donate organs. I mean, who's gonna not donate organs, unless you have some religious reason. The same is taking place here, like, oh, of course I'll give my DNA for science, I'm gonna do some good.

Daniel Forkner:

[11:04] Well, and of course so many companies, their terms of service, it's not really optional. I mean, I guess you could call it opt-in, in the fact that you have to agree to it to use the product, but for many of these services it's not like you could use the service and not agree to the terms of service. It's a take-it-or-leave-it proposition.

David Torcivia:

[11:19] Right, well, I mean, this is specifically in sharing your data for third-party research because they do different things depending on where your data's warehoused, you can ask them to delete it. But not all these services are created equally. Ancestry and their kits are much more privacy invasive than 23andMe's. There are other ones you can pay much more for, because of their business model it's just profiting off the sequencing and not selling your data. Those options are out there, as well. So you can take this data and get it into your hands without having to sacrifice privacy, but you're going to be paying for it. Because a lot of times this is always that trade-off that we come to, convenience, or cost in this case, in exchange for your privacy and data.

Daniel Forkner:

[11:56] Right. And talking about genetic information, talking about the companies behind them, and talking about convenience, David, how about we look for a minute at companies and their direct relation with their employees, and how many companies create pressure for employees to partake in programs that ultimately harvest their personal medical and health-related data. And so it's no surprise that more and more companies today are asking their employees to participate in these types of programs, including health screenings, biometric risk assessments, to determine what kind of risk factors you might have for a certain disease, as well as implementing broad wellness programs, which we'll get to in a second. Companies also want to know how often their employees smoke tobacco, and many are beginning to financially penalize those who do and who do not participate in many of these programs. But sticking with genetic information for a minute, David, there was a bill that was introduced into the US Congress in 2017. It was approved by the House Committee on Education and the Workforce. It's H.R. 1313, and in December it was discharged by the other two committees, which means that it will go straight to the floor of Congress to be considered. And, as far as I can tell, no further action has been taken yet, but it remains open for consideration. And what this bill would do, is allow employers to collect the DNA of their employees directly, as part of their voluntary wellness programs. And so, like I mentioned, many companies offer these wellness programs, which are aimed at improving the health of their employees so that they can lower health costs. And, although they are voluntary, there are great financial incentives and other benefits tied to participation. So in companies that offer employer-sponsored insurance plans, for example, if you don't participate in a wellness program it could mean paying substantially more on your health insurance premiums, as much as $1,500 or more. And if this bill were to pass, it means that employees who refused to give up their genetic information would be faced with this financial trade off.

David Torcivia:

[14:04] But wait, you're screaming into the void right now. David and Daniel, you're wrong! In the United States I know that there's a bill that protects our genetic health information.

Daniel Forkner:

[14:13] The hippopotamus law.

David Torcivia:

[14:14] The hippopotamus law, no, it's unfortunately not, Daniel. It is actually a 2008 genetic privacy law, also known as GINA, or the Genetic Information Nondiscrimination Act. That's a mouthful, I can see why they call it GINA. And what this law does, basically, is preventing companies from discriminating you based on your genetic data. Simple enough. But because these wellness programs that we talked about in H.R. 1313 are voluntary, this GINA law, in fact, doesn't apply. But again, when there are financial consequences in the thousands of dollars for not participating, can you really call something voluntary? And that raises the question, why do employers want this information in the first place?

Daniel Forkner:

[14:57] Well, that's a good question. And, again, the idea of these wellness programs is that, if you can modify the lifestyles of your employees away from unhealthy lifestyles, by encouraging less smoking, more exercising, better diets, more frequent health screenings, then from a business perspective you might lower the occurrence of lifestyle-related diseases, which means you can negotiate for lower-cost insurance plans and you'll have fewer payouts to assist these employees when they have health issues. Eighty-five percent of large companies that offer employer-sponsored health insurance also offer these wellness programs. And what participation in them looks like depends on the company, and in some cases it depends on the health of the individual. There's one company that has a different program for employees with diabetes than those who don't. And for those that do have diabetes, the company offers a 30% discount on insurance premiums if you lose weight, keep your blood sugar down, or visit the gym at least 10 times each month. And that 30% discount, by the way, amounts to $1,400 a year for individuals. And as proponents of the collection of genetic information rightfully point out, employers with access to their employees' DNA could get even lower rates from insurance companies, which, they are used because it allows companies to identify particular health risks and offer preventative methods early. The opponents of this argue that it is just another insight into the lives of individuals that could be used to discriminate against them. And, you know, I think it also raises a question, David, of why are we putting our personal health into the hands of those that sign our paycheck? It seems like it's really unrelated.

David Torcivia:

[16:40] Well, I've got a lot of thoughts on that, and I'm going to get to that in a minute. But first, I really want to address these wellness programs because, you know what, on paper they actually don't sound so bad, right. Okay, so I'm a company, I want my employees to be healthy so that they can be productive, so they can make me as much money as possible, and I don't have to constantly pay a bunch of insurance for them, have them out of work, you know, whatever.

Daniel Forkner:

[17:02] Especially if they work in something like a coal mine, you want them to be strong and fit.

David Torcivia:

[17:07] And not filing class-action lawsuits against me. So, I have a huge motivation to keep my employees healthy. And if I can have a program that encourages them to be healthy, well, you know, that sounds like a great deal. And that's how these wellness programs have really come into vogue, and they're really growing in a lot of companies. There's billions of dollars poured into these companies every year. But the big problem with these wellness programs is the fact that they actually don't work at all.

Daniel Forkner:

[17:32] You mean they don't lower insurance premiums for companies?

David Torcivia:

[17:35] In fact, they end up costing companies more money because they're paying for these wellness programs in addition to their regular insurance premiums. They don't keep their employees healthier, the employees that actually enroll in the wellness programs oftentimes tend to cost more for insurance than those who do not, and in fact, every single measure of success for these wellness programs has fallen flat and in many cases found the exact opposite is occurring. And this is not just in one study but in many studies, and there's some very long-term studies going on right now who have so far found that, yes, these programs are a huge failure, but they are still continuing. And I don't know if the reason is because businesses are just in denial about this, and because I want to be the CEO, or whoever, who has this great program to take care of my employees, or because there's something more insidious happening here, Daniel, like you've alluded to.

Daniel Forkner:

[18:23] Absolutely, David, I think that's kind of the theme of this episode. I mean, what we're talking about is medical surveillance, and while the idea behind medical surveillance is that you can lower insurance premiums and help people out by identifying health risks, at the end of the day it's mostly incentivized by the fact that that information is extremely valuable from a marketer or a data analytics company standpoint. Where the more you can know about a person's lifestyle, the more ammunition you have to target particular vulnerabilities, or target certain risks, exclude them if you don't like certain lifestyles that they exhibit. And we see with these company wellness programs that these initiatives are often managed by third-party companies, so these are third parties that are collecting your personal health information on behalf of the company that hired them. But these third-party companies are unregulated and they often sell this data to other companies, to the highest bidder. And it's all part of a massive wellness industry right now that is growing at a rapid clip, and is trying to consolidate the market for technologies that can collect data on every part of your life activity.

David Torcivia:

[19:31] Before we move on to all of this data, what it is, where it's going, Daniel, I really want to focus just one moment on something we said a moment ago, and this idea of control. Because a lot of this data, this health data, is used for control, but as you know, we've spent a lot of time on this show discussing health care, medical care, all these things. We did a very long, three-part series on the American healthcare system, episodes...

Daniel Forkner:

[19:53] Episodes 45 through 47.

David Torcivia:

[19:56] Which we highly encourage you to check out. We love them. But the idea that our healthcare, and this is in particular in the United States, where we do not have a form of universal healthcare, where everything is provided by individual companies, either by the company you work for or something you purchased from another company, is used to control us as individuals.

Daniel Forkner:

[20:17] Well, can you give us an example, David, of what companies might be trying to control through these types of programs?

David Torcivia:

[20:23] Let's quickly, let's look at this. So think about universal healthcare for a second, those dreaded two political words in the United States. Universal healthcare or Medicare for all, whatever it is you want to call it. A program where everyone has insurance and is automatically covered, and it is covered by the government instead of private healthcare companies. So these dirty, commie words when we're not participating in the free market or whatever.

Daniel Forkner:

[20:47] Okay, I'm imagining it.

David Torcivia:

[20:48] Okay, you're imagining it right now. And this is something we should state that a vast majority of Americans want. It depends on what poll you're looking at, but over 60% of Americans want this. With Democrats it's very, very high and even in Republicans, surprisingly, it's high and getting higher. This is a very popular idea, we've seen it work all over the world. So many other places in the world have much lower healthcare costs and better healthcare treatment than the United States. It works. We have the most inefficient healthcare system in the world by a long shot, so, go America. So why is this the case?

Daniel Forkner:

[21:22] Yeah, so tell me why companies, then, wouldn't want this, when they're paying more to ostensibly lower health costs of their employees, but it's not working.

David Torcivia:

[21:31] Right.

Daniel Forkner:

[21:32] Why wouldn't companies want this, then?

David Torcivia:

[21:33] That's a good question, and that's an important question to remember here. Because companies are paying a huge amount out-of-pocket to cover our healthcare, if we're lucky enough to be working for a company that gives us health insurance. Your cost to a company is probably 20 to 40% higher than just your salary, and this is because of taxes, and this is in large part because of that healthcare cost. It's very expensive for these companies to be providing us all this healthcare. So why aren't they clamoring, lobbying, talking to all of the politicians they have in their pockets, to say please pass this universal healthcare, save us trillions of dollars a year in healthcare costs. That's a good question, these stockholders should be asking this right now. And the reason why, Daniel, is because health insurance has become a huge bargaining chip to control labor, in the same way that pensions were decades ago, as we've talked about in the past.

Daniel Forkner:

[22:24] Episode 10 "Broken Promises."

David Torcivia:

[22:26] So then imagine you are working at a company, you hate it but they have great health insurance. But you also have a family, you have a child and your child is sick. They have some illness that's not getting better, you have to take care of them, you desperately want to quit this company. But your partner, they don't work, your child is sick. Are you going to risk leaving this company to find somewhere else that you would like much better, that could maybe pay you more money, but would have worse health insurance. Or worse, you fail to find that new job, or even after you get accepted it falls through at the last minute and you find yourself with no health insurance and a sick family. Are you going to look for something else in that case?

Daniel Forkner:

[23:05] That's a great point. I think when faced with that type of pressure, like, hey, I have a steady job, its paying for my health, its paying for the health insurance for my family, you know, maybe I don't like it, but why go through the hassle of moving somewhere if there's a chance that I'm gonna be months without health insurance. And, I mean, especially considering how expensive it is. We've already mentioned that medical bankruptcy in the United States is the number one cause of bankruptcy across the board, and that's for people that even have insurance. It's extremely expensive, and to have an employer help pay for that is a true burden off so many people's shoulders. Yeah, you might be onto something.

David Torcivia:

[23:42] Exactly. This private insurance scheme is something that keeps the job market as illiquid as possible, and employers benefit hugely when it's a huge burden for an employee to move to a new company. The harder they can make that process, the more likely they can hang onto you for longer, at lower wages, and take advantage of you and exploit you in other ways. This is why we're not seeing universal healthcare despite a majority of Americans desperately asking for it, which is something if I was a politician running for president or something else I'd say, well yeah, of course I'm gonna support this, most people want it. But the fact that so few are, even on the Democratic side where something like 70 to 80% of Democrats want a Medicare for all or universal healthcare program employed, these party leaders are still saying no, this is never gonna happen. And I wonder why. But David, you're saying, you're getting way off topic. But this theme of control in our health is something that is very important, and is very closely related to this surveillance that we're seeing in our medical data, and the medical devices that we see around our life.

Daniel Forkner:

[24:43] There's no doubt that this wellness program industry, despite losing money for companies, is growing. Like I mentioned, it's an eight billion dollar industry in the US. It's expanding, and particularly of note is that many companies are expanding their wellness programs beyond just the medical. So they want to monitor not just your physical health but your financial security, your social activity, all so that they can track and modify the behavior of their employees into favorable ones, in some cases by using software that automatically delivers incentives to employees based on their behavior, something out of episode "Plugged In" that we talked about. According to Fast Company, the founder of Peerfit, a startup that connects employers with fitness centers, says he "predicts that in five years, data-driven incentives will be the norm, based on employees' locations and personal preferences. Wearables will track certain aspects, while others will be driven by employee need. Some employee engagement platforms, such as YouEarnedIt, integrate wellness incentives with other employee engagement activities, so workers can earn points that can be redeemed for prizes. This type of gamification also increases the amount of data that employers and third parties have to work with." And there's another startup founder, from the same article, who highlights the ways companies can track employee financial behavior. And he says, "For example, if you have a lot of people taking 401k loans and hardship withdrawals for non-housing reasons, that tells you that they're having financial issues." I mean that's pretty obvious, but the idea is that the company is the one monitoring that, I guess ostensibly so that the company can help them with their financial issues. But it's not hard to see, despite how companies spin this, that these employers want to see into every aspect of their employees' private lives, because that gives them tremendous power to discriminate, to penalize, to fire, to ultimately control their work force, because they hold the fundamental carrot that everyone needs - a paycheck.

David Torcivia:

[26:49] But we also mentioned how this data is managed by third parties, who can ultimately sell off all of it, and that's another big piece of this story. While, yes, there is the insidious infrastructure being built to enable companies to directly control our private lives, this data is also valuable to the old-school industry of marketing.

Daniel Forkner:

[27:10] But hold up, David, let's take a step back here for a minute because we're talking about all this data that companies collect, they sell off, and all this, but as many people will point out, and like we alluded to with the 23andMe genetic information, much of this data that is harvested on us is anonymized. And companies tell us this, they say yes, okay, we collect a lot of information on you, we do sell that to marketers, but we strip that information of your personal name, no one can trace it back to you, it's just for research purposes, it's just to help us improve our services. You hear that one a lot. Hey, do you mind sharing your information with us? It will help us provide better services to you. But is this something we can trust, David? Should this give us peace of mind, or should we give it a piece of our mind?

David Torcivia:

[27:56] Daniel, it's true most personal health information is supposed to be stripped of personal identifiers before it gets sold to companies on the open market. The fact that our health data is sold at all is something we should all care about, but at least these companies anonymize the data so it can't be tracked back to you or me. At least that's the idea. But, as always in this show, that idea turns out to be completely wrong. There was a study published just last month by UC Berkeley, for instance, that used machine learning to identify individuals from anonymized data on their physical activity. Specifically, researchers got just one week's worth of walking data from 15,000 individuals (about 10,000 were adults and about 5,000 were children), that came from wearables like their Fitbit. And then they used machine learning to match the data to specific individuals, with 80% accuracy for children and 95% accuracy with adults.

Daniel Forkner:

[28:52] I just threw my Fitbit on the ground, David.

David Torcivia:

[28:54] Smashed it into dust, Daniel, that's what you should be doing. I hope everyone does that by the end of this episode. But the lead author of the study wrote about their findings, "The results point out a major problem. If you strip all the identifying information, it doesn't protect you as much as you think. Someone else can come back and put it all back together if they have the right kind of information. In principle, you could imagine Facebook gathering step data from the app on your smartphone, then buying healthcare data from another company and matching the two. Now they would have healthcare data that's matched to names, and they could either start selling advertising based on that, or they could sell the data to others." Ultimately, the authors argue that artificial intelligence is making it easier for companies to connect us with our private data. And that our privacy laws are much too weak to protect our health records from marketers, and those with even more nefarious purposes.

Daniel Forkner:

[29:48] David, I was thinking about this issue, how companies anonymize our data, and I came up with a hot take. And ...

David Torcivia:

[29:54] Lay it on me.

Daniel Forkner:

[29:55] Well, the more I thought about it, the more I realized there is no such thing as anonymized data.

David Torcivia:

[29:59] Now you're onto something here, Daniel.

Daniel Forkner:

[30:02] The concept is a contradiction, it doesn't make sense. Yeah, I mean, it's something that companies say they do, and I guess it's supposed to make us feel good, like we imagine, you know, all the data about our life just goes through some giant scrambling machine and comes out the other side clean. But, if you think about it, if data was truly anonymous it's not useful, you can't use it for anything. A truly anonymous piece of data would be something like, "there was a woman." Okay, that's anonymous. But the moment I start trying to connect that data point to something else, like, "there was a woman who walked into a store," well, now I have a store name, I have a store location, I have, probably, a date and time that this woman walked into this store, right. I might have a purchase history. And, while these companies strip certain information, like, okay, we know there was a woman who walked into a store, who purchased a certain thing, but we took her name off of it. The whole reason why this is a problem, though, is that these data points don't exist in isolation. They are connected to other bits of information. That's exactly what these data brokers exist to do - they harvest data from your doctor visits, then they harvest information from your Fitbit, then they harvest information from your phone app where you inputted your diet, then they harvest information that was shared with your insurance company. And even though each bit of this data may have removed your name, by connecting the dots you have location, you have zip codes, you might have a phone number, you might have the last four digits of a credit card. It's very easy to start connecting these dots to say, okay, we have the person. In fact, David, there is a bizarre, mask off illustration of how ridiculous it truly is to trust anonymous data, and so listen to this, okay?

David Torcivia:

[31:42] You can't see me, Daniel, but I'm rubbing my hands together in eager anticipation. Let's hear it.

Daniel Forkner:

[31:48] Last year, Facebook ...

David Torcivia:

[31:49] Oh, it's gonna be good, I know.

Daniel Forkner:

[31:51] The social media company was talking to hospitals in an effort to start up a project that would match anonymous Facebook profiles with anonymous medical information, in order to identify people. Facebook assured people that in this process the company would not "deanonymize" the data.

David Torcivia:

[32:11] Wait, wait a second, I'm hearing anonymize and anonymous a lot here.

Daniel Forkner:

[32:15] So let me clarify for you, David. Facebook is telling it's users that, hey, we have your profile but it's anonymized. And then Facebook goes to hospitals who have patient data which is anonymized, and then the two groups come together and say, hey, if I take my anonymized data and connect it with your anonymized data, now we can identify people and provide services to them. I mean, it's a complete farce, but, I mean, what kind of crazy world do we live in where companies can actually advertise for that type of thing. But it does point out the false reality of so-called anonymous data.

David Torcivia:

[32:53] What I really love about this is that it's almost word-for-word exactly what the researcher in that study warned about companies like Facebook doing. And here it is an actual plan that Facebook came up with. But this kind of stuff sort of happens all the time. According to ProPublica and NPR there's this company called Optum that has compiled data on over 150 million Americans, going more than 25 years back. This includes their medical diagnoses, medical tests, the drugs they take, and even things like their socio-economic data. The company then takes all of this and connects that with other data on individuals' education, their race, and family. In 2016 the company filed for patent on a technique for linking people's social media posts with details from clinical visits and payments. The company claims they anonymize this data, there's that word again. But as we've come to learn, you have to be joking. You can't tell me that with that many data points on an individual that you can't identify them. It's like if you and me were sitting in a room, Daniel, and some other person said, "I'm thinking about a person, and I'm not naming any names, but they're in this room, they're wearing brown shoes, they have brown eyes, they're six feet tall, they have a red shirt on, but I'm not naming any names."

Daniel Forkner:

[34:05] Yeah, at least they wouldn't be naming any names.

David Torcivia:

[34:06] It reminds me of the scene from The Simpsons, Daniel, they're talking about a snitch and they don't want to let the classmates know who it was, so they say, well, let's just say it's L Simpson. Wait, no, that's too easy, maybe Lisa S.

Daniel Forkner:

[34:19] Yeah, exactly. And to be clear, this information that ProPublica and NPR reports on this company Optum, they got that from their marketing material. It's not, like, secret. There was a convention for medical health professionals where they could meet all these data analytic companies and peruse their services. And this was just one of the many companies that says, hey, this is all the data I got, we can do something with it, pay us. But we've been talking about a lot of markers on physical health, David, things that companies want to know about, our lifestyle habits and all this. But this starts to get disturbing when we realize that one of the main goals of all this activity into prying into our private lives, is to figure out what our individual mental and emotional states are. That is, these companies, these social media conglomerates, Facebook, Google, what they want to do is diagnose us, themselves, with mental health issues and other mental diagnoses. And this starts to get a little disturbing, David.

David Torcivia:

[35:23] A number of large companies are currently pursuing technologies that will enable them to track the mental health of, you guessed it, all of us. Google, IBM, Apple, all these companies have departments dedicated to finding connections between individual behaviors and the expression of our mental health disorders. Companies are looking at our browser search history, our shopping habits, our mobile phone use, the results from our wearable Fitbits and other devices, all to identify, as early as possible, what our mental and emotional states might be. This research is driven by the assumption that when it comes to things like depression, the best way to monitor its occurrence is through the tracking of real-time behavior. Because subtle changes in our habits can yield profound insights into our emotions. For example, companies are trying to find ways to turn our mobile phone microphones into tools for detecting changes in our very own voice patterns. Maybe yesterday you spoke with enthusiasm to those around you, but today your voice betrays an inner insecurity and uncertainty. And that's the type of intel companies want to know.

Daniel Forkner:

[36:30] It didn't take long for Watson to go from lovable game show contestant to evilbot. And one of these departments that I want to highlight, David, is called Verily Life Sciences, it's owned by Google. But there's one individual, his name is Tom Insel, he used to be an executive at Verily, but then he left for a startup called Mindstrong. Mindstrong, of course, is a hot new startup and it just recently surpassed 29 million dollars in funding, and it aims to use smartphones to identify mental health markers. This company offers software for health providers, and by extension health insurers, on an individual's 24/7 activity on their smartphone. And the analytics behind all that activity is provided by machine learning, on what that activity may signal about a person's mental health. And we talked about company wellness programs and how voluntary doesn't necessarily mean voluntary when there are financial consequences for not participating. And software like the ones being developed by Mindstrong are the perfect type of tool that could be easily forced upon people, either directly by an employer, or indirectly by health insurance plans that can adjust premiums based on participation.

David Torcivia:

[37:46] Well, Daniel, to really illustrate these ideas let's look at specifically what Tom Insel, the founder of this company Mindstrong, is bragging about in terms of his technology. So this is an article from Wired. Tom is pursuing the "idea that a combination of your medical records and how you use your gadgets, tracking of activity correlating with depression or future self-harm, let's say, could be a big data bonanza for predicting and treating health issues. For a bipolar patient whose mania is manifested in rapid, uninterruptible speech, or hypergraphia, their disease could be characterized by the frequency, length and content of participation in social media, write the researchers who defined the term in Nature Biotechnology." Tom Insel's company Mindstrong is trying to figure out how the way we type on our cell phones correlates with depression, psychosis, and mania. According to Insel himself, "The complication is developing the behavioral features that are actionable and informative. Looking at speed, looking at latency or keystrokes, looking at error, all of those things could prove to be interesting."

Daniel Forkner:

[38:52] David, that's terrifying. To be clear, he is saying that they want to look at not just what you type, but how fast you type it, errors in spelling and grammar that you might make, how quickly your thumbs move today vs yesterday, and on and on. And at this point we have to take a step back and ask, like, what the fuck? I remember in an earlier privacy episode, I think it was "Permanent Record," we discussed how technology is being deployed in Southeast Asia to exclude individuals from financial services based on their spelling abilities when they send text messages. And how that type of technology, often developed by Western companies, would one day come home to haunt us as well. And that's exactly what this is. I'm gonna be honest with you, David, I really do not understand how we live in a society today where companies can openly brag about their efforts to track our intimate cell phone use, our intimate lifestyle behaviors, all so that they can label us with a mental disorder, so that companies can profit by excluding us from certain services. Or otherwise, even worse, manipulating us when we are in a vulnerable state, because they know that we are susceptible in those moments to their advances. It's really sick, if you ask me.

David Torcivia:

[40:07] But it shouldn't be surprising, Daniel. I mean, look at all the funding that's going into these companies, all the investment that's happening in this area. I mean, there are millions, billions of dollars being poured into this technology, and that money, in the investors' eyes, is something that's going to be made back. And how are they gonna make that money back? Not by making sure that they're taking care of all of our health! That's something that costs money, and that's a service that these companies aren't providing. But instead, by either selling data on how we're feeling physically, emotionally, whatever, which is something advertisers can take advantage of, or selling our data, our health data, in order to deny us things, whether that's better terms on things like loans, maybe access to plane rides, like we're seeing right now in China with the social credit system used to actively deny people transportation around their own domestic country, or ultimately something more sinister, something like genocide.

Daniel Forkner:

[41:02] We really have to pause, take a step back and ask what are we building here, what are we allowing our tech entrepreneurs, our companies to construct? When an entire company, or an institution, or a government can pull back the veil on its entire population and identify, oh these are the people with depression, these are the people that are at risk for this disease, these are the individuals who have this genetic profile. Think about the power inherent in that type of knowledge.

[41:34] Are we really gonna believe this PR take that all this is for our benefit, that all this is in our interest, that these companies are breaking their back to learn every detail about our lives so that they can make it better? You know, we have historical examples that we learn about in school of atrocities involving genocide, like you mentioned, David. Discrimination, Jim Crow, slavery in American South, we learn about internment camps, concentration camps, all these terrible, atrocious things that occur where an institution identifies a group of people based on some physical trait, or some social class, and then physically forces them into a way of life.

[42:16] But we are building an infrastructure, that word is important, infrastructure. Think of the roads that we walk down, or drive down today, or better yet, the train tracks that we traverse when we get on the light rail system.

[42:30] The infrastructure we're building is giving over the power to these institutions to achieve the exact same effects of those violent physical exclusions of whole peoples that we learn about. Except, missing will be the visceral and direct use of physical force, and missing will be the direct visualization of that taking place. When every part of an individual's life can be digitized and tracked, you don't need to round people up with guns, you simply adjust, behind-the-scenes, what roads of society are open to them. You've been identified as a person at risk for depression? Well maybe when you go to search for job openings, or apartment vacancies, your computer merely hides certain opportunities without your knowledge, and so directs you to a certain part of town. You've been identified as having diabetes? Well then maybe when you go to apply for a mortgage loan online, and online is probably the most likely way that you'll get a mortgage as everything becomes automated, maybe those mortgages that will be presented to you will be too expensive for certain zip codes. I mean, after all, diabetes is a pretty good proxy for race. And if you wanted to exclude a certain race from a part of town, just increase the cost of a loan when they go to apply, and they will never know.

David Torcivia:

[43:47] Or we could just listen to Executive Editor of Politics for The Telegraph, James Kirkup, writing in 2015. "Yes, we now live in a world where your phone might observe you to help assess your mental health, but that feeling of unease should not determine our response to technology and mental health. In fact, we should embrace and encourage the tech giants, they seek to chart the mind and its frailties, albeit on the condition that we can overcome the enormous challenge of devising rules and regulations protecting privacy and consent. If you think the idea of Google assessing your state of mind and your phone monitoring you for depression is worrying, you're right. But what's more worrying is that allowing these things is the least bad option on mental health."

Daniel Forkner:

[44:29] If it doesn't come across that clear, David, what he's telling us is, yes these tech giants are invading our lives but we should welcome that, we should embrace it. Because they're providing insights, and without that insight life might be a lot worse. Now, he's writing from the UK, and in his article he does mention Britain's National Health Service, the NHS, as one of the parties responsible for implementing the types of rules and regulations that he assumes will just emerge out of thin air. And this is particularly ironic considering Google has already proven itself to be outside the control of the NHS, or any local governmental body. And that example comes from Google's previous subsidiary company DeepMind Health.

David Torcivia:

[45:15] With a name like that, it's definitely got to be all good, right? Well, DeepMind offers a service called Streams to healthcare providers. And the idea is that by centralizing a patient's personal medical data in one place, artificial intelligence can be used to identify when a patient needs immediate attention from a particular nurse, in the hopes that the warning signs for fatal incidents of sepsis, or kidney failure, can be recognized early enough to ultimately prevent death.

Daniel Forkner:

[45:41] I'm on board so far.

David Torcivia:

[45:43] It sounds good, right, there's nothing, nothing bad here yet. The concern from NHS, however, was that Google could use this highly-sensitive personal health data to violate people's privacy, and profit off their conditions. And that's why NHS made DeepMind and Google both agree to keep the data separate from Google, only in the hands of DeepMind itself, and to appoint an independent board to oversee the handling of this information. In 2016, DeepMind co-founder wrote "We've been clear from the outset that at no stage will patient data ever be linked or associated with Google accounts, products, or services." Well, last November Google announced that it was acquiring DeepMind Health in full and making it a part of Google proper. In the process, Google dismantled the independent review board. So what we're seeing here is that Google used the promise of privacy to persuade the UK government to allow a subsidiary company to acquire that patient data, and then once the service was established broke that promise by integrating the company anyway, and taking full control of all that patient information. This is a perfect case study of the dangers of giving our data away to companies on the assumption that they will be good stewards, and they won't abuse the power that we give them.

Daniel Forkner:

[46:57] Now, while Google still pretends that it protects that data, many other companies are just deliberate about sucking up as much data on us as possible. Health insurers are notorious for this. And so far we've been discussing data on our health specifically, and this is kind of the flip side of this where health insurers don't want to just know your health data but everything about you - how fast you drive your car, how much water you drink in a day, how long you spend watching TV, what you order at restaurants, how often you go out to the bar, how much money you have, if you're single or in a relationship, what you post to Facebook. All this data they want so they can calculate risk profiles on you and then charge you more on your premiums. Once again, here's from a great ProPublica report, where they highlight how this information is used, practically, by health insurers. Are you a woman who recently changed your name? You could be newly married and have a pricey pregnancy pending. Or maybe you're stressed and anxious from a recent divorce. That too, the computer model predicts, may run up your medical bills.

[47:59] Or are you a woman who's purchased plus size clothing? You're considered at risk of depression - mental health care can be expensive. Or, low income and a minority? Well that means, the data brokers say, you are more likely to live in a dilapidated and dangerous neighborhood, increasing your health risks. And this goes on, because as we've discussed at length on the insurance market in the American health system, specifically Episode 45 "Bill of Health," insurance companies compete with one another by identifying the healthiest members of the population and excluding everyone else from their services. They want more and more data, because that's what's going to give them the competitive edge over other companies. And it's what's going to enable them to figure out that they don't want to cover you anymore, at least not for a price you can afford.

David Torcivia:

[48:47] But, Daniel, we're not just talking about negative health effects here. Some of this data that these companies are sucking in and turning into valuable information, should be something that we see as positive. And this is especially the case when it comes to pregnancy. And now there's so much stuff here with pregnancy that we really want to touch on, in fact we're planning on having a future episode that's all about the politics and privacy of pregnancy. It's a really interesting topic which we can't wait to explore in depth, but we want to touch on some of that here because it's really important. The fact of the matter is, is as soon as you get pregnant you are worth a lot of money. Or even before that, if you're trying to get pregnant you're worth a lot of money. Because a pregnant woman is soon going to become a regular woman plus one child. And that child, and that woman, are worth tens of thousands, if not hundreds of thousands of dollars over the next 20 or so years of their life. And this is an important market - and, yes, this is how they talk about it, as a market - for advertisers to capture and sell to the retailers that employ them. So it's natural that from this process, that app makers, in particular, are desperate to get the information that they can about you as a pregnant woman, to turn around and sell to these advertising companies. Literally everything that can be tracked, if you're trying to get pregnant or are pregnant, is being tracked.

Daniel Forkner:

[50:04] Well let me give you some examples, 'cause there's a journalist, Kashmir Hill, who wrote an excellent report on this. She actually took one for the team, yes, she got pregnant and then decided to track how these apps were, you know, using her data. And it began even before she got pregnant, when she was just trying. She writes that, "They asked me about my mood, when and how I was having sex, whether it was painful, my weight, whether I exercised, whether I got wasted or smoked, and of course, whether I was having a period and how heavy it was." And these questions are, of course, coming from apps that she downloaded that are supposed to help her get pregnant. And according to the representative of one app that's really popular, called Glow, both for trying and pregnant women, this representative says, "Rather than look just at period start end dates and cycle length in order to figure out the right day of ovulation, we examine the result of ovulation predictor kits, the consistency of your cervical mucus, your basal body temperature, as well as a multitude of other symptoms like cramping, bloating, anxiety, stress. In the Glow app your predicted ovulation day and fertility window changes as you enter this data, and the app learns from each cycle, becoming smarter and more accurate for an individual woman's next cycle."

David Torcivia:

[51:19] There's that machine learning coming in to try and pull our privacy away in the sake of convenience.

Daniel Forkner:

[51:25] And when Kashmir Hill was finally pregnant, she writes that when interacting with these apps "I reported what I was feeling each day in a log, tracking my nausea, my weight, my appetite, feelings of movement, bloating, and on and on. I found the apps oddly addictive. When you're conducting a human science experiment inside of yourself, it's comforting to get updates on what's happening."

David Torcivia:

[51:48] Well, all of this is a lot of data, as you can very quickly see. But how that data is collected and used is where the story gets really interesting. So one popular app, called What to Expect, is actually owned not by a tech company but a media company. And as soon as a woman creates a profile within this app, the company sells her email and her information to marketers and companies like Pottery Barn and Huggies. One user of the app had a miscarriage, deleted her profile, but nonetheless received a congratulatory package of baby formula in the mail the week she would have given birth, sent by a third-party retailer. And what's worse, much of this information is not stored anywhere close to securely. Many apps have discussion boards, forums, or even just a page for personal notes, that all invite you to write the details of your experiences, like a diary. But then much of this can be read by anyone willing to pay for it. "Many of the apps weren't using encryption to send my information along to their servers. That means the women writing in the apps' message forums about the weird things happening to their bodies, or how many times they've been raped, which unfortunately is a striking common conversation on the forums, could have had their messages intercepted by someone sharing their WiFi network, providing their internet service, and thanks to Congress recently overturning privacy rules for ISPs, that's information that Comcast, Verizon, or Time Warner, for example, could hypothetically collect, and use to target those women with ads."

[53:11] In addition, every app that tracks a woman's period is complicit in selling the data she enters to social media companies, data analytics companies, marketers, and much more. Even more worrying, journalist Kashmir Hill was also able to find out that the app Glow, that one we mentioned earlier that she was using, even sent her cell phone's unique serial number, something called the IMEI, to an ad company. Meaning, anyone with that serial number can track the phone, even in the real world, even if the phone is factory reset.

[53:39] As Kashmir observes, the most unsettling part of all this data being sold on her personal and intimate health is the fact that it's impossible to know how it will ultimately be used. "A huge problem with the data trading business it's just how Kafkaesque it is. You don't know who knows what about you, or how it's influencing what you see, or how you're treated. I can't know the ultimate fate of the data I share with them. Such is the murky nature of privacy in a world where a seemingly endless network of companies you've never heard of are collecting information about you and trying to monetize it."

Daniel Forkner:

[54:14] I want to point out, David, I feel like there's a number of people who could hear this and say, well this is why you don't give companies information that you don't want them to share. I especially feel like this, you know, this is more likely to come from the men in the crowd on this particular subject of, like, oh, you didn't want companies to know when you were having your period, why did you share that. But that's a gross simplification of this problem, where again the message that is fed to us is it's always on the individual. Oh, why did you share that information with a company, that's your fault.

[54:48] But we have to recognize the manipulation that is going on behind-the-scenes here, that goes so much deeper than just the individual apps themselves. We have to peer back the veil even further to realize that we as individuals, living in modern society, have been atomized. Our society has broken down our communities, our solidarity. And so when we as individuals are at our most vulnerable, when we find ourselves in the transition of great change, when we face uncertainty, something like a pregnancy, so many of us genuinely want and need guidance. But because our society has eradicated much of the community that we used to turn to, or in the instance of women, communities that have been turned to for generations for assistance with such an important part of life, well, many people feel they have no choice but to turn to whatever is being offered on app stores, online forums, or any other similar place that might provide some guidance into what is going on. Kashmir Hill writes, "It pains me to admit that the apps were ultimately helpful in steering me through my first pregnancy. Proving that for me, at least, convenience trumped privacy. But now that I know the ropes, I would spare any future fetuses the pregnancy panopticon. The only privacy invasion they'll be subject to in utero will be the ultrasound.

[56:13] And so this is a topic for its own show, like you mentioned David. We don't want to misrepresent this pregnancy issue, I mean, we're two men sitting here who haven't experienced this. But I think we did discuss at one point on this show how women's health, in general, used to be something that women themselves took control of in their communities, and how the midwife used to be an important pillar of generational knowledge for women. But that role was largely eradicated from society because of the male monopoly over scientific knowledge, and the propaganda that only an educated expert locked behind paywalls of books and institutions had the ability to facilitate women's health. The consequences of this, of course, include the feeling of isolation that many pregnant women feel, and may well extend into such disturbing trends of rising deaths among women while giving birth. And it doesn't stop there. Black women in the US, for instance, die giving birth 300 to 400% more than white women. I mean, think about that for a second - three to four times more. I wouldn't be surprised if the reason has anything to do with the fact that the books on women's health have largely been written by white men for the past few centuries. But this isn't the focus of this show and we're getting off topic, so let's leave that for another time.

David Torcivia:

[57:29] We've talked a lot so far about the specific private companies that are abusing and exploiting all of our personal healthcare data and turning that into valuable monetizable information. But the incompetency with our data, and the exploitation of that information of who we are, does not end in the private sector. In fact, it also most assuredly continues when anytime governments touch that same important information.

Daniel Forkner:

[57:54] As our listeners from Australia have drawn our attention to, Australia has been receiving a lot of flak for the implementation of My Health Record, an effort to automatically enroll Australian citizens into having their medical information stored in an online digital database.

David Torcivia:

[58:10] Well on paper, like many of these ideas, that sounds great. Having a centralized medical information database that all your doctors, no matter what type of specialist they are or where you move to, can have that information, is something that has a lot of value.

Daniel Forkner:

[58:24] I have my doubts, David.

David Torcivia:

[58:26] Well, bear with me on this one. Positive benefits aside, leading up to the implementation there had been multiple complaints that privacy would be violated, the system was not secure, that teenagers would lose their medical privacy. And then in November of 2018, just a couple of weeks before this system was supposed to go live, the Australian Digital Health Agency's Director of Privacy resigned amid the accusation that the Health Minister and the My Health Record agency were ignoring the advice of privacy experts. Well, a couple weeks later the program did go live, automatically uploaded the health records of 17 million Australians, and then, just six weeks later, the database had been breached and several people had their health records viewed illegally.

Daniel Forkner:

[59:08] Exactly, David. See, and this is why, I think, we talk about how valuable medical information is to companies, so they can sell it and so they can direct our behavior. And I feel like these pushes for digitization really just follow this model of how can we get everyone's information into a place where it's that much easier to pry, it's that much easier to chop it up, organize it, sell it, whatever. And in that process all these systems are created that open the door for hackers and for nefarious people to get into that data. In 2015, for example, hackers figured out how to infiltrate hospital records through their weakest links. It turns out that hospitals have a ton of medical devices and equipment like x-ray machines, MRI machines, and much more, that are easy to get into, and that's what these hackers did. They install malware to comb hospitals for patient medical records. Another example, on July 14th, 2018, hackers gained access to the network of the largest American diagnostics company, LabCorp. You might know this company if you've ever had to perform a drug test as part of a job interview. I've done it. Well, hackers got into this network and the attack is estimated to have exposed millions of people's records who have gotten blood tests, urine tests, all kinds of tests from LabCorp.

[1:00:27] And so, I mean, I know you're trying to say, David, that there's a lot of benefits from having these types of databases, but I'm not convinced that the security risks don't outweigh the potential benefits.

David Torcivia:

[1:00:38] What if you could have your cake and eat it too here, Daniel.

Daniel Forkner:

[1:00:42] I'm sitting at a table outside, I have a black coffee mug, and I have a sign that says, "Change my mind."

David Torcivia:

[1:00:48] Okay. Well, there is absolutely undeniable value in having your health records follow you around. That information is very important, as we talked about in this episode. Everyone wants to get their hands on it, for good reason. But it should be most important to each of us, because that health record is something that can quite literally save our lives. And making sure that our doctors, and healthcare professionals, and caretakers have access to that information is important in making sure that we maintain our health, and when we need serious intervention they're able to do so appropriately and accurately. In that case, a single place that all this information can be accessed is hugely valuable, and this is what motivated this Australian My Health Record system. It's something that's noble. Of course, they very incompetently, unsurprisingly, put it into practice and it was doomed to failure because there is no such thing as a secure centralized system, unfortunately (we're all looking for that Golden Goose). But really, when you think about it, the benefits here, Daniel, are very obvious. In the United States, where we don't have these centralized systems, if you go to a doctor out of state they have to call up your old doctor to try and get them to fax paperwork there. I was trying to go to the doctor's the other day, I had to tell the receptionist to call up another doctor to try and get a form faxed over, and then that had to go over. And all these things take time away from the health care process, it makes it more complicated, there's more work going on that really shouldn't be done, because this should be able to be looked up instantly. All this information should just type it in, there it is, ready to go.

Daniel Forkner:

[1:02:14] Okay, so that's a problem, but how do you get around that, how do you create a central system that is still secure, that you can protect?

David Torcivia:

[1:02:22] Well, we're all already individually carrying around a lot of information about us. When you're going to the doctor you're oftentimes, if you have insurance, bringing with you that insurance information. But what if instead of just an insurance card, we also had some sort of information card, or drive, or thumb drive, or whatever it is that we bring in, we hand to our doctor, we say, "Here is my file, add your stuff to it." It plugs into their system, they have everything, access to it, they can see what my last doctor saw, insert it all, and then that's it. You know, like, they have access to it, now it's updated, I take it back with me when I go home, and move on, on my way. And only the information that they need is copied over, I control those things, I have passcodes on the parts I don't want to see. And I don't want to get nitty-gritty details, but this is very possible to have a system that is similar to what we're seeing here, done in a centralized way of Australia but decentralized in that we as individuals retain control over our healthcare information. That's really the crux of what we're trying to get across here because, like I said, this information is important and valuable to each one of us, more so than to any other advertiser or healthcare company, whatever. It is most valuable to us personally, and we shouldn't have to give up this valuable information just because someone else is trying to abuse it. If we can get control of it and retain our privacy and control, giving it away only when we need to, now we're onto something. And that's really what we need to be moving towards with these systems - not handing over control of this value, but maintaining it personally.

Daniel Forkner:

[1:03:46] Well, I'll admit that's super interesting, and I think it makes perfect sense if it's technically feasible to put individual health information into the control of the individual themself. I think I've just been conditioned to think that if I want to access my health information I need to call a doctor, or visit a hospital, or something. I mean, I don't even know, exactly, how to do that.

David Torcivia:

[1:04:09] It's a lot of work. You have to call dozens of doctors, get them to send the information, they'll ask you why do you want to see it, you'll say I just want to see it, and they'll ask, they'll fight you at it, but eventually they have to give it over to you. So you can do it, but ...

Daniel Forkner:

[1:04:21] And that's such a crazy idea. It's the information that is literally who you are. Why would that be in anyone's hands but you?

David Torcivia:

[1:04:28] Well, maybe mashing up some of these horrible privacy invasions with this centralized information, Daniel, we'll all have RFID chips inserted in our necks at some points, with our full medical healthcare record right there, available to whoever comes by.

Daniel Forkner:

[1:04:42] Well, the Swedes are already beating us to that, David.

David Torcivia:

[1:04:45] Maybe that's what the future looks like, Daniel.

Daniel Forkner:

[1:04:47] Alright, David, so what can we do?

David Torcivia:

[1:04:49] Well, I just gave my piece on what we might be able to do. But individually we can be very cognizant of the ways that our interactions with technology can be used to monetize ourselves and the way that we live our lives. Yes, there's a lot of convenience in using devices like Fitbit, or applications like Glow, which you talked about both in this episode. But at what cost? And at what potential of cost? Something that hasn't been realized yet, as Daniel alluded to. Yes, we may only be suffering additional advertising right now, but in the future where everything is defined by algorithms that nobody really understands or controls, maybe having this information out there is something dangerous, something that we should try and limit as much as possible in our day-to-day lives. So making sure that when we install an app, we actually read the permissions and look at the privacy policy. And I know that's a lot of work, that's a lot of burden on each of us, but if that is too much for you, asking yourself "Do I really need this device?" And the same time looking at apps and hardware devices that do not use our data as part of their business plan. Devices that are sold to us specifically to maintain control of our data ourselves.

[1:06:00] And these do exist, they usually cost more, they're harder to find. And that's because the market has decided we'd rather sacrifice our privacy for convenience, or saving a few dollars in the process. But if we continue to support these people who are operating outside of this standard "data is dollars" marketplace, then we might start seeing a shift to that sort of idea of handling our information. This is something Apple has been trying to do with their phones, first the way Google exploits every single possible information of our Android devices. Not to say that Apple isn't abusing data, but they're doing it on a much more limited scale than the companies like Google or Facebook are. Being informed of all this is hugely valuable. Because only as informed consumers can we understand what is really happening every time we install that app or put on that device.

Daniel Forkner:

[1:06:46] Well, David, I don't have anything to add to that. But I do want to leave the listeners with just a couple things to think about. And that's the fact that all the surveillance that we've talked about, this medical surveillance and this data harvesting, these efforts are often pitched to us as a minor inconvenience that we simply have to endure for a more secure world. It's this trade-off between privacy and security, but it's a false security.

[1:07:16] Bruce Schneier writes in his book Data and Goliath, "Security and surveillance are conflicting design requirements. A system built for security is harder to surveil. Conversely, a system built for easy surveillance is harder to secure. A built-in surveillance capability in a system is insecure because we don't know how to build a system that only permits surveillance by the right sort of people. We need to recognize that to society as a whole, security is more critical than surveillance. That is, we need to choose a secure information infrastructure that inhibits surveillance instead of an insecure infrastructure that allows for easy surveillance. By prioritizing security we would be protecting the world's information flows, including our own, from eavesdropping as well as more damaging attacks like theft and destruction. We would protect our information flows from governments, non-state actors, and criminals. We would be making the world safer overall." And so, that's just a concept to think about - that we don't have to trade off privacy and security. In fact, when we give away privacy we're also inadvertently increasing our insecurity, so it's a lose-lose situation.

[1:08:40] And one other thing, because this is health-related, and we have talked about insurance companies before, but ultimately insurance companies can only profit off of all this data and information on us because they exist in a for-profit competitive marketplace, where that's how you get a competitive advantage. There are representatives in the United States that are starting to wake up to the fact of what you said, David, earlier about how many Americans truly want a streamlined, single-payer, Medicare for all system in this country. And there are so many benefits that would be derived from that. And one of them is deincentivizing all this health medical data that is being collected on us and used to manipulate us. If we can deincentivize that from an economic standpoint, well, we'd be making our own medical information a little bit more secure. In addition to all the pricing benefits and the fact that, like you said, our employers couldn't hold that insurance premium over our heads to keep us at our jobs, we should all really be holding our politicians accountable to supporting a true single-payer, Medicare for all, universal healthcare coverage, and not one of these proposals that are often attempts at compromise that, ostensibly would provide universal coverage but somehow still allows these for-profit insurance companies to exist. A true system that covers all of us, a true single-payer system, precludes the very existence of any for-profit health insurers. And that's the future we truly need. But that's a lot to think about.

David Torcivia:

[1:10:22] As always, Daniel. But think about it we hope you will. You can read more about all these topics, see all the sources that we used, as well as read a full transcript of this episode on our website at ashesashes.org.

Daniel Forkner:

[1:10:38] A lot of time and research goes into making these episodes possible, and we will never use ads to support this show. So, if you like it, would like us to keep going, you our listener can support us by giving us a review, recommending us to a friend, discussing these issues within your community, or support us directly at patreon.com/ashesashescast. Thank you so much to everyone who have been supporting us and jumping on board with that. We're very excited and we cannot wait to keep creating these episodes for your enjoyment.

David Torcivia:

[1:11:14] You can also find us on your favorite social media network @ashesashescast. Next week we've got another great episode, so we hope you'll tune in for that. Until then this is Ashes Ashes.

Daniel Forkner:

[1:11:26] Bye.

David Torcivia:

[1:11:27] Buh-bye.