Terms of Service

Chapters

  • 01:00 A Purpose?
  • 05:00 Evolution
  • 08:15 VPN
  • 10:09 Facebook in Business
  • 12:45 Unfairly Crushing Competition
  • 16:42 Providing Internet
  • 21:45 The Internet As A Right?
  • 23:29 News Feed Algorithms
  • 34:13 Psychological Experiments
  • 39:15 Think of the Children
  • 40:47 Ethics
  • 42:15 Slaves To Offensive Pictures
  • 44:34 Revenge Porn
  • 47:25 Trust And Security
  • 48:28 Facebook And Intelligence
  • 52:11 Censorship
  • 55:20 Elections
  • 59:19 ‘Don’t Use It’
  • 1:00:53 Ghost Profiles
  • 1:02:00 Real Life
  • 1:03:46 A Challenge Resolved
  • 1:04:27 Question Everything

(this page is currently being edited, please pardon the machine translation until this is complete)

(Complete)

Back to Episode Page

David Torcivia:

[0:00] I'm David Torcivia.

Daniel Forkner:

[0:02] I'm Daniel forkner.

David Torcivia:

[0:04] And this is Ashes Ashes, a podcast about systemic issues, cracks in civilization, collapse of the environment, and if we're unlucky the end of the world.

Daniel Forkner:

[0:14] But if we learn from all this maybe we can stop that. The world might be broken but it doesn't have to be.

David Torcivia:

[0:29] Today we’re going to be exploring the big blue social network that dominates every part of our life. Yes that’s right it’s Facebook time.

This is a continuation of our surveillance series and we’re going to take a quick aside just to look at this one particular company because Facebook is so pervasive, so huge, and so important in this field that we really feel it needs its own dedicated showing, and maybe not even one show but several.

Daniel Forkner:

[0:44] And to be clear David we're not going to be talking about social media itself right because there is a lot to say about how social media is affecting society and how it affects our relationships right? But this is literally just about what the company does and why they do it.

A Purpose?

David Torcivia:

[1:00] So before we start I mean at a very basic level let's talk about what Facebook is; what their purpose as a company is.

And that's to create a product to sell to someone. That's how they support themselves, that's how they make their billions and billions of dollars, and of course that product isn't a social networking platform as we like to think but we are the product.

[1:19] As the users, our data and our attention is what they're selling, and who they're selling to our advertisers.

Daniel Forkner:

[1:26] And when you say advertisers we don't want to limit that to just companies like Nike, like Whole Foods, like that new startup that's telling you why you need to order cats online right and ship them to your house because when it comes -

David Torcivia:

[1:40] Wait there’s a cats online startup because I sort of need that info.

Daniel Forkner:

[1:43] I'm sure there will be now that that brilliant idea that I just broadcast to the world is going to be stolen and - send me a check.

But we're not just talking about these types of advertisers that are trying to sell you a product because when it comes to using data to manipulate and influence people into doing certain things, there are many interested parties.

So these are governments interested in election outcomes; intelligence communities that want to control populations; militaries; police.

David Torcivia:

[2:12] Charities; groups that wants you to support or not support a certain issue; the amount of interested parties, the angles they want to take and the different ways they want to manipulate you are enormous, vast, and all-encompassing.

Why Facebook?

Daniel Forkner:

[2:25] Well David if that's the goal of Facebook why are we using it in the first place?

David Torcivia:

[2:29] That's a very good question and I mean initially the honest answer is we’ve sort of been tricked into it, and taken advantage of, and one of the ways Facebook does that is by designing itself to be so addicting.

Every single component of Facebook; the way the app is installed on your phone; the way they make you install multiple apps; the noises that these apps make; the colors of the notifications; the way the new feed lays out; the different things that you see that pops up; the fact that sometimes you get notifications just about things that aren't even related to you just to have that little red circle pop up that makes you want to click on this. All these things are built to take advantage of you. To make you want to automatically want to do these things. To build habits to open this app and check this, to click this button, to spend more time on Facebook’s platform to get as much of your attention as possible.

Daniel Forkner:

[3:14] And some of the people that have been involved intimately with the creation of Facebook, original founders, executives within the company, these very people are speaking out now and saying “things have gone too far.”

I mean the co-founder of Facebook Sean Parker for example says you know “we designed Facebook to consume as much of your time and conscious attention as possible to give you that little dopamine hit every once in a while,” because making you want to use it is going to create more content, creating this social feedback loop that's “exploiting a vulnerability” he says “in human psychology” to get you to use this more so they can collect more data on you.

David Torcivia:

[3:52] Former Facebook executive and member of the founding team Chamath Palihapitiya said that he thinks Facebook and his team and others like him created tools that are “literally ripping apart the social fabric of how society works” by exploiting these dopamine releases, by taking advantage of these attention feedback loops.

Daniel Forkner:

[4:11] And this has been so successful, Facebook has spread itself into so many areas of life - life in this country and life abroad – that a former privacy manager at Facebook says that the “company has reached so far into our world that it has no incentive to prevent abuse, and that nothing less than our democracy is at stake.”

David Torcivia:

[4:32] Those are pretty heavy, damning words from people who built this thing in the first place. So maybe we should start by examining exactly how Facebook got to be like this, and the many ways they spread out across the web and our very lives.

The first component of that of course is tracking.

Evolution

We all know Facebook tracks us. That's the name of the game when it comes to Facebook. They are a company designed to track, to suck in data and then sell that to advertisers, and that access to you. Now how do they collect all this data?

[5:03] The tracking is many-tiered. So first off is the data that we willingly give to Facebook. You make an account, you like pages, you add photos -

Daniel Forkner:

[5:11] I tell them my birthday and, whether or not I'm single or seeing somebody, who is a member of my family, where I worked, where I went to school,

David Torcivia:

[5:19] For people who have been on Facebook longer, in the past they even had your political leanings right there on your profile page. I remember that.

All this data is collected, stored, and this is stuff we give over willingly.

But this extends even in the things that we enter into Facebook, places that we think are private. So our messages. When we open Messenger, we type something to somebody, Facebook scans every line of that conversation looking for pieces of data they can store and update their information about you; looking for keywords so they can trigger ads that are now popping up directly in that conversation, or if not in the conversation when you click out you'll see it on your timeline.

And that starts getting into the more subtle aspect of all this data that Facbook is collecting, and into this question of “yes we're typing messages into Facebook’s system, but are these things that we are willingly giving up? Should we have an expectation that the conversations we have with people are private?” Well Facebook says “no, you you're having this conversation on our platform, we are entitled to whatever it is that you say in this platform.”

Every time you open up Facebook, that’s logged. How long you stay on a page, that’s logged. How long you spend looking at a post, that’s logged. How many times you click on other people’s profiles, that’s logged. How many times people look at your profile, that's logged. Places that you hover your mouse over, that's logged. Places that you hover your thumb on your phone, that's logged.

Literally every single thing that could be tracked on this website is, and all of this is added up and used to build profiles about you, and that type of thinking extends to the fact that they said “well you know what it's not just this, but we deserve having access to information not just within our own platform, but in all these areas that are dependent or around our platform.”

So what does that mean you might be asking? Let's look at this.

So you install the Facebook app on your phone. It asks for permissions to view your contacts so that you can have Messenger. Well now you’ve given your entire contact list to Facebook; they look at who you know - this network of people that you might not even be friends with on Facebook, but obviously know in real life, and they use that to update their social network web about who knows, who who's related to who, and more importantly, “how can we influence these people to do certain behaviors?”

On your web browser you’re logged in to Facebook. On your desktop, on your laptop, you go to other websites, these websites might have a Facebook login and might have a link that says “share with Facebook.” You don't click on that, you don't worry about that, you say “it's not tracking me I don't use this login,” but this cookie that you have logged in from Facebook it follows you around. It pings these little links; notifies Facebook “oh this person's on this website outside of Facebook, they must like these things. Oh they searched for this, they must like that, they want more information on this let me sell them these things.”

It keeps going; it follows you everywhere; you can't escape this; Facebook is the web increasingly as they extend these social tools, as they extend their login tools, their tracking things that they give to other websites in order to make their life easier, well it's all a ruse to get a closer idea of exactly what it is you're doing online.

VPN

In fact, Facebook is so desperate for this tracking data they've introduced a VPN. That stands for Virtual Private Network and what that means is you connect instead of directly to a website - to Google, to whatever - your computer takes and extra step and first connects to this VPN, and then the VPN connects to the website, and it acts as a middleman directing data back to you.

Daniel Forkner:

[8:36] And the purpose of that David is to protect your anonymity right? That's typically why people subscribe to VPN services so that they can access the web and different networks without giving up their IP address, this individual identifier that can give away who they are.

David Torcivia:

[8:53] Right and along with that their browsing history. So this is something used often times by privacy people, and also protects you from things on the web; there are bad actors; there are phishers; there are scams; there are viruses, and VPNs help to insulate you from that. There's lots of great reasons to use one, but Facebook's under the guise of claiming to protect you from the web is actually about tracking literally everything you do online, which is why you should always look for no-log VPN providers.

But Facebook's isn't about no-logs, it’s about all the logs. They record literally everything you do as you use this service, and what that mean is that even websites that don't participate in this Facebook tracking; in this bastardization of the web; will if you use this VPN service, now Facebook has access to all that regardless. And of course beyond all this there's the data Facebook just simply buys about us. So like we talked about in Episode 3 - Permanent Record where the data you generate just by living your life is collected, scooped up, concentrated, and sold by data brokers, well Facebook's one of the largest buyers of this data. Even if you are privacy-respecting, trying to avoid all this stuff, Facebook is still watching, still buying, still learning things about you in order to what they hope is ultimately manipulating you to do something.

Facebook In Business

Daniel Forkner:

[10:09] This ability to collect information on us in order to manipulate our behavior is what has made Facebook so valuable, and Facebook has really spread its tentacles into so many places in our world. In particularly Facebook has dug itself pretty deep into the business community. So much so that very few companies don't interact with Facebook, or pay for ads on the platform, and in many startup environments the entire business model sits on a foundation of Facebook and Instagram interaction.

I think what is most surprising to me is the fact that this isn't surprising anymore. I think we all took it for granted that business would naturally show up to a space where people interact with each other, but Facebook was at one point just a social media platform. I mean it's not anymore, and maybe it never was, but that's kind of how we saw it: a place where people can meet and share within this digital space without having to interact with marketers and promoters of business.

To give us a sense of the scale and influence here, I mean Facebook and Google, together they control over 70% of the market for digital advertising, and so many small businesses and startups rely on these platforms to grow their business. It's cheap and it's relatively easy to put ads on Facebook and then let the algorithms behind the scene pick and choose which individuals to show those ads to.

There are different tools that can be used in this process. One is what Facebook calls “lookalikes.” So a company can pay to have this lookalike service incorporated with their digital advertising campaigns. What they do is they send their customer email list to Facebook, Facebook will then identify those individuals and their algorithms will target people that are similar and behavior to the target customers.

We know this process can create commercial bubbles around people. There's been a lot of discussions around intellectual filter bubbles and things like that, and as we discussed in Episode 9 - Nothing Left to Hide, we know how these algorithms work to exacerbate discrimination and inequality in things like predictive policing.

What’s actually interesting about how this digital advertising works with Facebook is that sometimes Facebook itself allows advertisers to take advantage of discrimination intentionally. In 2017 for example, advertisers had the option to target people who were labeled as interested in the following topics:

Quote: “Jew hater.” Quote: “How to burn Jews.” Quote: “History of ‘why Jews ruin the world.’”

Unfairly Crushing Competition

[12:45] Obviously we could speak more about that but what is perhaps more relevant to this show is how Facebook takes advantage of the surveillance and machine learning capabilities to absolutely crush competition and maintain market dominance, particularly in the startup community.

So it may be easier today technically to create a new business right? It's much easier to design an app and just promote that immediately than it was in the past, but at the same time it's become almost impossible for many start-ups to compete with Facebook and other tech titans that now overshadow Silicon Valley.

[13:21] To highlight how difficult it is to compete with Facebook many executives at the company encourage their engineers to copy rival functions that threaten Facebook's market share. To assist them Facebook falls back on this massive data that it has in its hand and it uses its surveillance and tracking tools to identify and monitor apps and companies that are growing quickly, and this monitoring isn't limited to rival companies, but the customers as well. If Facebook cannot just acquire these new apps that threaten Facebook market share - if it can’t acquire these cheaply and early in the process - Facebook will just copy the business and make it its own, putting that competitor out of business.

In one case Facebook copied the functions of a video conferencing app, and was able to develop it so quickly by directly interacting and studying those app’s users because they were also Facebook users.

David Torcivia:

[14:14] Yeah I've actually got a friend who has this amazing new app, it's only available on iOS right now, it's like an augmented reality drawing, sharing, graffiti app; you stand there and you look around and you can place things in 3D space and draw shapes, and add text, and share that with friends and tag locations and stuff - it's really cool it’s really slick it’s called Mirage.

Well they picked up a lot of interest from these high-tech corporations. They’ve had interviews with Google, with Snapchat, with Facebook, under the guise of “we’re interested in your product, we would maybe like to talk about acquisition or funding or something. Why don’t you come in here, fly out we'll have a meeting and talk about all this,” and this is all arranged by these corporate research communities within these giant companies.

My friend, he goes out there, along with his business partner, they have great conversations, they talk about all their users, and the amazing application of this technology, and discuss all these details, and then they leave and they never hear back from these companies. And it turns out that these companies are reaching out specifically just to gather intelligence data and understanding of what this potential competitor app might be and what the team behind it is capable of.

So under the guise of “let's talk acquisition and funding,” it's really “I'm interested in what you are doing; I'm threatened, come in here so we can get a better idea and maybe let us steal your ideas.”

Daniel Forkner:

[15:31] And it's really easy for Facebook to do that right, because it's so easy for them to monitor every interaction that happens on and around their platform. They can see if an app grows from 5,000 users to 90,000 users in the span of a month, and by seeing this they can identify “hey this company is going to be successful; it looks like they're going to grow very quickly, but since they're still small - they don't have a lot of resources, they don't have a lot of funding - we can come in here with a larger engineering staff and just copy exactly what they're doing and just put them out of business before they even have the chance to get off the ground.”

But of course all this, you know this digital advertising in the startups, is not just limited to tech companies but also retailers with physical stores. We already touched on this David in an earlier episode but because of this mobile tracking and facial recognition that’s becoming more pervasive, businesses can also pay Facebook to send ads to individuals that simply walk in their stores.

David Torcivia:

[16:26] And that's one of these ways that Facebook has entered not just this digital realm of tracking that we know is happening, but also this physical world like we do discuss in these earlier episodes.

Providing Internet

And speaking of the physical world, Facebook is actually trying to get out of just working on the internet, to even going so far as to provide internet.

[16:48] So in 2015 Facebook launched a new program aimed at and its own words quote “bringing internet access and the benefits of connectivity to the portion of the world that does not have them.”

Daniel Forkner:

[16:58] Sounds pretty reasonable.

David Torcivia:

[16:59] Of course it does; it's a very admirable goal. We should want to expand this connectivity, all this information, this boundless knowledge that is on the internet to as many people as possible. I agree with that completely. And Facebook here in the United States tries to play this same sort of idea as a big proponent of net neutrality saying “we should have equal access to the web no matter what”is the case”

But here’s where the story gets weird let's look at this.

The name of this app - and that's what it is remember it is an app - it's called Free Basics, and Facebook is pushing very very hard to get every individual in developing countries to download and use this regardless of whether they have internet access or not.

When you download this app you don't actually get access to the entire internet. You can't enter URLs or search whatever website you want. What you do have access to though is a collection of third-party apps that Facebook has approved.

Now that doesn't sound very net neutrality like to me.

Daniel Forkner:

[17:55] You're not the only one that suspect that David, and a study was done in 2017 that looked at how the app functions in different countries. Specifically Colombia, Ghana, Mexico, Kenya, Pakistan, and the Philippines.

And what they found is honestly funny in a way; it doesn't matter what country you were in. When you open the app you see the same list of 10 services at the top on the front page.

David Torcivia:

[18:20] Let me guess number one is it… let's see what app would Facebook put first? Oh I'm sure it's the boundless purveyor of knowledge Google, is that right?

Daniel Forkner:

[18:28] Close David, but you're wrong. It's Facebook.

David Torcivia:

[18:31] Should have guessed.

Daniel Forkner:

[18:33] The next service on the list is called Baby Center, which offers helpful advice on caring for infants -

David Torcivia:

[18:39] That doesn't sound so bad that's a good thing.

Daniel Forkner:

[18:41] - from your trusted Source Johnson & Johnson.

David Torcivia:

Oh. I see.

Daniel Forkner:

Following that is BBC News, then you have ESPN,

David Torcivia:

[18:48] Getting the important things out of the way.

Daniel Forkner:

[18:52] Then you have three services called Smart Business, Money Matters, and Smart Woman and they're all from the same company that offers you advice on financial decisions.

David Torcivia:

[19:01] Just what we need.

Daniel Forkner:

[19:03] Okay so, so 8 out of the 10 services are made by for-profit companies based in the US - the BBC is obviously based in the UK - and Wikipedia is on that list and that's the only non-profit available. And despite the cultural diversity of people that may be using this program, for most of these services English is the only language option. I think it’s pretty obvious that this whole setup is an advertiser's wet dream right?

So you mean I, Johnson & Johnson, get to present my brand exclusively to the untapped markets in developing countries? And on top of that I get full access to all the data associated with these individuals?

Sign me up.

David Torcivia:

[19:43] Yeah this is getting pretty far from that idealistic equal-connectivity-for-everyone thing that we talked about a minute ago that Facebook claims that they're trying to do. And of course what’s interesting about this is providing connectivity to all these curated lists of websites doesn't cost any more than letting people go to whatever websites they want to. The hard part is just connectivity. Once you have access to the internet, you're free to do whatever you want, and so this is an artificial garden that Facebook has set up to control what people see - how they think - and of course track all that data. There's no noble reason to restrict what website people can see, much less restrict them to these for-profit, questionably motivated, companies.

In fact, we're not the only people to complain about this. India was so concerned about the deployment of this program, and the conflicts of interest when it comes to Net Neutrality, that they banned Free Basics from the country.

Daniel Forkner:

[20:38] Facebook pushes back on this and they say “hey look this is not a violation of net neutrality. Isn't some internet access better than none?”

But the reality is like you pointed out David, this is a closed system of limited services, where Facebook divides apps into different tiers, disincentivizes competitive services like other social media platforms from joining the program - like Google that you mentioned - and it controls what users see and don't see and then forces each service to route user traffic through Facebook servers so it can collect profitable data, and we don't need to go into great detail on this but it's no small point that this also serves as a propaganda tool for Western and individualistic values.

The Internet As A Right?

I mean I'm a citizen in the Philippines and my only source of news comes from the UK, and my only source for sports comes from ESPN. And it has become a common thing to say “well free access to the internet should be a basic human right,” and David I used to fully support that but I mean lately my opinion has turned around in the face of these corporate practices, especially after learning about the things Facebook is doing, and now I posit the following: the ability to live without being forced to interact and be exploited by these insidious companies should be a basic human right, and as we continue to dig into this topic David we’ll see if you agree with me.

David Torcivia:

[22:02] Challenge accepted Daniel let's take a closer look at all this and see exactly where we end up how about that?

Daniel Forkner:

[22:09] One of the big things that Facebook does David that I think is pretty insidious is the way they manipulate each user’s news feed so that different posts get different priorities.

[22:21] I’ve never been a huge user of social media platforms, I mean I post stuff every now and then when I think it's interesting and maybe two people - including my aunt - will give it a like. Shout out to my aunt, but I don't interact very much so I haven't really paid attention to how Facebook works.

I mean I have noticed that I tend to see posts from the same people on my news feed, that same 1% of all my friends, and major life events like engagements, new babies, things like that always seem to find their way to the top of the list even if I don't normally ever see posts from that person.

A little while ago I logged into my account, and while it was pulled up I hit the refresh button and I noticed that my newsfeed had changed. I was curious about this so I hit the refresh button again… and again…

And again.

And each time I got a different news feed. These weren't new posts that were just popping up; I might see a post at the top of this list - this new list that I had just refreshed – that was 10 hours old. So it became pretty clear that Facebook was deciding for me which posts I would and would not see.

I've been thinking about this and the more I think about it, the more it really bothers me.

News Feed Algorithms

David Torcivia:

[23:29] Well before we get into just why that bothers you Daniel, let's talk a little bit why you see those different news feeds. And to do that we really need to explore what the news feed is, what it came from, and how it evolved over time.

So in the beginning - as they say - the newsfeed was very simple, straightforward, and easy to understand. The news feed was chronological. That just means as you posted something it would appear. When someone else posted something after that it would appear above it, and it was very easy to use. Your friend would post something that would show up, you would see it there, you could scroll down on your newsfeed and see what everybody that you followed - that you agree to be friends with - was doing and what they're up to in the order that it was. And naturally as time shifted and these news feeds moved on, conversations would disappear and naturally wane out and you would have new content to look at. It makes sense, everyone was happy, this was the beginning of the newsfeed and the logical way to construct a newsfeed.

But there's a problem is if I'm not looking at this from the perspective of a user, but as somebody who wants to sell something, well the news feed like this, if I put a promoted post on it it's going to very quickly get covered by new content. Because all this real estate - let's think about a news feed like this in real estate terms - every single slot on this news feed gets the same value because they all can be pushed down by somebody posting later than you did, and that's not a good value proposition for Facebook.

So enter the time line algorithm. In this first evolution of the time line, we started to see a shuffled time line, and Facebook pitched this to us in a way that said “well you know there’s a lot of things going on on Facebook all the tie; the important posts that you might want to see might get lost in the shuffle. So we think we know better than you do what you want to see so we’re going to make sure we show you those things from the people, brands, pages, that you care about most.

But what it was really about was saying “if we can prioritize posts, put them higher up in the news feed, and make sure they don't get covered, we can charge more for those” right?

Daniel Forkner:

[25:24] Yeah that makes sense.

David Torcivia:

[25:26] So the real estate market of this time line was dramatically shifted, and we started to see this dramatic process of turning Facebook into the advertising juggernaut that we see today.

And this happened multiple times.

So first it was just a little bit at the top, and then more and more content was pushed higher up and your friends’ content was pushed farther down. Then Facebook decided “well you know posts aren’t enough, we're going to become the platform of video.” so now videos were pushed higher up in this news feed algorithm to try and promote these because of course they could charge advertises more money for video content; for video ads. Then they began auto-playing videos because Facebook wouldn't charge for video placement unless it was playing. Well you might say “what counts as a play?”

At first Facebook wouldn't release information; it wouldn't tell, they would just tell advertisers “10 million people played your video, isn’t that awesome?! It's so effective give us more money!”

Well later on it turned out that the play that Facebook claimed, was three seconds of video playing, which isn't enough to sell anything.

Daniel Forkner:

Yeah if I was an advertiser I wouldn’t pay very much for that. Especially if my video is like, 30 minutes long.

David Torcivia:

[26:29] I don't know what you're advertising with 30 minute long videos, but

Daniel Forkner:

Well it’s very creative. It’s very engaging.

David Torcivia:

With lots of visual effects I’m sure.

Daniel Forkner:

Oh yeah.

Sorry, keep going.

David Torcivia:

[26:40] With that strange aside, of course 3 seconds is about how long it takes for video to scroll from the bottom of the timeline and start auto-playing to the top. Hmmm, convenient.

Advertisers didn’t like this, they reacted badly when they found out, and Facebook adjusted their playback statistics, but the damage had been done to video. But then somebody else came across video and said “you know what we can promote our content better by turning it into video,” and that ironically, was meme and picture content pages. They found that instead of just posting images and letting it float around wherever it does on the news feed algorithm, that it works a lot better to turn this images into videos. So now you had all these static videos that actually didn't have any motion just scrolling up, but being ranked higher because the algorithm prioritizes this content because they think it's more valuable.

Facebook saw this, reacted, didn't like it, and so now they introduced machine learning to detect if a video actually has motion or not; if it's just a static image it will prioritize it the same way that they would a still image. Well this is an arms race now at this point, and these pages that were creating this static video content started adding digital things in front of it – snow, designs, that are transparent but add motion and have effectively gamed the Facebook algorithm.

And this has continued for years between advertisers, between Facebook, and ultimately the users / products that is the rest of us, and the Facebook news feed for us has gotten worse, and worse over this time. Facebook is finally starting to realize that. In fact, last quarter in the United States Facebook for the first time ever had a decline in US users.

Much of this is motivated by the declining quality of the timeline, so much so that Facebook has now started a new program to redesign the Facebook timeline in order to reprioritize content from friends and family.

Daniel Forkner:

[28:20] So David that sounds like I guess the evolution of this Facebook Newsfeed has kind of been going through a pretty complex like tug-of-war basically between Facebook, and advertisers, trying to figure out the right way to prioritize these posts to generate the most value, and I guess in this struggle the ultimate loser is the user.

David Torcivia:

[28:40] Yeah balancing with the user: “how much bullshit are they willing to tolerate, and look at while still using our website?” and I guess with this latest update Facebook realizes they had gone too far with the spam angle.

Daniel Forkner:

[28:53] And that's when they published their new update that is going to prioritize friends and family in the posts… I guess that sounds okay, and it maybe we can talk about if it really is or not, but I want to go back to that point I made earlier David that the more I thought about this news feed manipulation the more it bothered me.

But because Facebook has been so normalized within our society, I actually struggled at first to understand why I was so bothered by this, but once I thought about it in terms of a real-world interaction, it kind of helped me visualize it better.

So imagine David that I throw a party okay? I invite all my friends -

David Torcivia:

[29:30] Awesome.

Wait does that include me?

Daniel Forkner:

[29:32] Yes David that includes you…

David Torcivia:

Oh good.

Daniel Forkner:

…and I host this party in some conference hall because I have a lot of people coming. We all arrive in this conference hall and start mingling about.

But then something happens. As I'm talking to you, an individual walks up between us, and says "Hi! This conversation isn't really optimal, so I'm going to have you come over here David. Jim is telling a great story that you're going to find really engaging and might spark a good conversation." And I say "well hold on, who are you to decide who we should interact with?" This individual says "Oh I'm the owner of this conference hall; I've had many parties in here, and I know what people like, and what will be meaningful to them."

So off you go David, and I look around and notice that this individual has been walking around, moving people and sectioning them off. All the charming people like Jim are in the center –

David Torcivia:

Or me.

Daniel Forkner:

We’re reaching a little bit with this example now, David, but sure, and you in the center…

and that's when I look down and realize "oh no! I've been sectioned off into the corner with all the boring people!"

David Torcivia:

[30:41] It sounds like it's working just as intended.

Daniel Forkner:

[30:43] Can I uninvite you to my party?

David Torcivia:

[30:48] Like you need a conference call for all your friends anyway.

Daniel Forkner:

[30:50] Okay well that's a silly example, but it does highlight the reality that in the real world, we would take offense at someone trying to control our interactions, but this is exactly what’s happening online, on Facebook’s social media platform.

David Torcivia:

[31:07] And as much of a fantasy this story might be of yours Daniel, you did mention one part that I think really establishes this as a sort of truth in terms of what Facebook is trying to do right now, and that's that attempt to introduce people that are going to spark conversations, and this is one of the things Facebook laid out as the goals in the 2018 update.

Quote:

With this update we will also prioritize posts that spark conversations and meaningful interactions between people. To do this we will predict which posts you might want to interact with your friends about, and show these posts higher in the feed. These are posts that inspire back and forth discussion in the comments, and posts that you might want to share and react to.

Daniel Forkner:

[31:46] When I heard that initially David I thought it sounded reasonable. Like “oh that's nice they want to reprioritize what we see from promoters and advertisers to family and friends,” but that quote that you just read: “we will predict which posts you might want to interact with your friends about,” how is this different from that conference hall example? I don't need an advertising company to decide for me which of my friends deserve my attention.

David Torcivia:

[32:14] Well let's look at this from an even more insidious, perverted, look. So if I’m Facebook, if I’m Mark Zuckerberg, presidential hopeful for 2020 I think at this point we can all agree this is going to happen, what if I want to encourage just certain conversations? Thins about certain topics that I care about, that we at Facebook think are important, and make sure that only certain takes on these conversations are the ones that are promoted?

And it starts sounding conspiracy, weird, and maybe a little out there, but this isn’t unprecedented is it? Now to be fair and play a moment of Devil's Advocate, we do have more info than ever before to scroll through, and Facebook already tries to prioritize some of this like you mentioned, things like “congratulations” if that word appears in a post it'll automatically go higher up, or I want to announce this news about my baby, or I'm engaged or graduating, and I got a new job, Facebook says “people want to talk about this so I'm going to push it to the top of the timeline,” and they're trying to help us out – quote - in those moments. There’s too much stuff, a little bit of timeline curation might not be the worst thing in the world right?

But let’s be honest; navigating social relationships is just a part of being human. Deciding who to talk to, when to leave a conversation, what to talk about, is part of the things that make our social relationships so powerful. The quote Facebook again: “we will also prioritize posts that spark conversations and meaningful interactions,” well deciding who should be our friends, who we want to engage with, and what is defined as meaningful conversations; are these the types of decisions that we really want to outsource to an intelligence agency-backed company trying to sell us products?

That’s who we should be looking to for meaning?

Who am I to trust Mark Zuckerberg to what constitutes something worth having a conversation about? What topic, and what takes on these topics, are the things they want us to talk about and be influenced by. There’s huge amounts of room there for manipulation, and do we trust Facebook to decide what we're exposed to?

Psychological Experiments

Daniel Forkner:

[34:13] David you're asking great questions. I love these questions; is Facebook who we want to look to to determine what is meaningful in our lives, in terms of our social relationships? To understand what is at stake when we ask that question, we need to be aware of what the company is doing behind the scenes and the actual studies they have been doing; the psychological experiments they’ve been carrying out to understand us on a most basic emotional level, and use those emotions against us.

David Torcivia:

[34:46] Wait. Actual studies, you said that. They run large-scale psychological experiments on their users?

Daniel Forkner:

[34:53] Let me just read free you the abstract of this scientific study by Facebook researchers that was published in 2012. Okay are you ready for this?

In an experiment with people who use Facebook, we test whether emotional contagion occurs outside of in-person interaction between individuals by reducing the amount of emotional content in the News Feed. When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.

David Torcivia:

[35:55] Okay that's a lot of words, but “emotional contagion” definitely stood out there as an alarming choice of words.

Daniel Forkner:

[36:02] What actually happened in this study is that for one week in January of 2012, these researchers manipulated the posts that people saw - a total of 700,000 Facebook users were part of this experiment without their knowledge. They would log on to the site, and unbeknownst to them they were either shown a larger portion of positively charged posts, or negative. At the end of this week these data scientists analyzed how these people responded, and what they found once the study was over was that there was a measurable change in the way these manipulated users posted to the social network, and it wasn't just what they were posting; it was literally their emotional state. What they measured was a change in the emotional content of these posts.

So what the study found is that they could actually change the emotional state of users by showing them different emotionally-charged content. And although there have been other studies that have been carried out to analyze Facebook data, this one was a little bit different than a typical study because while others just analyzed data that was available, this one actually sought out to manipulate the data, and that's not something that has been done before - at least not that we're aware of, I mean the only reason why we know about this study is because it was published in the Proceedings of the National Academy of Sciences.

David Torcivia:

[37:19] Okay let me stop here for a second, I'm not even going to touch the ethical issues of intentionally manipulating the emotional states of your users, but I want to exam for a second what you might be able to do with this knowledge.

So if I found I’m Facebook, I control the keys to this timeline algorithm – because we’ve moved past the chronological timeline, that day is over... The chronological timeline is neutral. It doesn’t affect anything; there is no changes on what we see, it's based solely on the impartial arbiter that is “time posted.” But now that we're directly manipulating what people see, well in theory that should be able to also manipulate what people think and feel, and this study confirmed exactly that. Well what do we do with this information?

Well say I'm Facebook. I realize people are more likely to buy certain products when they feel a certain way. Now I can start manipulating people individually to be happy, to be sad, to be angry, impassioned, to be depressed, based on the content that I show them, and I can show that this works with data. The potential for abuse of this knowledge, of this technology are enormous and something we’ll explore later on in this episode.

Daniel Forkner:

[38:27] Facebook was actually investigated in connection with this study, because typically when you do a psychological experiment on people they have to consent to that, and at the time Facebook didn't have “research” as one of the things that they could use our data for in their privacy policy, although a couple months after the study Facebook did add the word “research” to their privacy policy -

David Torcivia:

[38:51] So ironically if they had kept this data secret and used only for Facebook's internal use it would have been okay? But the fact that they published it to let people know that they could do this, it made it a violation of that terms of use or privacy policy thing is that right?

Daniel Forkner:

[39:05] I guess you’d have to look at the specific legality surrounding this which I certainly don't understand, but that's a possibility yeah, but let me point out one other concern as part of this investigation that was brought up, and that's that this experiment might have included minors below the age of 18 who normally would be held to stricter standards for psychological experiments.

Think of the Children

Speaking of minors and children, other experiments have been carried out, and the only reason why we know about them is because they were leaked. Like another experiment that was done on children in Australia and New Zealand.

Two Australian Facebook Executives put together a document showing how facebook had been targeting kids as young as 14 who were in a vulnerable state, based on data to determine when young people feel such emotions as “defeated”, “overwhelmed”, “stressed”, “anxious”, “nervous”, “stupid”, and all these other emotional states that typically are associated with low self-esteem.

We already know that Facebook sells personal data to advertisers, but these types of insights show that it's not just data like you know age, location, and sex that Facebook is selling but information on how to exploit emotional vulnerabilities. Like when a person has low self-esteem, low body-confidence, or in general is just feeling depressed. Information that advertisers can then use to take advantage of people.

David Torcivia:

[40:27] And not just people but children as well. In fact I think Facebook has a video messaging app that is specifically designed to target children. Called messenger kids - I've never used it, I’m not entirely familiar with it – but you can see how they’re thinking about how to take advantage of these different sectors of the population in very specific targeted ways.

Daniel Forkner:

[40:47] David can we take a step back for a second?

David Torcivia:

[40:49] I will always take a step back with you Daniel what is it?

Daniel Forkner:

Let’s take two steps back then.

David Torcivia:

Don't go crazy.

Ethics

Daniel Forkner:

I want to raise a question about these studies, about these practices, about this children’s app, whatever. About how it relates to what we consider ethical in our society.

Because when the researchers, and Facebook, and outside observers discuss this study, ethics is an obvious point that gets discussed. But the discussion seems to revolve around things like legality, and governing board approvals, and other beurocratic processes and codes. Maybe it's time we re-thought our relation to "ethics," because ultimately what is right and what is wrong in a society should be determined by us, the very people that comprise that society.

If a beurocratic in an office somewhere writes down on a piece of paper that “an experiment is ethical so long as X Y and Z conditions are met, and the privacy policy of the company clearly uses the word ‘research’ in connection with people's data..." well at the end of the day if we as people feel that this is unethical, and we don't like it, then it's unethical and we should hold these companies to that standard. Why are we outsourcing our morality to companies and attorneys whose only relation to us is one of profit?

Slaves To Offensive Pictures

David Torcivia:

[42:15] Oh man Daniel I forgot something huge that I was supposed to send you on Facebook.

Real quick, to catch you up to this. So Facebook is constantly scanning everything on Facebook that’s posted to Facebook for offensive content. There's no nudity, you can't have nipples because nipples are sinful or something, you can't have violence, you can't have whatever. All the things that are against their terms of use, and so sometimes you hit that report button to let it know.

Well what happens when you press that report button? So let's look at this the ethics of this for a second. Not whatever Facebook has decided is good or not, but who or what scans these things that we report, and decides what happens with it.

We might assume that this is actually some sort of machine learning - some ultra smart AI built entirely to detect penises, or whatever other types of nudity or offensive content that are posted online - but we would be wrong. What Facebook actually uses is outsourced labor farms in places like Indonesia, like Thailand, where workers sit at computers 24 hours a day looking at pictures of offensive content, labeling it as “yes this is offensive,” or “no this is not.”

Not only is Facebook arbitering what is considered offensive, what is ethical, and what is okay to post, but they're pushing this emotional labor of scanning through these images that may or may not be hard to look at, to these people who look at it 8 hours a day, 12 hours a day, every single day of their working lives.

You want to talk about something that’s unethical? Well there you go.

Daniel Forkner:

[43:47] I'm glad you brought me up to speed on that David.

That is a good thing to bring up because I mean it does raise that question like, when we think about a company like Facebook and we say okay Facebook is making this decision about what’s ethical and what should be on their site and not, it's not a company; it's not this faceless entity… it's human beings at the other end of that computer right? Human beings who have to see this stuff and make those decisions. And again, as terrible as that is for them, it also comes back to the point that these are decisions that are being made by people who are not connected in any way to this content in a personal way, and I think when we’re talking about ethics, and making ethical decisions, those decision should be made by the people that are directly affected and not someone who is far removed from that reality.

Revenge Porn

David Torcivia:

[44:34] Well to pivot the conversation for just a moment, and speaking of all this disturbing and nude content, let’s look at an initiative that Facebook is trying to pioneer all across the world, and a special program they have in Austrailia. That of course is their revenge nude program.

What is revenge porn first off? If you have ever sent a nude photo to someone, and then broke up with that person, or is angry at them or whatever it is - you are no longer in a good relationship with this individual – well sometimes your ex, former lover, whatever it is, will share this nude photo of you with friends, and with the internet at large. In order to blackmail, expose, make you feel bad, or just as a way to get some sort of closure on this.

This might sound like it doesn't happen, but in fact 4% of US internet users have been victims of revenge porn, and that number climbs to almost 10% when looking at women under 30, and so Facebook is one of the major trading places of these revenge porn photos. In Messenger, in private messages, these photos are traded. So they developed an initiative to try and do something about it.

This almost sounds so comical that I've no idea how it got out of a boardroom, but the basic idea was this: you take your nude photos that you don't want to get out, and you upload them to Facebook. That's right let me see that again: you upload your nude photos to Facebook. They scan this photo using a computer software which turns it into a hash – a hash is a big string of numbers and letters that identifies this unique picture, think of it like a barcode - and there's different ways of doing this so you can detect slight changes in the photos, so people can't adjust things slightly to try and get around this technology. But the basic idea then is that Facebook scans every single image uploaded and traded on Facebook against this collection of blacklisted hashes, and if you’re revenge photo pops up, it automatically deletes that website, and maybe penalizes the user that traded it.

Maybe that sounds fine if you understand the technology, but who is going to upload nude photos to Facebook? That sounds like a terrible idea, and it was it was mocked in the media. Well at least most places, but not in Australia.

Daniel Forkner:

[46:35] In late 2017 it was announced that Facebook would be partnering with the Australian government to pilot this program aimed at combating this revenge porn you’re talking about on Facebook.

Well this program that Facebook rolled out works like this: an individual concerned that someone might share a private photo or video on Facebook - which includes messenger and Instagram - this person contacts the Australian government and fills out a form, and then sends the photo to themselves via the Messenger app. Then a human being, working at Facebook - this is a quote “specially-trained representative” -

David Torcivia:

[47:10] Probably one of these people in these offshore data centers. In Indonesia, in Thailand, wherever.

Daniel Forkner:

[47:15] This individual will review that photo or watch the video, and then convert that image to a numerical code – like you said this is called hashing – and that can be used to identify the photo later and prevent it from being shared. Then Facebook deletes the original photo and notifies the individual to delete it from their Messenger app.

Trust And Security

I think we should look at the big picture here. Because of the far-reaching influence of tech giants like Facebook, we are forced more and more to rely on them for our security. And entrusting Facebook with our private data for the purpose of protection is very ironic, because not only are we addressing symptoms that in many cases are caused by Facebook in the first place, but we are entrusting our data to a company that [as we've talked about] has a long and storied history of abusing user data, exploiting its users, manipulating its users, and in general just treating its users like dumb commodities to be bought and sold.

Maybe in a perfect world an algorithm that can detect child pornography, and prevent revenge porn, is a nice idea. But given its nature, do we really want Facebook to be the one pioneering and implementing that?

And similarl companies like Microsoft and Google?

Facebook And Intelligence

David Torcivia:

[48:28] Those are excellent points, and to really illustrate how far the idea of security has gone into the hands of business, well let’s consider that the White House, the Justice Department, and intelligence committee officials have been meeting with advertisers and technology executives seriously since late 2015 to work out partnerships to combat the spread of extremist ideology.

And now Facebook, along with Microsoft, Google, Twitter, they're all under pressure to deploy technology that will work similarly to this proactive revenge born software to identify and suppress content considered terrorist-related. This all sounds sort of similar to that predictive policing stuff we discussed in earlier episodes, and it should for good reason.

One of the main proposals under consideration would create a central hub, called the National Office for Reporting Extremism - that's a name right out of 1984 – but this would house a shared database of terrorist-related content, and prevent that content from being shared across these tech platforms like Facebook.

Now the emphasis has always been placed on organizations like ISIS, but that's just the easy example to use when you're trying to sell a program like this, but this goes back to earlier discussions we've had about what the definition of a terrorist is.

We already covered this story in Episode 11, but it's worth briefly mentioning again.

Daniel Forkner:

In 1954, the US Gov't overthrew a democracy to install a tyrant dictator to deal with peasants that threatened the profits of an American business. And they justified this by calling the peasants terrorists. At the time they used the word communist but it's the same effect. But we need to remember, this story isn't particularly remarkable from a historical perspective, because we have done the same thing over and over again, and continue to do it. Just two decades later we were doing the same things in South America: overthrowing democratically elected governments by backing military juntas to come in, take over, and implement the economic policies we wanted. And always accompanying these economic policies were the kidnappings, tortures, and killings by the government and US-supported death squads.

Who were the victims of these killings? The quote "disappeared" as they were referred to by other citizens? They were teachers. Social workers. Union leaders. Anyone who still supported the policies of the old democracy. In one Argentine case - this was in 1976 - a group of high schoolers got together and tried to organize for a cheaper bus fare so they could get to school. The junta responded by kidnapping, torturing, and murdering them.

Look, the point is, we went into Guatemala and destroyed that country, calling everyone else terrorists in the process, to save the profits of a company that sold bananas. Well Facebook is an American business that has 2 billion active monthly users worldwide. When you add Google and Microsoft to that, you are talking about serious institutions, and serious money and control at stake. If those Latin American peasants were terrorists for threatening a company that sells bananas, how much more will people be labeled terrorists in other countries, and in our own, for resisting the influence of these tech giants?

And you know when these conversations come up, ISIS is the example used by these companies, but in practice what does this mean? We are already living in a world of commercial bubbles, information stratification, news filters.

We may be on the precipice of an even greater censorship.

Censorship

David Torcivia:

[52:11] Now maybe we're a little bit hypocritical after making this episode, but we do have a Facebook page. Of course I in fact reactivated my Facebook account just to create this Facebook page and manage it for this podcast – we’re ashesashescast, look us up, subscribe.

But what's going to happen to our page, to our exposure, to our users after we post this story and these links about how terrible Facebook is? Well if their algorithm is halfway decent it’s going to discourage the sharing of this content, and make sure less people see it, less people realize all the terrible things Facebook does to us.

But it maybe isn’t just this case, think bigger than that. Imagine a group of concerned citizens, of mothers, who are starting a campaign to get their children to use mobile phones less.

[52:56] Because more and more evidence is showing just how bad social media is to our minds and especially young developing minds - something we’ll exploring in future episodes - and so they want to discourage this harmful thing from affecting their children, and a great place to reach out to children, especially ones who might be feeling like their self-esteem is down because of this social media, is on that social media, and the advertising tools that Facebook provides specifically to target these children.

But if this is content that might hurt Facebook and the consumption of Facebook, well they might use the same tools that censor terrorism and other unwanted content, to censor things that might hurt Facebook's bottom line.

Daniel Forkner:

[53:34] David that's a good example, but we don’t need hypotheticals. This is exactly what is happening right now. Facebook is already doing things like suppressing individuals that repressive governments disagree with, and interfering with elections all over the world.

David Torcivia:

[53:49] Facebook has always had a very long and tight relationship with the governments of the world, especially their law enforcement and intelligence Departments. So places where these branches of government extend their reach into the very small and detailed parts of people's lives, well Facebook is there with them. Maybe most notably in Israel and how they deal with Palestinians living there.

The Israeli government frequently comes to Facebook and asks them to remove accounts, posts, information from users, from Palestinians; posts that Israeli law enforcement think maybe might incite violence.

All the meanwhile in a double standard, there are many Israeli posts calling for direct violence against Palestinians that remain on Facebook not pulled, not punished, nothing.

Daniel Forkner:

[54:34] So that censorship of one group for something that another group is directly doing, represents kind of a hypocrisy by Facebook saying “who has the power in this situation? Which government, which institution is the strong party in this relationship? Because that's who we're going to listen to.” And it's pretty obvious but it's worth pointing out that it's not just unfair punishment of individuals by blocking them, but it's also a censorship that deprives all of those individual’s followers from hearing their message, and in some cases these individuals that are blocked from Facebook in this hypocritical landscape have millions of followers who rely on them for news and information about what's going on in the world, because they have no other source.

David Torcivia:

[55:17] They have no other source because Free Basics internet limits them only to Facebook as a news source! They can either get the latest ESPN scores or just look up this blah blah blah.

Elections

What's even more concerning maybe then this direct involvement between governments and Facebook, is that Facebook brags about being able to create these governments in the first place. So in 2016, Facebook’s marketing department came out with this press release bragging about their ability to swing elections. They referenced the reelection campaign of senator Pat Toomey as a perfect example of the ability of their platform to swing votes.

Daniel Forkner:

[55:54] Well David a lot of that was driven by digital advertising, and in the current legal framework it's not illegal to purchase these political ads and try to influence people's decision about who to vote for, but there's a much larger picture that's going on which is that these algorithms behind the scenes might be pushing these campaigns in ways that aren’t immediately obvious but have enormous implications for democracy.

[56:17] In 2010 there was an experiment that was run on Facebook to determine if it was possible to get people who would otherwise stay at home to get out and vote. To evaluate this idea, the test users were separated into three groups.

Group A was a control that saw nothing. Group B saw a standard ad that just said “go out and vote” or something like that. Group C saw an interactive graphic showing polling locations, the pictures of friends who had voted, and a query to answer whether you had voted or not, so that your picture could be included alongside your friends.

[56:52] And this experiment found that those in group C who saw the graphic including their friends were far more likely to vote, and in this particular experiment it is estimated that 400,000 people were influenced to vote who otherwise wouldn't have.

The implications for this are huge, because Facebook already has a ton of data on everyone, and it's not hard to figure out someone's political leanings right? So the scary application is that Facebook could digitally gerrymander its users during a particular election and mobilize those with a political ideology Facebook favors to go out and vote while ignoring the people Facebook disagrees with.

No one would even know, because the people who facebook is trying to suppress aren’t even aware that this prompt is something they’re missing out on.

[57:44] Of course it’s not just Facebook itself that poses this digital gerrymandering threat, but anyone who can purchase the kinds of data that Facebook or another data broker has regarding people's political leanings, which again is not hard to figure out. Sometimes people even publicly express their political leanings. So taking this data and then paying for the type of ads known to sway elections.

David Torcivia:

[58:06] We're not going to address all the stuff currently happening in the media about Russia, Facebook ads, troll Farms, or all these things that may or may not have swung the 2016 presidential election.

But looking forward to 2020, if Mark Zuckerberg decides to run for president, are we going to trust the person who owns Facebook to not use these tools that can swing elections in 2 percentage points whichever way Facebook wants? If Zuckerberg doesn't run or isn’t chosen as a candidate if he doesn’t, who's to say that he can't come to whoever is saying “I can guarantee you a 2% swing,” something that would have made a difference for Trump or Clinton in 2016, and guarantee that they would get that in their favor and become president in exchange for whatever favors, contracts, information, money that Facebook wants?

Not just in the United States but in every nation around the world. Every place an election occurs, this could be happening, and I think we would be naïve to assume that it isn’t. We have entrusted the democracy of this world; of our countries; of control of our very governments to Facebook and these black box algorithms that control what we see and attempt to manipulate our behavior into the real world.

If that's not alarming; if that's not the beginnings of a cyber dystopia; well I don't know what is.

'Don't Use It'

Daniel Forkner:

[59:19] And it's not so easy to just say, “well David if you don't like it just don't use it; don't put something on Facebook unless you want to share it with the world,” but the reality is that so much of our world is now integrated with this social platform that it's not so easy to just say “I will never use Facebook,” when many apps require Facebook accounts just to use them; when all your friends are making plans and announcing events via Facebook;

David Torcivia:

[59:42] Well like I said I didn't have Facebook for several years; I just got sick of it, I deleted it and during that time I missed out on so many events, because they would only post them to Facebook. The only way to organize, to see things going on, was on the Facebook event page. And I was cut out from a lot of this. Unless somebody told me about this in person I would never be able to discover that these things were happening because people have given over control of their social lives, their social activity, and even the events they plan out and given that control to Zuckerberg; to Facebook.

Daniel Forkner:

[1:00:13] At least in the eyes of Germany's Federal Cartel Office, in an investigation that they brought against Facebook, they see this company as extorting information from people, and using the fear of social isolation to get people to agree to the giving up of data that they would otherwise not agree to. So basically saying that Facebook is a monopoly on social interaction for many people. So we can't just say it's on the individual's responsibility not to allow Facebook to take advantage of them, when so much psychological manipulation is at play to figure out how to get people to depend on these platforms.

Ghost Profiles

David Torcivia:

[1:00:53] And even beyond all that, even if you are one of those people who never had a Facebook or were able to delete it recently, there are still ghosts profiles out there.

These are things that Facebook builds up about people that they know aren’t Facebook users but exist in real life. So those contacts your phone uploads when you give that permission to the Facebook app, well all these things are referenced and if somebody doesn't have a Facebook account - if you don't have a phone number or name associated with them - Facebook creates an account for them anyway. It's not visible, it doesn't live on the Facebook website that we can see, but in the back end, in their vast datastores they know this person exist, and as your friends, family, co-workers, give more information to this Facebook machine, well this ghost profile gets more developed and elaborate.

Facebook's advertising and tracking has infiltrated all across the web and can spread that advertising, and spread that attempts at manipulation in these other places that aren't even in the Facebook world.

[1:01:44] So what is it enough just to run away, to hide, to delete your account, to cut access to Facebook, though if we all did that it’d definitely be a step in the right direction. We need to start questioning our relationship with Facebook, with social media, with these networks like Instagram and WhatsApp as well.

Real Life

Daniel Forkner:

[1:02:00] One thing we can think about it is trying to move as best we can, our social interaction back to the real world, or at least putting more value on those real-world interactions. Bcause maybe we can't get completely away from these digital spaces but if we put more value on our real-world interactions - so that when we are seeing someone in person we're not also checking Facebook at the same time right? - maybe some of this digital manipulation will lose some of its potency and its bite.

David Torcivia:

[1:02:29] And a few practical things that are easy to do: first off take the Facebook app off your phone. Don’t use Messanger. Switch back to text or an alternative like Signal which I highly recommend as a messaging app.

Try switching your phone to monochrome screen. This is actually something cool that I've been playing with, and you find you reach for it less often when your screen is black and white.

Daniel Forkner:

[1:02:50] Yeah David this is something you told me about and I've been trying this. Basically it just turns your screen black and white so everything on your screen is black and white and the idea is -

David Torcivia:

[1:02:58] Say black and white one more time.

Daniel Forkner:

Black and white.

Of course the idea is, as part of the initiative to make these apps and social media platforms addicting, colors and icons are a big part of that, so reducing the color makes it a little less interesting and maybe a little bit less addicting.

David Torcivia:

[1:03:17] There’s lots of little steps you can take on this process to weaning yourself off social media, and you'll quickly find as you make it harder to access this and less convenient to always just open your phone, tap a button, and then you're scrolling Facebook before you even realize what you're doing, well with a little time without this you realize very quickly that “I don't get anything out of this; it's not helping me - maybe it's making me feel worse in fact – and there are better ways to communicate with these people, with my friends, with my family, with my communities, than on these social media websites.

A Challenge Resolved

Daniel Forkner:

[1:03:46] So David, I challenged you earlier in this episode to see if you would agree with me that living without being forced to interact with these types of companies should be a basic human right. Now after everything we've discussed do you agree with me?

David Torcivia:

[1:04:01] Daniel after careful consideration and non-biased analysis of all the data that we've looked at here, I think I have to agree with you.

Listeners I have to admit something to you this challenge was a ruse; I was always on Daniel’s side the whole time.

Daniel Forkner:

What would we be without a little bit of humor in these dark times.

David Torcivia:

[1:04:24] Dark Times; dark humor.

Question Everything

[1:04:27] Like we talk about at the end of every episode, question all these things. Question these websites; question the tools; question what we’re shown, why we're shown it. If you continue to do that you will be a more aware consumer, individual, and citizen of this world, and start to work towards moving your social relationships off the internet and back into the real world and developing these communities with those around you; your family, your friends, your neighbors, people in the real world that support and depend on you.

Daniel Forkner:

So that wraps up this week; we hope you've taken something out of this episode. We've enjoyed it and we've got a great one coming up for you next week turning back to a climate related issue. We might have a special guest and we really hope that you join us.

David Torcivia:

If you want to learn more about any of the things we talked about today, well you can find lots of links, sources, and more information as well as a full transcript of this episode on our website at ashesashes. org.

Daniel Forkner:

[1:05:23] A lot of time and research goes into making these episodes possible, and we will never use ads to support this podcast. Also we will never purchase digital ads like the ones that you can on Facebook to help promote this podcast - despite how effective that might be.

So if you enjoy this podcast and would like us to keep, going, you can support us by giving us a review and recommending us to a friend. Also we have an email address. It’s contact @ ashes ashes dot org, and we encourage you to send us your thoughts - positive or negative we’ll read it - and if you have a story or something related to this episode send it to us and maybe we can include it in a future episode or share it on our website.

David Torcivia:

If this episode hasn’t made you delete all your social media accounts, well we're on all your favorite social networks @ashesashescast.

That wraps it up for this week. Until next week, this is Ashes Ashes.

Daniel Forkner:

Bye.