July 7, 2021

Alternative Social Media with Robert Gehl


Facebook, Instagram, Twitter, LinkedIn, SnapChat … these and other social media platforms allow us to keep up with old friends and make new ones, lets us expand our knowledge and stay informed about current events, and even give rise to important social movements. But social media are addictive, manipulative and the lynchpin of the surveillance economy. Their algorithms define what it means to be social. Social media also, to paraphrase our guest, valorizes the sensational over the sober. But if we want to live digital life to its fullest, what choice do we have? Well, our guest, Dr. Robert Gehl, is here to tell us that we do have choices, in the form of alternative social media. 

Transcript

Transcript

Craig: 0:06 

Facebook, Instagram, Twitter, LinkedIn, Snapchat. These and other social media platforms allow us to keep up with old friends and make new ones, let us expand our knowledge and stay informed about current events and even give rise to important social movements. But social media are addictive, manipulative and the linchpin of the surveillance economy. Their algorithms define what it means to be social. Social media also, to paraphrase our guest, “valorize the sensational over the sober”. But if we want to live digital life to its fullest, what choice do we have? Well our guest, Dr. Robert Gehl, is here to tell us that we do have choices in the form of alternative social media.

 

Andrea: 0:47

Dr. Robert Gehl is the F Jay Taylor Endowed Research Chair of Communication at Louisiana Tech University and an alumnus at the Fulbright Canada Research Chair program. Dr. Gehl’s research involves network, cultures and technologies, the dark web and alternative social media. He is the author of several books, the most recent of which is “Weaving the Dark Web” which is published in 2018 by MIT Press. His 2014 book, “Reverse Engineering Social Media” won the 2015 Association of Internet Researchers Nancy Baym Award. In addition to his books, Dr. Gehl has published numerous academic and practitioner articles. He has appeared on CBC, Deutschland Radio, MSNBC, NPR and now the Rational Ignorance Podcast. Dr. Gehl, welcome to our podcast.

 

Rob: 1:38

Thank you, thank you for having me.

 

Craig: 1:40

It’s OK to call you Rob? We’re all doctors here, so… OK.

 

Rob: 1:45

Yes, yeah, yeah.

 

Craig: 1:47

Doctor, doctor. Before we get started, I just want to say that I love your website, especially the ability to change the style. You know when I looked at the 1990’s style it made me think of Homer Simpsons’ website, which consisted of nothing but animated “jifs” or “gifs” or however you wanna say it. But my favorite is the 1980’s style cause that’s when I got started in computers and it gave me a warm fuzzy about my younger days.

 

Rob: 2:14

Same here, actually. Using IBM clones back in the 80’s yeah.

 

Craig: 2:20

Yeah, yeah.

 

Andrea: 2:23

Ok, well, speaking of changing things up: that is what is happening with social media right now. It’s clear that social media has had a huge impact on much of the world and social media platforms such as Facebook, Twitter and Instagram have grown to enormous size. For example, in December 2020 Facebook had an estimated 1.8 billion daily active users. Twitter has over 500 million tweets per day, and although it's hard to know how many tweets are from people and how many are generated by bots. But traditional social media has it’s problems. Rob, what do you see as the main problems with traditional social media?

 

Rob: 3:00

Yeah, the problems of corporate social media, where do we start? These companies like Facebook, which owns Instagram and Whatsapp, Twitter, all the Google properties, they’re built to monitor our activities. Everything we do, where we go, who we connect with, what we say, our interests, our desires     . And all these data are analyzed and sold. And to keep us engaged, they not only use dark patterns meant to keep us online as much as possible, and a lot of people have written about these dark patterns. They also, and this is the thing that really gets me, they rely on our friends and family to keep us online, right? Our friends and family post things and we pay attention to them and we respond to them. The companies themselves don’t produce any content, we’re producing the content for free. So our friends and family and we’re producing emotional labor into these systems. Sharing our lives, liking things, connecting, we’re drawn in further and further. And as a result of more than a decade of this, we have these massive globe-spanning companies. Just yesterday as we record, the US congress was updating antitrust laws in direct response to the size of these companies: Facebook, Google, Apple and Amazon. And because these companies gather so much data, they’re ripe targets for governments who want information on citizens. So in the US alone, not to speak of any other jurisdictions, data subpoenas from law enforcements in companies like Facebook are so routine, they’re handled by paralegals or they’re even handled automatically.

 

Andrea: 4:36

Rob, can I ask you just a really basic question. Can you explain to us what a dark pattern is?

 

Rob: 4:44

Yeah, dark patterns are, they’re design patterns that are meant to essentially be addictive. And so, for example the use of a larger, brighter color button for a subscribe versus something like opt-out. Do you accept these cookies or not? The accept button might be larger and brighter and your eyes drawn to that. There’s other practices too, emphasizing constant, kind of new things, or using randomness kind of like gambling, to keep people engaged. I don’t do a lot of research on dark patterns per say, I’m more interested in the cultural practices of people using these systems. But lately, a lot of people have grown really concerned about the dark patterns being used. As a Silicon Valley thought leader, Jeff Hammerbacher said once, it’s kind of a quip, “The best minds of my generation are trying to get us to click on ads and that sucks.” 

 

Craig: 5:55

I mean, I think at the core of that is we’re being manipulated. And we’re being manipulated often in ways where they’re not only clever and effective, but we remain clueless about the fact that we’re being manipulated most of the time. You know when you think about how certain ads pop up and, well, my wife and I were talking about this today on the “Facebooks”. You know, she was saying she wasn’t seeing things from her friends. And I said, well stop liking other stuff, you know? Only like stuff from your friends, because Facebook wants to keep you on Facebook and it’s gonna show you what it thinks you wanna see. I have no idea if that’s really what’s going on behind the scenes, but I think it is. I think that’s how their algorithms work.

 

Rob: 6:46

I actually think we have a lot of, I wouldn’t say, empirical knowledge, each of us about how these systems work, but we know that it’s not showing us everything or it’s showing us what it believes we wanna see, so I think a lot of people know that. They’re just kind of resigned to using them cause that’s where their friends are, that’s where their family are. So, sometimes you hear about people complaining about Facebook as if it’s no better than your internet service provider or your electrical provider. It’s just kind of seen as monopoly, a necessary evil in our daily lives.

 

Craig: 7:27

Yeah, but it isn’t. Sorry, Andrea, go ahead.

 

Andrea: 7:32

I was just gonna say we really can resist it. And I think Facebook in particular, it’s been a little over a year and maybe longer since I quit using Facebook and I just noticed that I actually felt physically better when I would not look at Facebook. I went back and forth a couple of times, you know, just to test it out, and you know, people say well, you know, look at this on Facebook or come on Facebook. And you know, I say, well I don’t use Facebook or I’m not gonna go on that platform. And there is some resistance, right? People consider that an inconvenience, because they’ve decided this is the way they want to bring groups together, specially community groups or people that aren’t affiliated through professional organizations. But I insist on not using it, for the things that you were talking about, Rob, these dark patterns. I’m glad you explained what those are. I mean, they’re not immediately obvious to us, but they are addictive. Yeah, it’s just, I think, it's been good for me to intentionally avoid being on that platform.

 

Rob: 8:44

I congratulate you for leaving Facebook. I haven’t been on Facebook since about 2010. I can assure you you can survive in this world. And my partner has never been on Facebook, she never signed up in the first place. I actually wrote an essay back in 2013 where I just collected stories of people who left Facebook, because particularly back then it was a really a big deal because it was so explosive and so many people saw it as an unadulterated good. You might remember back in those days when people were talking with a straight face about the Arab Spring being caused by Facebook, right? Or Twitter. So they were seen as spreading democracy and allowing all of us to kind of participate. So it was kind of a big deal for people to leave. And then when you leave, what do you do? You can’t go on Facebook and say, “I left”. So people would take to other platforms or they’d blog and I would collect those stories. And I think I titled the article, “Stubbornly refusing to not exist even though you’re not on Facebook”. So yeah, kudos to you.

 

Craig: 9:58

So I’m the one who’s morally inferior here, since I’m still on Facebook. 

 

Rob: 10:04

Again, it feels so necessary, though. I understand it.

Craig: 10:10

And it does. I’ll give you an example, my almost 97 year old mother fell the other day and kind of gashed up her head. You know, so I get a phone call from my brother, he’s on his way to the ER where they’ve taken her and she’s fine. You know how head wounds are, they bleed a lot but there wasn’t really any significant damage, which is pretty amazing at her age. But I went on Facebook and said, “Hey my mother just fell, etcetera, etcetera, etcetera.” I’m not that active on Facebook, there were 103 responses. You know, “I hope she’s ok.”, and a lot of them were friends of mine or friends of one of my brothers that knew her, you know, when we were kids and you know, that kind of thing. So it’s effective for certain kinds of communications, but I think the real question is: is it worth the price? And that’s where I’m not so sure.

 

Rob: 11:07

Right, well this conversation right here is why I got interested in alternatives, because at first glance it’s kind of a stark choice: you’re either on it or you’re off it. And there’s a middle ground. There are a lot of people who recognize that these things give us pleasure, they help us connect to each other, they’re very valuable for families of communities or activists or whomever. But do we have to use Facebook? Do we have to have everything flowing to Facebook’s kind of logical center or Twitter, and so on? So people have been developing alternatives to Facebook and Twitter and Google for as long as those things have existed, and that’s really what I’m interested in. It’s like, how do we take the positive aspects and stave off some of these negative things and develop our own systems?

 

Craig: 12:01

So that brings up.. I’m sorry, Andrea, go ahead.

 

Andrea: 12:05

No, go ahead.

 

Craig: 12:06

We do this a lot, so, sorry. No, after you, after you, please.

 

Andrea: 12:11

No, I was just gonna say, can you tell us what alternative social media is? And maybe how alternative social media addresses some of these problems? 

 

Craig: 12:19

That’s what I was gonna say. Look at that!

 

Andrea: 12:23

Yep, we can’t wait to learn about it. We both wanna know. 

 

Craig: 12:27

School us.

 

Rob: 12:32

Well, alternative social media is kind of a contested term. I tend to think of it in the long tradition of alternative media in general. So for every mainstream or kind of corporate dominated or in some cases government regulated, highly regulated media, alternatives pop up and they take on a variety of forms. Most times we define it in terms of what it’s not, so we say alternative media/alternative social media is not controlled by a big corporation or it’s not driven by monetizing data. It’s not centralized, or it can be decentralized. And those negative definitions don’t tell us too much about what alternative social media is, and that’s where things get fuzzy. I tend to subscribe to a definition of alternative media and alternative social media that I derive from a scholar by the name of Clemencia Rodriguez, who I believe is at Temple University, and she argues that these media should empower us to make media, to be at the center of media production. And so, at first glance you might say “Ok, I can do that with Facebook or Youtube or whatever, I can post a video or I can post stuff.” But in this case I think it’s more than the just the ability to participate, it’s also controlling our own platforms, operating our own social media software, moderating our own communities, not relying in some governance board in Silicon Valley but governing ourselves, making decisions about what we see, making decisions about which algorithms are used. We tend to vilify the word algorithm, but algorithms, you know, can be very beneficial for sorting data. These days too, I mentioned as a contestant term, and these days when people say alternative they sometimes say alt, and they very often refer to alt-right platforms so they’ll point to Gab or Parlor. But I’ve been writing about alternative social media for a long time. It's much bigger than these kind of one-off alt-right platforms that have popped up in the last couple of years.

 

Craig: 14:50

And it seems like those wouldn't fit your description of alternative social media because they are corporatized. I don’t know what Parler and MeWe and those kind of places do with the data, but that’s part of the problem: we don’t know what they do with the data. And it is still, I mean we saw the importance of centralized servers when Parler got shut down, effectively, so it is still subject to all those things and you know, if the government wants to do a subpoena, you know, they can because it’s all under that corporate control. And it sounds like you’re saying your view of what alternative social media is kind of gets rid of all of that, because there isn’t a governing body or anything who’s actually in charge of Mastodon or whatever. Am I getting that right?

 

Rob: 15:45

Yeah, well, let’s take the example of Mastodon. We’re talking pretty abstract terms and it might be useful to talk about a concrete project. So Mastodon is a Twitter alternative. It was developed starting in 2016 by a german software engineer, a young man by the name of Eugen Rochko. And he was concerned about harassment on Twitter. So women and people of color, queer and trans people on Twitter being harassed, and he wanted to build a system that would have anti-harassment tools baked in. In addition, he wanted to build a system that was decentralized, specifically federated, and I can talk about what that means, but basically there’s no one Mastodon, where as you go to Twitter.com and sign up, you’re on Twitter. There’s no one Mastodon. I actually run my own Mastodon server, and any of us can too with a little bit of technical knowledge. Or you can sign up on somebody else’s server, so let’s say that you have a friend who’s running a server and you trust that friend, you join the server and then that can connect to other Mastodon servers. The analogy, well it’s not really analogy, but the parallel project that Mastodon points to is email. We can all run our own email servers, we can sign up with other email servers, and email still has a shared protocol so that email gets through no matter where we’re sending them and Mastodon functions in a very similar way.

 

Craig: 17:19

So can you communicate across Mastodon servers? You can? Ok. I’ve only played around with it a little bit, that wasn’t clear to me.

 

Rob: 17:29

Yeah, it’s dictated by two kinds of classes of people. One being the administrator of the server you’re on. So that administrator might decide, “I don’t want to connect to that server because I don’t like their politics.” So if you’re on one that’s very much against harassment and you see another server that’s, the code word for a server that’s full of trolls is “free speech extreme”, right? “We’re extremely dedicated to free speech”, that usually means “we’re harboring a lot of trolls.” If your instance administrator doesn’t want to connect to that server, then that’s blocked, and you as a user can also block access from particular servers, so you have a lot of control over that. 

 

Craig: 18:22

So it sounds like it’s not... On the surface, it might sound like the idea is to remove this control. And that is what I was thinking originally, is you know, it’s a free for all, you know, we’re gonna respect free speech and that sort of thing. But it really sounds like it’s democratizing control. So it puts you or it puts some smaller entity than a big corporation in control of what’s allowed and what’s not allowed. I think that’s an important point to make because you know, you can be, I hesitate to use the word “safe”, but you can be free from harassment. You know, regardless of which side of the isle you’re on, if you don’t want to be harassed by the people who are trying to cancel whatever, you know, you can avoid that, you can live in your own little world. So that makes me wonder if there’s a danger that we’re creating multiple echo chambers, where you hear what you wanna hear, but that’s it.

 

Rob: 19:31

Yeah, that’s a possibility. Of course, we can note that the echo chamber label has been levied at Facebook and Twitter, right? You like the particular things you like on Facebook and they’re kind of fed back to you, so you mentioned in that earlier example, or you follow particular people on Twitter and you block others. So that problem doesn’t necessarily go away in this system. You know, I come across people who are very explicitly dedicated to trying to follow people that they disagree with. I find other people who want to, you know, escape kind of this daily grind of politics online and just be with friends. And so that I think is far more under control in these systems.

 

Craig: 20:28

Yeah, that was my… The last two presidential elections I did a lot of snoozing, you know, of people, because I wanted to see pictures of dogs and cats and you know, grandkids, at my age it’s not kids anymore it’s grandkids. But you know, that sort of thing. That’s interesting. Andrea, go ahead.

 

Andrea: 20:49

As we’re talking about different users, I have a couple of questions about that. One is, I’m wondering who is a typical user of alternative social media might be, and I’m also wondering, Rob, since you know, I know you’re teaching classes and I imagine you teach about this in some of your classes, if when your students learn about alternative social media, when they learn that there is an alternative to you know, corporatized communication, like all of their data being collected and analyzed by Facebook and Twitter. If any of your students become interested in this and become alternative social media users. So two questions, who’s the typical user and do students become interested in this when they learn about it?

 

Rob: 21:34

Yeah. I would say there’s not a really typical user now. There used to be kind of a tech bro type of user. Guys who look a bit like me, playing around with technology. So free and open source aficionados, people who are interested in open source solutions to problems. It was a little bit more insular. I think thanks to Mastodon, it’s really opened up. Mastodon is not huge, but it’s got… it’s hard to get exact numbers but it’s got at least 3 million users. And there’s a range of users, there’s queer, trans, people of color, libertarians, socialists, a range of people using these technologies. And there’s always been, although it’s harder to get your finger on figures in this regard, but activists around the planet, so people in repressive regimes using systems like this. And of course, that’s harder to count, but that’s always been kind of an undercurrent. As for my students, I think about 5 years ago when I talked about this with students, they would typically say, “Uh, this doesn’t really matter.” You know, when I use something like Facebook or Twitter or Instagram or whatever and they show me an ad, the only thing that bothers me about it is when the ad isn’t what I wanna see. There’s almost always this guy who stands up and says, “I’m so tired of seeing ads for feminine hygiene products.” And that always kind of amused me because basically what these students are saying is that they want these systems to know them better, they want them to predict their likes and tastes in a better manner. And I would point that out to the students and hopefully give them a little bit to think about. But in the past couple of years, you know, especially with this tech lash that’s happened, students are more receptive to exploring this world.

 

Craig: 23:42

You know the whole rise of awareness about surveillance, the surveillance economy is that Shoshana Zuboff is, I think I’m saying that right, I’m still trying to get into her book, it’s like 750 pages long, but it looks good on my shelf, makes me look really brainy to have it up there. But I do wonder if there’s this weird, I don’t know, it’s not quite denial but it’s almost a paradox, because I’ve had conversations with pretty smart people that will say, “Yeah I care about my privacy, but I like seeing these ads and these suggestions and it helps my life to do that.” And really, I guess because what they’re giving up is so behind the scenes and it’s so hard to nail down how it might be hurting you and what the risks are. You know it’s like, ok, I can see the benefit, I can’t really see the harm, so why not use it? What would you say to somebody like that?

 

Rob: 24:43

I would think about, we have a long history just in the US context of differential access to resources. So you think of as a long history of redlining in real estate. Differential access to loans in various areas of the country, typically falling along the lines of race. We also see the increasing movement towards health metrics, and Google and all these other countries buying health metrics companies. Think about the implications in terms of your insurance premiums. I think those are the things that I would point to. But you’re absolutely right because a lot of this stuff is abstracted away from the actual mechanics of it. And when you go, privacy researchers have done this again and again. You go to these companies and say, “Ok, can I get the information you have on me?”, they’ll say, “No, it’s proprietary.” I think one of the things that has helped a bit, is that jurisdictions around the world are starting to regulate data collection more, in Europe, in Canada, in California. And if we do that in the US, if we have more regulations about data collection practices and more requirements of these companies to share the data that they have gathered, I think that would increase the awareness of the problems. 

 

Craig: 26:12

We’re at a bizarre world, so you know, we are our data. In a lot of ways our data identifies us, but yet we don’t own our data, so we don’t own ourselves. I mean it’s enough to give me a headache and want a scotch, when I think about it. It’s just a weird, weird time. So I think a lot of us can understand the motivations behind Facebook and Twitter and gathering the data and serving up ads and Google and the rest of them, but what’s the motivation for the people who create tools like Mastodon and go through the effort and the expense of setting up a Mastodon server. There’s gotta be some motivation for that, what is it?

 

Rob: 27:01

Yeah, there’s a variety of them. Some of them overlap, some of them are contradictory. A big one is privacy consciousness, so wanting to socialize outside the view of the Silicon Valley corporations. An example that I would give that is kind of an extreme of this version is social media hosted on the dark web. So there are people who have set up Facebook-like systems hosted as a tor hidden service, so you can only access it with the tor browser and knowing the exact url, it’s a long kind of string of gibberish. It’s not like typing in www dot facebook dot com. And you visit these systems and can do basically all the same stuff as Facebook, you set up a profile, you can follow other people, friend other people, like things, share and things like that. But obviously this is on the dark web which anonymizes our connections, so you don’t have to give away your personal information. If you contrast that with Facebook, which is kind of notorious for demanding to know who you really are, that’s a big difference. As I mentioned earlier, the motivation for Mastodon was concerns about harassment. So Twitter was criticized for being unsafe for queer or trans users, users of color, so seeking out systems with stronger moderation as you mentioned earlier, systems where moderation and governance is really under, for a lack of a better word, local control. And I think the motivation with the alt-right ones, if we want to include these in the conversation, is concerns about censorship. There's, you know, a lot of talk about censorship on Facebook. Alt-right groups are not the first ones to flag this, activists groups have pointed out that, you know, Facebook will censor criticisms of corporations that might be advertisers on Facebook. But if you’re concerned about that, if you’re concerned about Facebook or Twitter having a lot of power over what we see, you’d set up your own system in that regard. And goals correspond accordingly, like, if you’re really privacy conscious, you tend to construct a system that protects privacy. If you’re really into moderation, you’re gonna put a lot of design work into moderation tools and so on.

 

Andrea: 29:37

I wondered if we could talk just a little bit more about privacy for a moment, because I feel like privacy is something that we don’t really have a full appreciation for, because sometimes it is abstracted away because this constant surveillance that most of us are under is invisible. I’ll tell you a crazy story, I went through this phase where I took all of the curtains on these windows on one side of my house. I took all the curtains off just to make a point, and I told my partner, I’m like, “There is no privacy, it’s an illusion, we have no privacy, like, that ship has sailed. It’s gone, the world is over where we have privacy.” But it doesn’t feel like I’m being surveilled when I have my laptop and I’m sitting on my couch. I feel very much like I’m alone and I’m the only person that’s observing this, but it’s really not the case. And I think that it’s important to think about privacy as a fundamental condition of human dignity. Often when we talk about privacy we think, well, you know, if you weren’t doing anything wrong, then why would you mind being watched? But that’s not really the way we feel as human beings. I mean, privacy is a fundamental condition for our own dignity, for creativity and for really, a sense of autonomy and self control. And if in fact, everytime we were on the computer, there was a little representative human being from Facebook standing there, and from Twitter standing there, and you could see all these people around you it would be very uncomfortable. In fact, most of us are under the illusion, especially during the time of Covid, maybe too much privacy, right, that we’re very much alone and that we are the only people that are seeing what we’re actually doing. I wonder if there are any examples that you use to illustrate to people how much we really don’t have privacy when we’re online. 

 

Rob: 31:50

Yeah. One demonstration that I really liked, I had the chance to participate in the Hackers on Planet Earth conference last summer. It was online cause of Covid, and one demonstration by Ailen Porath, and I can send you the information if you want to follow up on this. He developed this app for I think the iPhone called Game Time and it was ostensibly a game. He would ask people to check it out and they’d have to sign up for it and they’d go through all these steps to sign up, like give their name and give their phone number and all the permissions that we see on phones, right? Like give Game Time access to your contacts, give Game Time access to your microphone, you know? It needs all these things in order for you to play the game, and they go through all these steps and it really wasn’t a game. And what he would do is show how in those moments of signing up for this game the app would gather just amazing amount of data on people and then he would put up all the ways to which that data could be put, like really doing basic machine learning analysis of their photographs, right, if it has access to their photographs and using that to classify them politically. Getting access to their travel history, their GPS data and showing where they stop, and saying, “Ok, we think you work here, we think this is your child’s school” just in minutes. He’s an ethical hacker, he’s doing this to show people like just the glut of data that a given app is gathering. Those sorts of demonstrations I think are quite powerful. We do stuff like that in classes too, we try to get students to look at all the data that’s gathered on them, but you kind of have to constantly remind people of this because as you mentioned, the data gathering practices are abstracted away. We just kind of interface with these friendly devices, we’re very intimate with these machines. We have the phone in our pocket, the laptop on our lap. These days we have microphones everywhere in our homes and we don’t really give that a second thought because they’re designed to be kind of unobtrusive. 

 

Craig: 34:17

Yeah, that’s a big part of the design, you know, blend into the environment. It’s good to show students that kind of thing and I’d love to use that for my Principles of Information Systems class that all the business students have to take, just to show them in a more tangible way what some of the potential harms are. But you know, at the same time we spent the last year and a half training them to being ok with being surveilled through these stupid proctoring systems. This might get me in trouble but I flat will not use them.

 

Rob: 34:54

Same here.

 

Craig: 34:54

I just, I won’t do it. You know, “Well you know it’s safe, they don’t store any information” I’ll bet they do. I just don’t buy that. And how do you audit it? You know it’s closed source. You know, and what data can be captured en route? “Well, it’s encrypted.” How well is it encrypted? You know, I mean really heavily encrypted systems are getting hacked constantly. So yeah, I worry that we’re conditioning the students all the way down into primary school that, you know, it’s ok if trusted institutions want to surveil you. That bothers me a lot. A lot.

 

Andrea: 35:40

Is it really possible to opt out?

 

Craig: 34:46

It’s tough. I mean some, a lot of classes require that if you’re taking the test that you use an online proctoring system, Proctorio and some of those kind of places. And I don’t… you know at least when somebody chose to take classes online they could opt out at a price, but you know, we were forcing them to. I mean, their choices were either drop out of school or give up your privacy. At least that was my take on it. 

 

Rob: 36:21

I think opting... so the question of opting out is really interesting and I think a lot of, and our conversation is going in this route and it’s totally understandable. We tend to individualize the solutions, right? So I will opt out, I will choose not to use these things. We really need collective solutions to them because otherwise it’s an individual against very powerful institutional systems. So this is why, and I am not one to say there ought to be a law, but in this case, I think looking at state level regulation, of the gathering of data and privacy, as other jurisdictions have done. As clumsy as it is, the EU’s efforts are leading the world in this regard. There’s a lot of problems with the GDPR, but it’s a really big step in the right direction. So what’s happening in congress now, there’s a variety of motivations behind the various representatives and senators seeking to regulate “big tech” as they call it, but at least they’re agreeing and trying to reign in the data gathering practices of these companies.

 

Craig: 37:38

I mean they’re turning into data robber barons, in a lot of respects. It’s gonna be interesting to see how that plays out, I guess we had a big salvo fired across the bow. I think there’s some hope for something actually happening because of what you said, Rob, you know. It’s not a left or right thing with wanting to do something about big tech. It seems to be one of the few truly bipartisan concerns, which is bizarre in of itself, but... So I wonder if you could talk a little bit about, you mentioned this earlier, but the connections between the free and open source software movement and the alternative social media movement, because they seem to be pretty tightly intertwined.

 

Rob: 38:30

Yeah. So just a basic primer in free and open source software. This is software that you can audit the code. So when you get you know, Microsoft Windows, you get binaries that you can't, they’re not human-readable. You can’t audit how Windows works, what it’s doing with your data and so on. In contrast, something like Linux, which is what I run, in theory (I don’t do this), but in theory I can download all the source code and look at exactly how the machine is operating, and audit it, and I can also modify it. And if I have the skill I can modify it and share those modifications with the Linux community and they could take that and integrate those modifications if they find them beneficial as well. And the power that is, I think it’s pretty obvious in this space in that we don’t know how Facebook chooses what to show us in the various feeds or how Twitter generates trending topics and so on, because that’s abstracted away on a server somewhere else and those are protected through intellectual property regulations and whatnot. In contrast, alternative social media developers tend to focus on free and open source versions because not only do they want to allow people to do all the social media things we were talking about: share information with our family and friends and like and comment and all that stuff, but they also want to allow us to look at how the system works, look underneath the hood. Modify it, change it to our own purposes. So there’s a lot of overlap between free and open source software development and alternative social media, so much so that it’s hard to think of any examples, really notable examples of alternative social media that weren’t at their core free and open source software projects. 

 

Craig: 40:35

Well, that might be a marker for listeners to consider, if they find a platform that is alternative social media and it’s not free and open source, it seems like they oughta think again. Cause I think we should point out that even if I lack the technical ability to audit the code, somebody out there can audit the code and keep them honest. So I think that’s important, even if you don’t know the first thing about how to look at code, you can still benefit from the mere fact that it is open. But I think I’d be pretty skeptical about a platform that wasn’t open source.

 

Andrea: 41:15

… just for a minute when we were talking about technical ability because I think that, you know, I wonder when we think about education generally whether some basic level of scripting should be considered part of our basic education now. That wasn’t part of my early education and recently I just thought, you know, I wanna learn more about this and I started learning how to script in Python which is a really friendly language and just doing some basic things. And I thought this is really a fundamental condition of literacy, and even if you don’t know how everything works, it seems to me that if you’re in the world today, having some, you know, fundamental understanding of how these machines and algorithms are working. And as you pointed earlier, Rob, algorithms can be very useful. I mean now with things like, maybe this isn’t good because it’s Google, but with Google Colab, I mean without any specialized software you can get on there and run some basic scripts and start to understand how this is working. And I’m just curious on what your views are on whether or not that should be part of a basic and required education.

 

Rob: 42:35

I do agree with that. You mentioned literacy. I like to think about media literacy in very broad terms. Media literacy in the past meant, and this was very valuable, let’s dissect you know, like a TV commercial and see what messages are encoded in it and how the TV commercial resonates with us and that’s perfectly fine. And then a more expansive vision of media literacy comes about in the 1990’s. Well, it’s not just about the commercial but how are these TV programs produced, how does commercial funding affect the shape of the programs, how does audience research work and so on. And if we just follow that through logically to today, now media literacy should mean how do these platforms work through recommendation engines, you know, why is Netflix recommending this or Amazon recommending that. How does Facebook’s algorithm work? How do search engine algorithms work? And the beauty of free and open source software and alternative social media is: you just don’t have to kind of reverse engineer from the interface you have or from the occasional times that a journalist might get a little bit of access or you know, a whistleblower might leak access to these normally closed systems. With free and open source software and alternative social media you can run the things yourself and see how they work, you can see how the connections are made. So I agree with that. I think it’s… I regularly taught a web design class, and the big lessons from that are not just the coding of the website but also how the internet works. And so just playing with thing like a Python script or making your own website or you know, more complex things like running your own social networking site, you get a little bit more insight in how these things work and you can make critical decisions about which ones you use and when regulators do step up, and say, “We wanna regulate these things” we might have more of an informed voice about what that should look like. I hate to be cynical but obviously the most informed people in the room when the regulators are thinking about regulating are obviously the tech companies themselves. They’re gonna be in there lobbying and they’re gonna be seen as the experts, so we need more citizen expertise about how these systems work. 

 

Andrea: 45:14

Yeah, let’s go back and you know, we’re sort of talking about these moneyed interests and corporate interests, and one thing about alternative social media is that most of it refuses to accept advertising. And Rob, you talked about this in your 2015 Social Media and Society paper, you claimed that refusing advertising is refusing to privilege moneyed speech. And I’m wondering if you could expand on that point and explain why it’s important to do that.

 

Rob: 45:45

Right, so if we think about just the stream of content that’s produced in a generic social media system, you know. You post something, you post something, I post something, I respond and so on. That can be presented to the end user in a variety of ways, it could be presented just chronologically or there’s all sorts of approaches to presenting that. And then somebody pops and says, “If I throw a few bucks at you, will you put this message more, make it more prominent or maybe put it off to the side and make it appear larger,” and so on, so all of the sudden, one class of speech in that system that seems to be very equal gets elevated and that’s one indication of how moneyed speech might change our social interactions, but there’s much more to it than that, because how do they decide who gets to see which advertisement, who doesn’t get to see it? That requires a large infrastructure of data gathering on us and you know, all the sudden our liking and sharing and connections, and we are connected to this person and that person and we like this product, fuels that system. So now it’s not just the ad, it’s the underlying data gathering infrastructure, and as that aspect of social media becomes more valuable to the company producing the social media, they bend more and more of what they do towards that goal. We need to gather more data, we need to have you know, geolocation data, or audio data or what have you. And all of the sudden we go from this kind of flat relationship you know of each of us producing content and interacting with each other to a warped system where the moneyed speech really kind of becomes centralized. And so fundamentally most alternative social media systems say, “We wanna have no part of that, we don’t wanna have advertising on our systems because that goes down that path of warping social interactions around the interest of whoever is paying for access to our eyeballs.” And along those lines, you know, there’s a secondary question I often get about this, well how do we pay for it then? I often point to a website that’s a top 20 in the world website in terms of traffic, that’s been around for more than 20 year, and that’s not advertising-funded and that’s Wikipedia. It can be done. Right? It can be done through donations or other funding models, it doesn't have to be this advertising model. Literally every time I bring that up, though people are like, “No, how do we pay for it?” Look at Wikipedia, which, you know, this is a little bit of a tangent but Wikipedia is interesting in that they were also going down the road of posting advertisements around the community-generated encyclopedia articles back in the early 2000’s. But a significant proportion of Wikipedia users, particularly in Spain, actually had a labor strike against that. They said, “We’re not gonna contribute our labor to Wikipedia, in order to have you sell ads around what we write.” and because of that, they basically forced Wikipedia to become the nonprofits that it is to this day. So imagine if we have a labor strike at Facebook, you know, I think that ship has sailed, right? But what if we had done something very similar in the early history of Facebook, and made that a more nonprofit model or a community-controlled model.

 

Andrea: 49:39

I wanna ask a little bit more about paid advertising, because I wonder if it’s the issue of paying for messaging in general or if it’s just the power of a few large players that makes it problematic. In other words, if I'm in business, right, and I have a service and I’d like to sell that service to people and you know, then I wanna get my message out and so, I might be willingly, let say unsurreptitiously to you know, pay money to distribute information about the service I’m offering and this just seems like part of a market economy and not necessarily something that’s deceptive or manipulative. So I just wanna say that, you know, you have these large gatherings of people. It’s natural for people to want to distribute information through these channels and you might be willing to absorb some cost for the benefit of doing that. So if the scale was different, if the transparency were greater, do you think it would be less objectionable to pay for advertising?

 

Rob: 50:51

The distinction I would draw would be between contextual and behavioral. Are you showing ads to this group of people and you’re offering a service and you’re not really trying to, you know, gather data or you’re not purchasing the data that somebody else gathered on those people, their likes and desires and friendships and all these very intimate personal private details. You’re just showing an ad to them, that’s one thing. But the majority of the online advertising world is behavioral, it’s based on gathering as much data as possible, which, again, warps everything towards, “How do I gather more and more information on people?”. And so, you know, I’m not a huge fan of advertising in any capacity, but where I confront it, I am happy to confront contextual advertising. So, podcasts being an example, right? Podcasts tend to have contextual ads, you know. We think our audience would be interested in these pillows or these mattresses, as the kind of cliché ad goes, so we’ll accept sponsorship from the advertiser. But when the advertiser says, “Ok, now I want you to install this tracking software (just in the abstract) into everyone’s podcast app and give us all the data that you gather on the people that are listening to your show” that crosses that line in my view.

 

Craig: 52:22

That makes sense. We saw what the surreptitious advertising could be when Google first got started with it. You know, all of the sudden certain things started showing up at the top of the results, and now at least they have a tiny little thing that says, “Ad”. So I wonder if we could kind of shift gears into what people can do around social media, so, Andrea you wanna take the first question, you want me to take it? We’ll edit this out. 

 

Andrea: 52:57

You go ahead,

 

Craig: 52:58

Ok. Alright, so now our listeners know what social media is. They know a lot more about some of the dangers in corporate social media, so how should they go about deciding whether or not to explore alternative social media further?

 

Rob: 53:19

That’s a big problem. So, one of the advantages of the corporate model, really I should say the centralized model is that it’s by design easy to sign up for. It’s easy to sign up for a Twitter account, it’s easy to sign up for Facebook and Instagram. It becomes much trickier when we’re dealing with more distributed systems. So to go back to one of my favorite examples, which is Mastodon, I mentioned earlier it’s federated, which is a good thing in the sense that it decentralizes power you know, you can have servers, you know, in Germany, in Iceland, in Sri Lanka, in the US, in Canada and so on. Then the problem is, which one do you sign up for? You don’t really sign up for Mastodon, you sign up for a particular instance of Mastodon. So there are services that are out there to help people decide which one to sign up for. They ask basic questions like which language do you speak, are you interested in an instance that has thousands of people or only a few people? And a lot of Mastodon instances solve this problem by being interest-based, or geographically-based. So you might set up an instance based on Louisiana, right? This is an instance where we talk Louisiana, you know, people from outside the state can join, but really it’s focused on and interested in Louisiana. Or we set up a Mastodon instance for academics, so academics can join. You don’t have to be an academic, but most of the conversations here are gonna be about academics, or a philosophy instance, for example. So that’s a window in, but again, that’s not as easy as signing up for Pinterest and just kind of clicking through a few interests and getting involved in that system right away. So this is a big problem.

 

Craig: 55:19

But I will attest that signing up for some of the Mastodon instances is pretty painless.

 

Rob: 55:28

Once you find one, yeah.

 

Craig: 55:29
Yeah. And is it mastodon dot social?

 

Rob: 55:34

That’s the main one. Or I wouldn’t call it the main one, that’s the one that Eugen Rochko kind of set up as the test instance, and that became kind of de facto the main one, but it’s more distributed than that.

 

Craig: 55:47

And so it isn’t that hard. You know, you don’t have to know how to do anything really technical. Once you find it, it’s you either request access, or some of them give automatic access, and you put in a password and make up a pseudonym and you know, put in your email address, you don’t have to give your regular email address and you’re there. So it really isn’t very difficult to do. But I think one of the key points that I heard in your answer was that if we want to avoid some of the negative consequences of centralized social media, we need to become our own curator to an extent, you know? We need to take on some of that work of finding social media platforms or instances that align with our interests, rather than relying on Facebook’s or Instagram’s and somebody else’s algorithm to do it for us. So when you give up that.. It’s almost like a weird personal responsibility thing. You know, if you give up the responsibility for doing this yourself, well you live with the consequences of offloading that responsibility to somebody else and you open yourself up to the negative consequences of it.

 

Rob: 57:10

Yeah, I’d go down that road a little ways, but also I think that absolves the company’s too much responsibility for what they’ve done.

 

Craig: 57:21

Yeah, I’d go along with that. So, yeah, it’s gonna take a little bit of work, but it’s not that bad. Like I said, I tried it out and it was really pretty painless. It’s kind of interesting, really, to see what Mastodon servers are out there and kind of what they coalesced around. You know, it gives you an insight into that world just by looking at what different services are kind of targeted towards.

 

Rob: 57:50

Well I need to commend you. I’ve commended Andrea for quitting Facebook, which I applaud. And I commend you for signing up for Mastodon, because I’ve done these conversations with people many, many times, and when the conversation is over they probably never think about Mastodon again. And I’ve had these conversations with you know, I won’t name names, but organizations or groups who are interested in democratizing media, or media literacy. And I say to them, “You’ve got a website, or whatever and you say, ‘Like us on facebook and share things on twitter’, why not just add another thing for Mastodon?” And they’re like, “Oh, yeah. We should.” and they never do. So actually just going and trying it out, it’s not that hard.

 

Craig: 58:34

Alright, so we know something we’ve gotta do with rationalignorancepodcast.com, we’ll have to add a Mastodon button to it somehow, yeah.

Rob: 58:46

Yeah, yeah. And, you know, promote it on Mastodon. Can it hurt you? No, I don’t think so. You reach another audience.

 

Craig: 58:56

Right, right. And one that I think aligns well with us. 

 

Andrea: 58:58

That would be great. 

 

Craig: 59:00

I wonder if you have any… I’m sorry, Andrea, go ahead. Looked like you were gonna ask something. 

 

Andrea: 59:06

No, I think that would be great and I just wanted to say I really appreciate Rob’s point that you know, there’s a natural ability to feel like we have personal agency and full responsibility for our interactions online, but it’s not neutral and that these large tech companies are really manipulative, really have studied human psychology, have designed these platforms and interfaces to be intentionally addictive. I mean, there’s like a dopamine hit every time you get that little like or response on Facebook. And so, yes we should take responsibility but we should also realize the extent to which we’re being intentionally manipulated based on a very good understanding of human psychology and behavior.

 

Rob: 59:57

The tricky bit here is to not slide too far into technological determinism and say, “Facebook is causing all these problems.” and not to slide too far into purely this rational choice of you know, “I need to maximize my utility so I’ll select from among these options and gather information.” It’s really a mixture of the two, you know. We have choice but our choices are constrained by the social structures we find ourselves in.

 

Craig: 1:00:25

Right. So that brings up another question about what users, what can listeners who are social… Sorry, let me try that again. So for our listeners who want to remain on centralized social media, the Facebooks and Instagrams of the world, do you have any advice on what they can do to kind of mitigate some of the potential negative effects on those platforms? So there is kind of a middle ground here that they can take?

 

Rob: 1:00:59

I am not the best person to ask, cause I don’t use them. I haven’t used them for a very long time. I know that one of my colleagues in the department of Communication and Media Studies at Louisiana Tech, Dr. Johnette Magner is really interested in digital wellness. And so this is kind of been meant towards mindfulness and wellness in relation to all these technologies. And a lot of that language or that focus is countering the addictive… let me try that again. A lot of that focuses on countering the addictive aspects of these systems and I think there’s a big literature there. To my mind it goes towards the direction of self-help, which again, individualizes it. But there are resources for people who are interested in kind of like, “How do I reduce my use of these devices and these systems?” I tend to pound the drum of “We need to leave them.” Whatever benefits we gain from them connecting to our friends and family and so on, community organizing or whatever, can be done elsewhere. In fact, should be done elsewhere.

 

Craig: 1:02:22

Ok. Alright, so one last question. You mentioned the EU’s privacy regulations, and now California’s got privacy regulations. Do you think it’s worthwhile for people to start to push their own state legislatures to adopt similar legislation? I’m a libertarian to a large extent, but you know, there are times when you need governmental regulation, and I think this is one of them. And so what do you think listeners oughta do in that regard to maybe bring about some of the change at state level?

 

Rob: 1:02:58

Yeah. Now is the time to get involved in that conversation because like we mentioned, there is interest at the federal level and you know, one of the benefits of our federal system is that at the state level, if the federal government doesn’t do something the states can step up. And so, you know, the pressure points might be at the state legislative level or it might be at the federal level. I mentioned earlier, I find myself in a weird spot saying that I'm not a libertarian but I don’t tend to look towards state-based regulation as a solution to a lot of problems. I look to community, kind of radical democratic governance, which is why I got attracted to these systems in the first place. Like, I can do social media stuff, but also be much closer to the governance structure of the social media I participate in. But it’s gotten to the point where I am very much on board with like, we have to look at broad regulation of these systems. Yeah, I would encourage people to pay attention to what’s happening there because in the absence of us paying attention, the most powerful lobbying organizations on the planet, some of them, I should say, are there in Washington right now. Google, Facebook, Amazon, Microsoft, they’re all there. They’ll get the regulation they wish if we don’t step up and say what it should look like. 

 

Craig: 1:04:35

Yeah. Absolutely.

 

Andrea: 1:04:37

Are there any organizations that you recommend people connect to, or any particular platforms that you recommend? Say someone says, “Oh, I’m very concerned about my data, I’m very concerned about privacy, but I just don’t know where to begin.”?

 

Rob: 1:04:53

There’s two that I have consistently seen over the years. The Electronic Frontier Foundation - EFF, is a big one and the Center for Digital Democracy. Those are two that I think are leading the charge in terms of privacy. To a lesser extent, we can look to the Free Software Foundation because of their interest in promoting free software, but they have a host of other problems, so I would say the EFF and the Center for Digital Democracy.

 

Andrea: 1:05:35

So from a practical point of view, folks could connect with these organizations and if they supported these platforms, then contact their representatives or their senators to let them know that they’re really interested in this and this is an issue that matters to them.

 

Rob: 1:05:52

Yes.

 

Craig: 1:05:53

And call, or even send a fax. You know, emails are pretty easy to ignore, write a letter. Yeah, so go old school with some of this.

 

Rob: 1:06:02

Yes, yes. Old school, yeah. And make sure, you know, make sure it’s in your own voice cause one of the problems we face now is so many robo-generated comments on the FCC or wherever, right? Put it in your own words, I think it makes it much more powerful.

 

Craig: 1:06:23

That’s a great point. So, Rob, thanks for helping us understand more about alternative social media. So kind of where are you going next with your research in alternative social media and where can we learn more about what you’re doing?

 

Rob: 1:06:40

My website, www.robertwgehl.org you mentioned earlier. I tend to put a lot of stuff there. I’m currently wrapping up a book on disinformation and social engineering, and that should be out from MIT Press in a few months, so you can look at the MIT Press website. And my next project, I worked on alternative social media for a while and I got distracted by disinformation and now I’m circling back and so I’m starting to do more and more projects about Mastodon with collaborators.

 

Craig: 1:07:15

Great. Well, thanks for joining us. We’ve learned a lot today and maybe we can have you back a little bit later on to talk more about the disinformation, cause that’s a pretty interesting and important topic as well. So really, thanks for joining us.

 

Rob: 1:07:32

Thank you.

 

Andrea: 1:07:33

Thank you so much, it’s been fantastic.

 

Rob: 1:07:35

Excellent.