TRANSCRIPT: Fighting Disinformation to Make the Internet a Better Place
[Innovation Heroes theme music] This episode of Innovation Heroes is brought to you by Women in SHI. Visit shi.com/wish to learn more. Welcome to SHI's Innovation Heroes, a podcast exploring the people and businesses making a difference in our constantly disrupted world. I'm your host, Ed McNamara.
Social media platforms have been making a lot of headlines this year. [cellphone vibrates and chimes] There's been a lot of talk about how we access and assess the information we're being served. "Fake news" is a term that's only grown in popularity since a certain former president made it one of his catchphrases. [gentle electronic music]
It's people who want to spread this false information and find it interesting and want to keep sharing it, it's us, so as they come onto their platforms, the design for safety, if that's not in place, then it's just gonna persist. Their pollution, polluted information will spread across platforms.
But really dis- and misinformation is practically as old as humanity itself. With the rise of modern technology, we've become even more inundated with a constant flow of endless data, which has amazing uses, of course. After all, it gives us the power to accomplish great things and improve the world around us.
The terrible abuse that's online towards women politicians persists. We need to understand it better and develop ways for women to feel safer on social media so that this doesn't happen anymore, and women aren't afraid to step up and speak out against it, or even just run in the first place.
[suspenseful music] Some advocates suggest that social platforms need to be doing more to identify and curb the spread of disinformation, especially in light of some recent accusations of companies putting profit over user safety. To me, at least, it seems like a pretty big issue, and one that won't be going away any time soon. Luckily, there's experts out there who can help us understand the roots of the problem and how we can make it better. Alexandra Pavliuc is one of those people. She's a doctoral researcher at the Oxford Internet Institute, and she uses machine learning and neural computation, computer vision, and visual analytics to paint a bigger picture of the data we generate in online communities. Welcome to Innovation Heroes, Alexa.
Thanks so much for having me.
Can you tell us about your field of research, and what exactly you study and how you're using data visualization in your projects?
I think the name of the PhD program I'm in is a good way to introduce it, which is Social Data Science, so I have two fields that I'm exploring. I'm exploring social science, and for myself, that's trying to understand gender abuse, and disinformation towards women and women politicians on social media. And then, the data science side is the methods I use to understand that, and that's network analysis and network visualization. So what I'm doing is trying to understand how disinformation about women spreads across social media platforms, and how it sort of festers on individual platforms, as well. I've done research on this in the past, and we found that the false narratives that people are putting forward about women online are very sexualized, racist, and transphobic towards different women, and sometimes there's an overlap between these things, as well. So I've studied this through network analysis, and that's where I was able to see that people who engage with one false narrative that's very sexualized, also sometimes engage with a racist narrative, as well, about a woman if she is a woman of color, for example. So network analysis, again, that really helps me study the relationships between things, put most simply, the relationships between people, kind of mapping out who's talking to who, who's interacting with who, or what, is the opportunities that come forward with things like network analysis, and network visualization.
So people who engage in these types of false narratives, you know, whether it's sexualized, racist, or transphobic, you know, content, you know, what's the end game? I know that the data might not necessarily point to that, but can you use data to figure out the goal?
So the hunch that I've kind of looked into with other researchers is that we think it's really to dissuade women from being in the public light, so in the case of women politicians, this could be coming from people who are trying to convince the women themselves, and convince, kind of, the world that women don't belong in the public light, don't belong in politics and, of course, if women fall down to this and really kind of internalize the terrible things that are said about them online, and the more prominent you are as a politician, the more of this kind of stuff you're gonna get, whether you're a man or a woman, but it tends to be more violent and quite nasty if you're a woman receiving it. It's gonna change the demographics of politics if certain people are targeted more with this type of disinformation abuse and they choose to, you know, decide it's not worth continuing with. There's been examples of that where members of parliament decide, "You know what? I've had enough of all of this abuse I get offline and online," and they step away. [gentle electronic music]
Disinformation is certainly a hot topic these days, but it also shows, like, how little we really know about a clearly pervasive issue. So I'm extremely lucky to have you as a guest today because you and I are actually talking on the morning after Facebook announced it would be rebranded as Meta. What exactly happened with Facebook, and what's the behind-the-scenes take you can offer us as a researcher who specializes in these issues around big data and ethical algorithm use?
On the side of, like, kind of thinking about what happened there on the user perspective, and what people have been seeing in the things that have been the issue here, there's been lots of news, of course, coming out the last week, but a main problem is that Facebook puts profits over safety, according to these reports that have come out, these internal reports, and what that ends up looking like is that sticky content or sticky advertisements that-- by sticky I mean people like you or I are more likely to look at, and choose to look at for longer, so, say videos, for example-- those things get promoted by Facebook's algorithm. So for example, I, personally, anytime I go on Facebook, I scroll down. The first few things I see end up being maybe some news articles from CBC, from NBC, from kind of more mainstream sources, some weaved in with things that my friends and family are saying, but when that runs out, I'm just scrolling down looking at baking time lapses and, honestly, folk dancing videos 'cause my family's from Ukraine, and I love folk dancing. [Ed chuckles] So those are the things that are sticky to me. I'll watch the whole baking time lapse if I have nothing else to do, or don't want to get back to work. [Ed chuckles] I will watch the whole folk dancing video, but the thing is, if you start getting shown things that are maybe a little bit more [chuckling] salacious than the things that are really sticky to me, such as white supremist content or content from QAnon conspiracies, if you watch that the slightest bit longer than you've watched another video, Facebook will take that indicator, and it will show you more and more of that content. That's why I get so many baking time lapses [chuckling] because of how much I honestly just enjoy sitting back and watching someone make the most intricate of cakes. So there's my very benign example, but the issue is that, when you're exposed to, and choose to end up watching these things that are not so cheery and positive, these things get sticky, and they get shown to you more and more, and that's, kind of, the rabbit hole that Francis Hogan and other whistleblowers have been talking about that really sucks you in, because that's all you see, and then you see more and more of it. [gentle electronic music]
Part of what I found compelling about your research is that it's able to take just massive amounts of incoherent data points, and turn it into something, you know, dynamic, understandable, and visually compelling. Can you tell us a bit about that?
Yeah, of course. So network analysis and network visualization, the thing I really love about it is it lets you see the bird's-eye view of an entire data set or situation that you're trying to understand. I, personally, as much as I need to be detail-oriented as a PhD student, and of course, don't wanna make any mistakes, I am a very high-level person. I like thinking about the bigger picture and trying to see things, kind of, seeing the forest from the trees and trying to get a higher-level understanding of whatever I'm looking at. I don't like being stuck in the weeds, per se, so, for me, it's really compelling to be able to take a CSV of up to millions of tweets, for example, or social media posts, or even from several platforms, which I did, looking at gender disinformation in the 2020 US election, and see the bird's-eye view of who's interacting with who, how you can even play these network visualizations over time and see how interactions between different accounts evolve over time, which is something I think is really, really incredible.
Yeah, and how you represent the data is really fascinating. I thought some of the visualizations you used to show the relationship between inauthentic social media accounts and hashtags they use could easily find their way into a gallery showing or the High Line Park in New York City. So, when you were looking at this data, though, and really finding a way to group it and display it and just analyze it, was there something surprising that you learned in this process? And what was the biggest takeaway you got from this research?
Yeah, so I'll tell you a little story about a different piece that I did before I started looking at gender. So when Twitter started releasing massive really treasure-trove data sets of foreign, state-backed information operations, so to break that down, that is, essentially, they were releasing big Excel spreadsheets, CSVs of tweets coming, that they have attributed to certain states as part of an effort by the state to put out these tweets. So it was kind of a-- it's called an information operation. And, what I did first was take the Russian dataset, and essentially visualize these fake users, these fake accounts-- there's about 3,600 of them-- and I visualized their interactions with each other and with other users, so the nodes in my network were users. And, by playing this over time, I was able to see certain groups of users in this dataset, essentially, testing out different strategies to see what was the best way to seem like a normal person online, and this was in the run up to the 2016 US election. So, for example, I was able to, through network visualization, see a cluster of users who were just retweeting trending hashtags, because I'm sure somebody in the Russian Internet Research Agency office thought, "Oh, maybe that's a way that we can look American before the election, and then people will follow us, and then [Ed chuckles] we'll have lots of people following us around the election," so it's different strategies you can see, and I published a medium article where I expanded this to six different countries' information operations, and the surprising thing, from putting this research out there, was how much traction I actually got on this article that I'd published because people were able to, for the first time, see the information operation at a higher, bird's-eye view, not just one screenshot of a tweet or image that was published, or avatar image of one account from one of these countries' operations. You could see the whole thing unfolding in front of your eyes, a decade-worth of tweets in about a minute.
Wow. And I know you were talking about, you know, some of this state-backed activity. I mean, some of your work has been published by NATO Defense Strategic Communications. So what could your research mean for other industries? Like, what lessons can, say, an IT professional, or even just an old social media manager, such as myself, learn from, you know, what happened at Facebook and other places?
I really think that being able to see a bird's-eye view of the data that you have available to you, whether that's interactions between users around a certain brand, or a certain hashtag, or a certain account that you own, and seeing this kind of high-level view and understanding who's talking to who, and maybe, therefore, finding influencers who are engaging with your content that you could reach out to in the future, if they're like kind of central nodes, if you think about it that way, really important people in your network, and being able to see those things and potentially, yeah, reach out to people if that's the value add there.
This episode of innovation Heroes is brought to you by Women in SHI. Visit shi.com/wish to learn more. [Innovation Heroes theme music] Many people, including every SHI employee, knows that SHI has not only been women-led since 1989, but it's currently the largest minority and woman-owned business enterprise in the United States. But no one knows what that really means more than me. Almost 25 years ago, I interviewed with, and was hired by, a woman who is now VP of our Public Sales Division. On my first day at SHI, I reported to a woman. After being promoted, my first true sales mentor was the woman who is now VP of Global Sales. And for the past 12 years, I've had the privilege of reporting to Thai Lee, SHI's President and CEO. But even after all these years of working for a company featuring amazing female leadership, my co-workers and I know we must be as vigilant as ever in supporting women and diversity in both our company and the IT industry. That's why the WiSH Organization, or Women in SHI, was started. Today, WiSH strives to make the lives of all SHI employees easier by prioritizing satisfaction and retention, and you don't just have to take my word for it. You can check out real, honest SHI employee testimony at shi.com/wish. With a focus on recognizing and supporting confident leaders, WiSH works to increase awareness of the importance of diversity and inclusion in leadership positions, and continues to grow the Women in IT network within SHI with our partners, and with our customers. If SHI sounds like a workplace you'd like to find support and success in, visit shi.com/careers. If your organization shares similar values, we have several opportunities for you to get involved with WiSH. For anyone who's curious to learn more about wish please visit shi.com/wish. [Innovation Heroes theme music]
[gentle electronic music] These days, social media is more than just a communication tool. It's fast becoming an integral part of our daily lives. With Alexa's help, I wanted to dig a little deeper and get a little philosophical because some of the questions being raised around this topic seem to cut right into the psyche of not only our present society, but the one we're shaping for the future, too. It feels like a big part of the conversation around disinformation comes from a place of anxiety because, sometimes, it feels like the truth is dead, or it's lost in a sea of opinions, you know? What has this research, you know, made you think about truth itself? Is truth dead?
[chuckling] Well, what I'll say is, I think there's always been a sea of opinion, so I don't think that truth is dead. I think it's still out there. We just-- we really have better opportunities to find people who are like-minded to us, which in a lot of ways is very positive, of course. We're able to find communities for things that we're interested in, that we never knew existed, and we can be, of course, connected with people across the world, so I don't think truth is dead. I think that there's people who are trying to, or groups who benefit from people disagreeing with each other and being pushed a little bit further apart, but I personally try to follow a wide range of politicians and news outlets that kind of lean left and right in the Canadian in the British space, 'cause I'm living in the UK right now, to try to have a bit more of a holistic picture, when I, inevitably, I'm scrolling down Facebook and Twitter trying to distract myself from doing a little bit more work. [laughs]
[laughs] Right, right. So it's kind of a curveball for you. You know, as George Costanza from Seinfeld once said, "It's not a lie if you truly believe it," right? So... [laughs] how do you classify people who think that their facts are simply those that are detached from reality, right? 'Cause you're looking for people who are purposely getting disinformation out there. Is there a difference between purposely doing it and accidentally doing it? And do you have to account for that?
Of course. My friend, Ed, you have just touched on the difference between mis- and disinformation. So disinformation, of course, is misleading or false information that is known to be shared that way, and then misinformation is the unknown spread of that. So there's kind of, there's central... There'll be people who seed this information. Whether or not they think it's even true from the start, I can't quite say, but then, there'll be groups of people that, when you see the narrative spreading through a network, who believe what they're reading, because maybe the person they were told it, or saw it on social media from, is somebody that they trust in some way, so it's really that human trust that helps these false narratives propagate through our social networks.
So it's really cause and effect. The disinformation, hopefully, for those bad actors causes the misinformation that then gets shared, and that's how it explodes exponentially.
Yeah, that's their hope.
Right, right. [gentle, upbeat music] So we've been focusing a lot on, [chuckling] kind of the gloomy aspects of things. Through the course of your research, like, have you found some positives that you found out there while researching these topics?
In terms of looking at gender and disinformation, women politicians who get targeted with this kind of really rude, vile, nasty false narratives, and also just terrible abuse of ones. Like, there's lots I really don't feel like repeating that I've seen on the internet. At the same time, these people who are public figures have also really great support groups of people who will respond to those false narratives or really abusive things being sent to the women directly. On Twitter, for example, there's been situations where I noticed there's rude things and false things being said about a woman, and then other people coming in and responding and saying, "How could you say that? She's done X, Y, and Z for the country, for the city, for whoever," so I'm lucky to have also observed people standing up for the women politicians when they receive this kind of terrible abuse and disinformation online.
That is great to hear, but it's exhausting, right? I mean, isn't part of the goal just to try to wear those supporters out?
That could be one of them. Yeah, of course, because people like that, of course, just get too worn down, and the teams of these women politicians, as well, against staying in the political sense there, they have teams of people who will wake up every morning-- I've read this in reports as well-- where teams of women politicians will wake up every morning, kind of, during a campaign in the really hot, hot, hot period, and have to scrub the social media profiles of their members of parliament to make sure that all of this vile abuse is deleted every morning, and that's exhausting. It's exhausting work. It's a massive burden for women and their teams and their supporters to have to be putting up with. In the report that I wrote with some colleagues, including Nina Jankowicz, who is a well-known, state-sponsored disinformation and gender disinformation scholar, we made a lot of recommendations to platforms, and one of them was as simple as allowing for batch reporting of this abuse on social media. So, say you, as a politician, or just as a person on Twitter, put out a tweet, and you start getting lots and lots of responses, at present, you'd have to report every single one of them. So there's two things that could be fixed here. One is you can send the tweet and all of its replies over to-- again, in the context of Twitter-- over to them and say, "This whole tweet has gone bad. I'm getting hundreds of responses of people calling for my death in various, absolutely terrible ways," things like that, that we've seen online, so changing the way that we even report these things, it doesn't get to the core of the problem, but at least helps the women and their teams and their supporters who are dealing with this, sometimes every day, especially during the campaign period, or if it's somebody quite prominent, in another field.
Over the course of our conversation, you mentioned 2020, where Kamala Harris was the large target here in the US and then, 2016, obviously, it was Hillary Clinton. Was there an evolution between 2016 and 2020, in terms of how it was trying to be, you know, countered on the support side?
You know, the terrible abuse that's online towards women politicians who are aiming for the top seat, or almost the top seat, persists, and we need to understand it better and develop ways for women to feel safer on social media, so that this doesn't happen anymore, and women aren't afraid to step up and speak out against it, or even just run in the first place. We don't want this chilling effect to kind of run down the line where younger women are deciding, "You know what? This isn't worth it. I just saw the replies to Kamala Harris or Hillary Clinton, and I don't want anyone sending anything like that to me," so those are the things that we need to really watch out for to protect our democracies, frankly.
So this is a big question, but what do you think companies and/or users need to be doing better?
To put it short and sweet, stop putting profit over safety. The focus needs to remain on making sure that these places are safe for everybody, not just the types of people who are sitting at these top tables that don't have to deal with speaking a language that is supported by social media platforms but doesn't have the experts within the company to help people from that country if they're facing abuse or disinformation on that platform and making sure that it's a safe place for everybody. I think that it's not the platform itself. It's people who want to spread this vile information, false information, and find it interesting and want to keep sharing it. It's us, so as they come onto their platforms, if they're not designed in a way that puts in friction, if you're about to post something that contains abusive terms, for example, if the design for safety, if that's not in place, then it's just gonna persist there. Pollution, polluted information will spread across platforms.
We talked a lot about Facebook, insofar, in this conversation. I know that your previous research, you know, focused on Twitter and other platforms, as well. Do they have different characteristics in terms of how they go about their business?
Research has shown that people on different platforms have, kind of, just different things that they want to talk about, to some degree, so on Reddit and some alternative platforms, there's something called the "manosphere" that people have researched, which is people who engage in really misogynistic discourse and conversation, and that tends to occur more on these kind of alternative platforms, and also, as Twitter and Facebook have been de-platforming, so deleting or shutting down the accounts of large numbers of people who've been engaging with disinformation, those people will shift to other platforms. But, of course, people can make new accounts, and it can always start cropping up again, My colleague, Nina, calls it "whack-a-troll". You hit one down, they're just gonna pop up again somewhere else. So trying our best to introduce friction when people are about to post these things, having better reporting mechanisms, and having more experts who can assess the content that's been reported at speed so that it doesn't stay up for terribly long, those are some of the things that are really important to think about.
Yeah, indeed. What does your research mean, or what do you hope it could change in terms of the future of disinformation and how it intersects with social media?
I really hope that my research sheds light on how women, in particular, are impacted by disinformation, and also, what I expect to find is a lack of accountability on the side of platforms who aren't deleting this quickly enough and aren't dealing with the problem. So hopefully, I, in the end, possibly even give women the tools to feel safer online and step forward into the public light when they feel that they have something to say because that's what they have the right to do, no matter where they live, or no matter which platforms they need to be on for their careers, whether that's journalism, politics, or anything else. [gentle, upbeat music]
Absolutely, absolutely. No, we really appreciate having you. So, Alexa Pavliuc, doctoral researcher at the Oxford Internet Institute, thank you so much for being on Innovation Heroes. Really fascinating stuff, and thanks for your time. [gentle music]
Of course. Thank you so much for having me. [gentle, upbeat music]
We know that IT leaders, and the rest of us, too, have a ton of data coming in at all times and from all angles, but even just by listening to the advice and positivity that Alexa has to offer, I hope that you can take what you've learned, be inspired, and do your part as an individual and a leader to help raise your own game. Make the most of what the digital world has to offer, but make sure you're thinking critically about the tools you use and keep working to ensure that everyone sharing the digital space feels equally welcome and safe. [gentle music] Thanks for listening to this episode of Innovation Heroes. [Innovation Heroes theme music] Next time on the podcast, I'll be speaking with Andy Lapsa, Co-Founder at Stoke Space Technologies Inc. Andy spent the last decade helping to design the modern rocket, and he's got some big ideas about what the future of commercial spaceflight could bring us, but first, he says we need to focus on sustainable engineering to make it feasible, and soon, because the fate of the planet might just depend on it. So tune in again in two weeks. You won't wanna miss it.
[Innovation Heroes theme music] If you enjoyed this episode, then consider being our hero. Smash that like and subscribe button to Innovation Heroes, wherever you get your podcasts. Innovation Heroes is a Pilgrim Content production in collaboration with SHI. Our producers are Tobin Dalrymple and Jessica Schmidt. Our associate producer is Olivia Trono, with production assistance from Carmi Levy, Ronny Latimore, Jane Norman, and Amanda Scheffer-Cavanagh. I'm your host Ed McNamara, and I'll be back with another amazing story in two weeks.
If you're an SHI partner, prospective customer, or SHI employee looking to learn more about WiSH, visit shi.com/wish today.