Google | Apple | Spotify

Rhys Lindmark: Hello, listeners. Today, I'm excited to chat with Isabella Garcia-Camargo. Isabella works at Stanford Internet Observatory as a research analyst and is the project manager for two amazing projects that I love, the Election Integrity Partnership and the Virality Project. Isabella, thanks for being on the show, and welcome.

Isabella Garcia-Camargo: Thanks so much for having me.

Rhys: Yeah, excited dive in and chat. It's been cool for me to see as the election and various misinformation disinformation things were happening to see EIP, this Election Integrity Partnership pop-up was really cool, and to see you coordinating all the cats there was really interesting. So let's start with, could you tell us and the listeners, a little bit more what the EIP was and what your role within it was?

Isabella: So I think the biggest thing about the EIP and cornerstone to how we’re thinking about the model and improvements and what it meant, it was very serendipitous. It kind of came up out of nowhere. So there was a group of five Stanford students who went to work at CISA, which is a Cybersecurity Infrastructure Security Agency and that's the agency that really runs the US elections, so I was one of those students.

I had gone between my undergrad and master’s program just for an internship thinking about democracy. Generally, I like elections. So I showed up there and was really ready to do whatever. Initially, was working on some security products and then talking to election officials, started thinking about, okay, I've been working on this information throughout my college time. I really like the problem. What are the election officials thinking about as they go into the 2020 election? How do they think that they're actually going to deal with this?

Because I've heard about all the disinformation in the 2016 election. People were talking about it for 2020, but the bottom line was after a couple of weeks of discussing with the election officials, I realized that there wasn't a huge plan, per se. Especially, in comparison to what I knew we were capable of. Being at the observatory, I had worked on a lot of takedowns. I knew the tooling that was out there, but this tooling wasn't in the hands of the election officials and frankly, they just didn't have the time to think about this. Running an election is extremely difficult. There are a thousand and one things you have to be thinking about all the time and extremely resource-strapped as well.

So the idea came up of, what if we had a war room which would be monitoring for disinformation in real-time and then providing these insights to election officials so that they don't just know when a machine breaks down. They know when people are talking about a machine breaking down, for example, or when people are talking about this specific polling station not allowing XYZ voters to come in and vote and it seemed like a pretty core part of this whole mission of protecting 2020 and protecting the legitimacy of the election. So that’s the origin story there and then the long story short is I brought it up to the people who I work with, both CISA, as well as the observatory, spun up the EIP about hundred days out from the election. I think to the credit of the leads of the organizations that joined partnership.

Well, I think the most incredible thing it was how we were able to make it happen in that time period and how much people put aside their incentives that they had in their research, in their company, in their think tank organization to come together for this goal of let's help these election officials. Let's help this organization that runs our elections. Think about this massive problem that's coming up against us.

Rhys: Yeah, that's cool. I think like many good things in the world, just like an organic thing or it's like, “Oh, y'all aren't thinking about this too much. But it's possible to do a ton of effort here to pre-bunk the whole and to get ready for the whole issue that's starting to emerge.” And I, for me, I just found myself during as election times were spinning up. I was checking in with EIP and being like, well, the current narrative is okay, stop the steal, blah blah blah and so it's good for me to check in with you all. But how did you all determine impact? It's like were you able to help the election officials pre-bunk things or would there have been X more people died in the insurrection or whatever. Then would have died. How do you think about the impact?

Isabella: This is probably one of the most frustrating things about working in the disinformation space. I come from a computer science background and was on my way to do more PM-type work in industry before this info bug got to me. So for me, metrics and determining impact is something that's extremely important and it is how I work. It's how I determine what I'm doing day-to-day, and in this information field, this is basically impossible. There's been research papers and whole doctorates given out of people still trying to understand the impact of disinformation in the 2016 election. Not to mention, how are we determining impact in real-time as we're trying to find narratives in real-time?

So this was extremely frustrating while we were running the EIP. Probably,  the researchers weren't thinking about this every day, but something I thought about every single day, is what we're doing important? Does it matter? What's the bottom line here? And I remember talking to Alex Thomas about this, who was the lead for the Stanford side, and I just said, “Hey, I'm super worried we're writing all these tweets, we're putting out these blog posts. There are 100 people working together towards this effort, but I feel people still believe that the election is being stolen.”

Rhys: I read it. They slip random Twitter person who's super into this crap. Is it reaching the folks it needs to reach or whatever?

Isabella: Also, because we're putting it out on Twitter who could possibly be listening on our Twitter page. It's someone who genuinely is going to believe that their vote was stolen going to go follow the EIP Twitter. I didn't think so. I think Alex brought up a couple points which were really, really key here.

One of them was, the work that the EIP did was not to go try to get to the deepest rungs of QAnon followers and try to pull them up. There's a completely different effort, a completely different ball game. Though I started thinking about the EIP more as an offensive tactic. This information, it moves very quickly, and oftentimes the institutions that we have are reactive towards this information.

What we're trying to do with the EIP is start to get that asymmetrical playing field a bit more level. So just bringing the tools to people who could potentially actually reach those voters and are in the position to actually put those counter-narratives out. Just give them the information, get them that narrative. Not 10 minutes after it starts, but at least within the week, within the day, not two months later.

So yes, if we put out a lot of blog posts, we put out our Twitter, all of that was really, really important, but our relationships with the key stakeholders, especially with the election officials and key civil society groups who have the trust in the communities already. I think that was the most valuable part of the EIP, and obviously, you put something out on Twitter. The followers that we have were able to communicate this to their families, that interpersonal connection is also very important but the Twitter and the likes weren't everything. It was really about the coalition, the partners, the stakeholder relationships that we were able to manage throughout the election.

Rhys: I'm just personally curious, I mean, because as 2020 started to happen. It was, okay, this could be a little bit of a sketchy, like 2016 has some issues. Okay, 2020. Oh, not too much on Russia yet. Okay, that's pretty good. But oh, God, Trump's being a little crazy here. Oh, boy, stop the steals, like becoming a thing. Oh, boy, now there's riots at the Capitol. So how was it like for you to be like you’re being on the call there?

Isabella: The period between August and November until after the action was called. That was just one of the craziest periods of my overall work-life thus far. It was all the time. It was constant. You couldn't think about anything else because you were seeing the slow burn. It felt like an avalanche and slow-motion of. In August, we could have told you that January 6 was going to happen in August and it was just seeing the accumulation of evidence and seeing how these masses of online users were.

Well, it was organic. It was people were finding the evidence themselves, like framing it and networking with each other to understand and build up these conspiracies and it felt pretty helpless at a lot of points. It was just, what am I going to say to these people. There's a scaffolding. The scaffolding was a bit in wrap up there. No matter what the underlying assumption was. We're looking for evidence of fraud, and we're looking to show how this is going to be a stolen election. So, so yeah, I mean, the way that we work at the EIP is that we have 12-hour shifts.

Where people would be online all day. So it's almost like business hours for disinformation monitoring. And so there were on-call managers. I was one of the on-call managers. There were tears of different analysts and really just trying to operationalize to keep up with this huge flow of disinformation that was coming through. So we started that in August, ran it through till the election was called. So yes, a lot of hours looking at this information, a lot of hours just thinking about, we're overwhelmed with the sources we already have.

But where else do we need to be looking? How do we preempt this? How do we make our turnaround a little bit faster? And yeah, an incredible experience, but I would characterize it as a slow burn which was extremely frustrating at times.

Rhys: Well, there's something interesting about as you start to be aware of what the information ecosystem is doing and what kinds of, like the main memes or whatever, the main ideas that are forming from it. As you said, you were able to see in August that there was, and everyone's able to see this, it's like, okay, fraud is this underlying meta-narrative that is just going to come up and it's just coming up in all these different ways and you can see it, you could predict or whatever, and maybe not all the way to January 6, but you could see that was going to happen.

I guess my question is, actually, let's take a step back here for a second. And you could have chat about doing the counter-narrative work and doing the proactive work in the pre-bunk work, whatever. How do you view these information ecosystems more generally and the ways to make them more fair or more good or less misinformation? How do you kind of view the system as a whole?

Isabella: How do you view the information system as a whole? I can break it down, maybe a little bit. So I'll start with that infrastructure point. I think, again, from my sense of 2016, I was barely starting my career in 2016, but my sense of that was it was a lot more focused on this foreign interference piece.

And we didn't see what we saw in 2020, which was, 2020 there was a whole machine at work. It was kind of the evolution of a small child in 2016, coming into maturity in 2020. It’s like a well-oiled machine that could take a small narrative. Take an idea that a Sharpie could potentially bleed through a piece of paper and produce out of that. A claim that people would be chanting outside a polling station. That production line is new and that was shocking to see happen during the election, how large-scale influencers were listening to what the masses were saying, what they were concerned about, how they were feeling.

Pull up those ideas and then echo them back down. I think Kate Starbird at the University of Washington who's really driven this point home and has a lot of really good images on Twitter that she's been sharing about that dynamic. But we call it the participatory nature of this information. It's not just that Trump was tweeting that the election was going to be stolen. And it's not just that one person in Maricopa County thought that the Sharpie was going to bleed through the piece of paper. This story I'm referring to is Sharpie-gate for anyone that doesn't know it yet.

Sharpie-gate, which was the theory that specifically Republican voters at polling stations were giving being given Sharpies, which would disenfranchise them because the voting machines would not count their votes. So but to the point was, it's not just someone at the top shouting things down to the users at the bottom. And it's not just users talking among themselves, it's the synthesis of these two and it's the masses and the populations looking for evidence and creating these stories among themselves. Some mid-scale influencers pushing them up to the top and then large-scale verified influencers taking these narratives and then echoing them down to the bottom and this like vicious cycle that was going on over and over again and just churning out these narratives.

So that was, I think, one of the most surprising facets of the information space that certainly was created far before the 2020 election, but was activated in this incredible way during the election and something that it's like, not a bug, it's a feature or something of the information ecosystem that we're in because it's rooted into the network. One thing I think that...

Rhys: Let me pause you for a second there. I think the 2016 version, as you said, that was a version where you were still in college or whatever that time, but you weren't deep into your work that you're doing now. But I think that 2016, I was doing music education at the time, so I wasn't really that into it either, but it sounds like, as far as both you and I understand, it was nation-state stuff, it's like old school. It’s like top-down, centralized, like Russia is like, oh, let's like get into the. US information ecosystem through these new social media things like that was one of the main stories and was powerful in a variety of ways. And this one, as you said, it's more native. There's this machine and the interesting thing about I like you calling it a well-oiled machine because it's taken a good amount of time for these networks to come into existence, and now you can imagine them as these networked beings that have trust between them and have like lower transaction costs between them and that allows them to have this well-oiled machine. As you said, that moves things from top to middle to bottom through these cycles and can take anything. It has lots and lots of small experiments to run at the bottom level. Oh, and then one of them, Sharpie-gate is powerful, and then it gets the #sharpiegate and then it just propagates into the system ridiculously quickly. Because the system is kind of optimized to take things at the bottom and push the most powerful narratives through the system. Is that sound...

Isabella: That last sentence you said, that's virality. That's the point of a lot of these social media algorithms and honestly less on Twitter. But think about a social media platform like TikTok for example, where it's not that you go around and follow verified people all the time. It's a lot of the time you're seeing some random person with 10 followers who bubbles up on your feed and it's pushing information at you.

So I think it's important to note that with the fact that this machine exists. It does have to do with how the product is built up and the veneer of authority that is created within this ecosystem. It's not broken English coming out of some Russian bot. It is a verified influencer with over a million followers, who's appearing also on cable news. It's not just Twitter. It's their presence across all these different social media platforms on cable news sometimes elected officials. So it's not just, first of all, I just said it's part of the product.

But it's even beyond the product, it’s people who in our society have a lot of trust, have been engendered this trust and authority and now they found their slot within the machine. So I think that’s the problem that I'm most concerned about, not because it’s largely because of the continued existence.

It's not something that I see a clear through-line for how we can solve because people are free to choose who they put their trust into and that ecosystem has proven to be very, very effective at spreading pretty malicious claims.

Rhys: Thinking about the solving stuff side, do you think if we have these kinds of well-oiled machines that are really good at rising up viral content and then spreading it within the system, no matter it's like healthiness for everybody or truth or whatever. I'm tempted by the, oh, let's add friction back into the system or let's do some mediating consent stuff or whatever. How do you think about the ways to positively shape these viral beings, this weird information ecosystem?

Isabella: I think there's two things here. One along the lines of what you just pointed out, friction. Recognizing that the fact that these organizations, institutions, machines, 've said a lot of words at this point, the fact that this entity exists is due to some of the platform affordances that being the algorithm works in their favor because they are these large celebrities, in our work we called them repeat spreaders because no matter what they said they could get something trending on Twitter within 30, 40 minutes. A narrative that started within this system.

Okay, what are the platform affordances that continue to allow this system to perpetuate narratives? And is there some friction that we can add in or how do we think about on platform, what that looks like? I think there's also a lot of work to be done on the offensive side here.

So how do we think about, the problem that we're facing is network. The problem that we're facing is, it is vicious, it moves very quickly. It's always been the case that this information is created far more quickly than at the rate at which we can fact check all this information. So we need to be thinking about this problem as offensively. What are the tools that we can give the trusted versions of this information in order to start offensively, working against the bubbling up of these narratives? In the 2020 election, the most promising tool was a rumor control page.

Which I think we've chatted about a lot since the election, and which we are now promoting for vaccine misinformation response. But the problem of disinformation is a network problem. It is organized. It moves very quickly and you have to have a networked, quick responding solution, to start to try to even that playing field.


Rhys: So that rumor control page. This is, I'm just looking at it now, it’s part of the CISA work. If I understand CISA, for listeners, it's like a part of government that works on election integrity and misinformation. Is that roughly correct?

Isabella: So, CISA, it is part of the DHS, so it's an agency under the DHS that is in charge of all of our critical infrastructure. So that’s everything from highways, electric grids and elections happen to be, critical infrastructure. So since 2017, they've had the mandate to, okay, elections, you're there to protect it. They have focused a lot and improved a lot on the hard cyber part there, and so they secure all of our voting machines and how the election is actually run, but to their mandate, they have also added in the Protect 2020 mission, a specific focus on the integrity of the election itself. So that was added actually, and I think that's a really interesting ad. But anyway, just to clarify that, says it does a lot more than just elections but has done a lot of scrutiny in the past couple of months. Obviously because of the election work.

Rhys: So I'm looking at the rumor control page and I'm looking at one of these rumors that say, “Votes are being cast on behalf of dead people and these votes are being counted.” And the reality is that, well, it's not really true. And I see some of the power on this page, you could look at it and you could see, oh, okay, here's some negative rumors, here's the reality and maybe here's a way to. But I'm thinking about stop the steal, like the #stopthesteal. I'm not sure if that is helped or stopped, or hampered at all by this rumor control page.

Isabella: Absolutely, and I think that's core point here.  So you can start to think about these things as there's levels. Nobody is born out of the womb believing stop the steal. Stop the steal is a scaffolding. It is the overall kind of belief that the election itself is going to be not legitimate. So for these scaffolding problems, there’s solutions specific to those scaffolding problems. That is the most difficult problem.

You’re talking about people do not believe in the authority of elected officials and the institution of government itself. So you're not going to solve that by having a rumor control page but there are ways that people get pulled into that underlying belief of the scaffolding. So you could have gone to vote in Arizona and you could have received a Sharpie and you could have thought everything was okay.

You go online and then you understand a little bit more about the Sharpie-gate narrative and you talk to your friends. Maybe some friends who already completely have lost all faith in that centralized authority, and they pull you into the scaffolding. So a rumor control page is helpful for I think those narratives that bubble up and then pull that questioning population into that underlying everything is a lie. I'm skeptical of anything that's coming out of these institutions.

The election officials are corrupt. The whole system is broken, et cetera. So I guess we're trying to get is there's a spectrum here and you can't solve all of it by a rumor control, but you have to basically you start. You start this. You have to moderate the problem a little bit. So there's people who are already radicalized. There's people who completely have faith in every elected official and on that spectrum, targeting the questioning population is a good place to start, but those solutions are very different than people who would look at the rumor control page and say this is a bunch of BS.

Rhys: I think it's interesting to hear you use the term scaffolding a good amount and it sounds to me, it's like, scaffolding is I would call, the book that I'm writing right now is all about memes, and so I would call like a memeplex. It's a lot of different kind of both things that could either be scaffolding you up or like pulling you down, depending on which whether you want to go up or down, and it is like a reinforcing set of ideas that keep on pulling you deeper into it. How do you think about scaffolding? How do these scaffolds work?

Isabella:  When I see the word scaffolding, I think what I'm trying to get at is, it's like a framework. It's the lens through which you view new information that's coming to you and we all have these frameworks. It's not that just half the population or tenth of the population has one framework and everyone is everyone else is in agreement. No, everyone has a framework through which they understand facts through which they understand. It's like you're biased whether or not you'll believe in A or B or C.

So what I mean by scaffolding? Some of these frameworks get to the core of some really key institutions, such as whether you or not you believe that government is a functioning part of society and that it can run an election in which your vote will count. So that's a specific scaffolding that was primed. It's like a specific framework coming into the election where more and more people were drawn into this idea that everything is broken. The entire system is rigged and no matter what happens on November 3rd, it will have been cheated.

So when you have that specific of a framework that's surrounding one event in which there's this, this just network of people pulling you in and bringing you more evidence. It's like, oh, you see this specific barcode on the outside of your ballot. You should interpret that in this way. It also stops people from looking for more evidence. It's like a crutch. You don't have to look for more evidence because you have this framework through which you understand the world when it could be that.

But that barcode is just part of how the mail-in ballot works and how your vote would be counted. But it allows you to just take shortcuts and align things into your view of the world much more quickly. Again, we all have these frameworks. We all have these lenses, but it was really, really interesting to see a really specific one created around the election and which tied in so closely with online narratives and how that framework was created and reinforced in online spaces.

Rhys: Yeah, got it, that narrative of hey, you can't really trust the government to do anything and so then anything that you see from the government is like, oh God, I'm not trusting this random barcode or whatever, like, oh, here's a kitten, well, that kitten means that the government is but Epstein did it or you know? And so it's interesting. It makes me just think it's a crucial meta thing that all people need to have is a desire to have a, like a scout mindset instead of a soldier mindset and to actively be trying to understand their biases or actively looking for more evidence or whatever, and I just think that that's a difficult thing with our reality now is that we just a lot of folks have this thing that says, oh, I'll just take it when it is true.

Instead of actively trying to change their own beliefs or whatever. I feel like in something when I had Renee on the podcast, a year or two ago, I think both you and her talk a lot and generally, the information ecosystem folks talk a lot about how platforms work and what the friction of platforms is and how to counter disinformation and misinformation, but there's also this other perspective, like media literacy or whatever, which is instead of focusing on the information you kind of focus on in the platforms,  you focus on the people. What do you think about that? What do you think about how much impact we can have by up kind of quote-unquote upgrading the people themselves instead of trying to change the platforms?

Isabella: That's a very packed question there, so let me try to break it down. I mean to your first point of what you were saying of having people with a scout mindset. Super, super interesting. I would agree. Every single time I'm going to new sources or trying to understand a new international problem. I really try to bring myself out of that framework of, okay, I need to read XYZ sources. Think about the problem this way. If I was trying to prove myself wrong, what would I say? Et cetera. Sure, that's great.

But the funny thing about this scout mindset though and telling people you should question everything that you see. That's exactly the conspiratorial way of thinking about things. It's questioned everything. Nothing that you see is authoritative. So this is a really, really difficult problem because that framing is actually where a lot of these more conspiratorial communities already live.

So it's a difficult problem of what are the upgraded authorities in this new information ecosystem? I think the Martin Gurri’s Revolt of the Public book, was the first time I really started thinking about this in this way. I think one of his main points in the book is, the old authorities they're not coming back. And now we have this public that is moving very quickly, coming to conclusions very quickly, nihilistic about everything, breaking everything down, questioning everything.

So in this new landscape, how do you think about authoritative information and what does that look like? It's not going to look like we're not going back, we're not going back to nightly news. One person tells you this is how you think about this international problem. Sounds good. So we have to be really creative there in terms of okay, so then what? What we're doing right now is not working out super well and it's not great.

I don't feel great about the idea that, like demigod celebrities who are verified on Twitter are new authorities. So what's a more creative solution here? I think the second part of your question was upgrading the people. That I totally agree and media literacy is very, very core to this solution but we gotta think about what are the priors that we're coming into with now?

So we can't just X out the population that currently exists and there's a new population of people we can integrate it into public school education and think about how we teach people how the FDA works. People don't need to know how vaccines work to trust the vaccine. A lot of these conspiracy theories are about, oh, Fauci, just woke up one day and allowed me to inject myself with Pfizer. No, if you understood how the vaccine development and approval process works, a lot of these conspiracies lose their legs.

So how institutions work and teaching people how does an election get run? How do these rules come up? How can I change these rules by engaging in civic processes? That is extremely important. Understanding how the institutions that govern our lives work is something that nobody teaches and would solve a lot of these problems, but that has to be done in concert with, okay, there are people who are already very, very far gone. What are we going to do about that as well?

Rhys: Yeah, that makes sense. It's a classic. How do we help the folks today? And also how can we make sure that everybody is upgraded for tomorrow? I like what you said about the demigod celebrities, that's a hilarious way to call it. It's like, okay, whatever Elon Musk says, that's what I'm going to do. Hopefully, the new influencers are kind of a network influencer, which is where you look at the ecosystem of people in the climate tech space or in the epidemiology space and you have a feed that incorporates a lot of their thoughts together, and then that thing that like network of ideas is kind of the influencer.

I have a question for you about the virality project, which is the other big thing. And you were just starting to talk about some vaccine stuff and and and D.r Fauci. Tell us a little bit more about the Virality Project and what you all are trying to do with vaccines and misinformation.


Isabella: So the Virality Project is, essentially, we finished the Election Integrity Partnership, we had a report to write. We realize that another huge disinformation crisis was on its way and got the gang together again and set up a new partnership. It's essentially how that came about. So a lot of the people who worked on the EIP are now working on the Virality Project and we're just trying to bring what we learned from the election work and bring it to the space. It's everything from how does the teamwork together.

We have academics working with four profit companies working with think tanks. What are the different incentives and how do you align everyone into this like long-term marathon of a research project. Because, at least with the election, we had an end date but with the vaccine, we have no idea when we're going to be done. We've been going since January and it's May right now, and we'll be going through at least, August. So it's understanding, especially, with this model of I think we call it internally a center of excellence model.

A clearinghouse for this information. It shouldn't live in an academic project forever. I spend a lot of time thinking about,  oh, my God, we're going to run out of budget. Oh, my gosh, you know, XYZ. Ideally, this work of real-time analysis of disinformation and working with the social media platforms, the civil society people, the government people, the academics, that whole framework, ideally, will be able to put in a more long-standing space eventually. But this is just our V2 and we're trying to understand, okay, what didn't go well in the EIP?

How do we make it better? Who else do we involve? What other partners do we need and all the time making sure that we're doing the research and gathering the facts on, okay, what do people believe about the vaccine? What are the communities that are most engaged? Are the tactics the same? Are we seeing new tactics to spread vaccine disinformation versus election disinformation? So it's been very exciting, really great team, a smaller effort for the EIP. We had 120 people at the peak and for the Virality Project, we've kept it slimmer because it's a longer project, so it's about 40 people, 40 to 50. We're just trying to keep learning or trying to to advance this model and understand how can we impact right now and also think through this model for the future.


Rhys: Yeah, that's interesting. I would differentiate those two is like that model. You call it the Center for Excellence model or you could call it a network of networks model or whatever is a kind of a new institutional form. It's the partnership and the project or this happened at the beginning of the pandemic where folks were creating, there's like the hashtag #maskforall and it was groups that were aligned around a shared goal and are kind of coming from a bunch of different spots.

So it's like what can this new kind of institution look like and how to run those kinds of institutions. So it's cool that you all are thinking about that. On the other side of object-level on the ground. What kinds of patterns are you seeing in the Virality Projects? Are they similar to an election or not?

Isabella: Yeah. I think this is one of my favorite parts of my job, is just getting to think about how do you track this information in the most efficient way? It's different from long-term academic research. It's also different from if you were in an industry and trying to get a product out really, really quickly. So it's kind of between the two and one specific examples I can give is in the EIP, one of the problems we had is it's a whack a mole problem. Every day you can see a new post about someone who is dead who supposedly voted.

So we spent a lot of human power, analyzing all of these things, putting a case study together, sending it to the platforms. One of the big upgrades I wanted to figure out for the virality project is how do you track narratives, not just incidents. How do you track the formation of these large-scale ideas which we talked a lot about at the beginning of the podcast and so, how do we operationalize that collective sense-making process? So to that end, we developed some tech tooling for it. We modified our workspace management system to be tracking narratives, not just incidents.

We had to retrain the analysts. So it's a really interesting work because you have to understand a lot about how disinformation works, you have to understand a lot about people. You have to understand a lot about people and how they understand the ideas themselves and the tech behind it is really, really cool. How do you understand when one post about a pregnant woman getting the vaccine and having a side effect? One week, that's a really interesting finding because it's signaling the surge of that narrative across different online communities. Another week, it's just another post. So all of this context is important, especially for our stakeholders to telling them what's important, what do you need to counter right now, and that's a hard decision to make because you don't want to spread misinformation. By going out and debunking something that's not that important.

So the stakes are high, but problems really interesting, but it's really great to work with people in this space because I found that everyone super interested in the problem. It's a very novel, new growing area, so anyone can get into it right now because it's not there's a whole lot of people that are gatekeeping it right now.

Rhys: Yeah, so whoever you are, get out there and start researching misinformation disinformation. What tech platforms, I know you all work with Replica, if I want to do this and I want to track the ongoing narratives within the anti-Vax or whatever in world. How do you all do it?

Isabella: So we use a lot of open-source investigation and we do have some custom tools or API access just as being an academic institution. So we use a lot of CrowdTangle, which is a platform that you can monitor Facebook with, and Instagram and Reddit. We have Twitter Enterprise API, you know just so we can get a whole lot of Twitter data. But there's non-private versions of all of these tools. There's Torque which you can use to get the last seven days of Twitter data, which is really the only thing that's important in many of these cases.

Bellingcat has a really good OSINT handbook where you can use a lot of the tools that we use, so I would say. You know it's maybe 60, 40 open-source tools versus specific licenses that we have. But I would recommend everyone gets a CrowdTangle plug-in. At least that's something you can get on your Chrome, and it allows you to see any link that you have you can understand what engagement it's gone across Facebook over the past forever. That's really, really interesting information you can see. Oh,wow, this link that I'm on actually a whole bunch of anti-vaccine pages shared it yesterday. So maybe I should think about when I'm reading this like who's been promoting it. Anyway, there are a lot of things out there.

Rhys: That was essentially what I was looking for is CrowdTangle. I don't have CrowdTangle on my Google Chrome right now, but I will after this call. So thank you for that. I want to get into kind of wrap mode here and ask a couple of final questions mostly or first some overrated underrated questions, which are I'll just say a thing, and then you can say whether you think it's overrated or underrated.

The first is how hard it is to moderate video. Is that overrated or under... 'cause it's like a big theme for listeners, I was like, oh, man, like TikTok and YouTube. It's hard to moderate them because it's not text. Is it overrated or underrated? How hard it is?

Isabella: Well, I don't feel super qualified to give this rating because I don't have the toolset and thousands of engineers that these platforms have. It is underrated. How poorly they are moderated, the amount of work that the platforms claim it would take. I feel like it's overrated, but surely it must be a very difficult problem. They haven't figured it out yet. I mean, especially live streams. We talked a lot about live streams in the election and that was a huge problem. So I guess mixed bag here.

Rhys: So it sounds like the amount of impact that it has is underrated, but the amount that the platform is oh, but it's so hard to, is overrated. They're overrating...

Isabella: They have transcripts. Of all of the audio, at least that I don't know. But yeah, surely, it's it just seems like something could be done a bit more.

Rhys: The other one is kind of a I would call an elephant in the room of this. I mean, this isn't totally true, but it's moderately true. Is the left versus right side of this or it's like a good amount of misinformation disinformation is like coming through the right, and at the same time there's various things on the left. I was just reading a nice article on The Atlantic today about how all the folks on the left or like not following the science and making sure that kids can't be in schools or whatever. How do you think about the amount that flows through primarily right-leaning ecosystems? Is that overrated or underrated?

Isabella: Okay, in the election. Not overrated. In the election, we did see and you can go check our report of how we conducted the methodology, but the vast majority of the election fraud narratives. Were right-leaning and spread in right-leaning communities. However, one thing I'll say here that to copy out that is this idea of scaffolding and this idea of how much are you believing random Instagram stories to get your factual evidence of what's happening in the world? And I think that's vastly underrated.

Especially, people in my generation like the Gen Zers, and I'm culpable of this as well of how much we are asked to believe some random person on Instagram posting a story about XYZ issue and how I can explain it in 10 Instagram slides. You cannot explain fundamental issues and tenants or implied and priming the people in this new generation of very online people, students, et cetera, like priming ourselves to be so quick to jump to conclusions. It does not bode well for this whole, the right will always be a disinformation machine and the left will never get there. So I think that I would copy out that with that observation.

Rhys: Yeah, that makes sense, and as a final question here is, for you as a Gen Zer, and for me, I'm 29, so I'm just at the snake person thing. What do you think that you, as a Gen Zer, bring to the kind of misinformation and disinformation world that perhaps older folks 30 or 40 or whatever that they don't quite understand. What is the unique perspective that Gen Zers have that that older folk might not have?

Isabella: I hadn't thought about this one. I mean, it's growing up with the technology. Surely there has to be some difference there and just the empathy for understanding why online communities matter so much. Sometimes, talking to my parents is just, oh, well, it's just people saying things online. But if you grow up in a way that the majority of how you interact with your friends is online. I think it will become important in the coming years to have that empathy of why people believe what they believe and understanding online communities really well. Trying to explain Discord or Twitch to someone who doesn't understand that subculture is going to be very difficult and a lot of the time I think that older populations see some of these things as a joke like the wallstreetbets stuff. It's just like a joke. And it's just silly teens doing XYZ. But no, this is actually an incredible phenomenon and I think that you would need that kind of dual perspective to understand both the subculture as well as the impact that it could have on institution and on the way that we conduct our life day to day right now. So I think that's maybe where I would land on that.

Rhys: It's like they've never even been to a subreddit before, so they don't really get it., So there's lot of catch-up work. Okay, well, thank you again for your time today Isabel. And thank you for the work that you're doing with all of these partnerships and projects. Is there a place that folks can? You can go to viralityproject.org maybe?

Isabella: Yeah viralityproject.org and check us out.

Rhys: And where is it possible to find you?

Isabella: Oh, I have a Twitter. I don't remember my handle. Yeah, it's @igarciacamargo, checking out on Twitter, but really follow the Virality Project on Twitter. It's @viralityproject, the handle. The teams going to continue doing a lot of great work through August, and then we'll have a big final report.

And yeah, always open to chatting and especially chatting about finding these super interesting and exciting spaces that are growing really quickly. I think a lot of people are interested in disinformation and don't really know how they can get into the space and work on these problems, and there's a lot of opportunities right now, so pop into my DMs.

Rhys: Nice, exactly. Slide into her DMs if you're looking to do misinformation disinformation. With that Isabella, thank you again for your time and goodbye, listeners.