Welcome back to the Zero Hour brought to you by SafeGuard Cyber. I'm George Kamide.
Hi, I'm Ashley Stone
And today's guest is Alicia Wanless who has returned. We talked to her late last year on one of our more or listened to episodes.
It was around disinformation, misinformation, and really the difficulty of pinning down the distinctions for influence operations. That was mostly about geopolitical election interference, things like that. But she has returned to talk about her new paper and her new position at the Carnegie endowment for international peace.
She's incredible. She does a great job of distilling a truly complex conversation into really digestible bits, especially as it relates to public health information.
Yes. And if you're looking for easy binary answers, this is not the conversation for you. It's complicated. It's nuanced. We love it. And without further ado, let's get into it with Alicia Wanless.
Welcome back to the Zero Hour, Alicia Wanless, so good to have you again.
Thank you for having me back. I'm happy to be here.
Yes. So many things have changed since last we spoke, including some definitions of disinformation, misinformation, influence operations. We'll get into that. You have a new position, which is the co-director of the partnership for countering influence operations at the Carnegie endowment for international peace, which sounds like a big deal. It's a great title. So why don't we start there?
Why don't you tell us about your new role at Carnegie?
Sure. So the partnership for countering influence operations aims to foster a cross sector, multidisciplinary community, that's researching, and countering influence operations. And essentially we have three core pillars to our work there. First, we aim to provide a baseline and what is currently known about influence operations and how to counter them.
Second, we foster a multidisciplinary community working to understand and counter influence operations. And in doing that, we try to promote some standards for their work. And third, this work feeds into our wider research agenda aimed at developing collaborative pilot projects for measuring the spread of influence operations across online platforms, but also the effects of influence operations and the effectiveness of countermeasures.
That's going to be a big focus of our work for 2021.
That's good. Sounds like we're now bridging academia into policy and ability to measure the countering, which is, productive and very necessary.
Yeah, we're certainly trying to do that. It is a very complex network of stakeholders that need to be engaged.
We've done a little bit better so far bringing in industry academia, and other think tanks for sure. In 2021, we'll be looking to try to bring in other areas like media, the CSL world, the civil society groups, et cetera in government.
That's great. Speaking of the research that you're doing, your latest paper is unmasking, the troop public health experts, the coronavirus and the marketplace of ideas. It looks at the challenges facing public health experts and asserting their views in the mass media landscape. What are the challenges that public health experts face against these influence operations?
Yeah, public health officials face many challenges, unfortunately. so just a few of them off the top.
First, they're communicating in a really crowded information environment where it can be difficult to be heard, even if you have a good pulpit. The second problem is that at least in the context of COVID-19, they're also trying to communicate in a period of great uncertainty and this gives them added challenges to what they're trying to do. the first is that certain situations create information voids. Whereby little is known about what is happening. And there's not a lot of accurate information out there for, to explain the issue. So the challenge here is that humans by nature don't really handle ambiguity very well.
It's the same part of the brain that processes fear is activated in uncertain situations. And so people become compelled to seek out answers - any answers to reduce the uncertainty. The second issue is that mostly being doctors and scientists, they're committed to communicating truthful or accurate information.
And that can take months when you have a new disease emerging. So that's really difficult. And then the third area is the role of politics. Both having political support and not having them cooperate, pose challenges for public health officials for sure.
Yeah. That paper was interesting. One because I thought it was useful to read something that wasn't about COVID-19 mis and disinformation, but rather looking at the people who are trying to swim upstream against it. And it's very powerful to understand how difficult it is to be nuanced in a media landscape that craves, thrives, and amplifies, just black and white binary notions of truth and anxiety and whatever.
It's a pandemic. It's a hoax, truth is it's a new disease. We don't know anything about it. The paper made me very uncomfortable, but in a really good way.
Welcome to my world. It feels like it's just generally a highly ambiguous situation where we don't really know a lot.
And there isn't a lot of great mapping on the information environment in total as a system. And I think we do have a tendency number one as a society and culture to try to make things binary when they really aren't. And then the second thing is that we also hone in on like really specific problems.
So we only look at a small subset of what's happening as opposed to how this works in a bigger kind of ecosystem.
Yeah. I also really liked that you laid out: things change, scientists are learning about what's new. What's the next thing. So the truth that was stated two weeks ago. It's different now and people have a hard time adapting to that.
Yeah. I think that was really quite apparent in the debate around masks. So especially in Canada, our chief medical officer had been telling people for it and it's like short timeframes here, but for a couple of weeks that they shouldn't be wearing masks because it won't protect them and it won't help them.
And partly that was because they didn't want to have a big rush on the PPE for the rest of healthcare professionals who are on the front line. But at the same time, as there was pushback, I think more from the public and studies were starting to emerge that it may reduce things, somewhat, transmission, somewhat.
She had to make an about face and that can hurt credibility over time, because if they were wrong on this, maybe they're wrong on the next thing.
Yeah. And it's interesting, right?
Cause it's a matter of scale in the scientific world, you wouldn't lose credibility because your audience accepts that information changes, new studies emerge. In fact, it's quite a miracle how quickly some of these studies can be done with the data modeling, but yeah, in the public sphere where they want very clear guidance that feels like the audience has shifted. It's not quite ready for that level of adjustment. And I think we very much the same issue here, which was don't buy PPE because you're hurting doctors. And then the next week, it just felt like every media outlet was trying to write a different take because they wanted to attract people's eyeballs.
That was where people's attention was. And that was top of mind. But man, talk about a lot of conflicting information and then everyone you're talking to is an expert on some facet of the pandemic.
"Yeah, but I just read a study at 2:00 PM. That said masks are the thing." And you're like, but mine at 2:30 said that it's not a thing, right? It's yeah it's a shifting sands issue.
Exactly. And when you have everybody in their brother, having the ability to weigh in and communicate on it, it also gets a lot more complex and depending on where people are getting their information from, and if it's increasingly social media, that means the networks that they have, their social networks.
And that also complicates things because people do have a tendency to either trust people that they deem to be or understand to be influential or like them. So they can accept more information perhaps coming from their peer groups than they might from an unknown expert who doesn't have a relationship with them to start.
Yeah. And I think you also make the good point that while scientists and policymakers may be trying to hue to the evidence, anyone who wants to seize upon it, opportunistically for propaganda or whatever, the bots just need to pick one thing. And they just need to push that button in the amygdala, like many times over. They can just it into this one thing. so they can become like very single issue, ideas. So that actually brings up something in our previous conversation last year.
We had talked about the tricky nature of the nomenclature, at the time it was a lot of conversation about disinformation, but I think you had brought to the fore that whether it's that term propaganda, influence operations. Like it just wasn't really agreed upon in the academic setting. We couldn't even agree on those terms. So I wanted to check in and see, how is that shaping up?
Because it seems like with this partnership with Carnegie, it feels like there's some growing consensus around influence operation as an umbrella term for more discreet elements. I don't want to put you on the spot. You don't have to answer for everyone.
The problem is that. So we've been really in our baseline research, we look at different things like the policy proposals that have been put out or interventions that have been made. And there's really no consistency in terms of how these things are being described.
There may be agreement around a term like disinformation, but there isn't necessarily how these all fit together and work as part of an operation. Of course, we try to use influence operations because we think that it is perhaps the best umbrella term to talk about the collective activities. But that really hasn't I think solidified or, been completely accepted by everyone.
And again, like the more I dig into definitions, the worst it gets. So last time we had talked about the idea of influence being like the positioning of medicine in the middle ages. And so I started looking into the definition of influence and it comes into the English language to describe the influence of the stars on fate and like human character and behavior, which I think is really entertaining given how we generally talk about influence with very little measurement to prove that there's been an effect based on an activity
Yes, exactly. Exactly. Which is where the term comes from.
Yeah. That's amazing. That's a good parallel. I certainly like influence operations more than coordinated inauthenticity, which is really hard to say it doesn't really roll off the tongue, but I feel like that's capturing some element to that.
The issue is that, That is like a really fine level of activity, right?
If people are pretending to be something other than what they are, and they're going around and organizing communications to try to influence an audience. Okay, that's one tactic. But when it comes to influence operations, I would argue that it is neither inherently good or bad. There are different things that happen to try to influence an audience or an outcome for a strategic reason, for an aim.
And they don't always have to be nefarious. We, again, as a society, haven't done. Very much or been very good at articulating where those lines are. So I'm hoping that this partnership will start to push that envelope forward a little bit more.
Yeah, I really like that distinction of it's not always nefarious.
So how are the influence operations observed targeting public health information during the pandemic different from election related operations?
I think during an election, the infrastructure, as in the networks and communities are built up long before the vote, right?
So influence operators have fostered their channels and their audiences.
And these are well honed working continually over time. Like these operations are ongoing and the tactics are pretty predictable. So some astroturfing to make online groups appear like a community. Are driven by a community. Social engineering is used to access information that is leaked. Politicians pick that up if it benefits them and amplify, it covers the politicians doing that. Even if they try to avoid covering the substance of the leak, it's actually fairly predictable. Both sides will try to get their followers to support the candidates. And while trying to throw, or trying to drown out the others. Like their opposition to dissuade anyone in the middle from voting to their side, but in a pandemic it's a lot more chaotic and different actors engaged differently for a variety of reasons.
And they may not all be influenced operators, but there's a lot of people starting to weigh in.
So for example, given that the pandemic started in Wuhan, there was a concerted effort by Chinese officials to shift the narrative and blame away from them. It didn't happen immediately, but definitely a few weeks in and their means for doing that was fairly coordinated, particularly using diplomats to push messages and sell out.
But you also have scammers, right? Trying to turn a profit on unsuspecting people. Maybe they're pushing here's are conspiracy theories about 5G technology because they get more views on their YouTube channel or sales for their snake oil.
It's a bit of a free for all. And anyone who wants to influence audiences has an opportunity there. So people are pretty vulnerable. They're scared. And the uncertainty drives them to believe that information.
The 5G thing, man, I can't even with that, but I won't. Yeah. I think one of the identified memes that came out of a Russian troll farm, had intentional misspellings and everything, it was like, something about the aluminum content in vaccines, and then do you know what happens when you put metal in the microwave?
And 5G is a microwave, like a leap in logic, And I was like, you do know that 1G... like it's all microwaves. it's not like suddenly the transmission, the wave, like the actual electromagnetic spectrum changed coming out of the towers.
Anyway, I was just like, I'm not even entertaining it, but...
This is also, I think, part of a wider problem in that we've definitely entered an age like... Luciano Floridi described it as hyper history, We live in this period where we are completely dependent on information communication technologies for like all aspects of our social wellbeing. And yet majority of people really have very little understanding for what that means, In terms of the technological infrastructure and how that may or may not change our environment, but also how information is processed and put in front of people on a very continual regular basis.
And we have done little as societies to educate. Adults, especially, but I think even children for what that shift actually means and the threats associated with it. I had, I think I probably told you this in the last time, but the last time I looked at the stats for demographics in the US and Canada and UK, nearly half the population graduated high school before the web was invented. Forget the underlying technology. I don't think that there's a lot of understanding for what the internet and web mean.
And then just adopted, it's in the early days of the 20th century with the internal combustion engine, as cars were being adopted, who among those people could repair cars was actually pretty high because there weren't mechanics around, like you owned a car, you had to know how to operate it. And then that divide between the knowledge and the use grew exponentially. Like how many people in the street are going to be able to explain an internal combustion engine?
But I think you do that with information technology, which is rapidly accelerating, and then you get into, not just the way cell phones work, but the actual transmission of that information. Yeah. that's a good point. There's a huge knowledge gap.
So I wanted to return to the paper, which looked at public health officials. So, you looked at the US Canada and UK, and you looked at how these officials attempted to disseminate their information. And I really liked that you point to different techniques. So Fauci was speaking to influential people. I remember him doing the live conversation with Stephen Curry. Theresa Tam in Canada was tagging celebrities like Ryan Reynolds like these clear here Canadian celebrities, and.
I think this is an interesting, I think it illuminated an interesting tactic, which is, public officials who may either lack the digital literacy or frankly don't have the time that it was somewhat effective to be countering misinformation by communicating through people who are trusted. And you had brought this up just a few moments ago that the average person knows Stephen Curry much better than they know Anthony Fauci personally. So I just thought, is that something to consider going forward is aligning policy messages. I don't want to, I don't want to get into influencer marketing for public policy, but is that a kind of a legitimate tool to counter influence?
Yeah, the legitimacy question is problematic because again, I'm not sure that these questions have really been teased out by the stakeholders that should be weighing in on that. And that would include academics and civil society, for sure. But as lone voices, public health officials really don't stand a chance of cutting through the noise.
I don't think they have much choice. They have to be amplified by others, including media influencers and politicians. And I think it's a pity that in the countries we looked at, their main channels were press briefings and Twitter. None of these countries really had a found a good or coordinated way beyond that to reach the public.
It just strikes me that there isn't enough in the clutter of today's information environment to help support public health messaging getting out. I think it's also a bit difficult to achieve increased government communications to the public in a period with waning trust. There's a fine balance between ensuring the public are properly informed and persuading them through propaganda, but in the current climate, especially in the US I expect there'll be a high degree of suspicion by whatever political side, isn't an office around any attempts to increase communications with citizens, especially if it veers more towards influence operations. Yeah.
Yeah. I hesitate to bring up any cold war parallels, but I'm going to do it anyway. In a unified media landscape where there was only like a few outlets TV, radio, right?
You had a mass public effort to teach children how to duck under their desks for atomic bombs. And no one can see this interview, but I'm rolling my eyes. Cause that was just to assuage the public, but you could get your message out because there were really only two ways and you could jam the signals.
And I think you've pointed to the fact that, where's the government going to go?
Are they going to go on Tik Tok to reach this group of people?
Or are they going to go on Twitter to reach that group of people?
And I think they might still be trying to jam their signals through some dated means in order to get to people.
Yeah, they are. And again, the problem is I think in many democracies there was such an emphasis put on technology as it was developing social media coming out, that it was going to have this democratizing effect. And what they meant by that, I think is that it would have a democratizing effect in countries where they wanted a regime change.
But at home there hasn't been a lot of thinking through how better to engage the public. Social media included was looked at as this megaphone through which we'll pump something out on this Twitter channel and we'll reach them, but it just doesn't work that way. People aren't necessarily following that specific account and agreeing with it.
They're following a thousand other people who are saying a thousand other things.
It, it seems like it's time for a change in how the government is communicating with citizens is... have you thought through what is a better way to disseminate that information?
Yeah. I don't know if I have an easy answer on that.
I do think that when it comes to public health officials, there's a big gap because people don't know who they are. You don't have the established relationship. And here we have a moment of crisis and we're surfacing an expert and hoping that everyone's going to listen to them because they're an expert.
But maybe had, they had an ongoing relationship communicating and preparing society throughout for the eventuality of a pandemic because it does happen and it's happened more than once in human history, that when the time came, that things would have to be done. They would have already had established trust and a relationship with the public. I suspect for governments, there is an aspect of ongoing public communications that has to happen over time that builds up those channels, those personalities and shows, for example, Canadians who the chief medical officer is such that when shit hits the fan, they have the relationship with the public.
So we've been talking about how there's this disconnect between how governments are communicating with their citizens, but there's also this aspect of the technology platforms that are being controlled by private enterprise and businesses as opposed to government. So do you think there's a need for more unified protocols among platforms?
And I'm thinking specifically about Twitter and Facebook. If you compare side by side false information on the platform, how that information is flagged or not flagged.
So I probably have a really long rant of an answer on this one.
My sense is that the companies try in an imperfect environment. There are often with very little detailed guidance from civil society, academia, or governments on exactly how to deal with these problems. And that's frightening.
Cause we're leaving everything to corporate organizations to determine how we govern information. So for example, 85 policy proposals that our team analyzed from about 51 organizations, only about a fifth offer details for implementation of those proposals. More than half of the proposals also recommended more collaboration between sectors, but how we collaborate is generally left out.
At the same time, the pressure to do something about this problem is extremely high. And it's mostly driven by media coverage about the topic. It's not really an ideal operating environment and something so important is how we govern information in a digital age. And to date, my team has reviewed more than 125 platform policies published by 13 internet platforms. These tend to be pretty disparate policies.
They address individual aspects of IO, even within a single organization, like a patchwork of policies. And if you're dealing with a complex problem, that's not exactly the best approach. There tends to be stricter guidelines, for example, on advertisers than regular users, which makes sense, because there's a transaction where they're profiting from them, but also on content and behavior, they tend to be treated as separate things by separate teams.
This is another reason why we advocate for trying to describe the entire problem as influence operations. So, perhaps if you take the umbrella term, you can start to look at all of these activities together and start to find proactive indicators of such campaigns happening. But, right now what it also feels is that the only point of intervention we have to deal with influence operations are the platforms. It's the companies.
To that, I'd also ask like, where are the commitments from politicians and their supporters to not engage in unacceptable influence operations?
Where are the laws governing the tools that drive influence operations or around behavioral advertising or astroturfing and spreading disinformation, for example.
Where's the recognition of the role of media as an amplifier of influence operations. It doesn't just mean uncovering them, but also in the covering of politicians engagement with it. It all amounts to the same. Where are the education campaigns to inform citizens about what it means to live in an information foster skills for coping?
What I'd say is that the information environments complex, a lot of focus goes to the platforms and rightly. But they're also not isolated from everything else in that information environment, including different actors, media, and audiences. And there isn't a lot of systemic analysis across platforms. Media and the myriad actors who engage it. So yeah, we could use a lot more coordination, both inside organizations and between them and with other bodies. But it's just lacking across the board.
That's a great answer. Unfortunately, it won't fit in the tweet that's used to promote this podcast.
Yeah. That's a good point. We've raised that issue before the need to educate the citizenry even from a like a cyber security perspective, it's the same thing. You do these public awareness campaigns about how to spot bad links or malicious content.
It's much harder I think when it's just information for lack of a better term. Like just media being broadcast into your brain, that's a little bit more difficult. And I think we're hoping to see a little bit more, collaboration from the platform side on the intel, like the threat intel and the signatures that each cause they all have different algorithms that they're using to identify this stuff.
But I don't know that they're sharing that signal data about what they're using because there's some crossover there. But, okay. So let's turn now to the beginning, which is also the end of your paper. Which is the premise of your paper, is that truth should naturally prevail.
And the guiding question is: what do public health officials do when the scientific evidence is unclear or what might seem true today is no longer so tomorrow, right?
So that's the shifting sands argument again. So I guess. how did you get public buy in without concrete evidence?
How do we get to a point where these nuanced information says - let's restrict the conversation to public health - how do we get that into the public sphere in a reliable manner?
Yeah. So I'd actually argue that the premise of the paper wasn't the truth should prevail or will prevail. More it was playing on the many times that I've heard it claimed that more truth would somehow trumped disinformation. I think we need to recognize that there's a tendency to think in opposites, which we've already talked about. Truth versus fiction. All information, ideas, and opinions can be so neatly divided.
Moreover while truth is very important, a lot of people simply don't care about it, or they need answers to satisfy what is a neurological response to uncertainty. So I think we have to accept that we have a complex relationship with truth. For there to be public buy-in without evidence there should be trust.
And again, we come back to this, how many average people can name the chief medical officer and their country before this happened?
How much did they know or understand what would happen during a pandemic?
My guess is very few. Our governments don't do a great job of preparing us for things to be frank. For there to be trust there needs to be a relationship, and that takes time to establish and it needs to be ongoing. I wouldn't say that our team is actively tracking influence operations more so our work has been focused on providing a baseline of what is currently known about influence operations in the field, tackling it.
So we are working through quite a few baseline studies that we hope to publish in the coming months.
That's exciting. I look forward to it. Again, I thought that this paper took a more interesting tack than we have seen. Instead of just analyzing the way misinformation or influence operations are propagated, it's how do you communicate in that media sphere, which is very interesting. The paper came out in July and it's only September now, but have you already seen or observed any other trends as it relates to those baseline studies?
Yeah. Again, our trends, because we're looking at the baseline stuff, is again a bigger picture I would rather hold back on saying what we found to date and share those with you when they come out and we expect several of them to be coming out in the coming months.
Some of the ones we're looking at are like what types of initiatives exist researching and countering influence operations. I mentioned the policy proposals that have been made. We've also been looking at interventions tried. And we will be looking at the known effects of influence operations, but we also have data sets around the platform policies. We'll most likely come out with something on legislation.
So I'll probably hold those trends close until they are out the door.
Entirely fair and we very much look forward to reading them because, the work that you're doing is very important.
Yeah, I think we'll wrap it up there. Thank you once again for lending your time. I know you're very busy, especially with I'm sure a number of virtual panels but yeah, thank you for coming on again. And we look forward to reading the next bit of research.
Thanks for having me always glad to be back here.
Thanks for joining us.
So what would you recommend for public health officials as we expect to head into the fall where we may be facing a flu season and the progression of potential vaccines.
So I think it's actually worse than that in the US isn't it. Not only are you facing the pandemic still and a flu season, but also the election. And I'm not sure that public health officials have the bandwidth to change course on how they communicate at this point.
But if I had a magic wand, I'd give them solid partnerships with social media and internet platforms and the media, some sort of dedicated commitment to help ensure that their message was actually reaching the public.