Information and disinformation campaigns are centuries old, but our social media era has given new and rapid thrust to the sharing of ideas, both for good and ill intent. Meg Steenburgh and Peter W. Singer discuss his book, LikeWar: The Weaponization of Social Media, which analyzes the poisonous effects of disinformation on politics, war, and social issues worldwide. They look at the role of governments, laws, and individuals; and our collective responsibility to support digital literacy and engage in positive digital citizenship.
Peter Warren Singer is strategist at New America, a Professor of Practice at Arizona State University, and founder and managing partner at Useful Fiction LLC.
Thank you to our sponsor NBI.
Intro: Welcome to the official ABA Law Student Podcast, where we talk about issues that affect law students and recent grads from finals and graduation to the bar exam and finding a job. This show is your trusted resource for the next big step. You’re listening to the Legal Talk Network.
Meghan Steenburgh: Hello and welcome to another edition of the ABA Law Student Podcast. I’m Meg Steenburgh, 3L at Syracuse University College of Law JDi program. Today, we are honored to have with us Dr. P.W. Singer, strategist at New America, consultant for the U.S. Military, the Intelligence Community and Hollywood and author of several best-selling books, including ‘Wired for War’ and ‘Ghost Fleet.’ Today, we are speaking to him about a book called, ‘LikeWar: The Weaponization of Social Media’ that he co-wrote with Emerson T. Brooking. Nearing the end of your book right now, I’ve not finished it, fascinating, beyond fascinating. Thank you so much for joining us today. Dr. Singer.
Peter Warren Singer: Oh, thank you both for those kind words and for having me on.
Meghan Steenburgh: One thing you remind readers of early on is that information, disinformation campaigns are not new, centuries old, but the explosion of social media creates this whole new level in this realm and its really informational warfare. What in all of it was perhaps the most disturbing part of this for you?
Peter Warren Singer: Oh, interesting. So the concept of the book is that, as you noted, we’ve seen information spread and all sorts of different ways and they’ve always had an influence on politics on war, pop culture, you name it. But what social media has done is that it has brought together two different types of communication revolution, so to speak. So if you think about the technologies that change the game for communications, but the broader world, they either allowed us to have a new kind of one-to-one communication, the telegraph, the telephone, vast improvements. Or they allowed you to have a vastly better one-to-many conversation, the printing press or radio or television broadcasting out. And what social media did is that brought those two together so that you could simultaneously have a one-on-one conversation but also broadcasted out, and for the first time to the entire world in real-time. And that’s been incredibly powerful and it’s been used for both good and ill. In the book we looked at everything from awful examples like ISIS propaganda to good examples like Ice Bucket Challenge.
But I think what disturbed me the most is the incredibly poisonous effect and I mean that in all meanings of the term that it’s had on the American body politic and literally the American body. How easy it was to poison our system? When you think about the impact of myths and disinformation on everything from our democracy to the ray of conspiracy theories that you saw. Otherwise, very thoughtful people be taken in by and then spread within your own networks. We’ve all got that relative that friend who are like “Oh, wow” to — I think the way it’s played out during the pandemic and you know the comedian Jim Gaffigan talked about that where, you know, everyone had sort of that kooky friend and then all of a sudden they’re now saying really scary stuff, but it’s not just they’re saying it, they’re engaging in behavior. Then it doesn’t just endanger them, it endangers you and it endangers our kids.
Think about America. We’ve had coming on 250 years of experience at democracy, freedom of speech, healthy media, et cetera, and yet it poisoned our system. What about all those other states that didn’t have that experience? And we’re obviously not out of it. So kind of a long-winded way of getting at your question of what disturbed me the most is how pernicious and poisonous it’s been, and we’re not out of it yet.
Meghan Steenburgh: Full disclosure, I avoid social media at all costs. What are your social media habits? And how has this book and your research influence? How you look at everything that you see and disseminate?
Peter Warren Singer: Let’s be clear about this. First, you said, you avoid social media at all cost and yet you and I are recording a podcast right now.
Meghan Steenburgh: Very true.
Peter Warren Singer: Which will allow you to broadcast out to many. That podcast has everything from comment section at the bottom of it, where people will rate to how will this be advertised and spread out to the world.
It will be advertised in spread out through ABA’s Twitter accounts and Instagram and whatever else. I guessing that when you decide to go to a restaurant, you might look at Yelp for ratings, not just what the restaurant critic at the newspaper said, but what other people said or maybe you utilize some kind of navigation aid in driving to work or on a trip or when you’re planning what hotel to reserve to. My guess is you might and go on to LinkedIn to set, you know, let people know about your career and/or find jobs for it. My point is we’re actually all on social media unless you really, really are saying, “You know, what? I don’t use the internet.”
But there’s a second point that’s really powerful and it shows the effectiveness. There was a survey done of professional journalists. So the way it was old school, whether it was a newspaper reporter or a producer for, not a podcast, but a radio talk show, a cable news host. And over 90% of them said that they use social media to decide what stories to cover or not cover? What angle to take on those stories? Who to interview for those stories? And if it was still trending, whether to double down with another story, the next day in the newspaper, another segment on that radio talk show. So the point of it was even if you’re not on social media, which as we know a lot of people are, even if you’re going to old school and say, “I only read the newspaper, the physical version that comes to me” or “I only listen to the radio,” you’re actually still shaped by it. In fact, you might be shaped by it more by not having that awareness.
And so, the point is for the attacker if they can hack social, so to speak, if they can drive their ideas viral which we call LikeWar, they cannot just hack social media, they can hack radio, newspaper, television. So incredibly powerful. Okay. So then you said, “You know, what about my habits?” I’m like the doctor that saying, “Don’t smoke” and then you might find me hiding in the alleyway smoking myself. So I definitely anyone can see online. I’ve used social media and I use it for everything from, as I was pointing, to consume, to figure out what restaurants to go to, I put out my own articles to, I pop off about everything from my views about sports, to politics, to pop culture, to — I just shared an awful dad joke on Twitter earlier that combined politics and dad jokes. So it was about the trucker’s protest going around D.C. annoying people related to people talking about, “You know, how can we get Putin to stop?” And I said, “You know, can we not just give the trucker’s an off ramp?” But I’m bumped. And so, I apologize for that both joke. But the point is, I use social media like everyone else, and my kids will be embarrassed by it. Like I will be eventually embarrassed by what they put on social media.
Meghan Steenburgh: Well you hinted at so much of it and yes, to be fair, you nailed it. Most of those, I am a part of and so, yes, I am a part of social media. But in this web of information, misinformation, disinformation, you point out there’s a kid in a Russian factory, paid to create like words and then push the Russian messaging or messaging to so discontent, doesn’t necessarily have to be pro-Russian propaganda or anything, all seemingly as well very innocuous and liked push by others, 300 accounts to the opinions of 300 million plus voices. How do we distinguish that source of information?
Peter Warren Singer: So a couple of things here. The first is you put your finger on what we’ve seen out of Russian information warfare and they were the masters of it and large part because of first-mover advantage. They were the ones early on in this. And second, there were largely pushing on an open door. And so, if you look at the activity during the 2016 U.S. election, for example, yeah, the numbers are astounding when we pull back and think about it. There were over 3,000 documented Russian sock puppet accounts. It’s when there’s a real person like did behind a false front account. There are more than 60,000 Russian bot accounts. It’s an algorithm driving overall trends, incredibly effective in shaping conversation online. For example, on Election Day 2016, the six most red account was a false front account posing as if it was the Tennessee Republican Party, when it was not. It’s more about the, not the raw numbers, but the followers and who their elevating.
So that account why it was popular as it was being elevated out by other people with hundreds of accounts to retired Lieutenant General Michael Flynn turned convicted felon turn pardon term QAnon on conspiracy theorist. He has hundreds of thousands of followers and during this period of time, he elevated at least 25 of these different Russian information warfare accounts. He also elevated over 16 different bizarre conspiracy theories, everything from pizzagate, nonsense to claims that the UN was getting ready to take over the country.
The point is it’s not about the individual numbers, but how it elevates out? Same thing on ad buys, Facebook found reported after the election that 143 million Americans were unknowingly exposed to Russian propaganda on their Facebook account. That’s half the U.S. population. Now, the point of this is that a lot of that activity was possible back in 2016 because it was sort of a laissez-faire attitude from the operators of these platforms. It’s a very different situation right now. The low-hanging fruit of the bot accounts and some of the sock puppet accounts and the ad buys, things that were possible in 2016, not so possible now. And so, the platform companies and this gets to the legal side, belatedly responded out of a combination of fear of regulation, fear of lawsuits. Their own customers saying, “You know, I don’t like what’s playing out on in this space. They didn’t do it like any other company out of the goodness of their heart. They responded to market and political and legal pressures. So the landscape has changed and in turn the Russians changed.
In 2016, it was a strategy of injecting falsehood into the American body politic. By 2020, that was not the strategy. It was more about elevating our own extremes. So QAnon, Russia can’t claim that. That’s American baby. We created it. Now, Russia did pour a little kerosene on the fire to help it elevate. But it was already there on its own. Same thing about some of the things on the Far-Right and Far-Right extremism. Russia also pivoted to hiring people not within Russia, but in other nations as workarounds, Latin America, Africa, basically, people doing very cheap contract work, as he these sock puppets. Including also, as always injecting into the U.S. We’re not the only target to this. A study out of Australia found that over 30 different democracies have been targeted in this way. One of them was Ukraine back in 2014.
Now, what’s been interesting in this points to your second question is “Okay, how do we react to that? How do we get ahead of that?” And we’ve seen learning, we’ve seen response to it. We’ve seen changes not just what the platform companies allow or not, but how governments can respond, how corporations, how individuals can respond? And like everything, whether it’s cyber security or public health, there’s no one silver bullet solution, you need a response from all of these different actors and it’s not about eliminating the problem, it’s about driving wrist down.
And so, we can see on the government strategy side of vast difference between how the U.S., the presidential administrations handled it in 2016, or throughout the Trump years basically. In 2016, the Obama White House has sort of taken aback by this, not even ready for it. During the Trump years, an administration that had a conflicted relationship with miss and disinformation and fake news, but also how a challenge of running a proper interagency process. It was much more reactive to what the president was tweeting rather than kind of being very strategic. And I think even if you’re the most ardent republican, you would agree with that to the way it engaged with allies like a NATO very different than what we’ve seen on the Biden team, confrontational versus coordinated.
And so, you have that kind of change, but we’re still nowhere near where we need to be. If we want to really be top and class as in terms of democracies reacting to this threat, you want to look at what the nations like the Estonia, the Finland, et cetera have put into place corporate side. Again, as I mentioned very different reactions in 2016, 2018 compared to 2022. A lot of what the corporations have put into place. And the last couple of years has been reacting more to the pandemic than the democracy side, but the key takeaway is that there are a whole array of actions that the platform companies said were legally or technically impossible for them to do it. I mean, I literally had — vice president, one of these companies, kind of wag his hand in my face and like, “How could you think we could ever do something like this?” And then suddenly they did it.
It wasn’t technically or legally impossible. It’s just they relooked at it for a variety of reasons from they worried about, as I noted, the legal regulatory environment changing to the pandemic called it, “Wow. This is starting to affect us.” But then you have the final part which is to individual and we need to better equip the individual with what you might think of as cyber citizenship skills. What cyber citizenship skills are is recognizing that we’re all online, as you and I were talking about, we’re all online in some way, shape or form. Particularly, our kids are online, where shaped by it. If you’re a lawyer, whether it’s in your own legal practice to the clients that you represent, what plays out online matters. And so, because of that, we need better training for it. And cyber citizenship is bringing together three types of skill sets. One is digital literacy, understanding how the space works. It’s the ability to distinguish between fact and opinion to understanding the role of algorithms. How those shape? What you see online? When you go on YouTube or when you’re on Facebook or Instagram, you’re seeing things because of the algorithm.
A second skill set though, is what’s known as sort of the citizenship side of it. It’s not what you know, it’s how you behave. Don’t be an Internet troll. Think about what you post and how that relates to the friends and family, coworkers and your network. And then there’s a third category, which is threat awareness. It’s understand how are people trying to manipulate you online? Whether it’s Russian information warriors to anti-vaxxers, to just a company trying to market to you and you need all of those skills, if you could have great knowledge of fact versus opinion, but if you don’t know how someone’s trying to manipulate you that’s not going to be effective, or you could be very concerned about. You know, I want to be a good actor online, but if you don’t know how algorithms work, then you’re not going to be effective.
We need to teach that and unfortunately, we don’t do a good job of it and that we need to teach that goes back to the three different actors, how is government supporting the teaching of that in our schools? How were the corporations enabling that? Interesting, the platform companies, they provide bumps to you on classic cybersecurity, “Hey, how’s your password? Hey, what’s your cellphone so that we can help secure your account?” They don’t say “Hey, these are things you need to know to operate effectively in our space.” They don’t do that kind of bumps of training. And then there’s the final part, there’s the you and I how do we teach ourselves and our kid? And again, I think of that parallel of public health, there’s a role for government, there’s also a role for me, you know, washing your hand, wearing a mask, getting a vaccine. That’s not just about protecting yourself. That’s about you protecting others. It’s about a sense of responsibility but woven into awareness and education.
Meghan Steenburgh: Secretary of State Blinken recently announced that the Biden Administration stated Myanmar is military genocide or committed genocide against the Rohingya minority. In the book, as an example you actually use that to say social media fueled genocide. How did you come to that conclusion? And then with that, do you see this ultimately as someday some sort of defense like, “Oh, I didn’t know I was looking at this — these are the facts, and I reacted to what I thought were “the facts.””
Peter Warren Singer: A quick answer to the second question is we’re literally seeing that play out in U.S. courts right now a number of the January 6 defendants have made that kind of excuse, either excusing it as “I didn’t know” or “Someone else made me do it,” and they’ve specifically said, “The president told me to do it and so I did it,” and we’ll see the courts. Peter Singer, I am not a lawyer. I do not play a lawyer on TV, but I’ll use that parallel of a parent when my kid does something wrong that self-evidently wrong them saying, “Someone else told me.” I’m like, “Well, that’s fine. We’ll figure out accountability for them if we can, but that’s not an excuse for you.” And particularly when it’s very clear, okay, you beat up a policeman or you smashed the window or you went through a broken window, you went into a government facility. I mean, these are known rules, right? So don’t give me that BS. But it’s playing out in the courts right now in the U.S. let alone and other state. To the example of Myanmar, yeah, it was a mass killing that’s gone after over 600,000 people. That was both motivated and coordinated mostly online in terms of everything from urging the attacks to coordination for it.
And that actually is important to understand the different ways that social media platforms are used in different countries. Facebook used to be very proud and talk up how it essentially replaced much of the traditional media approaches in Myanmar. Government used it for everything from communication to reporting results in a way that you previously would have used TV or newspaper. Unfortunately, was also used for ill. Somewhat related to that is that at that time Facebook by now its own admission was not doing enough kind of monitoring in other nations of how its platforms were being used. So simultaneously to it, pushing and extolling. It didn’t have enough local language speakers to be able to go “Oh, wow. Okay, we’re seeing calls for genocide here.”
And that’s been a major issue for the company and you could say understandably has dedicated most of its content moderation efforts inside the U.S., but that is meant that there’s been large areas of the world where actually it’s arguably more influential in shaping attitudes, politics, physical, real-world actions that it hasn’t had that kind of investment in local language speaking and the like. So hopefully we’re seeing that change, but yeah that points to the example that we’ve seen similar instances in India — again, the way that you use technology to coordinate, motivate, previously you would have used radio. During the Rwandan Genocide, it was primarily radio that was the platform. Now you utilize social media.
Meghan Steenburgh: We are speaking with Dr. P.W. Singer, strategist, consultant and author. We’ll be right back.
Male: If you’re a law student getting ready for the next big step, why not check out the Young Lawyer Rising Podcast from the ABA Young Lawyers Division. Young Lawyer Rising covers issues pertinent and newly minted attorneys as well as lawyers were 10 plus years of experience and beyond, from dealing with that daily grind and career management to focusing on social issues, financial worries and more. Don’t be an outsider. Tune in and check us out. Young Lawyer Rising now on Apple podcast, Google podcast, Spotify, Amazon Music or best yet, your favorite podcasting app.
Meghan Steenburgh: And we are back now with Dr. P.W. Singer, strategist at new America, consultant for the U.S. Military, the Intelligence Community and Hollywood, and author of several best-selling books. You begin chapter six of your book, ‘LikeWar’ with a quote “media weapons can actually be more potent than atomic bombs.” It is from the propaganda handbook of the Islamic State. Did it surprise you to how the world responded to the media weapons at use, including one point you mentioned votes were taken on whether to kill someone, hold them hostage and then they showed the results. And there was one thing that really struck me and said, “You could go in a bathroom in a bar and come out with blood on your hands.” And it just hit home because all of these things are happening halfway around the world and yet you have influenced the end of someone’s life.
Peter Warren Singer: That’s an illustrative quote because it both shows the attitude of that individual, but also like so much online, it has large amounts of exaggeration and attention-grabbing woven into it, right? So do I believe that a tweet is more powerful than an atomic bomb? No, but this is the discourse of the propagandist for ISIS. And why he’s saying that is not that he would turn down an atomic bomb, but rather he knows the central role that social media played to the literal build out growth notoriety to tactical battle field operations of that group. Through social media ISIS was able to persuade some 30,000 people from over 90 different countries to travel to Iraq and Syria to join it, to join a group made up of people that never met before. It was the direct opposite of how Al-Qaeda operated. Al-Qaeda the name of it literally the translation of the term the base which was a reference to the mountain training camp in Afghanistan, where you had to be known by someone to get into it. And then only if you went through it where you sort of bumped up to the level of, for example the 9/11 attackers and the same thing the flip for social media. Besides inducing all these people to come to Iraq and Syria, ISIS is influencing people to conduct their own acts of terrorism everywhere from Paris to Texas. It’s also creating a greater fear of terrorism.
There’s a really interesting factoid that the fear of terrorism among Americans was greater after the kind of emergence and virality of ISIS online. Then, in the literal weeks after 9/11, think about that as a point of comparison. Weeks after 9/11, we’ve had 3,000 Americans killed. We don’t know what’s coming next and yet polling showed with a more fearful of terrorism after ISIS emerges at which point it had actually only killed one American civilian and at that point, no American soldiers. And yet we were more afraid of terrorism and that, of course, was a win for that group and then you see the effect of it on our own politics and the like. So it illustrates the power of this space but what was also notable is what ISIS was doing was not that original. It was just copycatting.
And so, you know, for LikeWar, we’ve had a very kind of dismal discussion, but we looked at examples that ranged from pop culture to politics, examples that ranged from Iraq and ISIS to Chicago street gangs to one of the interesting ones that brings these two together is how ISIS’s top recruiter was copycatting Taylor Swift and the strategies that she utilized to reach the top of her game. And so the lesson to take away from all of this is there’s a set of rules that apply across these different actors, across these different locales and of course these rules can be utilized for both good and bad purposes. It’s just up to us to understand them and either deploy them for our own good and or figure out when someone’s up to no good, how they’re trying to apply them to us. And so that’s what I think about it as sort of stunning when you look at what like an ISIS recruiter is doing, you’re like, “Ah, but that’s what like a regular teenager is doing.” Well, part of it is explained by the fact that they are coming from this world. The ISIS recruiter was actually a failed rapper.
Meghan Steenburgh: And in that mimicking Taylor Swift, I think you were speaking about at one point in the book, the authenticity and the role of authenticity and being relatable and even pulling cats into the equation to show like “Look how relatable I am. Come join ISIS.”
Peter Warren Singer: Yeah, and when we looked at whether something went viral, again, whether it was something good or bad, whether it was Ice Bucket Challenge to ISIS propaganda, jokes remember pizza rat to pernicious disinformation about a war, about a disease. They consistently had these attributes. There’s several of them but one is was authenticity. And it doesn’t actually have to be real authenticity because everything online, it’s often self-consciously presented out there. It’s the combination of “Yes, it really is me,” but it might be the third or fourth take that you did that selfie or you got to get the lighting right for that live YouTube hit. So it’s live, it really is you you’re not in a studio, but it’s sort of curated. So there’s that authenticity, there’s that relatability, there’s the humanization effect, there’s a lot of different things going on there.
Again, one of the funny observations is the way that Wendy’s was more authentic online than Hillary Clinton. Hillary Clinton is a real person, but she did not come across authentic online. Where’s Wendy’s, it was a real, it’s not anymore, it’s a hamburger chain, but yet the way that they operated online came across more authentic and did a better job of winning hearts and minds. Cats, I joke there the most effective weapon and all of information warfare and the idea of soldiers posing with cats, terrorists posing with cats is absurd and yet it’s a tactic that’s been used for good and ill by ISIS members to, we’ve seen them deployed in the Ukraine war and it’s both cats are cute, cuddly, it’s also like every other meme, it’s kind of building on something else and internet cats has a long-running history to it. And so, yeah, it’s one of these just strange weird things of the online world.
Meghan Steenburgh: Since you brought it up memes, I want to pivot to the weaponization of memes. I believe — I’m not sure if this — you can correct me here with it. It originated with DARPA, but the utility of memes for influence campaigns transforming memes into weapons, he who controls memes controls the world came from somebody else. And so embracing this memetic warfare. What does that mean? Is that what this means with cats and —
Peter Warren Singer: That one was a really interesting. What you’re pulling from is it was a report —
You know, this is a challenge of writing book, “Should I try to go back and remember that that footnote reference?” But if I recall it correctly, yes, that was a report related to how can we weaponize memes and the idea of memetic warfare with the idea that — how do you drive ideas viral. This one of the most effective memes to do so and memes work by building on past understandings, past cultural sort of touchstones, but then you take it in a new direction. And it was a report looking at, “Look, this is really effective” and in fact, it’s one of the ways to shape both hearts and minds online, of course, with real-world effects. And then what was notable about it is the same person who did that report later became one of the key people in supporting the Trump campaign and conducting the lessons of memetic warfare on its behalf.
There’s memes era of all. They’re everything from Star Wars to cats, to Pepe the Frog, religion has a certain aspect of memes to it. It takes past understandings and builds upon it and takes it in the new layers. It’s an effective tool. It’s an effective approach. We need to be very, very careful here though just like if we’re thinking about a lawsuit or battlefield behavior. Let’s not over exaggerate it. Let’s not say, “Oh, it’s only the meme that will win out.” No, you know, you have to still have your day in court. You have to still operate on the battlefield. But if you ignore this space, you will not do well at it because the information space is what shapes all of these other actions. So if we use Ukraine as an example, the information side is what has kept the Ukrainian public and body politic together when Putin’s goal from the very start was to fracture it and rapidly collapse it.
It was also important to keeping the wider world on the side of Ukraine and providing everything from the literally tens of thousands of anti-tank and anti-aircraft weapons that are keeping Ukraine in the fight and causing such physical pain to the Russian army to it’s also why we’re seeing on the geopolitical and geo-economic side, the massive sanctions from governments, but also corporations joining in as well, which of course, is causing the economic and financial pain within Russia, which is what is most likely to shape Putin’s decision making, not just merely “I lost soldiers.” I mean, I don’t know Vladimir Putin, but my guess is, he’s probably not too heartbroken by a Russian soldier dying, but he is worried about the collapse of the Russian economy and what that means for his own political rule. And so, the information fight matters. And we’ve seen for Russia, if you’ve lost Switzerland, Switzerland putting sanctions on you or oil and gas companies are joining in or pulling out. When Halliburton saying, “You know what, this is just too much, Russ. We’re out.” That means you’ve lost the information fight, and that has real consequences, right?
On the military side we saw Germany go from saying, “You know what? We are not going to give military aid. At most, we’re going to give 500 helmets to Ukraine.” I’m not making that number up. That was their original plan. And then the online world including their own body politics, “No, no, no, this is not.” And Germany within a matter of days pivots and says “Remember when we said we were going to give 500 helmets to Ukraine, not going to matter that much, we’re going to give hundreds of pens or fowls which are any tank rockets.” And so, the point of what I’m getting at is this matters in the same way that if you’re a lawyer, none of this matters. It’s, you know, I live in the realm of law. Well, it matters in shaping everything from what juries and judges believe to — I’ve spoken to gatherings of state attorney general’s and state judges, who are deeply concerned about the effect of this on their rulings and whether they are respected or not concerns about what it means for law, to how you are thinking about your own law firm, and whether people come to you, whether they trust you. That’s all shaped by the online world. So it doesn’t mean it’s the only thing, that’s how it all wraps together.
Meghan Steenburgh: And You’re moving into my next question in this Russia’s war with Ukraine and LikeWar hitting the battlefields and influence this war.
We’ve also seen AI come to this, in terms of emulating speech, stitching together, the images and videos. We saw Russia try to do it with President Zelensky in fake announcement. You say, this line is only going to get more and more blurred. Where is this all heading?
Peter Warren Singer: That’s a great question. Where’s it all heading? I think like so much else, it takes the ideas and concepts that we first saw in science fiction and they become real. They become our normal. And look, that’s played out with everything from flying machines to you and I communicating across a computer network. So what we have seen with artificial intelligence is let’s pull back on this. We’re living through one of the most momentous technology developments in all of human history. It’s very bold to say that, but we’ve been talking about the advent of AI for literally thousands of years. And you can find discussions of it and ancient Greek mythology or Judaic texts.
If you’re a Sci-Fi person, we’ve been talking about it for over 100 years and it’s now coming true. And I’m not saying like, “Oh, it’s,” you know. It means it’s coming to life. I just mean, we’re using the various forms of whatever you want to call it, machine, intelligence, neural nets, et cetera, to shape our world to take on roles that humans used to only be able to do as in a side a more recent book, I did called Burn-In looks at this. What are the different uses of AI? What are the ways it’s going to play out in the world? Everything from how was Amazon going to use it to the police to one of the main characters and that is a contract lawyer who sees their role automated and that sounds like crazy sci-fi except contract law is an area pays very well. However, right now, not 30 years in the future, an AI can actually look at a contract and on average find more errors to fix than a trained human contract lawyer is, so warning.
But the point Burn-In explores that taking it to the social media side. Yeah, you know, think about all the challenges that we’ve had with this weaponization. Again, everything from a politics, to marketing, to your kids, to coronavirus, and that’s without AI, that’s without what’s popularly known deep fakes. Using an AI to create something that’s incredibly difficult for a human to figure out whether it’s real or not a pocket of audio a video. And we’re starting to see that introduced. Like everything else on the internet, it will be both marketized and weaponized. It will be used for good. It will be used for ill marketization.
We have already seen companies offer up fake people for use in your internet ads. One entity even explored using it to make your workforce look more diverse than it actually was, and its online presentation. We’ve also seen the politics side. The first use of deep fakes in politics, was one targeting the Belgian Premier. It was the Belgian Premier giving a speech on coronavirus that they never actually gave. And then as you noted, we saw it pop up during the Ukraine conflict. There was a deep fake of Zelenskyy surrendering when, of course, he did not actually give that speech. Now, these early versions are relatively easy to distinguish. They got certain tells within them, but of course each new generation gets better and better, and so it becomes more and more difficult. And then we get to the questions of what do we do about this? And the answers range from as we talked about how do we get people to understand and recognize that they exist and be aware of them to what is the role of regulation? Should it be something that government intervenes into and says this is not allowed or is it something that is self-regulatory? The platform companies decide what is allowed or not vis-à-vis deep fakes.
My own take on it. Again, I’m not claiming to be a lawyer but my belief is that you can’t ban them overall that would cross first amendment issues because deep fakes — again, they’ve been used for politics for marketing. There are aspects of it related to the methodology. You could argue freedom of speech, but what I would like is the companies or some kind of regulatory environment that doesn’t ban them but essentially puts a watermark on them. So, you can create a deep fake. You can utilize it, but by having that little watermark, it allows the positive uses of it for entertainment or marketing, but it takes away the pernicious manipulation.
If it’s got the little watermark in the corner, you can say, “I enjoy it. I still get the fruits of it, but I know this is a fake.” And so, that watermark to me is a sort of the equivalent of a blue check on Twitter, is that actually real Donald Trump or you just posing as him. Blue check allows us to see and/or not. So I’d want the watermark approach and it would also move the company’s way from trying to figure out the content, “Am I okay with this content or that content,” and just said, “You know what? Did you follow the rules? Did you put the watermark or you trying to take advantage of people?” “Oh, you didn’t. So, I don’t care whether you’re using it for marketing or your deep fake is going after this politician, you violated the rule on how everyone is supposed to act related to deep fakes.” That’s my own sense of how we approach it.
Meghan Steenburgh: From your perspective, I appreciate. This would be my last question because you’ve just been so gracious with your time, but I want to conclude by asking like what is your advice for law students as they’re heading out — I mean anyone can listen to this, but from that perspective, from your perspective and speaking to law students, what would you say in this context?
Peter Warren Singer: I think it goes back to those layers of activity and our own role within them. So there’s the role of government and many of the folks listening to this are going to go into government, or they will be shaping the laws that government follows and implements to their role as a citizen. They’ll be voting on it. And so being sensitive to what can government do in this space to better serve and protect, so to speak, our population, our democracy, our public health system, when it comes to these new kind of information threats. We can no longer ignore them as not mattering. We know they matter now, so let’s catch up to it and let’s bring in best practices and best practices including from other democracies. So constantly being willing to learn and implement on the governmental side.
On the corporate side, many people will be working for corporations or representing them and/or customers to them. And much of the ills of this space came out of a combination of attitude that was “Well, our product can only be for the good. We are good people and, oh, by the way us making profit and expanding the product to the world could only be for the good.” Facebook for example used to have a commercial that was the more you connect the better it gets, that was back in 2012. Now that sounds really creepy, right? If you think about it, “The more you connect, the better it gets.” Did it really? But it’s not just about that attitudinal, it’s about getting ahead of problems. Much of the problems for the companies has been when they waited for the bad thing to happen and they only responded after the fact. It’s known as red teaming or we use an approach called useful fiction. Basically you, in a realistic way, explore scenarios of what might happen.
So take the example of live video broadcast via your social media. Facebook was taken aback when teenagers used it to broadcast out their suicides. When terrorists used it to broadcast down a mass killing anyone who knew anything about teenagers or terrorists could have said, “Yeah, this might happen.” What are you going to do to have a policy for it rather than waiting for the bad thing to happen and then developing a policy? And that’s the same thing when we think about like ills like vaccine disinformation. It was obviously, it was going to come, but we waited for the bad thing. So as a company as someone who advises companies, how do you help them understand the difference between doing good and doing well, making a profit, which is doing well, one business is what they’re designed to do. But doing good, which is the moral side. They don’t always perfectly align. That’s just the nature of the beast.
And then second, how can you aid them to get ahead of problems through visualizations rather than waiting for it. And then the final aspect is us as individuals, what is our role and responsibility and who we are? We are citizens. We are consumers where that act Peter W. Singer on Twitter or you know whatever your handle is. That’s who you are. You’re also potentially a parent. You’re a family member. So, how are you individually seeking to educate yourself and seeking to be a responsible actor online because — we end the book book on the idea that you are now what you share, and what you share actually shows who you truly are.
By that, meaning, the online world looks at you and the real world looks at you by how you portray yourself online. But how you portray yourself online actually reveals a lot about you and your own. I don’t just mean like where you live and your politics, but like who you truly are. If you’re an Internet troll, it shows me something about you. If you’re sharing anti-vax or conspiracy theory, it shows me something about who you truly are. And so having that sense of awareness, and ethic is something we all need to take into our lives because we’re responsible, not just for ourselves, but everyone else that’s in our Network.
Meghan Steenburgh: Dr. P.W. Singer, strategist at New America, consultant for the U.S. Military, the Intelligence Community in Hollywood, author of several best-selling books, thank you for joining us.
Thank you for listening. I hope you enjoyed this episode of the law student podcast. I’d like to invite you to subscribe to the ABA Law Student Podcast on Apple podcast. You can also reach us on Facebook at ABA for Law Students and on Twitter at ABA LSD. That’s it for now. I’m Meg Steinberg. Thank you for listening.
Outro: If you’d like more information about what you’ve heard today, please visit legaltalknetwork.com. Subscribe via iTunes and RSS. Find us on Twitter and Facebook or download our free Legal Talk Network app in Google Play and iTunes. Remember, U.S. law students at ABA accredited schools can join the ABA for free. Join now at americanbar.org/lawstudent.
The views expressed by the participants of this program are their own and do not represent the views of nor are they endorsed by Legal Talk Network, its officers, directors, employees, agents, representatives, shareholders and subsidiaries. None of the content should be considered legal advice. As always, consult a lawyer.