EthicalVoices

The Rising Threat of Disinformation for Hire: Special Interview with Buzzfeed News’ Craig Silverman

Joining me on this week’s episode is Craig Silverman, a reporter at Buzzfeed, who along with two of his colleagues in January wrote one of the most chilling articles I have read in a while, “Disinformation for Hire: How a New Breed of PR Firms is Selling Lies Online”. I wanted to have him as a guest to discuss the article and its implications for communication professionals and society.

I encourage everyone to go to Buzzfeed and read the article, but can you give us a synopsis of what your research found?

This actually is an article that was roughly about a year in the making. Myself and my colleague Jane Lytvynenko at Buzzfeed News, focus on disinformation and the broader umbrella of digital media manipulation. That includes search engine manipulation, digital advertising fraud, bots, trolls, all that kind of stuff.

We started noticing platforms like Facebook and Twitter and others began doing more take downs of what they described as disinformation operations and, information operations. Campaigns that use fake accounts, that use false information, that manipulate their platforms to try and influence people. A lot of times when they would announce, “Okay, we’ve taken down 10 pages and 50 accounts,” they would often start citing the names of firms. When you looked at their websites, they described themselves as a public relations firm, a digital marketing firm, or a digital PR firm.

We started paying attention to this and saying, “Wait a second. It’s not just state actors. It’s not just individuals. There seems to be consultants who are involved.” And so, we spent time basically collecting all of the examples that have been publicly disclosed by platforms, by security companies, and research companies. We found that since 2011, there’ve been at least 27 of these kinds of digital information operations that have been partially or wholly attributed to PR or marketing firms. And 19 of those happened in just 2019 alone. It seems to be an accelerating trend.

I’ve been discussing this stuff for years. I say we’re entering the disinformation age. But many of the examples I saw in the article you gave had to do with politics. Is that where you’re seeing the focus or do you have examples that deal with the disinformation in business?

That’s a really interesting distinction to make because there’s definitely a bias in this data set towards more political operations, and there’s a couple of reasons for that. You will see in the data we’ve got and the article that we focus on, a lot of it is about politics, because that’s where Facebook, Twitter, Google have focused. They see a lot of urgency in investigating attempts to manipulate elections and manipulate public debate. They see that as a very high priority. They didn’t a few years ago.

I think it’s fair to ask whether they are investing enough in identifying this stuff, but they are definitely worried about regulation. They are worried about more bad press. And so, they have really put an emphasis on attempts to manipulate the electoral process and public opinion. You do see states who will hire a PR firm. And I know people in PR don’t like these firms being referred to as PR firms, but that’s how they self-describe themselves. They will say, “We want to make sure that our voice and our perspective is heard on social media.”

And these firms will often go out and they will hire paid trolls to go out and spread information, not necessarily false, but just people using fake accounts, making it look like there’s a groundswell of support when there isn’t. And it will also happen during elections where specific candidates and parties will use this. It’s very established in the Philippines, especially during election time. And so, politics is a huge place where this happens, but it is not the only place. There is the rise of influencers on Instagram and other places. I think that whole world is rife with people with fake followers, fake engagement and there are agencies helping them get this for sure.

Most of my career, I have focused on financial technology and anti-fraud, and I just worry about the factories of fraud doing identity fraud when they realize they can start doing this and creating fake news and day traders that are shorting stock can capitalize on it. There’s going to be a whole other industry that really comes up as these organizations realize how much money they can make manipulating the stock prices.

Yes. I think the stock price one is an example where it can play out for sure. Where it’s happening with a less sort of immediate kind of financial repercussion is people who are helping bury negative results about executives, people, and companies in search, so that the person is able to in effect reinvent themselves and get back into business. There are companies and others trying to kind of scrub out negative results.

I think we have a digital media environment that is rife with manipulation and that’s because we have this wonderful democratized thing. It’s easier for anyone to get out there, potentially find an audience and get your message out directly. This is something communication professionals love. They don’t always want to deal with someone in the media like me, right?

The downside of that is because it’s more democratized, because these platforms are so massive and basically unable for them to completely manage them and provide oversight to them, we have this wild west, and it is manifesting itself in lots of different forms of manipulation.

I noticed most of the examples you gave came from Asia, Europe, and Africa. You’ve mentioned the Philippines, quite a bit. Did you find anything going on in the Americas, in Canada or the US?

There was certainly a lot less. And it’s one of these scenarios where you can’t say for sure it’s not going on. It just means maybe it hasn’t been revealed yet. Maybe they’ve been so good at it that they haven’t been exposed.

There was a case we learned after we did the story in Canada where a PR firm seemed to be using fake accounts on behalf of a client. I think we see some of that happening where there are firms who are using borderline practices to get Twitter followers for a client’s personal account or a corporate account. I think there are a wide number of examples of businesses and others who probably have paid for engagement. There was a big New York Times investigation into a company that was selling Twitter followers and there were professionals and businesses who had bought followers.

I think that’s the level we’ll see it at. We are not necessarily seeing this kind of thing where you have firms walking around in Canada and the US rather openly pulling out a rate card saying, “And here’s our troll farm and here’s false information about your opponent.” It’s a little more hidden. And I would say obviously we have an environment in the US and Canada where there is a lot of disinformation and manipulation. It just seems like there’s not people hanging out a shingle saying, “Hey, this is my business. Come and see me,” as overtly as in some other places.

The two areas I’m seeing the most unethical behaviors are helping you bury bad news on Google, and editing Wikipedia and make your Wikipedia entry look perfect.

Do you see the situation becoming worse in the coming years?

Unfortunately, I don’t see why it wouldn’t at this point. It’s definitely true that Facebook and Twitter and others have bigger teams investigating this stuff than ever before. There are more journalists and think tanks and security firms paying attention to this stuff. That’s all a very encouraging.

But the reality is that the world is very big. There is a lot of money to be made doing these services, and I think one of the reason why these sort of professionalized firms are so appealing is that there often is this buffer, this plausible deniability where it’s like, okay, maybe the firm gets caught, but in most of these take downs that you see, they’re not able to definitively say, “And here’s who this firm’s client was.”

For that reason alone, people who are ready to cheat and have the money to be able to do it, there is a market there for them. And if there is a market there for these kinds of services, you know people are going to try and fill it.

What do you think reporters and PR professionals should do to fight this?

Well, one thing I would say for PR professionals, and I’ll also note, very early in my career, I spent a few years doing communication at a startup. I was on the other side at one point for about two or three years and got to see that. And I think actually it informs my work as a journalist a lot because I certainly saw a lot of bad journalists writing about our organization. So, I have a sense of humility about our work.

But on the PR side of it, I think one of the things that is there is in the last few years, you’ve seen really big PR associations, global PR associations, reinforcing existing ethical rules or introducing new ones that have been updated for the social media age. And I think that is a very important thing.

Right now, it’s very much a self- regulated industry. And I wonder what other kinds accountability and enforcement could actually happen there, because if you want to be able to say, “Listen, this stuff that these firms are doing, this is not public relations, this is not my world, any ethical PR person would never do this stuff,” I think it’s important to think about how you can create those distinctions as much as possible. And I think that’s something that we in the news business are struggling with too, because, and this has been a very strong part of my reporting of the last few years, is it is very easy to masquerade as something that looks like a real news organization.

Absolutely.

Right? In this environment, we have to figure out what are the signals, what are the cues, what are the enforcement practices that we can actually as PR or journalism professionals, get adopted into platforms. What can we start to enforce within our own ranks, so that the average person maybe has some pathways and as I say, signals they can follow that will help them to distinguish the difference. But those signals only happen if we take care of our own business and our own houses. And I think we have to make sure that there is accountability and that self-regulation, which is the reality in journalism for the most part and the reality in PR, has to be effective and it has to have teeth.

That’s why it was great to see PRCA take action against Bell Pottinger. PRSA now has a grievance policy. I think there’s small steps. There needs to be a lot more to be done, and especially as there’s no licensing, anybody can say that they do PR even though they’re not. That is one of the challenges we’re going to face.

I think the big threat probably in PR are really unethical reputation consultants, really unethical firms saying, “We’ll scrub this, we’ll scrub that.” Really unethical firms who as a rule are happy to buy followers or use really gray hat practices to get followers, to get engagement. That’s I think where the most enforcement would probably need to take place. I did a story about these shady reputation consultants about six months ago. And when I was interviewing a lot of the sort of more ethical search engine optimization people, I was asking, “Is there an association? Do you guys have a code of ethics?” And they’re like, “No, no, there’s nothing like that.” I think figuring out how that stuff, which is probably the most prevalent, can actually be exposed and called out by people in the industry is a good first place to look.

That’s a great piece of advice. And thinking of advice, what is the best piece of ethics advice you were ever given?

Oh, that’s a really good question. I think the Golden Rule is probably the best piece of ethics advice I’ve ever heard, which is to, “Do unto others as you would have them do unto you.” And I think that’s really, really important. I internalize that a lot in journalism because you have to think about how you’re representing people and you have to think about on the most basic level, someone is giving you their time, they are giving you their expertise, they are maybe letting you into their home, they are talking about sensitive things. You are a keeper of that, and you have to balance the newsworthiness and all of the elements that you’re thinking of with the basic humanity of it. And I think that’s a really, really important thing.

And one other one has been really central to me in journalism. Before I was focused on media manipulation, I actually spent a decade running a blog and I wrote a book about corrections and accuracy and plagiarism and fabrication in journalism. I spent a year as almost a journalism cop pulling out the bad stuff in our industry. And for me, one of the things that is fundamental to that and fundamental for me to building trust is this paradox that the more we are willing to admit our mistakes and our errors, to show that we are human and fallible…to genuinely own up for them, the more we can actually earn people’s trust. That for me was a really big lesson.

This is a practice that’s been in place starting with newspapers for literally hundreds of years. I actually found the very first newspaper published in Massachusetts, and it was the 1600s. The guy who wrote it, made a promise. He said, “If there’s anything incorrect in the previous edition, we will correct it in the next.” And he said that in the 1600s. And I think there’s something fundamentally human about that, realizing that you actually earn trust by being willing to admit your mistakes because people do not expect perfection and they are in fact very suspicious of the aura of perfection.

That’s great advice, not just for journalists but also for communications professionals and brands. When you make a mistake, own it.

Yeah, absolutely.

that’s the really big thing where so many people fall down, whether it’s journalists, communicators, brands or what have you. You feel really bad when this happens or you’re really terrified your stock price is falling or whatever. You’re just trying to get past it and it’s uncomfortable. The natural reaction is to not really lean into that mistake and not really take it all on and make a sort of full-hearted apology around it. It’s often, how do we minimize, how do we sort of say sort of what needs to be said to placate it but then be able to move on. And that’s often a recipe for an additional sense of blow-back.

I’ve seen that happen in journalism so many times where it’s a really bad mistake and the publication will run a correction, but it’s written in very unhuman, stilted language. Maybe you falsely accuse somebody of a crime or just made a really terrible mistake and without sort of acknowledging like, “Listen, this was really bad and we feel really, really bad about it,” but instead adopting some kind of pseudo-institutional voice and not acknowledging what really the mistake was, I mean, that’s just a recipe for it to keep going.

Is there anything else communicators should keep in mind?

The bad actors are often at the bleeding edge of technology. And in our story, we have this guy in Taiwan. He learned from people in China how to use a mixture of artificial intelligence, scraping a few different technologies, to put them all together into this sort of one stop shop online opinion manipulation. And I think people need to be prepared for meeting that technological match. The bad actors are always looking for new ways to exploit these systems. They’re learning from how they got caught last time and adapting to that. So just keep in mind that if you’re someone in PR who really wants to stamp these players out, whatever we have in our article, they’re going to read to and they’re going to adapt and they’re going to try to make sure they don’t end up in the next article.

You’ve got to think about that. And as you think about trying to enforce ethical practices in your industry and us thinking about it in ours, we have to really, I think, respect the adversary and not view it as something that is so marginal or somewhere else overseas that is not going to affect us. The lesson of the internet has been that the stuff that is often happening overseas or in other places, it’s only a matter of time until it gets to your shores.

Listen to the full interview, with bonus content, here:

Note: I first saw this story on the PRSA Ethics in Communication community. If you care about ethics and are a PRSA member, you should join it.

Mark McClennan, APR, Fellow PRSA
Follow Me
Mark W. McClennan, APR, Fellow PRSA, is the general manager of C+C's Boston office. C+C is a communications agency all about the good and purpose-driven brands. He has more than 20 years of tech and fintech agency experience, served as the 2016 National Chair of PRSA, drove the creation of the PRSA Ethics App and is the host of EthicalVoices.com

2 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *