EthicalVoices

The Most Ethical PR Professionals Don’t Just Fight Fires, They Prevent Them – Melanie Ensign

Joining me on this week’s episode is Melanie Ensign, the head of security, privacy, and engineering communications for Uber. Prior to that, she worked as the security communications manager for Facebook.

Why don’t you tell our listeners a little bit more about yourself and your career?

I knew at a very young age that I wanted to be a shark scientist, so I went to college the first time for marine biology. A couple of existential identity crises later in college and I ended up completing my degree in communication, and I was really fortunate that the undergraduate school that I went to had a really heavy focus on research, behavioral science, and the psychology aspect behind human communication and interpersonal relationships.

I had a really good, solid foundation but I felt like I really needed a stronger understanding of, “How do you make money doing this from a business perspective?” And, “What is the value for organizations as a communications professional?” So, I ended up getting my graduate degree in corporate PR from a program that had good emphasis on management and investor relations and ethics, and could bring a well-rounded business perspective to the communications profession.

My first PR job was actually doing PR for Burson-Marsteller in New York, so I was doing PR for PR, which was an interesting way to start, but it gave me the opportunity to get to know all the big players, at least from an agency perspective, and get a sense of everybody’s capabilities, leadership styles, client portfolios and things like that. And I ended up moving to a different agency where I started working on some environmental and tech clients. I took what I had learned from marine biology and doing various conservation work with shark scientists and really helping organizations translate things like environmental footprint and different scientific concepts into something that was a little bit more relatable and understandable for their stakeholders.

And by complete chance and a very fortunate accident, I had the opportunity to start working with the chief security office at a very large telecommunications company here in the US, and I worked with them for about six years, and that’s really where I cut my teeth in cybersecurity and privacy. That experience included security incidents and the Snowden revelation. This helped my firm at the time build out a bigger practice on cybersecurity and helping companies who weren’t even tech companies sometimes but needed to have a security story or needed to be able to explain certain data protection issues to their constituents.

I did spend some time at Facebook, working with their security team as well, and I have now been at Uber for just over three years. It turns out there is a lot in common between sharks and hackers, and so I actually get to use a lot of what I learned in school.

What is the most difficult ethical challenge you ever confronted at work?

I have to it has really been not any one particular incident, but the day to day decisions. This is where most of the grinding happens. Decisions are either going to recharge you or wear on you. I really think it’s keeping up the day to day commitment and being mindful that most of the decisions that we’re making as communications advisors have some component of ethics to them.

People tend to glorify firefighting in our profession, and I think unfortunately that leads a lot of communications professionals to think that they’re most valuable when there’s a crisis.

And the reality is, is we are incredibly valuable prior to that moment happening in helping our organization build muscle memory and resiliency so that they’re better prepared to do the right thing when one of these big moments happens. Quite honestly, there’s a lot that we can do to steer our organizations in a better direction so we can actually avoid some of these minefields. And for me, that has been the most challenging, is just making sure that I’m identifying some of these decisions that may seem innocuous right now, but is in fact just another step in boiling a frog. It is working to make sure that we don’t actually end up in a position where we don’t expect to be, just because we weren’t careful about our choices early on.

One of the reasons why I feel like this is the most challenging and difficult, is this is where you get a lot of the internal politics in an organization. You must be willing to stand up every single day for what the right thing is. I think that is a lot harder for most communications professionals in our organizations compared to when a crisis is happening and everybody looks at the PR person for an answer. Making sure that you’re able to insert your voice and to represent that consciousness of the organization on a daily basis, and with all of the teams that you work with, I think that’s what makes it the most challenging.

When the spotlight is not on you, how do you make your voice heard so that others are really listening to you?

One of the things that I learned when I started working in house is to not waiting until somebody drops something on your desk and expects you to be a mouthpiece for something. The teams that I work with know that I want to be involved from the very beginning. At Uber, I am part of our product review development process. I am part of our policy discussions. I work really closely with our legal team and our engineers to make sure that from the very beginning when we build something, we’re thinking about not just the external perception of something, but are we setting a precedent for ourselves as an organization that we’re not going to want to be in down the road?

We need to get more involved in more steps of the process and not just allow ourselves to be used as a mouthpiece for somebody else’s story. To really understand the thought process that led to a decision, to understand the technical process in a way that something was built, the onus is on us to go be part of those conversations and to ask our teams to include us. I think a lot of times people just assume that we don’t want to be involved until there’s something to talk about externally. That’s not the case if what you’re trying to do is guide your organization away from potential issues.

You’ve spent most of your career looking at privacy and security. Are there any ethics issues that you think are really important for people to keep in mind when it comes to privacy and security?

I think one of the biggest things, and fortunately I’m not the only one in the industry who has started talking about this, in security and privacy there is a huge tendency to rely on tactics that we refer to with as FUD, which is fear, uncertainty, and doubt. A lot of this comes from the vendor sector, the security companies who are trying to sell solutions.

And to be fair, I’ve been in their shoes. I have done that. I understand the angst and why that feels like an obvious path for them to go down. But we can do a better job of being more sophisticated and educating our internal and external clients on why that is such a dangerous path if our goal is in fact to protect people.

If you understand how the brain responds to that type of stimuli, then you understand that it’s actually counterproductive to education or information retention. I think that is why we see the same issues popping up over and over and over again in the data protection space. We are simultaneously trying to scare people while shoving new information at them. And psychologically, that’s just not how the cognitive mind works.

We have actually created this really dangerous situation within the security space where the topic in general demobilizes a lot of people. They will tune out. We have warning fatigue, because there’s too many things coming at you at once. When people are scared of something, they’re less likely to pay attention to it. It’s hard for them to make good decisions. It’s important to understand how the human brain actually responds to stimuli like that.

When I see a vendor or any company, or even a reporter, who’s going down this path of fear mongering, I know right away that your number one priority is not actually protecting people. Because if it were, you would be thinking differently about, “How do we engage with you on a topic that does have some risk?” It’s not as if this is all sunshine and rainbows, there is risk involved, but I need to talk to you about risk in a way that’s actually going to help you grasp the concept and to be able to take action rather than make it something that you want to run away from or stick your head in the sand.

What is the way you approach it rather than FUD?

An analogy that I use all the time when I’m explaining this to people is it is like the process of learning how to scuba dive. This is obviously something that is near and dear to my heart anyways, as an avid scuba diver and a shark advocate. But it is not natural for humans to breathe underwater. When you try to breathe under water, every part of your body is just avidly against this process. And so, when you’re starting out, there can be a lot of anxiety for new divers. This is another area where there is risk. You need to follow certain rules or certain protocols. We need to be able to check the equipment to meet certain standards.

It is not as if there is no risk to scuba diving. There is very high risk in certain cases. Yet if we didn’t teach and train people how to manage that risk and how to protect themselves, imagine all of the things that we would never know about the ocean, right? All of the experiences that we as humans could have missed, the scientific knowledge that we wouldn’t have.

It’s just a matter of understanding, “Why is it that people are worried about this?” And then recognizing that information and curiosity is actually the antidote to that.

If somebody is scared of something, it’s usually because they don’t understand it or because they’re lacking information. I teach my teams to look at this as if we’re teaching somebody to scuba dive. There are risks. We need to be really honest about those risks, otherwise you might make a dangerous mistake. But if we fixate on that, we’re never going to get them to the next step, which is helping them have a productive and enjoyable experience on the internet or using another type of technology that is actually going to help them in their life or add some entertainment or enjoyment value.

There’s a security conference every year. We host it out in Hawaii and every year we switch islands, and I started developing a separate track for this conference that we refer to as the dive track, and I’ve started helping more security engineers and security professionals get certified in scuba diving and to take them through that process with a dive master so that they now have a very recent experience of going through that fear on a different topic. They now personally understand what it was like to overcome that anxiety and the experience they had once they got to the other side of that. I tell them to pay really close attention to things that the dive master is doing to help guide them through that learning experience.

Because in the world of cybersecurity we have to be the dive masters. We are responsible for protecting these people and we are responsible for teaching them what they can do to protect themselves. Because even as a dive master can recognize the signs, the diver has to keep their regulator in their mouth. I’m always looking for those opportunities of understanding what is the basis of the fear, what is the information that they’re missing, and what is the experience that they’re trying to get to that I can leverage to help motivate them through that initial anxiety.

Beyond security and privacy, are there any other issues you’re seeing as some of the key ethics challenges for communicators today and tomorrow?

I think a big issue that we’re all kind of facing is the degrading of integrity and trust in information in general. It’s a big challenge for society. It is certainly a challenge for communications professionals, not only because we perhaps will find ourselves in situations where we’re having to confront misinformation or correct information that has been misconstrued, but we also increasingly face pressure from internal and external clients to take advantage of some of that degradation, and that really concerns me.

When we look at the state of journalism, there are more and more outlets now that I really just consider to be content farms, and it is really disheartening to see PR people actually manipulating that to their advantage. Quite frankly, it’s usually because they’re trying to appease some outdated metrics like coverage volume or impressions.

We very quickly become part of the problem if we look at those types of things as opportunities rather than problems that we need to help fix. I’m starting to see more and more people actually consider those types of stories or articles to be wins, as if all media coverage is created equal and as if all media coverage has value. Because the external environment around us, I think, unfortunately that’s created an opportunity for some organizations to kind of weasel their way into that and manipulate it in their favor, and I think that only exacerbates the problem.

I’m concerned about the rise of deepfakes, not so deepfakes, AI, all these other elements.  It used to be you believe what you see, and now we’re not going to believe what we see. How do brands such as Uber and Facebook and the brands we represent, how do we establish ourselves as authorities and fight back against kind of the fake content that’s being pushed out there?

I can’t speak for Facebook. I keep my leg out of that bear trap.

But I think it’s important for organizations to understand that you have to start now in developing that trust. You have to be known as a trustworthy organization, and this is why the day-to-day decisions make such a big difference, because every retread or every mistake makes it so much harder if you get to that point where if there is something like a deepfake and you come out and say, “This is a deepfake,” people need to be able to believe you. That kind of trust can’t be built in the moment. That has to be built up in advance.

If you don’t have the trust before that situation happens, I’m not sure there’s much you can do in the situation to get it back in that moment. Part of it also has to do with knowing whose trust you need. If we go back to some of these outdated PR metrics, you may not need every single person on the planet to align on a certain issue with you in order to be able to move forward. Being able to help the organization assess the actual risk of certain issues and certain situations, whether it’s a deepfake or something else. For example, if my family’s the only people who see this deepfake, you’re probably going to be fine.

I think we’re moving into more of risk management and helping companies understand, “Where are influential conversations actually taking place? Where are some of the kind of low level, perhaps it includes misinformation, but if it’s not particularly influential, probably don’t need to waste your energy trying to rewrite the internet in every corner?” Understanding that the real impact of these types of things is going to force our profession very quickly to get much better at measurement and accuracy.

What do you consider to be good, effective, ethical measurements of PR efforts?

I don’t know that I have a universal answer, because I do think the context and the objective dictate a lot of that. But I focus more on, “What are the behavioral outcomes?” versus just the output.

For example, there are tons and tons of articles about how to secure your online accounts and how to use two-factor authentication. The internet is just littered with these, whether it’s a journalistic article or on a company blog somewhere, and yet we know within the security community we just have abysmal single-digit adoption of two-factor authentication in most consumer communities. I think a lot of PR teams would consider those wins, because they got coverage, they created content. Yet I have a hard time seeing that as a win, because it hasn’t made a difference in securing people and getting them to use the tool.

Part of that has to do with the limitations and the weaknesses of the tools themselves, and that is where I see the role of a communications professional being able to go back to their product team and say, “We’ve done X, Y, and Z on the comm side. If we’re not moving the metric of people using these tools and finding them useful and valuable, we need to figure out how to fix the product.” That’s not a message problem. That message has saturated the market. That’s where, again, being involved in those discussions in product development and business strategy early on can help influence, “Let’s build something that protects people even if they never have to push a button.”

That’s where Uber started. Our first two-factor authentication was on by default because we just knew that a lot of people weren’t going to turn it on. So the first iteration was to build something that would be triggered if we detected suspicious activity, and then chapter two of that was, “Let’s give them an option of how they want to use that mechanism, whether it’s a text message or security app like Duo, But we started knowing that there was no amount of marketing PR or messaging that was going to convince everybody in the world to turn on this feature, so we built it on by default.

You brought up Uber. Is there anything you want consumers to know about Uber and ethics, and what you’re doing to be as transparent and secure as possible?

There are a number of things, but I think the most important thing that people probably don’t even realize is that we don’t have the same people running the company as we have in the past. I’ve been at the company for about three years. Six months after I joined, our former CEO was asked to leave, and so we have a new CEO, we have a new CSO, we have a new chief legal officer, we’ve hired our first chief privacy officer, we have a new security officer, we have a new chief of compliance. It’s very different at the most senior levels of the company. But I think more importantly, there are a lot of people like me at Uber who advocate on a daily basis for getting things like security and privacy into the DNA of the company.

Because we are the largest player in the space right now, at least in the US, we’re also the only company within our market that has this kind of robustness on our team. It’s one of the reasons why we’re actually in a pretty hot dispute in Los Angeles right now with local municipalities who want to collect realtime location data of riders. That is essentially government surveillance, and we are the only company that is standing up to that demand.  I don’t think that’s because other companies don’t want to. I think it’s because they can’t, because they don’t have the resources or the expertise to push back on these types of things while keeping the lights running.

I sincerely hope that the security and privacy teams that we’ve built, and the data governance and ethics governance that we’ve built here at Uber, will actually continue to be a model for the rest of the industry. I don’t think it is any secret that we learned the hard way, but people like me don’t run away from fires. We run towards them to help put them out.

We have actually put in a great deal of effort to teach the company how to operate differently, and I don’t experience tons of pushback from product teams or engineering teams when I insert a perspective on privacy or security, or even we have a lot of conversations about, “Just because it’s legal doesn’t mean it’s okay.” We’re very honest and open about that within our team, and being able to have that conversation without people really aggressively pushing back on me I think has been one of the most fulfilling and rewarding experiences of my career.

What is the best piece of ethics advice you were ever given?

Always tell the truth. I will take that one step further to say, prepare your organization to tell the truth.

It should be an expectation of everybody who works with you that that is going to be your guidance before they even ask you. In helping them tell the truth, that means not just the good, the bad, and the ugly, but it also means helping them anticipate a moment where they are going to have to tell the truth and to ask them, “Is this the truth that you want to have to tell? If not, maybe we should make different decisions.”

My engineers can tell you that I will often hold them to account for some of these decisions when I say, “If we have to discuss this publicly, do you want your name on the blog post? Do you want your name in the media statement? If not, then we need to rethink this decision, because if you’re not willing to be publicly accountable for it, I’m not either.”

Listen to the full interview, with bonus content, here:

 

 

Mark McClennan, APR, Fellow PRSA
Follow Me
Mark W. McClennan, APR, Fellow PRSA, is the general manager of C+C's Boston office. C+C is a communications agency all about the good and purpose-driven brands. He has more than 20 years of tech and fintech agency experience, served as the 2016 National Chair of PRSA, drove the creation of the PRSA Ethics App and is the host of EthicalVoices.com

1 Comment

Leave a Reply

Your email address will not be published. Required fields are marked *