Follow ACC Docket Online:  

Responding to Crisis: An Ethics Lesson from Facebook

F rom lawmakers to everyday citizens, there is a surging concern of how social media companies moderate speech. At the center of this scrutiny is tech giant Facebook.

Although Facebook provided consistent service and complied with the same regulations for years, the company is being questioned for its users’ speech on its platform. The most troublesome are the alleged foreign efforts to interfere with the 2016 US presidential election.

The growing unease has contributed to a crisis at the company, marred by waning public trust and rising executive turnover. This tests any in-house counsel facing such an ethical dilemma.

It’s worth weighing your options when determining how to prevent and respond to crises. This is especially crucial when the issues concern morality or a tarnished brand — and not a clear violation of law.

How would you encourage your company to act when such an issue is first identified? What part do you play in the solution? Under what circumstances should you leave?

Facebook’s challenge

A practical but simple rule of thumb for the internet is that intermediary platforms are not responsible for content; users are responsible for what they post. This is grounded in early US internet regulatory laws, which effectively protect internet service providers that host published speech and copyright content but do not otherwise control it.

Many tech firms such as Facebook can steer clear of content-related liability, where they act as a conduit and take down offending content when notified. This protection is critical for the tech industry because it shields firms from litigation and nonstop content moderation, which is difficult and costly. Accordingly, tech firms have traditionally designed their compliance policies and practices to stay hands-off content as much as possible.

As reported by the New York Times this November, the Facebook security team learned in the spring of 2016 of foreign efforts to interfere with the election using the social media platform. The security team first focused on traditional types of attacks, such as hackers probing accounts of individuals involved in campaigns.

[Related: 5 Considerations for an In-house Litigator When Guiding a Social Media Campaign]

Facebook Chief of Information Security Officer Alex Samos reportedly informed General Counsel Colin Stretch of the problem early in the investigation process. Stretch took charge of the investigation, and its complexity grew throughout 2016. Facebook learned of foreign efforts to manipulate the platform to propagate disinformation intended to inflame and confuse voters, thereby distorting the election process.

To address the issue, Facebook applied its policies that prohibited fake accounts and also the posting of shocking content, direct threats, or the promotion of violence. These strategies helped the platform block millions of counterfeit accounts daily at registration.
 
However, at the time Facebook did not have a policy and procedures that squarely addressed the publication of disinformation targeted at the election. When done right, such a policy would require a sophisticated system for moderating and disclosing the source of content. But Facebook hesitated, apparently due to political and liability concerns.

In the aforementioned New York Times report, Facebook’s high-level management wanted to avoid appearing to obstruct legitimate free speech and feared a backlash from the winning party. There was also a reported concern that the investigation would create a liability. Facebook nevertheless continued to investigate.

[Related: Social Media Policy — Still the Wild West]

Toward the end of 2017, Stretch and Samos reported in detail to the audit committee, indicating that the problem had not yet been fully contained. When the issue came before the Facebook board, the discussion reportedly included why the issue had not been escalated sooner and what could’ve been done to resolve it.
 
As part of its own investigation into the election interference, the US Congress called on Facebook to testify on October 31, 2017. Stretch appeared, stating that Facebook had not identified the full scope of the Russian problem.

He described how Facebook’s first line of defense is deleting counterfeit accounts and blocking phony registrations, as well as more steps Facebook intended to take going forward.  

Then in July 2018, Stretch announced his departure from Facebook:

“As Facebook embraces the broader responsibility Mark [Zuckerberg] has discussed in recent months, I've concluded that the company and the Legal team need sustained leadership in Menlo Park” – a difficult task for him to manage from his home in Washington, DC.

However, soon after the New York Times report in November, he retracted his statement and now intends to stay with Facebook until 2019.

Responding to new challenges

Taking media reports at face value, the Facebook reaction to foreign election interference presents a case study for in-house counsel decision-making, combining aspects of customer care, free speech, compliance, the influence of politics, governance, and technology.  

Encouraging action

What course would you have taken when the issue first surfaced? How would you have balanced the competing interests and advocated for this course of action?

In-house counsel should advocate for responsible corporate behavior, even when it’s voluntary and not clearly required by law. In Facebook’s case, close adherence to minimum legal requirements resulted in a missed opportunity to protect its business and brand soon after it learned of the foreign interference.

Solving the problem

What steps would you take to solve the problem? After being called before Congress in 2017, Stretch then thoroughly outlined plans to address the election interference problem during his Congressional testimony. His roadmap outlines common options available to in-house counsel for solving new challenges as they arise:

Leverage current policies and procedures

Facebook used its longstanding practices for taking down counterfeit accounts while it investigated foreign election interference. It planned to bring on extra staff to bolster existing review efforts and strengthen the manual and automated review process.


Create new policies and procedures

When you learn of holes or lags in existing policy, create new requirements to accelerate response time and subdue the current problem. Facebook created new channels to escalate issues quickly and required identity verification for political advertisers. Additionally, Facebook announced its plans to expand restrictions on advertising content to exclude “subtle expressions of violence.”


Transparency

Inform your customers and the public of targeted changes to your product to ensure transparency and reliability. To promote authenticity, Facebook is developing initiatives to educate voters on how to recognize falsified news and to also safeguard candidates against hacking. Facebook intends to introduce methods to allow greater advertising transparency, such as verifying the user, account, or page identity for issue-based or political accounts.


Coordinate with your industry

Reach out to peers to collaborate on eliminating threats and reducing future crises. Facebook plans to share information on bad actors with Google and Twitter in order to begin an industry-wide standard for recognizing and removing these users.


Coordinate with government

Pursue legislation that will clarify your obligations and reduce the likelihood of a future crisis. Facebook is pursuing legislation to define greater limits for political advertising, sharing information on bad actors, and working with election officials to identify potential threats during an election.

 

Deciding when to leave

There are certain situations where it’s relatively easy for an in-house counsel to know when to leave a company. Typically during those dilemmas, there is a clear violation of law and they have no direct influence or a strong relationship that can be leveraged to rectify the situation.

But what’s your threshold for staying at a company embroiled with morality problems? And if you decide to go, but are asked to stay longer, what conditions would you require for staying? Establishing your ethical standards from the start may help if you ever have to make this type of difficult decision when a crisis hits.

About the Author

Noah WebsterNoah Webster is general counsel and secretary for Zix. Previously ACCDocket.com’s Litigation columnist, he has since become the Law Department Management columnist after moving to a more generalist role in his career.


The information in any resource collected in this virtual library should not be construed as legal advice or legal opinion on specific facts and should not be considered representative of the views of its authors, its sponsors, and/or ACC. These resources are not intended as a definitive statement on the subject addressed. Rather, they are intended to serve as a tool providing practical advice and references for the busy in-house practitioner and other readers.