Facebook employees must see the platform’s dark side too

Image for post
Image for post
Facebook new employee orientation handbook

It’s time for Facebook to erase the buffer between its techno-utopian employees and the outsourced content moderators who understand the platform’s dark reality.

By Yaël Eisenstat, former global head of elections integrity operations for political advertising at Facebook.

Should Facebook’s moderators, who are on the front lines of protecting the company’s 2.7 billion active users from the most egregious content, be brought in-house?

That’s one of the demands made by workers in a November “open letter from content moderators”, addressed to Mark Zuckerberg, Sheryl Sandberg, and the heads of two third-party content moderation companies.

Executives often couch outsourcing decisions in financial terms, but I know from experience that, at Facebook, there’s a darker side to keeping content moderators separated from full-time employees.

In June 2018, I was hired as Facebook’s “global head of elections integrity operations” for political advertising. During my short time working at the company, I liked my colleagues’ passion and positivity, but I was uncomfortable with the pervasive bubble mentality there. Outsourcing the dirty work helps maintain an internal techno-utopian culture that believes “connecting people” is the world’s most important mission.

My first day visiting One Hacker Way for orientation felt like an indoctrination into the most fun, important club in the world. The day was filled with excitement, energy, speeches about how we were the smartest people, reminders of how critical our mission was, and a constant drilling home of how Facebook was “our company” now.

And while I certainly fell for the excitement at times, the entire day left me feeling uneasy. I was hired to help the company fix one of its biggest problems that for me — as someone who had spent 18 years in the national security and global affairs arena — was key to the future of democracy: elections integrity. I didn’t join Facebook for its fuzzy mission, or for the free food and inspirational posters. And while I understand the need for company cheerleaders to rile up the crowd of new employees, the complete absence of discussion about the real challenges, the darker side, the responsibility to fix past mistakes and protect users from harm was stark. It was all rainbows and unicorns.

Most outside observers know that, at the very least, there are legitimate questions of Facebook’s role in sowing and/or spreading hatred and division. But until recently, it seems most employees have clung to the shiny, utopian veneer.

The content moderators don’t have that luxury. Every day they are expected to wade into the most disturbing and toxic parts of the platform — to see things that, by definition, nobody should have to see. Meanwhile, they are kept apart from Facebook offices, outsourced to remote buildings, without the benefits, perks, or daily interaction with Facebook employees.

Even when I visited a content moderator facility, I was kept in a conference room and given well-orchestrated briefings. I was not free to roam the floor or speak with moderators. There are some Facebook employees who choose to spend time experiencing content moderation, and I recall one person who volunteered — a military veteran — telling me what a grueling experience it was for him. No wonder that, for the most part, Facebook employees keep a safe distance from the nastiest side of their product.

I have written about why at a company like Facebook, these types of employees are “second tier in companies’ power structures” and the effect on companies’ cultures of treating them as such. While there are many business reasons to outsource certain roles, I believe that in this case, Facebook is also outsourcing the responsibility for the real-world harms of their product.

But one other effect that is not often discussed is whether that separation helps leadership continue to perpetuate a false narrative to its workforce to keep it happy, motivated, and dismissive of any outside criticism.

Whether or not this was the intent when Facebook first started contracting out content moderation, it has real ramifications. Employees are shielded from seeing the worst of humanity and the way the platform they helped build provides a tool for those worst humans to encourage, spread or even facilitate some of the most dangerous and depraved behavior.

During my years working on counter-extremism issues, I made sure to spend time on the ground with communities affected by our policies and practices. Developing that empathy is fundamental if you truly want to serve a community. And while I understand that most Facebook employees cannot do so directly, they can at least push themselves to better understand users outside of their likeminded circles. Whether by interacting more directly with content moderators, or through mandatory rotations serving as content moderators, that exposure could help serve to build a deeper understanding for the victims of the worst kind of online behavior.

Imagine if the person next to you in the lunch line, drinking with you at a company happy hour, or playing ping pong with you on a break was telling you stories of the brutally violent child abuse or terrorist beheading they just watched on the platform. That brush with reality is not only a downer. It forces employees to confront the realities of the world they are so set on “connecting” and perhaps question if they are truly doing all they can to ensure their business choices and practices are serving humanity. And it would hopefully push employees to demand more of their leadership.

If the company is truly intent on ensuring it is not having a negative impact on society, it will require a strong dose of reality for a workforce who has long been fed the idea that Facebook is a fun and exciting place to work for people who want to “change the world”. It will require removing the blinders and truly grappling with all sides of the business.

Having employees see both the good and the bad on a daily basis, or at least work directly with those who do, can only help. Pushing employees to ask tough questions, to struggle with the full spectrum of reality — not just the warm and fuzzy — would make them better stewards of the public square.

If you are not directly confronted by the negative, then it is easier to not only bury yourself in your work and feel proud about it, but to dismiss outside critics as naysayers, self-promoters, or too “clueless” to understand how the company works. If you don’t have to personally experience the pain, it is easier to remain focused on fun, growth, scale, and profit.

There will always exist a tension between those who prefer to see the optimistic and positive side of things, and those who ground their work ethic in a deep understanding of the world’s ills and risks. At a company as big as Facebook, there should be room for both. The idealist and the skeptic should collaborate, push one another, and build a more well-rounded, robust product.

I have long argued that valuing those who protect the company — usually viewed as cost centers — less than those considered revenue generators is problematic. In separating the content moderators — the true protectors of the platform — from the builders and designers, Facebook is doing its employees and the public a disservice by creating a walled-off garden on campus instead of pushing its workforce to take off the blinders, confront its demons, and take responsibility for not just the positive impact of its platform, but also for the most negative.

Only then can Facebook employees be expected to truly push for the changes that so many outside activists, academics, journalists and civil rights leaders have called for.

This article was originally sent to founding subscribers of Techworker, a 100% independent news site for (and about) tech workers, launching in January 2021.

To become a founding subscriber, visit techworker.com

Yaël Eisenstat is a Visiting Fellow at Cornell Tech’s Digital Life Initiative, where she works on technology’s effects on civil discourse and democracy. In 2018, she was Facebook’s Global Head of Elections Integrity Operations for political ads. Previously, she spent 18 years working around the globe as a CIA officer, White House advisor, diplomat, corporate social responsibility strategist at ExxonMobil, and the head of a global risk firm. Yaël works with governments, tech companies, and investors focused on the intersection of ethics, tech, and policy, including as a Researcher-in-Residence at Betalab, an early-stage cohort-based investment program with the singular goal of catalyzing startup activity around “Fixing The Internet”.

An independent news site for, and about, the most powerful workforce on earth. Launching Jan 2021.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store