Sophie Zhang worked as a Facebook data scientist for nearly three years before was she fired in the fall of 2020. On her final day, she posted a 7,800-word memo to the company’s internal forum — such farewell notes, if not the length, are a common practice for departing employees. In the memo, first published by Buzzfeed, she outlined evidence that governments in countries like Azerbaijan and Honduras were using fake accounts to influence the public. Elsewhere, such as India and Ecuador, Zhang found coordinated activity intended to manipulate public opinion, although it wasn’t clear who was behind it. Facebook, she said, didn’t take her findings seriously.
Zhang’s experience led her to a stark conclusion: “I have blood on my hands.”
Facebook has not disputed the facts of Zhang’s story but has sought to diminish the importance of her findings.
“We fundamentally disagree with Ms. Zhang’s characterization of our priorities and efforts to root out abuse on our platform,” Facebook said in a statement. “As part of our crackdown against this kind of abuse, we have specialized teams focused on this work and have already taken down more than 150 networks of coordinated inauthentic behavior. Around half of them were domestic networks that operated in Latin America, the Middle East, North Africa, and in the Asia Pacific region.”
This interview has been edited for length and clarity.
QUESTION: Why were you fired from Facebook?
ANSWER: I’ve made the news for much of the work I have done protecting elections. This might sound very important to the average person, but at Facebook I was a very low-level employee. In addition, this work was not my official job. I was conducting it entirely in my spare time, with the knowledge and acquiescence of leadership, of course. At first, the company was supportive of this. But gradually they lost patience with me. I was underperforming.
Q: In your memo, you wrote that you have blood on your hands. Why did you say that?
A: Whether something was acted on was, as far as I could tell, entirely a function of how much I yelled, how much I made noise.
I know that many of the decisions they have made have had impact in the countries that they worked on. The U.S. is still deeply affected by what happened in 2016 with Russian manipulation on Facebook. For many countries like Honduras or Azerbaijan, this is their own Russia. But it’s done not by a foreign power, but by their own government — and without even bothering to hide.
I tried my best to make decisions based on the information I had at the time. But of course I am just one person. Sometimes I waited on something longer than I should have. At this level of responsibility, your best is often not enough.
Q: How did you get into the work you did?
A: When I joined the company I was, like many people, deeply affected by Russia 2016. And I decided to start looking for overlap between inauthentic activity and political targets. And I started finding many results in many places, particularly what we call the global South, in Honduras, Brazil, India.
Honduras got my attention because it had a very large amount (of inauthentic behavior) compared to the others. This was very unsophisticated activity we are talking about. Literal bots. And then I realized that this was essentially a troll farm being run quite openly by an employee of the president of Honduras. And that seemed extraordinarily awful.
Q: Then what did you do?
I talked about it internally. Essentially everyone agreed that it was bad. No one wants to be defending this sort of activity, but people couldn’t agree on whose job it was to deal with it.
I was trying desperately to find anyone who cared. I talked with my manager and their manager. I talked to the threat intelligence team. I talked with many integrity teams. It took almost a year for anything to happen.
Q: You’ve said there is a priority list of countries. What happens to countries that aren’t on that list?
A: It’s not a hard and fast rule. Facebook does takedowns in small countries, too. But most of these takedowns are reactive, by which I mean they come from outside groups — tips from opposition groups, tips from NGOs, reporter investigations, reports from the CIA, etc. What happened in this case was that no one outside the company was complaining.
Q: Given the resources Facebook has, why it can’t prioritize every country?
A: The answer that I’ve seen at Facebook when I was there, when these questions were asked, was that even though Facebook has a ton of money, human resources are different. Even if you have infinite money, you can’t expand its size by a factor of 10 every night. It takes time to train people. It takes time to grow.
And it was willing to believe that for a while when I was there. But I think in retrospect, if they genuinely believed that it was important, they would be taking steps that they aren’t. They would be focusing very highly on retaining talent in the integrity teams. And they would certainly never have fired me.
Q: How do people still at Facebook try to change this?
A: Like most employees, they’re just average people who want to do the 9-to-6, want to go home at the end of the day and sleep.
There’s also a self-selection bias. If you think that Facebook is evil, you aren’t likely to join Facebook.
But there are many people also who joined Facebook because they wanted to make it better. I was very upfront with them when I joined. I don’t think Facebook is making the world a better place. And I told them I wanted to fix it.
Q: Is there a concern among employees about the company’s image?
A: I think employees have gotten more pessimistic over time. But there’s also a very strong insularity and perhaps paranoia toward the mainstream press. People are skeptical of what the press says about the company.
I don’t want to diminish that Facebook has been very open historically. We had regular access to the CEO. I was able to, as a very low level employee, be involved in our discussions with a company vice president. But it’s also been changing over time because of fear and worry about employee leaks.
Q: Who is doing the work you did now?
A: I don’t know. I was the only person who was going out on my own to look for this behavior rather than waiting for people to tell us that something was going on. The reason I found so many things so easily was because there was so much low-hanging fruit.
Q: Facebook says it’s taking down many inauthentic accounts and has sought to dismiss your story.
A: So this is a very typical Facebook response, by which I mean that they are not actually answering the question. Suppose your spouse asks you, “Did you clean up the dishes yesterday?” And you respond by saying, “I always prioritize cleaning the dishes. I make sure to clean the dishes. I do not want there to be dirty dishes.” It’s an answer that may make sense, but it does not actually answer that question.
How to prepare your Facebook account for your digital afterlife
Today, our online lives are where we share a lot of private and personal information, especially on social media platforms where we share many of our thoughts, post photos and videos over the time we have spent online. Among these social media platforms, Facebook is the most used social media service today. A lot of us, our friends and our family members have a Facebook account. We post and share everything from our private photos to a personal message via Facebook.
But have you wondered what happens to your Facebook account and the information (like posts, comments, photos, videos, etc.) that you have created and accumulated on the service after your time?
■ What will happen to my account?
■ Who can access your profiles?
■ Who will own your account and data?
■ How to manage it when such a time comes?
Facebook has added features to your account so that you can decide what happens to your account when such a time arises. Follow the steps given below to set it up and ensure that the information in your Facebook accounts is handed over to someone else safely or managed according to your choice.
Setting up Facebook’s legacy contact:
In the case of Facebook, you can choose to memorialise your account and hand over the control to a ‘Legacy contact’ of your choice or altogether delete your profile after your time.
Step 1: To set up your legacy contact, you can visit the ‘Settings & privacy’ option under your profile and select the ‘Memorialisation settings’ under ‘General Account settings’. You can also sign in to your account and visit https://www.facebook.com/settings to access this setting.
Step 2: Now, you can choose a legacy contact in this setting by searching for and adding a friend from your account as your legacy contact. Do note that, once memorialised, the legacy contact can only moderate the posts on your page and not post on your behalf.
Step 3: The following setting is to choose whether to allow your legacy contact to download all your data that you have created or shared on your Facebook account like posts, photos, videos etc.
Step 4: The final setting on this page could be considered an alternative to choosing a legacy contact. This setting is to delete your complete Facebook account once you pass away. Facebook needs to be informed about your death and requires verifying it with valid documentation to activate this feature. The company will delete all your information on Facebook on completion of this process.
To know more about these settings, you can visit the FAQ page on legacy contact.
Subscribe to Mint Newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.
Never miss a story! Stay connected and informed with Mint.
our App Now!!
Big EU lawsuit against Facebook morphs into 3-year ‘partnership’ with complainants
Press play to listen to this article
Three years ago, a group of EU consumer agencies launched a multi-country lawsuit against Facebook, accusing the social media giant of having illegally harvested the data of millions of users.
More than 300,000 angry Facebook users positioned themselves behind the collective action suit, which promised to award them individual monetary damages if the company was found guilty of wrongdoing.
On Friday, those lawsuits quietly morphed into a brand new partnership with Facebook.
Euroconsumers, the umbrella organization behind the Spanish, Italian, Belgian and Portuguese lawsuits, announced they were entering a partnership with the company focused on the “safety and privacy” of Facebook users.
The move comes after POLITICO reported that Euroconsumers had settled its lawsuit with Facebook at the end of April — and highlights the fact that collective action lawsuits rarely make it over the finish line in Europe, sheltering companies from the type of action that can produce crippling damages in U.S. courts while leaving consumers with little recourse.
Originally, Euroconsumers had told people who joined the case it would seek compensation of €200 for every Facebook user whose data was mishandled.
In the end, though, there will be no court decision, no admission of wrongdoing by Facebook and no direct payment from the company to consumers as a result of the settlement, according to Euroconsumers.
Instead, the consumer groups and Facebook said they were forming a joint committee focused on three priorities: sustainability, digital empowerment and fighting scams. The issue of privacy — which was the explicit focus of the lawsuit — is the “umbrella” under which the thee priorities fall.
As for the consumers, they are being promised a vague consolation prize.
The four consumer groups said they would commit to “reward” consumers who joined the original lawsuit with “a package to help consumers be safe online” — but no hard cash.
Asked whether Facebook had paid money to Euroconsumers in the settlement, the group declined to comment. POLITICO reached out to Facebook, but the company didn’t give an immediate response apart from the press release.
Meanwhile, the committee isn’t committed to producing any specific results.
“There are specific initiatives in the making, but there will also be a consumer reporting channel. We will able to report problems that emerge, like feedback from our members,” said Els Bruggeman, head of policy at Euroconsumers.
A spokesperson for the group said: “It’s the moment to try to influence the reasoning from companies who are managed far away.”
Legally speaking, though, the heat is off Facebook.
The consumer groups will evaluate their collaboration in three years.
“An agreement for one year would be too short. Three years is long enough to be able to evaluate. There will be a lot of changes in the digital world in that period,” added the spokesperson.
In the meantime, a change in legislation may give future collective action lawsuits in Europe more teeth: A directive finalized late last year could lead to bigger pan-European collective redress cases.
Want more analysis from POLITICO? POLITICO Pro is our premium intelligence service for professionals. From financial services to trade, technology, cybersecurity and more, Pro delivers real time intelligence, deep insight and breaking scoops you need to keep one step ahead. Email [email protected] to request a complimentary trial.
Russian watchdog demands that Facebook delete post insulting WWII veterans
MOSCOW, May 29. /TASS/. Russia’s Federal Service for Supervision of Communications, Information Technology, and Mass Media (Roskomnadzor) demanded that US company Facebook delete an Instagram post that insults the memory of World War II veterans, the watchdog said on its website on Friday.
“Roskomnadzor has sent a letter to Facebook Inc top management, demanding that content insulting the memory of World War II veterans be deleted,” the watchdog said. “The governmental agency found the unlawful post on the Instagram social network, owned by Facebook.”
According to Roskomnadzor, publication of clearly offensive information that insults Russia’s military glory and memorable dates, or desecrates military glory symbols, or offends WWII veterans constitutes a criminal offense in Russia and is subject to criminal proscution.