As lawyers for both sides offered their closing statements in the trial of Derek Chauvin on Monday, a thousand miles away, executives at Facebook were preparing for the verdict to drop.
Seeking to avoid incidents like the one last summer in which 17-year-old Kyle Rittenhouse shot and killed two protesters in Kenosha, Wis., the social media company said it would take actions aimed at “preventing online content from being linked to offline harm.”
(Chauvin is the former Minneapolis police officer found guilty Tuesday of the second-degree murder of George Floyd last May; the Kenosha shootings took place in August 2020 after a local militia group called on armed civilians to defend the city amid protests against the police shooting of another Black man, Jacob Blake.)
As precautions, Facebook said it would “remove Pages, groups, Events and Instagram accounts that violate our violence and incitement policy,” and would also “remove events organized in temporary, high-risk locations that contain calls to bring arms.” It also promised to take down content violating prohibitions on “hate speech, bullying and harassment, graphic violence, and violence and incitement,” as well as “limit the spread” of posts its system predicts are likely to later be removed for violations.
“Our teams are working around the clock to look for potential threats both on and off of Facebook and Instagram so we can protect peaceful protests and limit content that could lead to civil unrest or violence,” Monika Bickert, Facebook’s vice president of content policy, wrote in a blog post.
But in demonstrating the power it has to police problematic content when it feels a sense of urgency, Facebook invited its many critics to ask: Why not take such precautions all the time?
“Hate is an ongoing problem on Facebook, and the fact that Facebook, in response to this incident, is saying that it can apply specific controls to emergency situations means that there is more that they can do to address hate, and that … for the most part, Facebook is choosing not to do so,” said Daniel Kelley, associate director of the Anti-Defamation League’s Center for Technology and Society.
“It’s really disheartening to imagine that there are controls that they can put in place around so-called ‘emergency situations’ that would increase the sensitivity of their tools, their products, around hate and harassment [generally].”
This isn’t the only time Facebook has “turned up the dials” in anticipation of political violence. Just this year, it has taken similar steps around President Biden’s inauguration, the coup in Myanmar and India’s elections.
Facebook declined to discuss why these measures aren’t the platform’s default, or what downside always having them in place would pose. In a 2018 essay, Chief Executive Mark Zuckerberg said content that flirts with violating site policies received more engagement in the form of clicks, likes, comments and shares. Zuckerberg called it a “basic incentive problem” and said Facebook would reduce distribution of such “borderline content.”
Central to Facebook’s response seems to be its designation of Minneapolis as a temporary “high-risk location” — a status the company said may be applied to additional locations as the situation in Minneapolis develops. Facebook has previously described comparable moderation efforts as responses specifically geared toward “countries at risk of conflict.”
“They’re trying to get ahead of … any kind of outbreak of violence that may occur if the trial verdict goes one way or another,” Kelley said. “It’s a mitigation effort on their part, because they know that this is going to be … a really momentous decision.”
He said Facebook needs to make sure it doesn’t interfere with legitimate discussion of the Chauvin trial — a balance the company has more than enough resources to be able to strike, he added.
Another incentive for Facebook to handle the Chauvin verdict with extreme caution is to avoid feeding into the inevitable criticism of its impending decision about whether former President Trump will remain banned from the platform. Trump was kicked off earlier this year for his role in the Jan. 6 Capitol riots; the case is now being decided by Facebook’s third-party oversight committee.
Shireen Mitchell — founder of Stop Online Violence Against Women and a member of “The Real Facebook Oversight Board,” a Facebook-focused watchdog group — sees the steps being taken this week as an attempt to preemptively “soften the blow” of that decision.
Trump, “who has incited violence, including an insurrection; has targeted Black people and Black voters; is going to get back on their platform,” Mitchell predicted. “And they’re going to in this moment pretend like they care about Black people by caring about this case. That’s what we’re dealing with, and it’s such a false flag over decades of … the things that they’ve done in the past, that it’s clearly a strategic action.”
As public pressure mounts for web platforms to strengthen their moderation of user content, Facebook isn’t the only company that has developed powerful moderation tools and then faced questions as to why it only selectively deploys them.
Earlier this month, Intel faced criticism and mockery over “Bleep,” an artificially intelligent moderation tool aimed at giving gamers more granular control over what sorts of language they encounter via voice chat — including sliding scales for how much misogyny and white nationalism they want to hear, and a button to toggle the N-word on and off.
And this week, Nextdoor launched an alert system that notifies users if they try to post something racist, but then doesn’t actually stop them from publishing it.
Facebook-Meta Earns the ‘Worst Company of 2021’ Title in This Survey
Facebook parent Meta has been named the Worst Company of the Year (2021) by Yahoo Finance respondents. According to the publication, an “open-ended” survey was published on Yahoo Finance on December 4 and 5, where 1,541 respondents participated. Facebook received 8 percent of the write-in vote, but respondents were seemingly mad about the Robinhood trading app as well. Electric truck startup Nikola, which was named last year’s worst company by the same publication also faced respondents ire.
Yahoo Finance even highlights, “At the same time, some critics, including conservatives, say Facebook over-policed the platform’s speech and stifled their voices.” Critics also blame Facebook and other social media platforms for not curbing hate speech that led to Capitol Building riots.
However, around 30 percent of Yahoo Finance readers said that Facebook or Meta could redeem itself. One respondent suggested that the company could issue a formal apology for negligence and donate a sizable amount of its profits to a foundation to help reverse its harm.
On the other hand, respondents chose Microsoft as the Company of the Year (2021). The Satya Nadella-led company touched the trillion-mark this year and introduced notable upgrades. The most notable is the Windows 11 OS update that succeeds Windows 10.
Facebook pays 1.7 Cr fine to Russia after failing to delete content Moscow deems illegal
In the latest legal tussle with Russia over controversial social media regulation laws, Facebook paid 17 million roubles (Rs 1.7 Crore) for failing to remove content deemed illegal by Moscow. With a threat of potential larger fines looming, Facebook parent company Meta, owned by Mark Zuckerberg, is scheduled to face court next week over repeated violations of Russian legislation on content, Interfax News Agency reported. As per the latest updates, the social media giant could be fined a percentage of its annual revenue.
In October, Moscow sent state bailiffs to enforce the collection of 17 million roubles. Meanwhile, as per Interfax report citing a federal bailiffs’ database, on Sunday, there were more enforcement proceedings against the company. Apart from the popular social media app, Telegram has also paid 15 million roubles in fines for failing to comply with the Russian social media legislations that came into force in 2016.
Facebook pays $53k to Russia for refusing controversial social media laws
It is pertinent to mention that Facebook has locked horns with Moscow earlier in November, resulting in it paying 4 million roubles ($53,000) over its refusal to adhere to Russian data localisation laws, the Moscow Times reported. The Moscow court on November 25 had said that Facebook paid the fine levied in February, following which all proceedings against the US-based social media giant. The payment comes against the litigation filed against the company in 2018, alongside Twitter. The tech companies were also forced to pay an additional 3000 rubles ($40) for failing to comply with user data sharing rules as per the law. The Russian authorities have also previously blocked LinkedIn, owned by Microsoft, for failing to abide by the laws.
Russian social media laws
As per Moscow Times, under the Russian social media regulation laws, all foreign technology companies are required to store data related to Russian customers and users on servers located in Russia. Additionally, the Russian tech companies will also have to share encryption data with the federal authorities as well as record user calls, messages and civil society group conversation records. The apparatus is said to be a severe breach of privacy rights and unfettered back-door access to personal data that could be used to harass Kremlin critics.
Facebook Messenger Is Launching a Split Payments Feature for Users to Quickly Share Expenses
Meta has announced the arrival of a new Split Payments feature in Facebook Messenger. This feature, as the name suggests, will let you calculate and split expenses with others right from Facebook Messenger. This feature essentially looks to bring an easier method to share the cost of bills and expenses — for example, splitting a dinner bill with friends. Using this new Split Payment feature, Facebook Messenger users will be able to split bills evenly or modify the contribution for each individual, including their own.
The company took to its blog post to announce the new Split Payment feature in Facebook Messenger. 9to5Mac reports that this new bill splitting feature is still in beta and will be exclusive to US users at first. The rollout will begin early next week. As mentioned, it will help users share the cost of bills, expenses, and payments. This feature is especially useful for those who share an apartment and need to split the monthly rent and other expenses with their mates. It could also come handy at a group dinner with many people.
With Split Payments, users can add the number of people the expense needs to be divided with and, by default, the amount entered will be divided in equal parts. A user can also modify each person’s contribution including their own. To use Split Payments, click the Get Started button in a group chat or the Payments Hub in Messenger. Users can modify the contribution in the Split Payments option and send a notification to all the users who need to make payments. After entering a personalised message and confirming your Facebook Pay details, the request will be sent and viewable in the group chat thread.
Once someone has made the payment, you can mark their transaction as ‘completed’. The Split Payment feature will automatically take into account your share as well and calculate the amount owed accordingly.
Tasneem Akolawala is a Senior Reporter for Gadgets 360. Her reporting expertise encompasses smartphones, wearables, apps, social media, and the overall tech industry. She reports out of Mumbai, and also writes about the ups and downs in the Indian telecom sector. Tasneem can be reached on Twitter at @MuteRiot, and leads, tips, and releases can be sent to firstname.lastname@example.org.