Connect with us

FACEBOOK

Facebook Faulted by Staff Over Jan. 6 Insurrection: ‘Abdication’ – Detroit News

Published

on

As rioters breached barricades and bludgeoned police with flagpoles before storming the U.S. Capitol on Jan. 6, some employees at Facebook Inc. took to an internal discussion board to express shock and outrage.

Many of the posts were imbued with a dawning sense that they and their employer—whose platforms for weeks had spread content questioning the legitimacy of the election—bore part of the blame. 

“I’m struggling to match my value to my employment here,” said an employee as the violence of the afternoon continues. “I came here hoping to affect change and improve society, but all I’ve seen is atrophy and abdication of responsibility.”

The remarks, directed at Chief Technology Officer Mike Schroepfer, were included in a package of disclosures provided to Congress in redacted form by lawyers representing Frances Haugen, a former Facebook product manager. The documents, obtained by a consortium of news organizations including Bloomberg, provide a unique window into a Facebook few outsiders ever see: a stunningly profitable technology firm showing signs of sagging morale and internal strife. 

In this combination of photos, on June 3, 2020, demonstrators, left, protest the death of George Floyd at the U.S. Capitol in Washington and Trump supporters try to break through a police barrier Jan. 6, 2021, at the same location.

“All due respect, but haven’t we had enough time to figure out how to manage discourse without enabling violence?” said one employee, according to a copy of posts by staff from Jan. 6. “We’ve been fueling this fire for a long time, and we shouldn’t be surprised it’s now out of control.” The remark was previously quoted in an article in the Wall Street Journal.

Advertisement

In a statement, Facebook said it ran the largest voter information campaign in U.S. history and took numerous steps to limit content that sought to delegitimize the election, including suspending Donald Trump’s account and removing content that violated company policies. 

“In phasing in and then adjusting additional measures before, during and after the election, we took into account specific on-platforms signals and information from our ongoing, regular engagement with law enforcement. When those signals changed, so did the measures,” according to a Facebook representative.

Facebook spent more than two years preparing for the 2020 election with more than 40 teams across the company and removed more than 5 billion fake accounts that year, according to a company statement. In addition, from March to Election Day, the company removed more than 265,000 pieces of Facebook and Instagram content in the U.S. for violating voter interference policies. It also deployed measures before and after Election Day to keep potentially harmful content from spreading before content reviewers could assess them, which the company likened to shutting down an entire town’s road and highways to respond to a temporary threat, according to the statement.

See also  Facebook Transatlantic Data Transfers In Peril; Upstarts Threaten The Holdcos

Some Facebook employees say the technology firm was in part to blame for the violence on Jan. 6. 

Over the 24 hours that followed the insurrection, employees — whose names are redacted in the documents— used the internal version of Facebook to debate the company’s performance in frank terms. Among the criticisms: that Facebook failed to aggressively act against “Stop the Steal” groups that coalesced around the false notion that former President Trump had won the election.  And that the company’s leaders repeatedly let down rank-and-file employees fighting to more aggressively curtail misinformation and other harms.

Advertisement

In this Jan. 6, 2021 file photo, violent insurrectionists loyal to President Donald Trump scale the west wall of the the U.S. Capitol in Washington.

Some of those observations were later backed by research carried out by Facebook earlier this year, according to the documents. Enforcement policies that focused on individual posts weren’t applied quickly enough to a coordinated movement of users and groups spreading quickly across the platform, a 2021 analysis of company failures around Stop the Steal found.

“It wasn’t until later that it became clear just how much of a focus point the catchphrase would be, and that they would serve as a rallying point around which a movement of violent election delegitimization could coalesce,” the analysis concluded. Stop the Steal and other election misinformation was also spread on other social media sites beside Facebook.

The company’s look back at Jan. 6 emphasized that few people, including crisis managers at Facebook, had expected the day’s events to explode in violence. (In fact, the level of violence surprised nearly everyone including government officials and law enforcement.) But it also listed a number of failures of policy and technology, as well as ways that Stop the Steal organizers gamed Facebook’s enforcement mechanisms.

See also  Youth apologises to parents on Facebook for 'embarrassing them', hangs himself to death

On Nov. 5, the company shut down the very first Stop the Steal Facebook group after multiple violations for posts inciting violence, and began tracking its replacements. The pages that emerged in its wake had some of the most rapid growth for Facebook groups in history. Organizers appeared to be consciously avoiding known enforcement triggers, internal documents say,  by carefully using language and, at least in one case, deploying the Facebook Stories feature, with posts that disappear after 24 hours.

Removing Guardrails

Advertisement

The disclosures also highlight other decisions made by Facebook prior to Jan. 6 that, in the view of some employees, fueled the rancor that spilled over in the insurrection.

Facebook had set up safeguards that were aimed at combating misinformation and other forms of platform abuse in the run-up to the 2020 election, but it dismantled many of them by mid-December, the documents indicate. Some measures, like a war room stood up for the November election, remained in place until after the inauguration.

Although the social media platform removed the first “Stop the Steal” group, others emerged with the most rapid growth seen in the company’s history.

And in early December, Facebook disbanded a 300-person squad known as Civic Integrity, which had the job of monitoring misuse of the platform around elections. Those experts were dispersed elsewhere even as efforts to delegitimize the election intensified.

Meanwhile, Stop the Steal groups were “amplifying and normalizing misinformation and violent hate in a way that delegitimized a free and fair election,” Facebook’s internal analysis concluded. “Early focus on individual violations made us miss the harm in the broader network.”

Advertisement

In this March 29, 2018, file photo, the logo for Facebook appears on screens at the Nasdaq MarketSite in New York's Times Square.

See also  Social Networking Sites Market 2020 (COVID-19 Worldwide Spread Analysis) by Key Players ...

Too Little, Too Late

Haugen’s cache of documents also suggest that Facebook failed to apply the most powerful levers it uses to slow the spread of harmful content — what it calls “break-the-glass” protocols — until after the violence began.

Rioters breached the final police barricades and began entering the Capitol just after 2 p.m. Eastern time on Jan. 6. At emergency meetings later that day, Facebook managers approved several additional measures, including restricting certain videos from rapidly going viral and “demoting” posts that incite violence, a tweak to the algorithm that keeps them from spreading quickly between users, according to one document, called “Capitol Protest BTG Response.” 

Facebook held an emergency meeting two hours after the rioters besieged the Capitol.

By that evening, Facebook announced that it would ban President Trump from the platform for 24 hours, an unprecedented step for the company, and one which it extended the following day to Jan. 20.

Advertisement

Several employees said that the response was too little, too late. “I do acknowledge that a 24-hour ban is a pretty big deal, but that’s only because up until now, our response has been completely tepid,” one person wrote.

A few hours later, another employee pointed the group to a fateful policy exception the company made several years earlier.

“Never forget the day Trump rode down the escalator in 2015, called for a ban on Muslims entering the U.S. We determined that it violated our policies, and yet we explicitly overrode the policy and didn’t take the video down,” the employee said on the chat. “There is a straight line that can be drawn from that day to today, one of the darkest days in the history of democracy and self-governance.

“Would it have made a difference in the end?,” the person asked. “We can never know, but history will not judge us kindly.”

Read More

Advertisement

FACEBOOK

Facebook Messenger Is Launching a Split Payments Feature for Users to Quickly Share Expenses

Published

on

By

Facebook Messenger Is Launching a Split Payments Feature for Users to Quickly Share Expenses

Meta has announced the arrival of a new Split Payments feature in Facebook Messenger. This feature, as the name suggests, will let you calculate and split expenses with others right from Facebook Messenger. This feature essentially looks to bring an easier method to share the cost of bills and expenses — for example, splitting a dinner bill with friends. Using this new Split Payment feature, Facebook Messenger users will be able to split bills evenly or modify the contribution for each individual, including their own.

The company took to its blog post to announce the new Split Payment feature in Facebook Messenger. 9to5Mac reports that this new bill splitting feature is still in beta and will be exclusive to US users at first. The rollout will begin early next week. As mentioned, it will help users share the cost of bills, expenses, and payments. This feature is especially useful for those who share an apartment and need to split the monthly rent and other expenses with their mates. It could also come handy at a group dinner with many people.

With Split Payments, users can add the number of people the expense needs to be divided with and, by default, the amount entered will be divided in equal parts. A user can also modify each person’s contribution including their own. To use Split Payments, click the Get Started button in a group chat or the Payments Hub in Messenger. Users can modify the contribution in the Split Payments option and send a notification to all the users who need to make payments. After entering a personalised message and confirming your Facebook Pay details, the request will be sent and viewable in the group chat thread.

See also  “Mark Changed The Rules”: How Facebook Went Easy On Alex Jones And Other Right-Wing Figures

Once someone has made the payment, you can mark their transaction as ‘completed’. The Split Payment feature will automatically take into account your share as well and calculate the amount owed accordingly.


For the latest tech news and reviews, follow Gadgets 360 on Twitter, Facebook, and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel.

Advertisement

Tasneem Akolawala is a Senior Reporter for Gadgets 360. Her reporting expertise encompasses smartphones, wearables, apps, social media, and the overall tech industry. She reports out of Mumbai, and also writes about the ups and downs in the Indian telecom sector. Tasneem can be reached on Twitter at @MuteRiot, and leads, tips, and releases can be sent to tasneema@ndtv.com.

Continue Reading

FACEBOOK

Facebook Owner Meta Launches New Platform, Safety Hub to Protect Women in India

Published

on

By

Meta Image

Meta (formerly Facebook) on Thursday announced a slew of steps to protect woman users on its platform, including the launch of StopNCII.org in India that aims to combat the spread of non-consensual intimate images (NCII).

Meta has also launched the Women’s Safety Hub, which will be available in Hindi and 11 other Indian languages, that will enable more women users in India to access information about tools and resources that can help them make the most of their social media experience, while staying safe online.

This initiative by Meta will ensure women do not face a language barrier in accessing information Karuna Nain, director (global safety policy) at Meta Platforms, told reporters here.

“Safety is an integral part of Meta’s commitment to building and offering a safe online experience across the platforms and over the years the company has introduced several industry leading initiatives to protect users online.

“Furthering our effort to bolster the safety of users, we are bringing in a number of initiatives to ensure online safety of women on our platforms,” she added.

Advertisement

StopNCII.org is a platform that aims to combat the spread of non-consensual intimate images (NCII).

“It gives victims control. People can come to this platform proactively, hash their intimate videos and images, share their hashes back with the platform and participating companies,” Nain said.

She explained that the platform doesn’t receive any photos and videos, and instead what they get is the hash or unique digital fingerprint/unique identifier that tells the company that this is a known piece of content that is violating. “We can proactively keep a lookout for that content on our platforms and once it”s uploaded, our review team check what”s really going on and take appropriate action if it violates our policies,” she added.

See also  News Brief: Booster Shots, US Capitol Security, Facebook Files - NPR

In partnership with UK Revenge Porn Helpline, StopNCII.org builds on Meta’s NCII Pilot, an emergency programme that allows potential victims to proactively hash their intimate images so they can”t be proliferated on its platforms.

The first-of-its-kind platform, has partnered with global organisations to support the victims of NCII. In India, the platform has partnered with organisations such as Social Media Matters, Centre for Social Research, and Red Dot Foundation.

Advertisement

Nain added that the company is hopeful that this becomes an industrywide initiative, so that victims can just come to this one central place to get help and support and not have to go to each and every tech platform, one by one to get help and support.

Also, Bishakha Datta (executive editor of Point of View) and Jyoti Vadehra from Centre for Social Research are the first Indian members in Meta”s Global Women”s Safety Expert Advisors. The group comprises 12 other non-profit leaders, activists, and academic experts from different parts of the world and consults Meta in the development of new policies, products and programmes to better support women on its apps.

“We are confident that with our ever-growing safety measures, women will be able to enjoy a social experience which will enable them to learn, engage and grow without any challenges.

“India is an important market for us and bringing Bishakha and Jyoti onboard to our Women”s Safety Expert Advisory Group will go a long way in further enhancing our efforts to make our platforms safer for women in India,” Nain said.

See also  Guwahati woman takes on molester, drags his scooter down the drain; Facebook post goes viral

Advertisement
Continue Reading

FACEBOOK

Facebook Adds New Trend Insights in Creator Studio, Which Could Help Shape Your Posting Strategy

Published

on

By

facebook-adds-new-trend-insights-in-creator-studio,-which-could-help-shape-your-posting-strategy

Facebook’s looking to provide more content insight within Creator Studio with the rollout of a new ‘Inspiration Hub’ element, which highlights trending content and hashtags within categories related to your business Page.

Facebook Inspiration Hub

As you can see in these screenshots, posted by social media expert Matt Navarra, when it becomes available to you, you’ll be able to access the new Inspiration Hub from the Home tab in Creator Studio.

At the right side of the screen, you can see the first of the new insights, with trending hashtags and videos from the last 24 hours, posted by Pages similar to yours, displayed above a ‘See more’ prompt.

When you tap through to the new hub, you’ll have a range of additional filters to check out trending content from across Facebook, including Page category, content type, region, and more.

Facebook Inspiration Hub

That could be hugely valuable in learning what Facebook users are responding to, and what people within your target market are engaging with in the app.

Advertisement

The Hub also includes insights into trending hashtags, within your chosen timeframe, which may further assist in tapping into trending discussions.

Facebook Inspiration Hub

How valuable hashtags are on Facebook is still up for debate, but you’ll also note that you can filter the displayed results by platform, so you can additionally display Instagram hashtag trends as well, which could be very valuable in maximizing your reach.

Much of this type of info has been available within CrowdTangle, Facebook’s analytics platform for journalists, for some time, but not everyone can access CrowdTangle data, which could make this an even more valuable proposition for many marketers.

See also  Facebook unblocks IPN page

Of course, overall performance really relates to your own creative, and thinking through the action that you want your audience to take when reading your posts. But in terms of detecting new content trends, including hashtag usage, caption length, videos versus image posts, and more, there’s a lot that could be gleaned from these tools and filters.

It’s a significant analytics addition – we’ve asked Facebook for more info on the rollout of the new option, and whether it’s already beyond test mode, etc. We’ll update this post if/when we hear back.

Advertisement
Continue Reading

Trending