Connect with us

FACEBOOK

Facebook harms children and is damaging democracy, claims whistleblower – The Guardian

Published

on

Facebook puts “astronomical profits before people”, harms children and is destabilising democracies, a whistleblower has claimed in testimony to the US Congress.

Frances Haugen said Facebook knew it steered young users towards damaging content and that its Instagram app was “like cigarettes” for under-18s. In a wide-ranging testimony, the former Facebook employee said the company did not have enough staff to keep the platform safe and was “literally fanning” ethnic violence in developing countries.

She also told US senators:

  • The “buck stops” with the founder and chief executive, Mark Zuckerberg.

  • Facebook knows its systems lead teenagers to anorexia-related content.

  • The company had to “break the glass” and turn back on safety settings after the 6 January Washington riots.

  • Facebook intentionally targets teenagers and children under 13.

  • Monday’s outage that brought down Facebook, Instagram and WhatsApp meant that for more than five hours Facebook could not “destabilise democracies”.

Haugen appeared in Washington on Tuesday after coming forward as the source of a series of revelations in the Wall Street Journal last month based on internal Facebook documents. They revealed the company knew Instagram was damaging teenagers’ mental health and that changes to Facebook’s News Feed feature – a central plank of users’ interaction with the service – had made the platform more polarising and divisive.

She told senators on Tuesday that Facebook knew Instagram users were being led to anorexia-related content. She said an algorithm “led children from very innocuous topics like healthy recipes … all the way to anorexia-promoting content over a very short period of time”.

In her opening testimony, Haugen, 37, said: “I’m here today because I believe Facebook’s products harm children, stoke division and weaken our democracy. The company’s leadership knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people.” She added that Facebook was “buying its profits with our safety”. In 2020, Facebook reported a net income – a US measure of profit – of more than $29bn (£21bn).

Referring to Monday’s near six-hour outage in which Facebook’s platforms including Instagram and WhatsApp were disabled for billions of users, Haugen’s testimony added: “For more than five hours Facebook wasn’t used to deepen divides, destabilise democracies and make young girls and women feel bad about their bodies.” Facebook has 3.5 billion monthly active users across its platforms including Instagram and WhatsApp.

See also  Facebook made its bed and wants Congress to lie in it

Warning that Facebook makes choices that “go against the common good”, Haugen said the company should be treated like the tobacco industry, which was subject to government action once it was discovered it was hiding the harms its products caused, or like car companies that were forced to adopt seatbelts or opioid firms that have been sued by government agencies.

Senator Richard Blumenthal speaks during the hearing.
Senator Richard Blumenthal speaks during the hearing. Photograph: Lenin Nolly/SOPA Images/REX/Shutterstock

Urging lawmakers to force more transparency on Facebook, she said there should be more scrutiny of its algorithms, which shape the content delivered to users. “The core of the issue is that no one can understand Facebook’s destructive choices better than Facebook, because only Facebook gets to look under the hood,” she said. With greater transparency, she added, “we can build sensible rules and standards to address consumer harms, illegal content, data protection, anticompetitive practices, algorithmic systems and more”.

The hearing focused on the impact of Facebook’s platforms on children, with Haugen likening the appeal of Instagram to tobacco. “It’s just like cigarettes … teenagers don’t have good self-regulation.” Haugen added women would be walking around with brittle bones in 60 years’ time because of the anorexia-related content they found on Facebook platforms.

Haugen told lawmakers that Facebook intentionally targets teens and “definitely” targets children as young as eight for the Messenger Kids app.

Haugen said that, according to internal documents, Zuckerberg had been given “soft options” to make the Facebook platform less “twitchy” and viral in countries prone to violence but declined to take them because it might affect “meaningful social interactions”, or MSI.

She added: “We have a few choice documents that contain notes from briefings with Mark Zuckerberg where he chose metrics defined by Facebook like ‘meaningful social interactions’ over changes that would have significantly decreased misinformation, hate speech and other inciting content.”

See also  Hackers take control of Aussie singer's Facebook page as high-profile accounts targeted - 9News

Haugen said Zuckerberg had built a company that was “very metrics driven”, because the more time people spent on Facebook platforms the more appealing the business was to advertisers. Asked about Zuckerberg’s ultimate responsibility for decisions made at Facebook, she said: “The buck stops with him.”

Haugen also warned that Facebook was “literally fanning ethnic violence” in places such as Ethiopia because it was not policing its service adequately outside of the US.

Referring to the aftermath of the 6 January storming of the Capitol, as protesters sought to overturn the US presidential election result, Haugen said she was disturbed that Facebook had to “break the glass” and reinstate safety settings that it had put in place for the November poll. Haugen, who worked for the Facebook team that monitored election interference globally, said those precautions had been dropped after Joe Biden’s victory in order to spur growth on the platform.

Facebook’s headquarters in Menlo Park, California - person walks by large thumbs-up sign
Facebook’s headquarters in Menlo Park, California. Photograph: Josh Edelson/AFP/Getty Images

Among the reforms recommended by Haugen were ensuring that Facebook shares internal information and research with “appropriate” oversight bodies such as Congress and removing the influence of algorithms on Facebook’s News Feed by allowing it to be ranked chronologically.

“Frances Haugen’s testimony appears to mark a rare moment of bipartisan consensus that the status quo is no longer acceptable,” said Imran Ahmed, chief executive officer of the Center for Countering Digital Hate, a nonprofit that fights hate speech and misinformation. “This is increasingly becoming a non-political issue and one that has cut through definitively to the mainstream.”

Haugen’s testimony is likely to add pressure on US lawmakers to attempt legislative measures agains the tech company. Senator Ed Markey suggested during the hearing that Congress would take action. “Here’s my message for Mark Zuckerberg: your time of invading our privacy, promoting toxic content in preying on children and teens is over,” Markey said. “Congress will be taking action. We will not allow your company to harm our children and our families and our democracy, any longer.”

See also  Thomasville Fire Rescue Launches Official Facebook Page

“Today’s testimony from Frances Haugen is a catalyst for change,” said Senator Amy Klobuchar. “The time for action is now.”

Haugen’s lawyers have filed at least eight complaints with the US financial watchdog accusing the social media company of serially misleading investors about its approach to safety and the size of its audience.

Facebook has pushed back forcefully against Haugen’s accusations. Though her testimony was built on thousands of international documents gathered from the company, a company spokesperson, Andy Stone, said in a tweet during the hearing: “Just pointing out the fact that Frances Haugen did not work on child safety or Instagram or research these issues and has no direct knowledge of the topic from her work at Facebook.”

Responding after the hearing, Lena Pietsch, Facebook’s director of policy communications, said: “Today, a Senate commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives and testified more than six times to not working on the subject matter in question. We don’t agree with her characterization of the many issues she testified about.

“Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”

Earlier, Facebook had issued a series of statements downplaying Haugen’s document leaks, saying: its Instagram research showed that many teenagers found the app helpful; it was investing heavily in security at the expense of its bottom line; polarisation had been growing in the US for decades before Facebook appeared; and the company had “made fighting misinformation and providing authoritative information a priority”.

Responding to accusations that Facebook had misled the public and regulators, the company said: “We stand by our public statements and are ready to answer any questions regulators may have about our work.”

Read More

FACEBOOK

Facebook Adds New Trend Insights in Creator Studio, Which Could Help Shape Your Posting Strategy

Published

on

By

facebook-adds-new-trend-insights-in-creator-studio,-which-could-help-shape-your-posting-strategy
en flag
sv flag

Facebook’s looking to provide more content insight within Creator Studio with the rollout of a new ‘Inspiration Hub’ element, which highlights trending content and hashtags within categories related to your business Page.

Facebook Inspiration Hub

As you can see in these screenshots, posted by social media expert Matt Navarra, when it becomes available to you, you’ll be able to access the new Inspiration Hub from the Home tab in Creator Studio.

At the right side of the screen, you can see the first of the new insights, with trending hashtags and videos from the last 24 hours, posted by Pages similar to yours, displayed above a ‘See more’ prompt.

When you tap through to the new hub, you’ll have a range of additional filters to check out trending content from across Facebook, including Page category, content type, region, and more.

Facebook Inspiration Hub

That could be hugely valuable in learning what Facebook users are responding to, and what people within your target market are engaging with in the app.

The Hub also includes insights into trending hashtags, within your chosen timeframe, which may further assist in tapping into trending discussions.

Facebook Inspiration Hub

How valuable hashtags are on Facebook is still up for debate, but you’ll also note that you can filter the displayed results by platform, so you can additionally display Instagram hashtag trends as well, which could be very valuable in maximizing your reach.

Much of this type of info has been available within CrowdTangle, Facebook’s analytics platform for journalists, for some time, but not everyone can access CrowdTangle data, which could make this an even more valuable proposition for many marketers.

See also  Two dead in house fire in Oneida County - syracuse.com

Of course, overall performance really relates to your own creative, and thinking through the action that you want your audience to take when reading your posts. But in terms of detecting new content trends, including hashtag usage, caption length, videos versus image posts, and more, there’s a lot that could be gleaned from these tools and filters.

It’s a significant analytics addition – we’ve asked Facebook for more info on the rollout of the new option, and whether it’s already beyond test mode, etc. We’ll update this post if/when we hear back.

Continue Reading

FACEBOOK

Meta Updates Policy on Cryptocurrency Ads, Opening the Door to More Crypto Promotions in its Apps

Published

on

By

meta-updates-policy-on-cryptocurrency-ads,-opening-the-door-to-more-crypto-promotions-in-its-apps
en flag
sv flag

With cryptocurrencies gaining momentum, in line with the broader Web 3.0 push, Meta has today announced an update to its ad policies around cryptocurrencies, which will open the door to more crypto advertisers on its platforms.

As per Meta:

Starting today, we’re updating our eligibility criteria for running ads about cryptocurrency on our platform by expanding the number of regulatory licenses we accept from three to 27. We are also making the list of eligible licenses publicly available on our policy page.”

Essentially, in order to run any crypto ads in Meta’s apps, that currency needs to adhere to regional licensing provisions, which vary by nation. With crypto becoming more accepted, Meta’s now looking to enable more crypto companies to publish ads on its platform, which will provide expanded opportunity for recognized crypto providers to promote their products, while also enabling Meta to make more money from crypto ads.

“Previously, advertisers could submit an application and include information such as any licenses they obtained, whether they are traded on a public stock exchange, and other relevant public background on their business. However, over the years the cryptocurrency landscape has matured and stabilized and experienced an increase in government regulation, which has helped to set clearer responsibilities and expectations for the industry. Going forward, we will be moving away from using a variety of signals to confirm eligibility and instead requiring one of these 27 licenses.”

Is that a good move? Well, as Meta notes, the crypto marketplace is maturing, and there’s now much wider recognition of cryptocurrencies as a legitimate form of payment. But they’re also not supported by most local financial regulators, which reduced transaction protection and oversight, which also brings a level of risk in such process.

See also  ELI5: Pyre - Fast Error Flagging for Large Python Codebases

But then again, all crypto providers are required to clearly outline any such risks, and most also highlight the ongoing market volatility in the space. This expanded level of overall transparency means that most people who are investing in crypto have at least some awareness of these elements, which likely does diminish the risk factor in such promotions within Meta’s apps.

But as crypto adoption continues to expand, more of these risks will become apparent, and while much of the crypto community is built on good faith, and a sense of community around building something new, there are questions as to how much that can hold at scale, and what that will then mean for evolving scams and criminal activity, especially as more vulnerable investors are brought into the mix.

Broader promotional capacity through Meta’s apps will certainly help to boost exposure in this respect – though again, the relative risk factors are lessened by expanded regulatory oversight outside of the company.

You can read more about Meta’s expanded crypto ad regulations here.

Continue Reading

FACEBOOK

Meta Outlines Evolving Safety Measures in Messaging as it Seeks to Allay Fears Around the Expansion of E2E Encryption

Published

on

By

meta-outlines-evolving-safety-measures-in-messaging-as-it-seeks-to-allay-fears-around-the-expansion-of-e2e-encryption
en flag
sv flag

Amid rising concern about Meta’s move to roll out end-to-end encryption by default to all of its messaging apps, Meta’s Global Head of Safety Antigone Davis has today sought to provide a level of reassurance that Meta is indeed aware of the risks and dangers that such protection can pose, and that it is building safeguards into its processes to protect against potential misuse.

Though the measures outlined don’t exactly address all the issues raised by analysts and safety groups around the world.

As a quick recap, back in 2019, Facebook announced its plan to merge the messaging functionalities of Messenger, Instagram and WhatsApp, which would then provide users with a universal inbox, with all of your message threads from each app accessible on either platform.

The idea is that this will simplify cross-connection, while also opening the door to more opportunities for brands to connect with users in the messaging tool of their choice – but it also, inherently, means that the data protection method for its messaging tools must rise to the level of WhatsApp, its most secure messaging platform, which already includes E2E encryption as the default.

Various child safety experts raised the alarm, and several months after Facebook’s initial announcement, representatives from the UK, US and Australian Governments sent an open letter to Facebook CEO Mark Zuckerberg requesting that the company abandon its integration plan.

Meta has pushed ahead, despite specific concerns that the expansion of encryption will see its messaging tools used by child trafficking and exploitation groups, and now, as it closes in on the next stage, Meta’s working to counter such claims, with Davis outlining six key elements which she believes will ensure safety within this push.

See also  Facebook made its bed and wants Congress to lie in it

Davis has explained the various measures that Meta has added on this front, including:

  • Detection tools to stop adults from repeatedly setting up new profiles in an attempt to connect minors that they don’t know
  • Safety notices in Messenger, which provide tips on spotting suspicious behavior
  • The capacity to filter messages with selected keywords on Instagram
  • More filtering options in chat requests to help avoid unwanted contact
  • Improved education prompts to help detect spammers and scammers in messages
  • New processes to make it easier to report potential harm, including an option to select “involves a child”, which will then prioritize the report for review and action

Meta messaging security options

Which are all good, all important steps in detection, while Davis also notes that its reporting process “decrypts portions of the conversation that were previously encrypted and unavailable to us so that we can take immediate action if violations are detected”.

That’ll no doubt raise an eyebrow or two among WhatsApp users – but the problem here is that, overall, the broader concern is that such protections will facilitate usage by criminal groups, and the reliance on self-reporting in this respect is not going to have any impact on these networks operating, at scale, under a more protected messaging framework within Meta’s app eco-system.

Governments have called for ‘backdoor access’ to break Meta’s encryption for investigations into such activity, which Meta says is both not possible and will not be built into its future framework. The elements outlined by Davis do little to address this specific need, and without the capacity to better detect such, it’s hard to see any of the groups opposed to Meta’s expanded encryption changing their stance, and accepting that the merging of all of the platform’s DM options will not also see a rise in criminal activity organized via the same apps.

See also  NIU receives grant from Facebook for STEAM Fest - WIFR

Of course, the counterargument could be that encryption is already available on WhatsApp, and that criminal activity of this type can already be undertaken within WhatsApp alone. But with a combined user count of 3.58 billion people per month across its family of apps, that’s a significantly broader interconnection of people than WhatsApp’s 2 billion active users, which, arguably, could open the door to far more potential harm and danger in this respect.

Really, there’s no right answer here. Privacy advocates will argue that encryption should be the standard, and that more people are actually more protected, on balance, by enhanced security measures. But there is also an undeniable risk in shielding even more criminal groups from detection.

Either way, right now, Meta seems determined to push ahead with the plan, which will weld all of its messaging tools together, and also make it more difficult to break-up its network, if any antitrust decisions don’t go Meta’s way, and it’s potentially pressed to sell-off Instagram or WhatsApp as a result.

But expect more debate to be had, in more countries, as Meta continues to justify its decision, and regulatory and law enforcement groups seek more options to help maintain a level of accessibility for criminal investigations and detection.

Continue Reading

Trending