Connect with us

FACEBOOK

Facebook whistleblower updates: Frances Haugen testifies at Parliament – USA Today

Published

on

play

Misinformation and extremism spreading unchecked. Hate speech sparking conflict and violence in the U.S. and abroad. Human traffickers sharing a platform with baby pictures and engagement announcements. 

Despite its mission to bring people closer together, internal documents obtained by USA TODAY show that Facebook knew that users were being driven apart by a wide range of dangerous and divisive content on its platforms. 

The documents were part of the disclosures made to the Securities and Exchange Commission by Facebook whistleblower Frances Haugen. A consortium of news organizations, including USA TODAY, reviewed the redacted versions received by Congress. 

The documents provide a rare glimpse into the internal decisions made at Facebook that affect nearly 3 billion users around the globe.

Advertisement
free widgets for website

Concerned that Facebook was prioritizing profits over the well-being of its users, Haugen reviewed thousands of documents over several weeks before leaving the company in May. On Monday, she testified before a committee at the British Parliament. 

►A tale of two accounts: The story of Carol and Karen: Two experimental Facebook accounts show how the company helped divide America

►Facebook rebrand on the horizon?: Is Facebook changing company name? Shift to metaverse ignites rebranding plan, report says

The documents, some of which have been the subject of extensive reporting by The Wall Street Journal and The New York Times, detail company research showing that toxic and divisive content is prevalent in posts boosted by Facebook and shared widely by users.

Advertisement
free widgets for website

Concerns about how Facebook operates and its impact on teens have united congressional leaders. 

The company  has responded to the leaked documents and recent media attention, claiming the premise was “false.”

“At the heart of these stories is a premise which is false. Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie. The truth is we’ve invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook,” the company said.

Frances Haugen testifies before British Parliament

During her testimony, Haugen said she is concerned with several concepts related to Facebook, such as ranking posts based on engagement, a lack of safety support for languages beyond English, and the “false choices” that Facebook presents by reducing discussions on how to act in a battle between transparency versus privacy.

“Now is the most critical time to act,” said Haugen, comparing Facebook’s situation to an oil spill. “Right now the failures of Facebook are making it harder for us to regulate Facebook.”

Advertisement
free widgets for website

Haugen also discussed the influence of “Groups” in the spread of misinformation and polarizing content.

“Unquestionably, it’s making hate worse,” she said. 

Haugen suggested solutions that would help curb the spread of misinformation and shift away from ranking based on engagement, such as a return to updates to users’ news feeds that happen chronologically. 

However, Facebook has pushed back from changes that could impact its bottom line, she said.

Advertisement
free widgets for website

“They don’t want to lose that growth,” said Haugen. “They don’t want 1% shorter sessions because that’s 1% less revenue.”

See also  Facebook Dating Gets Support for Facebook, Instagram Stories

Haugen also addressed Facebook’s Oversight Board, the body that makes decisions on content moderation for the platform. Haugen implored the board to seek more transparency in its relationship with Facebook.

Haugen said if Facebook can actively mislead its board, “I don’t know what the purpose of the Oversight Board is.” 

Haugen ‘deeply worried’ about making Instagram safe for kids

Haugen, who spoke for more than an hour, said she is “deeply worried” about Facebook’s ability to make its social app Instagram safe for kids.

Facebook had planned to release a version of the app for kids under 13, but postponed launch in September to work more with parents and lawmakers to address their concerns.

Advertisement
free widgets for website

During testimony, Haugen said unlike other platforms, Instagram is built for “social comparison,” which can be worse for kids.

Haugen disputes claims from Facebook that they need to launch a kids version of Instagram because many users under 13 lie about their age. She suggests Facebook should publish how they detect users under 13.

When asked why Facebook hasn’t done anything to make Instagram safer for kids, she said the company knows “young users are the future of the platform and the earlier they get them the more likely they’ll get them hooked.”

Haugen: ‘Mandatory regulation’ needed

Haugen said Facebook needs more incentives for its employees to raise issues about the flaws of  its platform. She told British lawmakers there are countless employees with ideas for making Facebook safer, but those ideas aren’t amplified internally because they would slow the company’s growth.

“This is a company that lionizes growth,” she said.

Advertisement
free widgets for website

Haugen called for “mandatory regulation” to help guide Facebook toward a safer platform.

Facebook’s response

Haugen’s comments before lawmakers in the U.S. and Britain as well as numerous media investigations have created the most intense scrutiny that Facebook has encountered since it launched in 2004.

CEO Mark Zuckerberg has repeatedly defended the company and its practices, sharing in an internal staff memo that “it’s very important to me that everything we build is safe and good for kids.”

Nick Clegg, Facebook’s vice president of global affairs, echoed a similar sentiment in an extensive memo to staff on Saturday that was obtained by USA TODAY.  

Advertisement
free widgets for website

Clegg told staff that they “shouldn’t be surprised to find ourselves under this sort of intense scrutiny.”

“I think most reasonable people would acknowledge social media is being held responsible for many issues that run much deeper in society – from climate change to polarization, from adolescent mental health to organized crime,” Clegg said. “That is why we need lawmakers to help. It shouldn’t be left to private companies alone to decide where the line is drawn on societal issues.

On Sunday, Sen. Richard Blumenthal, D-Conn., chair of the Consumer Protection Subcommittee that held Haugen’s testimony, told CNN that Facebook “ought to come clean and reveal everything.”

The spread of misinformation 

The documents reveal the internal discussions and scientific experimentation surrounding misinformation and harmful content being spread on Facebook.

Advertisement
free widgets for website

A change to the algorithm which prioritizes what users see in their News Feed rolled out in 2018 and was supposed to encourage “meaningful social interactions” and strengthen bonds with friends and family.

See also  Did Data Make a Difference? Reviewing the Crises of 2020 with Facebook Data for Good

Facebook researchers discovered the algorithm change was exacerbating the spread of misinformation and harmful content and actively experimenting with ways to demote and contain that content, documents show. 

►Who is Facebook whistleblower Frances Haugen: Everything you need to know

►From Facebook friend to romance scammer: Older Americans increasingly targeted amid COVID pandemic

News Feeds with violence and nudity 

Facebook’s research found that users with low digital literacy skills were significantly more likely to see graphic violence and borderline nudity in their News Feed.

Advertisement
free widgets for website

The people most harmed by the influx of disturbing posts were Black, elderly and low-income, ​among other vulnerable groups, the research found. It also said Facebook also conducted numerous in-depth interviews and in-home visits with 18 of these users over several months. The researchers found that the exposure to disturbing content in their feeds made them less likely to use Facebook and exacerbated the trauma and hardships they were already experiencing.

Researchers found: A 44-year-old who was in a precarious financial situation followed Facebook pages that posted coupons and savings and were bombarded with unknown users’ posts of financial scams. A person who’d used a Facebook group for Narcotics Anonymous and totaled his car was shown alcoholic beverage ads and posts about cars for sale. Black people were consistently shown images of physical violence and police brutality.

By contrast, borderline hate posts appeared much more frequently in high-digital-literacy users’ feeds. Whereas low-digital-literacy users were unable to avoid nudity and graphic violence in their feeds, the research suggested people with better digital skills used them to seek out hate-filled content more effectively.

Curbing harmful content 

The documents show the company’s researchers tested various ways to reduce the amount of misinformation and harmful content served to Facebook users.

Advertisement
free widgets for website

Tests included straightforward engineering fixes that would demote viral content that was negative, sensational, or meant to provoke outrage.  

In April 2019, company officials debated dampening the virality of misinformation by demoting “deep reshares” of content where the poster is not a friend or follower of the original poster.

Facebook found that users encountering posts that are more than two reshares away from the original post are four times as likely to be seeing misinformation.

By demoting that content, Facebook would be “easily scalable and could catch loads of misinfo,” wrote one employee. “While we don’t think it is a substitute for other approaches to tackle misinfo, it is comparatively simple to scale across languages and countries.” 

Other documents show Facebook deployed this change in several countries – including India, Ethiopia and Myanmar – in 2019, but it’s not clear whether Facebook stuck with this approach in these instances.

Advertisement
free widgets for website

►Done with Facebook?: Here’s how to deactivate or permanently delete your Facebook account

►’Profits before people’: After Facebook whistleblower Frances Haugen argued her case, will Congress act?

How to moderate at-risk countries

Facebook knows of potential harms from content on its platform in at-risk countries but does not have effective moderation – either from its own artificial intelligence screening or from employees who review reports of potentially violating content, the documents show.

See also  TVA: Mississippi solar project to provide power to Facebook

Another document, based on data from 2020, offered proposals to change the moderation of content in Arabic to “improve our ability to get ahead of dangerous events, PR fires, and integrity issues in high-priority At-Risk Countries, rather than playing catch up.”

A Facebook employee made several proposals, the records show, including hiring individuals from less-represented countries. Because dialects can vary by country or even region, the employee wrote, reviewers might not be equipped to handle reports from other dialects. While Moroccan and Syrian dialects were well represented among Facebook’s reviewers, Libyan, Saudi Arabian and Yemeni were not.

Advertisement
free widgets for website

“With the size of the Arabic user base and potential severity of offline harm in almost every Arabic country – as every Arabic nation save Western Sahara is on the At-Risk Countries list and deals with such severe issues as terrorism and sex trafficking – it is surely of the highest importance to put more resources to the task of improving Arabic systems,” the employee wrote.

One document from late 2020 sampled more than 1,000 hate speech reports to Facebook in Afghanistan, finding deficiencies in everything from the accuracy of translation in local languages in its community standards to its reporting process. (Afghanistan was not listed among Facebook’s three tiers of at-risk countries in a document Haugen collected before her departure in May, which was before the United States’ withdrawal.)

The report found for one 30-day set of data, 98% of hate speech was removed reactively in response to reports, while just 2% was removed proactively by Facebook.

The document recommended Facebook allow employees in its Afghanistan market to review its classifiers to refine them and add new ones.

“This is particularly important given the significantly lower detection of Hate Speech contents by automation,” it said.

Advertisement
free widgets for website

Platform enables human trafficking

Facebook found that its platform “enables all three stages of the human exploitation lifecycle” – recruitment, facilitation and exploitation – via complex real-world networks, according to internal documents.

Though Facebook’s public-facing community standards claim the company removes content that facilitates human exploitation, internal documents show it has failed to do so.

Facebook has investigated the issue for years, proposing policy and technical changes to help combat exploitation on its platforms, records show. But it’s unclear whether those changes were adopted. In at least one case, Facebook deactivated a tool that was proactively detecting exploitation, according to internal documents.

In October 2019, prompted by a BBC investigation, Apple threatened to remove Facebook and Instagram from its app store because it found it had content promoting domestic servitude, a crime in which a domestic worker is trapped in his or her employment, abused and either underpaid or not paid at all. An internal document shows Facebook had been aware of the issue prior to Apple’s warning.

In response to Apple’s threat, Facebook conducted a review and identified more than 300,000 pieces of potentially violating content on Facebook and Instagram, records show. It took action on 133,566 items and blocked violating hashtags.

Advertisement
free widgets for website

Contributing: Mike Snider

Read More

FACEBOOK

Resources for Completing App Store Data Practice Questionnaires for Apps That Include the Facebook or Audience Network SDK

Published

on

By

resources-for-completing-app-store-data-practice-questionnaires-for-apps-that-include-the-facebook-or-audience-network-sdk

Resources for Completing App Store Data Practice Questionnaires for Apps That Include the Facebook or Audience Network SDK

First seen at developers.facebook.com

See also  UPDATE 1-Facebook, Amazon not in discussions to broadcast Super League
Continue Reading

FACEBOOK

Resources for Completing App Store Data Practice Questionnaires for Apps That Include the Facebook or Audience Network SDK

Published

on

By

resources-for-completing-app-store-data-practice-questionnaires-for-apps-that-include-the-facebook-or-audience-network-sdk

Updated July 18: Developers and advertising partners may be required to share information on their app’s privacy practices in third party app stores, such as Google Play and the Apple App Store, including the functionality of SDKs provided by Meta. To help make it easier for you to complete these requirements, we have consolidated information that explains our data collection practices for the Facebook and Audience Network SDKs.

Facebook SDK

To provide functionality within the Facebook SDK, we may receive and process certain contact, location, identifier, and device information associated with Facebook users and their use of your application. The information we receive depends on what SDK features 3rd party applications use and we have structured the document below according to these features.

App Ads, Facebook Analytics, & App Events

Facebook App Events allow you to measure the performance of your app using Facebook Analytics, measure conversions associated with Facebook ads, and build audiences to acquire new users as well as re-engage existing users. There are a number of different ways your app can use app events to keep track of when people take specific actions such as installing your app or completing a purchase.

With Facebook SDK, there are app events that are automatically logged (app installs, app launches, and in-app purchases) and collected for Facebook Analytics unless you disable automatic event logging. Developers determine what events to send to Facebook from a list of standard events, or via a custom event.

When developers send Facebook custom events, these events could include data types outside of standard events. Developers control sending these events to Facebook either directly via application code or in Events Manager for codeless app events. Developers can review their code and Events Manager to determine which data types they are sending to Facebook. It’s the developer’s responsibility to ensure this is reflected in their application’s privacy policy.

Advertisement
free widgets for website

Advanced Matching

Developers may also send us additional user contact information in code, or via the Events Manager. Advanced matching functionality may use the following data, if sent:

  • email address, name, phone number, physical address (city, state or province, zip or postal code and country), gender, and date of birth.
See also  TVA: Mississippi solar project to provide power to Facebook

Facebook Login

There are two scenarios for applications that use Facebook Login via the Facebook SDK: Authenticated Sign Up or Sign In, and User Data Access via Permissions. For authentication, a unique, app-specific identifier tied to a user’s Facebook Account enables the user to sign in to your app. For Data Access, a user must explicitly grant your app permission to access data.

Note: Since Facebook Login is part of the Facebook SDK, we may collect other information referenced here when you use Facebook Login, depending on your settings.

Device Information

We may also receive and process the following information if your app is integrated with the Facebook SDK:

  • Device identifiers;
  • Device attributes, such as device model and screen dimensions, CPU core, storage size, SDK version, OS and app versions, and app package name; and
  • Networking information, such as the name of the mobile operator or ISP, language, time zone, and IP address.

Audience Network SDK

We may receive and process the following information when you use the Audience Network SDK to integrate Audience Network ads in your app:

  • Device identifiers;
  • Device attributes, such as device model and screen dimensions, operating system, mediation platform and SDK versions; and
  • Ad performance information, such as impressions, clicks, placement, and viewability.

First seen at developers.facebook.com

Continue Reading

FACEBOOK

Enabling Faster Python Authoring With Wasabi

Published

on

By

enabling-faster-python-authoring-with-wasabi

This article was written by Omer Dunay, Kun Jiang, Nachi Nagappan, Matt Bridges and Karim Nakad.


Motivation

At Meta, Python is one of the most used programming languages in terms of both lines of code and number of users. Everyday, we have thousands of developers working with Python to launch new features, fix bugs and develop the most sophisticated machine learning models. As such, it is important to ensure that our Python developers are productive and efficient by giving them state-of-the-art tools.

Introducing Wasabi

Today we introduce Wasabi, a Python language service that implements the language server protocol (LSP) and is designed to help our developers use Python easier and faster. Wasabi assists our developers to write Python code with a series of advanced features, including:

  • Lints and diagnostics: These are available as the user types.
  • Auto import quick fix: This is available for undefined-variable lint.
  • Global symbols autocomplete: When a user types a prefix, all symbols (e.g. function names, class names) that are defined in other files and start with that prefix will appear in the autocomplete suggestion automatically.
  • Organize Imports + Remove unused: A quick fix that removes all unused imports and reformats the import section according to pep8 rules. This feature is powered by other tools that are built inside Meta such as libCST that helps with safe code refactoring.
  • Python snippets: Snippet suggestions are available as the user types for common code patterns.

Additionally, Wasabi is a surface-agnostic service that can be deployed into multiple code repositories and various development environments (e.g., VSCode, Bento Notebook). Since its debut, Wasabi has been adopted by tens of thousands of Python users at Meta across Facebook, Instagram, Infrastructure teams and many more.

Figure 1: Example for global symbols autocomplete, one of Wasabi’s features

Language Services at Meta Scale

A major design requirement for language services is low latency / user responsiveness. Autocomplete suggestions, lints and quickFixes should appear to the developer immediately as they type.

Advertisement
free widgets for website

At Meta, code is organized in a monorepo, meaning that developers have access to all python files as they develop. This approach has major advantages for the developer workflow including better discoverability, transparency, easier to share libraries and increased collaboration between teams. It also introduces unique challenges for building developer tools such as language services that need to handle hundreds of thousands of files.

See also  Facebook Dating Gets Support for Facebook, Instagram Stories

The scaling problem is one of the reasons that we tried to avoid using off-the-shelf language services available in the industry (e.g., pyright, jedi) to perform those operations. Most of those tools were built in the mindset of a relatively small to medium workspace of projects, maybe with the assumptions of thousands of files for large projects for operations that require o(repo) information.

For example, consider the “auto import” quick fix for undefined variables. In order to suggest all available symbols the language server needs to read all source files, the quick fix parses them and keeps an in-memory cache of all parsed symbols in order to respond to requests.

While this may scale to be performed in a single process on the development machine for small-medium repositories, this approach doesn’t scale in the monorepo use case. Reading and parsing hundreds of thousands of files can take many minutes, which means slow startup times and frustrated developers. Moving to an in-memory cache might help latency, but also may not fit in a single machine’s memory.

For example, assume an average python file takes roughly 10ms to be parsed and to extract symbols in a standard error recoverable parser. This means that on 1000 files it can take 10 seconds to initialize which is a fairly reasonable startup time. Running it on 1M files would take 166 minutes which is obviously a too lengthy startup time.

Advertisement
free widgets for website

How Wasabi Works

Offline + Online Processing:

In order to support low latency in Meta scale repositories, Wasabi is powered by two phases of parsing, background processing (offline) done by an external indexers, and local processing of locally changed “dirty files” (online):

  1. A background process indexes all committed source files and maintains the parsed symbols in a special database (glean) that is designed for storing code symbol information.
  2. Wasabi, which is a local process running on the user machine, calculates the delta between the base revision, stack of diffs and uncommitted changes that the user currently has, and extracts symbols only out of those “dirty” files. Since this set of “dirty” files is relatively small, the operation is performed very fast.
  3. Upon an LSP request such as auto import, Wasabi parses the abstract syntax tree (AST) of the file, then based on the context of the cursor, creates a query for both glean and local changes symbols, merges the results and returns it to the user.
See also  Did Data Make a Difference? Reviewing the Crises of 2020 with Facebook Data for Good

As a result, all Wasabi features are low latency and available to the user seamlessly as they type.

Note: Wasabi currently doesn’t handle the potential delta between the revision that glean indexed (happens once every few hours) and the locally base revision that the user currently has. We plan on adding that in the future.

Figure 2: Wasabi’s high level architecture

Ranking the Results

In some cases, due to the scale of the repository, there may be many valid suggestions in the set of results. For example, consider “auto import” suggestions for the “utils” symbol. There may be many modules that define a class named “utils” across the repository, therefore we invest in ranking the results to ensure that users see the most relevant suggestions on the top.

Advertisement
free widgets for website

For example, auto import ranking is done by taking into account:

  • Locality:
    • The distance of the suggested module directory path from the directory paths of modules that are already imported in this file.
    • The distance of the suggested module directory path from the current directory path of the local file.
    • Whether the file has been locally changed (“dirty” files are ranked higher).
  • Usage: The number of occurrences the import statement was used by other files in the repository.

To measure our success, we measured the index in the suggestion list of an accepted suggestion and noted that in almost all cases the accepted suggestion was ranked in one of top 3 suggestions.

Positive feedbacks from developers

After launching Wasabi to several pilot runs inside Meta, we have received numerous positive feedbacks from our developers. Here is one example of the quote from a software engineer at Instagram:

“I’ve been using Wasabi for a couple months now, it’s been a boon to my productivity! Working in Instagram Server, especially on larger files, warnings from pyre are fairly slow. With Wasabi, they’re lightning fast 😃!”

“I use features like spelling errors and auto import several times an hour. This probably makes my development workflow 10% faster on average (rough guess, might be more, definitely not less), a pretty huge improvement!”

As noted above, Wasabi has made a meaningful change to keep our developers productive and make them feel delightful.

Advertisement
free widgets for website

The metric to measure authoring velocity

In order to quantitatively understand how much value Wasabi has delivered to our Python developers, we have considered a number of metrics to measure its impact. Ultimately, we landed on a metric that we call ‘Authoring Velocity’ to measure how fast developers write code. In essence, Authoring Velocity is the inverse function of the time taken on a specific diff (a collection of code changes) during the authoring stage. The authoring stage starts from the timestamp when a developer checks out from the source control repo to the timestamp when the diff is created. We have also normalized it against the number of lines of code changed in the diff, as a proxy for diff size, to offset any possible variance. The greater the value for ‘Authoring Velocity,’ the faster we think developers write their code.

See also  What Is a Data Center?

Figure 3: Authoring Velocity Metric Formula

The result

With the metric defined, we ran an experiment to measure the difference that Wasabi brings to our developers. Specifically, we selected ~700 developers who had never used Wasabi before, and then randomly put them into two independent groups at a 50:50 split ratio. For these developers in the test group, they were enabled with Wasabi when they wrote in Python, whereas there was no change for those in the control group. For both groups, we compare the changes in relative metric values before and after the Wasabi enablement. From our results, we find that for developers in the test group, the median value of authoring velocity has increased by 20% after they started using Wasabi. Meanwhile, we don’t see any significant change in the control group before and after, which is expected.

Figure 4: Authoring Velocity measurements for control and test groups, before and after Wasabi was rolled out to the test group.

Summary

With Python’s unprecedented growth, it is an exciting time to be working in the area to make it better and handy to use. Together with its advanced features, Wasabi has successfully improved developers’ productivity at Meta, allowing them to write Python faster and easier with a positive developer experience. We hope that our prototype and findings can benefit more people in the broader Python community.

Advertisement
free widgets for website

To learn more about Meta Open Source, visit our open source site, subscribe to our YouTube channel, or follow us on Twitter, Facebook and LinkedIn.

First seen at developers.facebook.com

Continue Reading

Trending