Connect with us

FACEBOOK

“Mark Changed The Rules”: How Facebook Went Easy On Alex Jones And Other Right-Wing Figures

Published

on

In April 2019, Facebook was preparing to ban one of the internet’s most notorious spreaders of misinformation and hate, Infowars founder Alex Jones. Then CEO Mark Zuckerberg personally intervened.

Jones had gained infamy for claiming that the 2012 Sandy Hook elementary school massacre was a “giant hoax,” and that the teenage survivors of the 2018 Parkland shooting were “crisis actors.” But Facebook had found that he was also relentlessly spreading hate against various groups, including Muslims and trans people. That behavior qualified him for expulsion from the social network under the company’s policies for “dangerous individuals and organizations,” which required Facebook to also remove any content that expressed “praise or support” for them.

But Zuckerberg didn’t consider the Infowars founder to be a hate figure, according to a person familiar with the decision, so he overruled his own internal experts and opened a gaping loophole: Facebook would permanently ban Jones and his company — but would not touch posts of praise and support for them from other Facebook users. This meant that Jones’ legions of followers could continue to share his lies across the world’s largest social network.

“Mark personally didn’t like the punishment, so he changed the rules,” a former policy employee told BuzzFeed News, noting that the original rule had already been in use and represented the product of untold hours of work between multiple teams and experts.

“Mark personally didn’t like the punishment, so he changed the rules.”

“That was the first time I experienced having to create a new category of policy to fit what Zuckerberg wanted. It’s somewhat demoralizing when we have established a policy and it’s gone through rigorous cycles. Like, what the fuck is that for?” said a second former policy employee who, like the first, asked not to be named so they could speak about internal matters.

Advertisement
free widgets for website

“Mark called for a more nuanced policy and enforcement strategy,” Facebook spokesperson Andy Stone said of the Alex Jones decision, which also affected the bans of other extremist figures.

Zuckerberg’s “more nuanced policy” set off a cascading effect, the two former employees said, which delayed the company’s efforts to remove right wing militant organizations such as the Oath Keepers, which were involved the Jan. 6 insurrection at the US Capitol. It is also a case study in Facebook’s willingness to change its rules to placate America’s right wing and avoid political backlash.

Internal documents obtained by BuzzFeed News and interviews with 14 current and former employees show how the company’s policy team — guided by Joel Kaplan, the vice president of global public policy, and Zuckerberg’s whims — has exerted outsize influence while obstructing content moderation decisions, stymieing product rollouts, and intervening on behalf of popular conservative figures who have violated Facebook’s rules.

In December, a former core data scientist wrote a memo titled, “Political Influences on Content Policy.” Seen by BuzzFeed News, the memo stated that Kaplan’s policy team “regularly protects powerful constituencies” and listed several examples, including: removing penalties for misinformation from right-wing pages, blunting attempts to improve content quality in News Feed, and briefly blocking a proposal to stop recommending political groups ahead of the US election.

Since the November vote, at least six Facebook employees have resigned with farewell posts that have called out leadership’s failures to heed its own experts on misinformation and hate speech. Four departing employees explicitly cited the policy organization as an impediment to their work and called for a reorganization so that the public policy team, which oversees lobbying and government relations, and the content policy team, which sets and enforces the platform’s rules, would not both report to Kaplan.

Advertisement
free widgets for website

Facebook declined to make Kaplan or other executives available for an interview. Stone, the company spokesperson, dismissed concerns about the vice president’s influence.

“Recycling the same warmed over conspiracy theories about the influence of one person at Facebook doesn’t make them true,” he said. “The reality is big decisions at Facebook are made with input from people across different teams who have different perspectives and expertise in different areas. To suggest otherwise is absurd.”

An integrity researcher who worked on Facebook’s efforts to protect the democratic process and rein in radicalization said the company caused direct harm to users by rejecting product changes due to concerns of political backlash.

“At some point Zuckerberg has to be held responsible for his role in allowing his platform to be weaponized.”

“Out of fears over potential public and policy stakeholder responses, we are knowingly exposing users to risks of integrity,” they wrote in an internal note seen by BuzzFeed News. They quit in August.

Those most affected by Jones’ rhetoric have taken notice, too. Lenny Pozner, whose 6-year-old son Noah was the youngest victim of the Sandy Hook shooting, called the revelation that Zuckerberg weakened penalties facing the Infowars founder “disheartening, but not surprising.” He said the company had made a promise to do better in dealing with hate and hoaxes following a 2018 letter from HONR Network, his organization for survivors of mass casualty events. Yet Facebook continues to fail to remove harmful content.

Advertisement
free widgets for website

“At some point,” Pozner told BuzzFeed News, “Zuckerberg has to be held responsible for his role in allowing his platform to be weaponized and for ensuring that the ludicrous and the dangerous are given equal importance as the factual.”

“Different Views On Different Things”


Samuel Corum / Getty Images

Mark Zuckerberg and Joel Kaplan chat after leaving a meeting with Sen. John Cornyn (R-TX) in his office on Capitol Hill on September 19, 2019 in Washington, DC.

Kaplan’s close relationship with Zuckerberg has led the CEO to weigh politics more heavily when making high-profile content policy enforcement decisions, current and former employees said. Kaplan’s efforts to court the Trump White House over the past four years — from his widely publicized support for Supreme Court nominee Brett Kavanaugh to his interventions on behalf of right-wing influencers in Facebook policy decisions — have also made him a target for civil rights groups and Democratic lawmakers.

Advertisement
free widgets for website

In June 2020, three Democratic senators asked in a letter what role Kaplan played “in Facebook’s decision to shut down and de-prioritize internal efforts to contain extremist and hyperpolarizing activity.” Sen. Elizabeth Warren called him out for overseeing a lobbying effort that spends millions of dollars to influence politicians. With a new presidential administration in place and a spate of ongoing antitrust lawsuits, Zuckerberg must now grapple with the fact that his top political adviser may no longer be a Washington, DC asset but a potential liability.

“I think that everybody in DC hates Facebook. They have burned every bridge,” said Sarah Miller, executive director of the American Economic Liberties Project and a former member of Joe Biden’s presidential transition team. Democrats are incensed with the platform’s tolerance of hate speech and misinformation, while “pulling Trump off the platform” has brought new life to Republican gripes with the company, she said.

“Facebook has fires to put out all across the political spectrum,” Miller added.

“I think that everybody in DC hates Facebook. They have burned every bridge.”

When Kaplan joined Facebook to lead its DC operation in 2011, he had the connections and pedigree the company needed to court the American right. A former clerk for conservative Supreme Court Justice Antonin Scalia, he served as a White House deputy chief of staff under President George W. Bush after participating in the Brooks Brothers riot during the 2000 Florida presidential election dispute. During a Senate confirmation hearing in 2003 for a post with the Office of Management and Budget, Kaplan was questioned about his role in the event, which sought to stop the tallying of votes during the Florida recount.

Though he initially maintained a low public profile at Facebook, Kaplan — COO Sheryl Sandberg’s Harvard classmate and former boyfriend — was valued by Zuckerberg for understanding of GOP policymakers and conservative Americans, who the CEO believed were underrepresented by a liberal-leaning leadership team and employee base.

Advertisement
free widgets for website


BuzzFeed News; Getty Images

By 2014, he’d been promoted to vice president of global public policy. In that role, Kaplan oversaw the company’s government relations around the world as well as its content policy team. That arrangement raised eyebrows, as other companies, including Google and Twitter, typically keep public policy and lobbying efforts separate from teams that create and enforce content rules.

The candidacy and election of Donald Trump made Kaplan even more valuable to the company. He served as Zuckerberg’s policy consigliere, helping Facebook navigate the sea of lies and hate the former president conjured on the platform as well as the outraged public response to it. In December 2015, following a Facebook post from Trump calling for a “total and complete shutdown” of Muslims entering the US — the first of many that forced the company to grapple with the then-candidate’s racist and sometimes violent rhetoric — Kaplan and other executives advised Facebook’s CEO to do nothing.

“Don’t poke the bear,” Kaplan said, according to the New York Times, arguing that taking action against Trump’s account would invite a right-wing backlash and accusations that the site was limiting free speech. It’s an argument he’d repeat in various forms over the ensuing five years, with Zuckerberg often in agreement.

During that time, Kaplan rarely communicated openly on Facebook’s internal message boards or spoke at companywide meetings, according to current and former employees. When he did, however, his appearances were clouded in controversy.

Advertisement
free widgets for website

Do you work at Facebook or another technology company? We’d love to hear from you. Reach out to ryan.mac@buzzfeed.com, craig.silverman@buzzfeed.com, or via one of our tip line channels.

After a Facebook team led by then–chief security officer Alex Stamos found evidence of Russian interference on the platform during and after the 2016 US presidential election, Kaplan was part of a leadership group that argued against disclosing the full extent of the Kremlin’s influence operation. When the company did end up publicly releasing further information about it in October 2017, it was Kaplan, not Stamos, who answered employee questions during an internal town hall.

“They could have sent me,” said Stamos, who subsequently left the company over disagreements related to Russian interference. “The person who was presenting [evidence of the Russian campaign] to VPs was me.”

It was Kaplan’s appearance at Kavanaugh’s September 2018 Senate confirmation hearings, however, that pushed him into the national spotlight. Sitting behind the nominee, he was visible in TV coverage of the event. Employees were furious; they believed Kaplan’s attendance made it look like Facebook supported the nominee, while dismissing the allegations of sexual assault against him.


Jim Bourg / Reuters

Joel Kaplan sits with family members, friends, and supporters of Supreme Court nominee Brett Kavanaugh during his testimony before a Senate Judiciary Committee confirmation hearing in Washington, DC, on September 27, 2018.

Advertisement
free widgets for website

Kaplan subsequently addressed the incident at a companywide meeting via videoconference, where angry workers, who felt his on-camera appearance was intentional, hammered him with questions. But it was inside his own team that he faced some of the fiercest criticism. During a Facebook public policy team meeting that fall, a longtime manager tearfully accused him of lacking empathy for survivors of sexual assault.

“It doesn’t matter how well you know someone; it doesn’t mean they didn’t do what somebody said they did,” the policy manager said, according to two people in attendance. She subsequently wrote a public blog post about her experience of being sexually assaulted and left the company a year later.

None of this changed Kaplan’s standing with Zuckerberg. The CEO went to DC in September 2019 and was shepherded around by Kaplan on a trip that included a meeting with Trump. Kaplan remained friendly with the Trump White House, which at one point considered him to run the Office of Management and Budget.

“Many people feel that Joel Kaplan has too much power over our decisions.”

In May, when Zuckerberg decided to not touch Trump’s “when the looting starts, the shooting starts” incitement during the George Floyd protests, workers became incensed. At a subsequent companywide meeting, one of the most upvoted questions from employees directly called Kaplan out. “Many people feel that Joel Kaplan has too much power over our decisions,” the question read, asking that the vice president explain his role and values.

Zuckerberg seemed irked by the question and disputed the notion that any one person could influence the “rigorous” process by which the company made decisions. Diversity, the CEO argued, means taking into account all political views.

Advertisement
free widgets for website

“That basically asked whether Joel can be in this role, or can be doing this role, on the basis of the fact that he is a Republican … and I have to say that I find that line of questioning to be very troubling,” Zuckerberg said, ignoring the question. “If we want to actually do a good job of serving people, [we have to take] into account that there are different views on different things.”

Facebook employees said Zuckerberg remains stalwart in his support for Kaplan, but internal pressure is building to reduce the public policy team’s influence. Colleagues “feel pressure to ensure their recommendations align with the interests of policymakers,” Samidh Chakrabarti, head of Facebook’s civic integrity team, wrote in an internal note in June, bemoaning the difficulty of balancing such interests while delivering on the team’s mandate: stopping abuse and election interference on the platform. The civic integrity team was disbanded shortly after the election, as reported by the Information.

“They attribute this to the organizational incentives of having the content policy and public policy teams share a common root,” Chakrabarti said. “As long as this is the case, we will be prematurely prioritizing regulatory interests over community protection.”

Stamos, who is now head of the Stanford Internet Observatory, said the policy team’s structure will always present a problem in its current form.

“You don’t want platform policy people reporting to someone who’s in charge of keeping people in government happy,” he said. “Joel comes from the Bush White House, and government relations does not have a neutral position on speech requests.”

Advertisement
free widgets for website

“Fear Of Antagonizing Powerful Political Actors”


Josh Edelson / Getty Images

In August, a Facebook product manager who oversees the News Feed updated his colleagues on the company’s preparations for the 2020 US election.

Internal research had shown that people on Facebook were being polarized on the site in political discussion groups, which were also breeding grounds for misinformation and hate. To combat this, Facebook employees who were tasked with protecting election integrity proposed the platform stop recommending such groups in a module called “Groups You Should Join.”

But the public policy team was afraid of possible political blowback.

Advertisement
free widgets for website

“Although the Product recommendation would have improved implementation of the civic filter, it would have created thrash in the political ecosystem during [ the 2020 US election,]” the product manager wrote on Facebook’s internal message board. “We have decided to not make any changes until the election is over.”

The social network eventually paused political group recommendations — just weeks before the November election — and removed them permanently only after the Capitol insurrection on Jan. 6. Current and former employees said Facebook’s decision to ignore its integrity team’s guidance and initially leave group recommendations untouched exemplifies how political calculations often quashed company initiatives that could have blunted misinformation and radicalization.

In that same update about group recommendations, the product manager also explained how leaders decided against making changes to a feature called In Feed Recommendations (IFR) due to potential political worries. Designed to insert posts into people’s feeds from accounts they don’t follow, IFR was intended to foster new connections or interests. For example, if a person followed the Facebook page for a football team like the Kansas City Chiefs, IFR might add a post from the NFL to their feed, even if that person didn’t follow the NFL.

One thing IFR was not supposed to do was recommend political content. But earlier that spring, Facebook users began complaining that they were seeing posts from conservative personalities including Ben Shapiro in their News Feeds even though they had never engaged with that type of content.

When the issue was flagged internally, Facebook’s content policy team warned that removing such suggestions for political content could reduce those pages’ engagement and traffic, and possibly inspire complaints from publishers. A News Feed product manager and a policy team member reiterated this argument in an August post to Facebook’s internal message board.

Advertisement
free widgets for website

“A noticeable drop in distribution for these producers (via traffic insights for recommendations) is likely to result in high-profile escalations that could include accusations of shadow-banning and/or FB bias against certain political entities during the US 2020 election cycle,” they explained. Shadow-banning, or the limiting of a page’s circulation without informing its owners, is a common accusation leveled by right-wing personalities against social media platforms.

“In the US it appears that interventions have been almost exclusively on behalf of conservative publishers.”

Throughout 2020, the “fear of antagonizing powerful political actors,” as the former core data scientist put it in their memo, became a key public policy team rationalization for forgoing action on potentially violative content or rolling out product changes ahead of the US presidential election. They also said they had seen “a dozen proposals to measure the objective quality of content on News Feed diluted or killed because … they have a disproportionate impact across the US political spectrum, typically harming conservative content more.”

The data scientist, who spent more than five years at the company before leaving late last year, noted that while strides had been made since 2016, the state of political content on News Feed was “still generally agreed to be bad.” According to Facebook data, they added, 1 of every 100 views on content about US politics was for some type of hoax, while the majority of views for political materials were on partisan posts. Yet the company continued to give known spreaders of false and misleading information a pass if they were deemed “‘sensitive’ or likely to retaliate,” the data scientist said.

“In the US it appears that interventions have been almost exclusively on behalf of conservative publishers,” they wrote, attributing this to political pressure or a reluctance to upset sensitive publishers and high-profile users.

As BuzzFeed News reported last summer, members of Facebook’s policy team — including Kaplan — intervened on behalf of right-wing figures and publications such as Charlie Kirk, Breitbart, and Prager University, in some cases pushing for the removal of misinformation strikes against their pages or accounts. Strikes, which are applied at the recommendation of Facebook’s third-party fact-checkers, can result in a range of penalties, from a decrease in how far their posts are distributed to the removal of the page or account.

Advertisement
free widgets for website

Kaplan’s other interventions are well documented. In 2018, the Wall Street Journal revealed that he helped kill a project to connect Americans who have political differences. The paper said Kaplan had objected “when briefed on internal Facebook research that found right-leaning users tended to be more polarized, or less exposed to different points of view, than those on the left.” Last year, the New York Times reported that policy executives declined to expand a feature called “correct the record” — which notified users when they interacted with content that was later labeled false by Facebook’s fact-checking partners — out of fear that it would “disproportionately show notifications to people who shared false news from right-wing websites.”

“It makes it hard to be visibly proud of where I work.”

Policy executives also reportedly helped override an initiative proposed by the company’s now-disbanded civic integrity unit to throttle the reach of misleading political posts, according to the Information.

Such interventions were hardly a surprise for those who have worked on efforts at the company to reduce harm and misinformation. In a December departure note previously reported by BuzzFeed News, an integrity researcher detailed how right-wing pages, including those for Breitbart and Fox News, had become hubs of discussion filled with death threats and hate speech — in clear violation of Facebook policy. They questioned why the company continued to work with such publications in official capacities.

“When the company has a very apparent interest in propping up actors who are fanning the flames of the very fire we are trying to put out, it makes it hard to be visibly proud of where I work,” the researcher wrote.

A Line From Alex Jones To The US Capitol

Advertisement
free widgets for website


Kent Nishimura / Los Angeles Times via Getty Imag

Alex Jones leaves after speaking at a Stop The Steal rally in front of the Supreme Court on January 5, 2021 in Washington, DC.

The strategic response team that had gathered evidence for the Alex Jones and Infowars ban in spring 2019 drew upon years of examples of his hate speech against Muslims, transgender people, and other groups. Under the company’s policies for dangerous individuals and organizations, Jones and Infowars would be permanently banned and Facebook would have to remove content that expressed support for the conspiracy theorist and his site.

In April 2019, a proposal for the recommended ban — complete with examples and comments from the public policy, legal, and communications teams — was sent by email to Monika Bickert, Facebook’s head of global policy management, and her boss, Kaplan. The proposal was then passed on to top company leadership, including Zuckerberg, sources said.

The Facebook CEO balked at removing posts that praised Jones and his ideas.

Advertisement
free widgets for website

“Zuckerberg basically took the decision that he did not want to use this policy against Jones because he did not personally think he was a hate figure,” said a former policy employee.

The teams were directed to create an entirely new designation for Jones to fit the CEO’s request, and when the company announced the ban on May 2, it did not say it had changed its rules at Zuckerberg’s behest. The decisions, however, would have far-reaching implications, setting off a chain of events that ultimately contributed to the violent aftermath of the 2020 election.

Two former policy employees said the process made content policy teams hesitant to recommend new actions, resulting in a “freeze” on new designations for dangerous individuals and organizations for roughly a year. In the interim, many extremist groups used the platform to organize and grow their membership throughout 2020. The former policy employees said the delay in labeling such groups effectively enabled them to use Facebook to recruit and organize through most of 2020.

“Once the Alex Jones thing had blown over, they froze designations, and that lasted for close to a year, and they were very rarely willing to push through anything. That impacted the lead-up to the election last year. Teams should have been reviewing the Oath Keepers and Three Percenters, and essentially these people weren’t allowed to,” said the policy employee, referring to right-wing militant organizations that Facebook started to remove in August 2020.

The Washington Post reported on Saturday that the Justice Department and FBI are investigating links between Jones and the Capitol rioters.

Advertisement
free widgets for website

The company could have acted much earlier, one Facebook researcher wrote on the internal message board when they quit in August. The note came with a warning: “Integrity teams are facing increasing barriers to building safeguards.” They wrote of how proposed platform improvements that were backed by strong research and data had been “prematurely stifled or severely constrained … often based on fears of public and policy stakeholder responses.”

“We were willing to act only *after* things had spiraled into a dire state.”

“We’ve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” they wrote, criticizing the company for being hesitant to take action against the QAnon mass delusion. “In the meantime, the fringe group/set of beliefs has grown to national prominence with QAnon congressional candidates and QAnon hashtags and groups trending in the mainstream. We were willing to act only *after* things had spiraled into a dire state.”

Though the 2020 election is long over, current and former employees say politics continue to seep into Facebook product and feature decisions. Four sources said they were concerned about Kaplan’s influence over which content is recommended in News Feed. Given his role courting politicians, they said, there is a fundamental conflict of interest in both appeasing government officials or candidates and deciding what people see on the platform.

For weeks prior to the election, misinformation was spreading across Facebook, undermining trust in the integrity of how votes would be counted. To improve the quality of content in the News Feed, executives decided the site would emphasize News Ecosystem Quality (NEQ), an internal score given to publishers based on assessments of their journalism, in its ranking algorithm, according to the New York Times.

This and other “break glass” measures improved the quality of content on people’s News Feeds so much that John Hegeman, the vice president responsible for the feature, pushed to continue them indefinitely, according to three people familiar with the situation who spoke to BuzzFeed News. Yet Hegeman’s suggestion was opposed by Kaplan and members of the policy team. The temporary measures eventually expired.

Advertisement
free widgets for website

Hegeman did not respond to a request for comment.

In the days following the insurrection, Facebook reemphasized NEQ in its News Feed ranking algorithm again. Facebook spokesperson Andy Stone said that change was temporary and has already been “rolled back.”

“Our Leadership Isn’t Doing Enough”

In the aftermath of the 2020 election, some departing Facebook have openly criticized leadership as they’ve exited. “I’ve grown more disillusioned about our company and the role we play in society,” a nearly eight-year veteran said, adding that they were saddened and infuriated by leadership’s failure to recognize or minimize the “real negatives” the company introduces to the world.

“I think the people working in these areas are working as hard as they can and I commend them for their efforts,” they wrote. “However, I do think our leadership isn’t doing enough.”

Beyond a profound concern over the influence of Kaplan’s policy team, a number of Facebook employees attributed the company’s content policy problems to Zuckerberg and his view that the platform must always be a balance of right and left.

Advertisement
free widgets for website

“Ideology is not, and should not be, a protected class,” a content policy employee who left weeks after the election wrote. “White supremacy is an ideology; so is anarchism. Neither view is immutable, nor should either be beyond scrutiny. The idea that our content ranking decisions should be balanced on a scale from right to left is impracticable … and frankly can be dangerous, as one side of that scale actively challenges core democratic institutions and fails to recognize the results of a free and fair election.”

In October 2020, Facebook responded to ongoing criticism of its policy decisions by introducing an Oversight Board, an independent panel to hear appeals on content takedowns. But the former policy employee with insight into the Alex Jones ban said that significant changes to rules and enforcement will always come down to Zuckerberg.

“Joel [Kaplan] has influence for sure, but at the end of the day Mark owns this stuff,” they said. “Mark has consolidated so much of this political decision-making power in himself.” ●

Read More

Advertisement
free widgets for website
See also  Facebook (FB) Says It Has Spent $13 Billion on Safety, Security - Bloomberg

FACEBOOK

Updating Special Ad Audiences for housing, employment, and credit advertisers

Published

on

By

updating-special-ad-audiences-for-housing,-employment,-and-credit-advertisers

On June 21, 2022 we announced an important settlement with the US Department of Housing and Urban Development (HUD) that will change the way we deliver housing ads to people residing in the US. Specifically, we are building into our ads system a method designed to make sure the audience that ends up seeing a housing ad more closely reflects the eligible targeted audience for that ad.

As part of this agreement, we will also be sunsetting Special Ad Audiences, a tool that lets advertisers expand their audiences for ad sets related to housing. We are choosing to sunset this for employment and credit ads as well. In 2019, in addition to eliminating certain targeting options for housing, employment and credit ads, we introduced Special Ad Audiences as an alternative to Lookalike Audiences. But the field of fairness in machine learning is a dynamic and evolving one, and Special Ad Audiences was an early way to address concerns. Now, our focus will move to new approaches to improve fairness, including the method previously announced.

What’s happening: We’re removing the ability to create Special Ad Audiences via Ads Manager beginning on August 25, 2022.

Beginning October 12th, 2022, we will pause any remaining ad sets that contain Special Ad Audiences. These ad sets may be restarted once advertisers have removed any and all Special Ad Audiences from those ad sets. We are providing a two month window between preventing new Special Ad Audiences and pausing existing Special Ad Audiences to enable advertisers the time to adjust budgets and strategies as needed.

See also  TikTok takes on Facebook with US ecommerce push

For more details, please visit our Newsroom post.

Advertisement
free widgets for website

Impact to Advertisers using Marketing API on September 13, 2022

For advertisers and partners using the API listed below, the blocking of new Special Ad Audience creation will present a breaking change on all versions. Beginning August 15, 2022, developers can start to implement the code changes, and will have until September 13, 2022, when the non-versioning change occurs and prior values are deprecated. Refer below to the list of impacted endpoints related to this deprecation:

For reading audience:

  • endpoint gr:get:AdAccount/customaudiences
  • field operation_status

For adset creation:

  • endpoint gr:post:AdAccount/adsets
  • field subtype

For adset editing:

  • endpoint gr:post:AdCampaign
  • field subtype

For custom audience creation:

  • endpoint gr:post:AdAccount/customaudiences
  • field subtype

For custom audience editing:

  • endpoint gr:post:CustomAudience

Please refer to the developer documentation for further details to support code implementation.

First seen at developers.facebook.com

Advertisement
free widgets for website
Continue Reading

FACEBOOK

Introducing an Update to the Data Protection Assessment

Published

on

By

introducing-an-update-to-the-data-protection-assessment

Over the coming year, some apps with access to certain types of user data on our platforms will be required to complete the annual Data Protection Assessment. We have made a number of improvements to this process since our launch last year, when we introduced our first iteration of the assessment.

The updated Data Protection Assessment will include a new developer experience that is enhanced through streamlined communications, direct support, and clear status updates. Today, we’re sharing what you can expect from these new updates and how you can best prepare for completing this important privacy requirement if your app is within scope.

If your app is in scope for the Data Protection Assessment, and you’re an app admin, you’ll receive an email and a message in your app’s Alert Inbox when it’s time to complete the annual assessment. You and your team of experts will then have 60 calendar days to complete the assessment. We’ve built a new platform that enhances the user experience of completing the Data Protection Assessment. These updates to the platform are based on learnings over the past year from our partnership with the developer community. When completing the assessment, you can expect:

  • Streamlined communication: All communications and required actions will be through the My Apps page. You’ll be notified of pending communications requiring your response via your Alerts Inbox, email, and notifications in the My Apps page.

    Note: Other programs may still communicate with you through the App Contact Email.

  • Available support: Ability to engage with Meta teams via the Support tool to seek clarification on the questions within the Data Protection Assessment prior to submission and help with any requests for more info, or to resolve violations.

    Note: To access this feature, you will need to add the app and app admins to your Business Manager. Please refer to those links for step-by-step guides.

  • Clear status updates: Easy to understand status and timeline indicators throughout the process in the App Dashboard, App Settings, and My Apps page.
  • Straightforward reviewer follow-ups: Streamlined experience for any follow-ups from our reviewers, all via developers.facebook.com.

We’ve included a brief video that provides a walkthrough of the experience you’ll have with the Data Protection Assessment:

Something Went Wrong

Advertisement
free widgets for website

We’re having trouble playing this video.

The Data Protection Assessment elevates the importance of data security and helps gain the trust of the billions of people who use our products and services around the world. That’s why we are committed to providing a seamless experience for our partners as you complete this important privacy requirement.

Here is what you can do now to prepare for the assessment:

  1. Make sure you are reachable: Update your developer or business account contact email and notification settings.
  2. Review the questions in the Data Protection Assessment and engage with your teams on how best to answer these questions. You may have to enlist the help of your legal and information security points of contact to answer some parts of the assessment.
  3. Review Meta Platform Terms and our Developer Policies.

We know that when people choose to share their data, we’re able to work with the developer community to safely deliver rich and relevant experiences that create value for people and businesses. It’s a privilege we share when people grant us access to their data, and it’s imperative that we protect that data in order to maintain and build upon their trust. This is why the Data Protection Assessment focuses on data use, data sharing and data security.

Data privacy is challenging and complex, and we’re dedicated to continuously improving the processes to safeguard user privacy on our platform. Thank you for partnering with us as we continue to build a safer, more sustainable platform.

First seen at developers.facebook.com

Advertisement
free widgets for website
See also  Facebook Files 2 Separate Lawsuits for Baiting and Switching, Compromising Accounts
Continue Reading

FACEBOOK

Resources for Completing App Store Data Practice Questionnaires for Apps That Include the Facebook or Audience Network SDK

Published

on

By

resources-for-completing-app-store-data-practice-questionnaires-for-apps-that-include-the-facebook-or-audience-network-sdk

Resources for Completing App Store Data Practice Questionnaires for Apps That Include the Facebook or Audience Network SDK

First seen at developers.facebook.com

See also  On the Web: Facebook and Palestinians
Continue Reading

Trending