Connect with us

FACEBOOK

Facebook message encryption won’t happen until well into 2022

Published

on

<b>Facebook</b> message encryption won't happen until well into 2022 thumbnail

Facebook has said it will take until next year to implement a controversial plan to encrypt its messaging apps, amid concerns from police and child safety campaigners that the move will make it easier for criminals to share abuse images.

Antigone Davis, Facebook’s head of safety, told The Telegraph that the social networking giant expected to make the change at some point in 2022.  A spokesperson added that this would be the earliest it would encrypt messages.

Facebook’s chief executive Mark Zuckerberg promised in 2019 that the company would allow its messaging apps – Messenger, Instagram, and WhatsApp – to send messages between one another, as well as introducing end-to-end encryption, which means only the sender and receiver can read them. Currently only WhatsApp’s messages are end-to-end encrypted.

The move has been seen as a way for Facebook to burnish its privacy credentials and boost its messaging apps against rivals such as Apple’s iMessage. However, critics say it would make it more difficult for regulators to break up the company.

The plan has also alarmed security services and campaigners since Facebook is the world’s biggest reporter of child abuse, flagging thousands of cases to police last year.

“I think we’re looking at a launch of encryption that’s well into 2022,” Ms Davis said, adding that it was taking time to work through details of the move. Facebook has previously said that it would take years to make the switch but has not given a date.

Ms Davis added that Facebook had not taken a decision on whether Messenger Kids, a service for under-13s that allows parents to view their children’s messages, would be included in the company’s encryption plans. 

Facebook has not released Messenger Kids in the UK or Europe amid concerns from data regulators, but has said it hopes to expand it to more countries in future. The company has come under criticism for planning to encrypt messages, with MPs warning earlier this year that the changes would make it easier for paedophiles to share images.

Ministers were this week reported to be considering forcing Facebook to provide a “backdoor” that would let police access messages when encryption is introduced. Its lengthy implementation of the plans is believed to be down to the technical challenges of integrating the messaging services and making them work across different devices, as well as concerns from campaigners.

Ms Davis said Facebook was introducing alternative ways to prevent paedophiles using the service after it introduces encryption, such as preventing connections between abusers and children and encouraging users to report illegal imagery. Facebook says it has already been forced to turn off its child abuse detection systems in the EU due to recently-introduced privacy laws.

Read More

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

FACEBOOK

Facebook oversight board rulings so far

Published

on

By

<b>Facebook</b> oversight board rulings so far thumbnail

A person photographs the sign outside of Facebook headquarters in Menlo Park, Calif. | Paul Sakuma/AP Photo

Upheld

  • The content: A post used a slur to refer to Azerbaijanis in the caption of photos it said showed churches in the country’s capital.
  • Why Facebook removed it: The company said the post violated its policy against hate speech.
  • Why the board agreed: The way the term was used “makes clear it was meant to dehumanize its target.”

Overturned

  • The content: A user in Myanmar posted photos of a Syrian child who had drowned trying to reach Europe, and suggested that Muslims were disproportionately upset by killings in France over cartoon depictions of the Prophet Muhammad compared with China’s treatment of Uyghur Muslims.
  • Why Facebook removed it: The company said the content violated its policy against hate speech.
  • Why the board overruled Facebook: While the comments could be seen as offensive, they did not rise to the level of what Facebook considers hate speech.
  • The content: An Instagram post about breast cancer awareness from a user in Brazil showed women’s nipples.
  • Why Facebook removed it: The company’s automated content moderation system removed the post for violating a policy against sharing nude photos. Facebook restored the post after the oversight board decided to hear the case, but before it ruled.
  • Why the board overruled Facebook: Facebook’s policy on nudity contains an exception for “breast cancer awareness.” The board added that the automated removal showed a “lack of proper human oversight which raises human rights concerns.”
  • The content: A user posted a quote that the person misattributed to Nazi propagandist Joseph Goebbels.
  • Why Facebook removed it: The company said it violated its policy against “dangerous individuals and organizations.”
  • Why the board overruled Facebook: The board said the post did not promote Nazi propaganda but criticized Nazi rule.
  • The content: A user in France falsely claimed that a certain drug cocktail could cure Covid-19 and berated the French government for refusing to make the treatment available.
  • Why Facebook removed it: The company said the post violated its policy against misinformation that could cause real-world harm, arguing that it could lead people to ignore health guidance or attempt to self-medicate.
  • Why the board overruled Facebook: The post did not represent an imminent harm to people’s lives because its aim was to change a government policy and it did not advocate taking the drugs without a doctor’s prescription.
  • The content: A post in a Facebook group for Indian Muslims included a meme that appeared to threaten violence against non-Muslims. It also called French President Emmanuel Macron the devil and urged a boycott of French goods.
  • Why Facebook removed it: The company said the post contained a “veiled threat” and violated its policy against inciting violence.
  • Why the board overruled Facebook: The post, while incendiary, did not pose an imminent risk of violence and its removal overly restricted the user’s freedom of expression.

Read More

Continue Reading

FACEBOOK

Trump faces a narrow path to victory against Facebook suspension

Published

on

By

Trump faces a narrow path to victory against <b>Facebook</b> suspension thumbnail

The key factors, these people said, will include whether the board thinks Facebook set clear enough rules and gave Trump a fair shake. Another will be what kind of case the board thinks it’s weighing — a narrow, “legalistic” debate about one person’s freedom of expression or a broader one about the public’s right to safety.

The board, often likened to Facebook’s Supreme Court, has the power to overrule decisions even by top executives like CEO Mark Zuckerberg. Its ruling on Trump will be the group’s highest-profile yet, with momentous implications for U.S. politics and potentially the company’s treatment of other world leaders.

Here are the make-or-break factors that could determine Trump’s fate on Facebook:

A point for Trump: The board’s early rulings bode well for his case

The oversight board’s decisions so far would seem to offer favorable omens for Trump: It has ruled against Facebook and ordered content restored in almost every case it has reviewed since its launch before the 2020 U.S. elections.

Two aspects of those decisions could work especially well for the former president: the board’s commitment to freedom of expression, and a big emphasis on whether Facebook made its policies clear enough for users.

The early rulings showed that the board values free expression “very highly,” said Evelyn Douek, a lecturer at Harvard Law School who has closely followed the oversight board’s work.

“They put a lot of weight on the importance of voice and the importance of free expression and free speech and they really put the onus on Facebook to heavily justify any restrictions that they wanted,” she said.

The board could decide that Facebook’s policy against incitement to violence isn’t clear enough. That policy was the company’s main justification for booting Trump after the assault on the Capitol, during which he had repeated his false claims of a stolen election and attacked Vice President Mike Pence for certifying Joe Biden’s victory.

“One thing that really struck me in their initial decisions was kind of how much of their analysis focused on lack of clarity in Facebook’s policies, and really pointing to that as a rationale for saying content has to be restored on the platform,” said Emma Llansó of the nonprofit Center for Democracy & Technology, which receives funding from Facebook and other tech companies.

When Facebook announced Trump’s suspension on Jan. 7, Zuckerberg said the risk of further violence if the platform allowed him to remain active was “simply too great.” The company’s rules say Facebook can “remove language that incites or facilitates serious violence” or “when we believe there is a genuine risk of physical harm or direct threats to public safety.” The policy also says Facebook may consider additional context in such cases, such as whether a user’s prominence adds to the danger.

But the board’s decision may turn on whether those policies gave Trump sufficient notice of what behavior would violate the rules — in other words, whether he received due process.

Under “the most narrow kind of legalistic interpretation,” Llansó said, “they might well conclude that Trump’s account should go back up.”

A point for Facebook: Trump got a lot of warnings

On the other hand, due process concerns may matter a lot less when dealing with Trump, a public figure who had repeated run-ins with the site’s rules.

“When it comes to [Facebook’s] decision making, it’s not really been clear to users, generally, about where the lines are drawn,” said David Kaye, a professor at the University of California at Irvine and a former United Nations special rapporteur. “But I don’t think any of that really applies to Trump. I mean, for months, all the platforms had been basically signaling to Trump pretty clearly that you are coming up to the line, if not crossing over it with respect to our rules.”

Trump spent years butting heads with Facebook over its standards, including posts before and after the election that the company either adorned with warning labels or took down entirely for making unfounded claims about the election or the coronavirus pandemic.

That should have made it clear to him and his accounts’ handlers that he was at risk for more forceful action, Douek said.

“There have been years of battle between Facebook and years of contestation around Trump’s presence on the platform, and it absolutely can’t be said that he didn’t have an idea that he was breaching Facebook’s policies,” she said.

Facebook took down more Trump posts immediately after the Capitol riots on Jan. 6, declaring it an “emergency situation” and warning that his online rhetoric “contributes to rather than diminishes the risk of ongoing violence.” It suspended him the following day.

A point for Trump: Critics say Facebook’s enforcement has been uneven

Facebook’s much-scrutinized track record in policing Trump’s posts could play in his favor, though.

Daniel Kreiss, a media professor at the University of North Carolina, argued that the social media giant spent years essentially ignoring Trump’s violations of its rules because the company stuck to an “overly narrow interpretation” of them.

That could hurt the company’s case, he said, if the board believes that the company suddenly adopted a broader interpretation of its policies in handling Trump’s posts on and after Jan. 6.

“A lot of this comes back to Facebook’s own failures over the last year,” Kreiss said.

In his Jan. 7 post, Zuckerberg said Facebook had let Trump use the platform “consistent with our own rules,” but that the storming of the Capitol dramatically changed the dynamics. “The current context is now fundamentally different, involving use of our platform to incite violent insurrection against a democratically elected government,” the CEO said.

But critics have skewered the company for not taking a more aggressive stance against Trump’s repeated, unsubstantiated claims of widespread voter fraud in the 2020 elections, as well as earlier posts such as his warning to racial justice protesters last May that “when the looting starts, the shooting starts.” Zuckerberg rejected such criticisms nearly a year ago, saying that “our position is that we should enable as much expression as possible unless it will cause imminent risk of specific harms or dangers spelled out in clear policies.”

The perceived inconsistency, coupled with the oversight board’s initial decisions, could mean Trump is bound for a comeback, Kreiss argued.

“If I was a betting man, I would say that the early rulings would lead me to expect that the oversight board will overturn Facebook’s decisions,” he said.

A point for Facebook: Trump’s case defies precedent

Perhaps the biggest factor in Facebook’s favor is the fact that Trump’s case breaks any semblance of precedent the board could have established in its early rulings, the people tracking its deliberations said.

None of the previous cases directly involved a government leader — let alone the leader of the free world, or one accused of inciting a deadly attack in the seat of his own democracy. Plus, all the past disputes were about Facebook’s decisions to take down specific pieces of content, not the suspension of someone’s entire account.

“The thing about the Trump case is it’s so sui generis and exceptional,” Douek said.

“This just does seem a case that in some ways, is set apart … because of the magnitude of it in terms of how important this person is,” said University of North Carolina media professor Shannon McGregor, who co-wrote a piece with Kreiss calling for the oversight board to uphold Trump’s suspension.

Facebook in fact leaned on the unparalleled nature of the case when it referred Trump’s suspension to the oversight board on Jan. 21, kicking off the at-most 90-day review period.

“Our decision to suspend then-President Trump’s access was taken in extraordinary circumstances: a US president actively fomenting a violent insurrection designed to thwart the peaceful transition of power; five people killed; legislators fleeing the seat of democracy,” said Facebook global affairs chief Nick Clegg, a former British deputy prime minister.

He added, “This has never happened before — and we hope it will never happen again. It was an unprecedented set of events which called for unprecedented action.”

That could mean that even if the board takes issue with how Facebook arrived at its decision, it could still agree with its conclusion.

“I would probably fall on the side of: They will not order his account restored, but with an opinion that explains a lot of things Facebook needs to change about their policies to make that outcome clearer and more predictable in the future,” Llansó said.

A point for Facebook: The board is big on human rights

Trump and his conservative allies have long accused Facebook and other social media sites of trampling on free speech by unevenly restricting their content, a charge the companies deny. The criticism borrows from the American tradition of largely unfettered self-expression, a tradition that Zuckerberg himself has proclaimed as a core value for Facebook.

But researchers said they expect the oversight board to look at Trump’s suspension through a wider human rights lens, which would put a greater emphasis on how Trump’s speech could harm others.

“What human rights law does, when it comes to freedom of expression, is it looks at not just the freedom to impart information, but also the freedom to seek and receive it, and it provides a kind of framework for thinking about the impact that speech can have on others,” Kaye said.

That doesn’t bode well for Trump, Kaye said, because it would mean Trump’s right to express himself freely on Facebook wouldn’t necessarily be an overriding factor in the board’s decision.

Still, some aren’t convinced the board will take that broad an approach to the case.

Paul Barrett, deputy director at the NYU Stern Center for Business and Human Rights and a former Bloomberg columnist, argued in an article that the board’s earlier decisions “tended to frame the factual context of the disputed posts in a narrow way, an approach that can minimize the potential harm the speech in question could cause.”

He added, “If carried over to the Trump decision, these inclinations would help him.”

But onlookers should be careful not to read too much into the board’s initial rulings, Douek said.

“Predicting the future is always a bad idea, and it’s kind of stupid to do it on such a small sample,” she said.

Read More

Continue Reading

FACEBOOK

Chris Cox earned cash and stock worth $69 million after rejoining Facebook last year

Published

on

Chris Cox earned cash and stock worth $69 million after rejoining <b>Facebook</b> last year thumbnail

Chris Cox

Asa Mathat | Re/code

Facebook Chief Product Officer Chris Cox got paid approximately $69 million to rejoin the social media company in 2020.

Cox rejoined Facebook in June 2020 after a more than one year hiatus away from the company. Cox is a key moral leader at Facebook and one of its top strategic executives, coordinating how the company’s various services work with one another.

According to the company’s 2021 proxy statement, Cox earned more than $421,000 in salary, more than $691,000 in bonus, and stock awards with a fair value of nearly $68 million, which will vest over the next four years.

Cox’s salary and bonus were down from his earnings in 2018, his last full year with the company. However, the stock compensation was up drastically, with Facebook noting that the massive compensation was awarded in connection to Cox rejoining the company.

The company also notes that Cox will earn an additional cash award of $4 million this year, to be paid within 30 days of the one-year anniversary of his return to Facebook.

Additionally, the proxy states that Facebook paid Cox $90,000 while he was away from the company for serving as a strategic advisor.

Read More

Continue Reading

Trending