Connect with us

FACEBOOK

This News Publisher Quit Facebook. Readership Went Up.

Published

on

This News Publisher Quit <b>Facebook</b>. Readership Went Up. thumbnail

Last year, in the middle of the pandemic, Sinead Boucher offered $1 to buy Stuff, New Zealand’s largest news publisher.

Boucher was already the company’s chief executive and was worried that its Australian media owner would shut down the publisher. Things had started to look really grim: the economy had ground to a halt and advertising revenue had evaporated.

“I knew that they … would potentially just decide to wind us up,” said Boucher. “So it was just a punt.”

The punt worked. The Australian company accepted the offer of NZ$1, worth about 70 U.S. cents. Just like that, in May, Boucher found herself the owner of the most popular news website in New Zealand and around 50 local, regional and national newspapers and magazines.

One of the first decisions she made as owner was to break up with Facebook.

How Boucher reached that point illustrates just how complicated the relationship between Facebook and news publishers has become over the years.

Stopped all posts on Facebook and Instagram

Boucher had worked for years as a reporter and editor before moving into management. She saw the change of ownership as an opportunity to rethink Stuff’s approach to covering the news in New Zealand, and to focus the company on building trust with readers.

“People sometimes say to me, ‘It must be so stressful now, owning the company,'” she said. “Actually, in a lot of ways, it’s been liberating, because there is a freedom in being able to make your own decisions, for better or worse.”

Stuff’s decision to quit Facebook did not come out of the blue. In 2019, after a horrific mass shooting at mosques in Christchurch, New Zealand, was live streamed on Facebook, the company began to reconsider its relationship with the social network, and stopped spending money to advertise there.

Just over a year later, in July 2020, Stuff launched a new experiment: its website, newspapers and magazine stopped posting anything at all on Facebook and its subsidiary Instagram.

Boucher said the company was increasingly questioning whether being on the social network was “compatible” with its new focus on trust.

“We still felt uneasy with a lot of the decisions Facebook has made or a lot of the things they turned a blind eye to,” she said, such as hate speech and misinformation.

Overall traffic rose, even though social media traffic fell

But this decision was also risky: almost a quarter of Stuff’s traffic came from social media — and mainly from Facebook. “We were expecting that it would bring a significant drop in our traffic,” Boucher said.

Those fears never materialized. While Stuff’s social media traffic did drop, overall traffic went up.

Boucher attributes that in part to 2020 being such a busy news year, from the pandemic to global racial justice protests to big elections in New Zealand and, of course, the U.S.

“If we had remained on Facebook, we might have had another 5% growth,” she said. “But even if we throttled our growth … it’s brought us a lot of positives.”

Donations from readers went up, for example, once word of Stuff’s decision leaked out. Boucher said the company’s newsrooms felt the impact, too.

“We can definitely see a change in the way people react to us and talk to us” she said. “Hearing anecdotally from journalists, they feel like they’ve been able to get interviews they would not have got before. They feel that it has really has contributed to people trusting us more, thinking about us as an organization with a clear set of values.”

A one-month pause turns into a long-term breakup

The Facebook pause was supposed to last a few weeks. It’s now been more than eight months — and Boucher says she has no plans to return to Facebook.

Breaking up with Facebook is not an easy decision for any publisher. Many news outlets feel they need to be on Facebook to reach people. A third of Americans say they regularly get their news on the social network, according to the Pew Research Center. At the same time, the news business is real financial trouble, because a big chunk of advertising dollars now go to Facebook, as well as Google.

In fact, Facebook has become such a dominant force in how people get information, some governments are trying to force the social network to pay media outlets for news stories. Facebook recently protested a proposed Australian law by briefly cutting off all news from the continent. (The social network backed down after the government made some changes to the measure.)

Facebook says the dynamics of its relationship with publishers are widely misunderstood — that, in fact, publishers need the social network more than it needs them.

But Boucher says her experience suggests publishers should prioritize a different relationship: the direct one they have with their audience.

Fears that Stuff would collapse were “baseless”

Still, she isn’t willing to say never. She describes the move as “an ongoing experiment rather than a black and white permanent decision.” That’s because she worries about one big downside: if Stuff is not posting its stories on Facebook, does that leave a hole that will be filled by false or misleading information?

“We’re really conscious that a lot of people still use the platform as a primary way to access news and information,” she said. “We wonder about the risk of withdrawing journalism from Facebook and what that leaves behind for people to be exposed to, particularly around things like COVID vaccine.”

Even so, Boucher says she is confident she made the right decision. And she says that while Stuff is one company, in one small country, it has a lesson for publishers of all sizes: there is life beyond Facebook.

“Taking that step and deciding to just give it a go taught us so much,” she said. “All our fears that without these platforms, we would just collapse — that was baseless.”

Editor’s note: Facebook and Google are among NPR’s financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Read More

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

FACEBOOK

Facebook oversight board rulings so far

Published

on

By

<b>Facebook</b> oversight board rulings so far thumbnail

A person photographs the sign outside of Facebook headquarters in Menlo Park, Calif. | Paul Sakuma/AP Photo

Upheld

  • The content: A post used a slur to refer to Azerbaijanis in the caption of photos it said showed churches in the country’s capital.
  • Why Facebook removed it: The company said the post violated its policy against hate speech.
  • Why the board agreed: The way the term was used “makes clear it was meant to dehumanize its target.”

Overturned

  • The content: A user in Myanmar posted photos of a Syrian child who had drowned trying to reach Europe, and suggested that Muslims were disproportionately upset by killings in France over cartoon depictions of the Prophet Muhammad compared with China’s treatment of Uyghur Muslims.
  • Why Facebook removed it: The company said the content violated its policy against hate speech.
  • Why the board overruled Facebook: While the comments could be seen as offensive, they did not rise to the level of what Facebook considers hate speech.
  • The content: An Instagram post about breast cancer awareness from a user in Brazil showed women’s nipples.
  • Why Facebook removed it: The company’s automated content moderation system removed the post for violating a policy against sharing nude photos. Facebook restored the post after the oversight board decided to hear the case, but before it ruled.
  • Why the board overruled Facebook: Facebook’s policy on nudity contains an exception for “breast cancer awareness.” The board added that the automated removal showed a “lack of proper human oversight which raises human rights concerns.”
  • The content: A user posted a quote that the person misattributed to Nazi propagandist Joseph Goebbels.
  • Why Facebook removed it: The company said it violated its policy against “dangerous individuals and organizations.”
  • Why the board overruled Facebook: The board said the post did not promote Nazi propaganda but criticized Nazi rule.
  • The content: A user in France falsely claimed that a certain drug cocktail could cure Covid-19 and berated the French government for refusing to make the treatment available.
  • Why Facebook removed it: The company said the post violated its policy against misinformation that could cause real-world harm, arguing that it could lead people to ignore health guidance or attempt to self-medicate.
  • Why the board overruled Facebook: The post did not represent an imminent harm to people’s lives because its aim was to change a government policy and it did not advocate taking the drugs without a doctor’s prescription.
  • The content: A post in a Facebook group for Indian Muslims included a meme that appeared to threaten violence against non-Muslims. It also called French President Emmanuel Macron the devil and urged a boycott of French goods.
  • Why Facebook removed it: The company said the post contained a “veiled threat” and violated its policy against inciting violence.
  • Why the board overruled Facebook: The post, while incendiary, did not pose an imminent risk of violence and its removal overly restricted the user’s freedom of expression.

Read More

Continue Reading

FACEBOOK

Trump faces a narrow path to victory against Facebook suspension

Published

on

By

Trump faces a narrow path to victory against <b>Facebook</b> suspension thumbnail

The key factors, these people said, will include whether the board thinks Facebook set clear enough rules and gave Trump a fair shake. Another will be what kind of case the board thinks it’s weighing — a narrow, “legalistic” debate about one person’s freedom of expression or a broader one about the public’s right to safety.

The board, often likened to Facebook’s Supreme Court, has the power to overrule decisions even by top executives like CEO Mark Zuckerberg. Its ruling on Trump will be the group’s highest-profile yet, with momentous implications for U.S. politics and potentially the company’s treatment of other world leaders.

Here are the make-or-break factors that could determine Trump’s fate on Facebook:

A point for Trump: The board’s early rulings bode well for his case

The oversight board’s decisions so far would seem to offer favorable omens for Trump: It has ruled against Facebook and ordered content restored in almost every case it has reviewed since its launch before the 2020 U.S. elections.

Two aspects of those decisions could work especially well for the former president: the board’s commitment to freedom of expression, and a big emphasis on whether Facebook made its policies clear enough for users.

The early rulings showed that the board values free expression “very highly,” said Evelyn Douek, a lecturer at Harvard Law School who has closely followed the oversight board’s work.

“They put a lot of weight on the importance of voice and the importance of free expression and free speech and they really put the onus on Facebook to heavily justify any restrictions that they wanted,” she said.

The board could decide that Facebook’s policy against incitement to violence isn’t clear enough. That policy was the company’s main justification for booting Trump after the assault on the Capitol, during which he had repeated his false claims of a stolen election and attacked Vice President Mike Pence for certifying Joe Biden’s victory.

“One thing that really struck me in their initial decisions was kind of how much of their analysis focused on lack of clarity in Facebook’s policies, and really pointing to that as a rationale for saying content has to be restored on the platform,” said Emma Llansó of the nonprofit Center for Democracy & Technology, which receives funding from Facebook and other tech companies.

When Facebook announced Trump’s suspension on Jan. 7, Zuckerberg said the risk of further violence if the platform allowed him to remain active was “simply too great.” The company’s rules say Facebook can “remove language that incites or facilitates serious violence” or “when we believe there is a genuine risk of physical harm or direct threats to public safety.” The policy also says Facebook may consider additional context in such cases, such as whether a user’s prominence adds to the danger.

But the board’s decision may turn on whether those policies gave Trump sufficient notice of what behavior would violate the rules — in other words, whether he received due process.

Under “the most narrow kind of legalistic interpretation,” Llansó said, “they might well conclude that Trump’s account should go back up.”

A point for Facebook: Trump got a lot of warnings

On the other hand, due process concerns may matter a lot less when dealing with Trump, a public figure who had repeated run-ins with the site’s rules.

“When it comes to [Facebook’s] decision making, it’s not really been clear to users, generally, about where the lines are drawn,” said David Kaye, a professor at the University of California at Irvine and a former United Nations special rapporteur. “But I don’t think any of that really applies to Trump. I mean, for months, all the platforms had been basically signaling to Trump pretty clearly that you are coming up to the line, if not crossing over it with respect to our rules.”

Trump spent years butting heads with Facebook over its standards, including posts before and after the election that the company either adorned with warning labels or took down entirely for making unfounded claims about the election or the coronavirus pandemic.

That should have made it clear to him and his accounts’ handlers that he was at risk for more forceful action, Douek said.

“There have been years of battle between Facebook and years of contestation around Trump’s presence on the platform, and it absolutely can’t be said that he didn’t have an idea that he was breaching Facebook’s policies,” she said.

Facebook took down more Trump posts immediately after the Capitol riots on Jan. 6, declaring it an “emergency situation” and warning that his online rhetoric “contributes to rather than diminishes the risk of ongoing violence.” It suspended him the following day.

A point for Trump: Critics say Facebook’s enforcement has been uneven

Facebook’s much-scrutinized track record in policing Trump’s posts could play in his favor, though.

Daniel Kreiss, a media professor at the University of North Carolina, argued that the social media giant spent years essentially ignoring Trump’s violations of its rules because the company stuck to an “overly narrow interpretation” of them.

That could hurt the company’s case, he said, if the board believes that the company suddenly adopted a broader interpretation of its policies in handling Trump’s posts on and after Jan. 6.

“A lot of this comes back to Facebook’s own failures over the last year,” Kreiss said.

In his Jan. 7 post, Zuckerberg said Facebook had let Trump use the platform “consistent with our own rules,” but that the storming of the Capitol dramatically changed the dynamics. “The current context is now fundamentally different, involving use of our platform to incite violent insurrection against a democratically elected government,” the CEO said.

But critics have skewered the company for not taking a more aggressive stance against Trump’s repeated, unsubstantiated claims of widespread voter fraud in the 2020 elections, as well as earlier posts such as his warning to racial justice protesters last May that “when the looting starts, the shooting starts.” Zuckerberg rejected such criticisms nearly a year ago, saying that “our position is that we should enable as much expression as possible unless it will cause imminent risk of specific harms or dangers spelled out in clear policies.”

The perceived inconsistency, coupled with the oversight board’s initial decisions, could mean Trump is bound for a comeback, Kreiss argued.

“If I was a betting man, I would say that the early rulings would lead me to expect that the oversight board will overturn Facebook’s decisions,” he said.

A point for Facebook: Trump’s case defies precedent

Perhaps the biggest factor in Facebook’s favor is the fact that Trump’s case breaks any semblance of precedent the board could have established in its early rulings, the people tracking its deliberations said.

None of the previous cases directly involved a government leader — let alone the leader of the free world, or one accused of inciting a deadly attack in the seat of his own democracy. Plus, all the past disputes were about Facebook’s decisions to take down specific pieces of content, not the suspension of someone’s entire account.

“The thing about the Trump case is it’s so sui generis and exceptional,” Douek said.

“This just does seem a case that in some ways, is set apart … because of the magnitude of it in terms of how important this person is,” said University of North Carolina media professor Shannon McGregor, who co-wrote a piece with Kreiss calling for the oversight board to uphold Trump’s suspension.

Facebook in fact leaned on the unparalleled nature of the case when it referred Trump’s suspension to the oversight board on Jan. 21, kicking off the at-most 90-day review period.

“Our decision to suspend then-President Trump’s access was taken in extraordinary circumstances: a US president actively fomenting a violent insurrection designed to thwart the peaceful transition of power; five people killed; legislators fleeing the seat of democracy,” said Facebook global affairs chief Nick Clegg, a former British deputy prime minister.

He added, “This has never happened before — and we hope it will never happen again. It was an unprecedented set of events which called for unprecedented action.”

That could mean that even if the board takes issue with how Facebook arrived at its decision, it could still agree with its conclusion.

“I would probably fall on the side of: They will not order his account restored, but with an opinion that explains a lot of things Facebook needs to change about their policies to make that outcome clearer and more predictable in the future,” Llansó said.

A point for Facebook: The board is big on human rights

Trump and his conservative allies have long accused Facebook and other social media sites of trampling on free speech by unevenly restricting their content, a charge the companies deny. The criticism borrows from the American tradition of largely unfettered self-expression, a tradition that Zuckerberg himself has proclaimed as a core value for Facebook.

But researchers said they expect the oversight board to look at Trump’s suspension through a wider human rights lens, which would put a greater emphasis on how Trump’s speech could harm others.

“What human rights law does, when it comes to freedom of expression, is it looks at not just the freedom to impart information, but also the freedom to seek and receive it, and it provides a kind of framework for thinking about the impact that speech can have on others,” Kaye said.

That doesn’t bode well for Trump, Kaye said, because it would mean Trump’s right to express himself freely on Facebook wouldn’t necessarily be an overriding factor in the board’s decision.

Still, some aren’t convinced the board will take that broad an approach to the case.

Paul Barrett, deputy director at the NYU Stern Center for Business and Human Rights and a former Bloomberg columnist, argued in an article that the board’s earlier decisions “tended to frame the factual context of the disputed posts in a narrow way, an approach that can minimize the potential harm the speech in question could cause.”

He added, “If carried over to the Trump decision, these inclinations would help him.”

But onlookers should be careful not to read too much into the board’s initial rulings, Douek said.

“Predicting the future is always a bad idea, and it’s kind of stupid to do it on such a small sample,” she said.

Read More

Continue Reading

FACEBOOK

Chris Cox earned cash and stock worth $69 million after rejoining Facebook last year

Published

on

Chris Cox earned cash and stock worth $69 million after rejoining <b>Facebook</b> last year thumbnail

Chris Cox

Asa Mathat | Re/code

Facebook Chief Product Officer Chris Cox got paid approximately $69 million to rejoin the social media company in 2020.

Cox rejoined Facebook in June 2020 after a more than one year hiatus away from the company. Cox is a key moral leader at Facebook and one of its top strategic executives, coordinating how the company’s various services work with one another.

According to the company’s 2021 proxy statement, Cox earned more than $421,000 in salary, more than $691,000 in bonus, and stock awards with a fair value of nearly $68 million, which will vest over the next four years.

Cox’s salary and bonus were down from his earnings in 2018, his last full year with the company. However, the stock compensation was up drastically, with Facebook noting that the massive compensation was awarded in connection to Cox rejoining the company.

The company also notes that Cox will earn an additional cash award of $4 million this year, to be paid within 30 days of the one-year anniversary of his return to Facebook.

Additionally, the proxy states that Facebook paid Cox $90,000 while he was away from the company for serving as a strategic advisor.

Read More

Continue Reading

Trending