Connect with us

FACEBOOK

Facebook Inc. Cl A stock underperforms Friday when compared to competitors – MarketWatch

Published

on

Shares of Facebook Inc. Cl A
FB,
-2.24%

slipped 2.24% to $364.72 Friday, on what proved to be an all-around grim trading session for the stock market, with the NASDAQ Composite Index
COMP,
-0.91%

falling 0.91% to 15,043.97 and Dow Jones Industrial Average
DJIA,
-0.48%

falling 0.48% to 34,584.88. This was the stock’s third consecutive day of losses. Facebook Inc. Cl A closed $19.61 short of its 52-week high ($384.33), which the company reached on September 1st.

The stock underperformed when compared to some of its competitors Friday, as Microsoft Corp.
MSFT,
-1.75%

fell 1.75% to $299.87, Alphabet Inc. Cl A
GOOGL,
-1.96%

fell 1.96% to $2,816.00, and Twitter Inc.
TWTR,
+0.29%

rose 0.29% to $62.47. Trading volume (26.0 M) eclipsed its 50-day average volume of 12.4 M.


Editor’s Note: This story was auto-generated by Automated Insights, an automation technology provider, using data from Dow Jones and FactSet. See our market data terms of use.

Read More

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

FACEBOOK

‘Vaccine Talk’ Facebook Group Is A Carefully Moderated Forum For Vaccine Questions …

Published

on

By

'Vaccine Talk' <b>Facebook</b> Group Is A Carefully Moderated Forum For Vaccine Questions ... thumbnail

Kate Bilowitz moderates a Facebook group where people exchange views on vaccinations. She shares what moderating it has been like during the pandemic.

Read More

Continue Reading

FACEBOOK

Facebook exec says stablecoins ‘probably’ require more regulation – Yahoo Finance

Published

on

SEC Chair Gary Gensler put forward a wide-ranging view of potential cryptocurrency regulation at a Senate hearing this week, saying that a type of digital asset called stablecoins may be considered a security.

The comments come as the Treasury Department works with other federal agencies to draft a report by next month on potential regulations for stablecoins, a form of cryptocurrency that pegs its value to a commodity or currency, like the U.S. dollar.

New rules could draw support from a top industry player, Facebook’s (FB) David Marcus, who has spearheaded the tech giant’s soon-to-launch digital wallet called Novi. Marcus also sits on the board of the Diem Association, a coalition of corporate and non-profit members that aim to bring out a stablecoin called Diem that will be exchanged over the new digital wallet from Facebook.

In a new interview, taped prior to Gensler’s comments on Tuesday, Marcus told Yahoo Finance stablecoins “probably” will require additional regulation, which should focus on consumer protection as well as the prevention of illegal payments like money laundering.

“Do we need more regulation?” says Marcus, head of F2, also known as Facebook Financial. “The answer is probably ‘yes.'” 

“The first thing is really consumer protection,” he adds. “Do consumers understand what they’re buying? And what guarantees do they have to get their money out in an adverse event? And so that pertains to if you’re talking about stable coins, specifically, what are the reserves made of? 

“Are they’re fully backed? reserves? Or are they not fully backed? And if they are fully backed? What are they backed with?” he adds.

During Gensler’s testimony before the Senate Banking Committee on Tuesday, Democratic Senator Elizabeth Warren (D-MA) asked about the possibility of crypto investors attempting to withdraw money during a market crash. Gensler said the SEC could not do much to help investors since crypto exchanges like Coinbase (COIN) had not registered with the SEC. 

Treasury Secretary Janet Yellen last month urged speedy adoption of stablecoin rules in remarks to regulators.

Marcus said investor risks found in stablecoins depend on the commodities that back a given cryptocurrency.

“In my view, very high quality stable coins are only backed by cash and very short term treasuries,” he says. “That’s it.”

“Then you could add a capital buffer on top of that, to basically cover unexpected operational losses, or what have you to add another layer of protection,” he says.

David Marcus, CEO of Facebook's Calibra digital wallet service, arrives for a House Financial Services Committee hearing on Facebook's proposed cryptocurrency on Capitol Hill in Washington, Wednesday, July 17, 2019. (AP Photo/Andrew Harnik)

David Marcus, CEO of Facebook’s Calibra digital wallet service, arrives for a House Financial Services Committee hearing on Facebook’s proposed cryptocurrency on Capitol Hill in Washington, Wednesday, July 17, 2019. (AP Photo/Andrew Harnik)

Facebook aims to release Novi along with Diem by the end of the year, Marcus told Axios earlier this month. Diem, which emerged from Facebook’s effort to develop a cryptocurrency that began under the name Libra in 2017, will be pegged to the U.S. dollar, Marcus said.

Libra faced backlash from regulators and lawmakers when it was announced in 2019, and ultimately lost support from corporate backers like Visa (V) and PayPal (PYPL). 

Speaking to Yahoo Finance, Marcus said concerns over illicit payments with stablecoins offer an opportunity for regulators to improve the clarity of rules governing such transactions, even though stablecoins are currently used for everyday payments in rare circumstances.

“We’re very motivated to solving payments use case but stable coins are mainly used right now for exchanges when people are buying and selling other crypto assets,” he says.

“There are provisions around anti-money laundering, combating the financing of terrorism, sanctions enforcement — and I think the rules are pretty clear,” he says. “This actually offers an opportunity to get better at it than the current system is, which I think it will be.”

Read more:

Follow Yahoo Finance on Twitter, Facebook, Instagram, Flipboard, LinkedIn, YouTube, and reddit.

Read More

Continue Reading

FACEBOOK

Facebook has an invisible system that shelters powerful rule-breakers. So do other online platforms.

Published

on

By

<b>Facebook</b> has an invisible system that shelters powerful rule-breakers. So do other online platforms. thumbnail

Last week, the Wall Street Journal published Jeff Horwitz’s investigation into the inner workings of Facebook — with some troubling findings. Internal documents suggest that Facebook’s top management dismissed or downplayed an array of problems brought to their attention by product teams, internal researchers and their own Oversight Board. These include a report on what is known as the XCheck program, which reportedly allowed nearly any Facebook employee, at their own discretion, to whitelist users who were “newsworthy,” “influential or popular” or “PR risky.” The apparent result was that more than 5.8 million users were moderated according to different rules than ordinary Facebook users, or hardly moderated at all.

This system of “invisible elite tiers,” as the Journal describes it, meant that the speech of powerful and influential actors was protected while ordinary people’s speech was moderated by automated algorithms and overworked humans. As our research shows, that’s not surprising. Other platforms besides Facebook enforce different standards for different users, creating special classes of users as part of their business models.

Unequal and opaque standards can breed suspicion among users

In a recent research article, we explain how another important platform, YouTube, takes what we call a “tiered governance” approach, separating users into categories and applying different rules to each category’s videos. YouTube distinguishes among such categories as media partners, nonprofits and governments. Most important, it distinguishes between “creators” who get a slice of its ad revenue and ordinary users. Even among those paid creators, YouTube has a more subtle array of tiers according to popularity.

Facebook’s program began as a stopgap measure to avoid the public relations disasters that might happen if the platform hastily deleted content by someone powerful enough to fight back, such as a sitting president. YouTube’s program began when it created a special category of paid creators, the YouTube Partner Program, to give popular YouTubers incentives to stay on the site and make more content.

YouTube then began to create more intricate tiers, providing the most influential creators with special perks such as access to studios and camera equipment. An elite few had direct contact with handlers within the company who could help them deal with content moderation issues quickly, so that they didn’t lose money. But things changed when advertisers — YouTube’s main source of revenue — began to worry about their ads being shown together with offensive content. This drove YouTube to adjust its policies — over and over again — about which creators belonged to which tiers and what their benefits and responsibilities were, even if the creators didn’t like it.

Creators were understandably frustrated as these arrangements seemed to keep shifting under their feet. They didn’t object to different rules and sets of perks for different tiers of creators, but they did care that the whole system was opaque. Users like to know what to expect from platforms — whether they will enforce guidelines, and how much financial compensation they provide. They didn’t like the unpredictability of YouTube’s decisions, especially since those decisions had real social, financial and reputational impact.

Some were frustrated and suspicious about the platform’s real motives. Opacity and perceptions of unfairness provided fuel for conspiracy theories about why YouTube was doing what it was doing. Creators who didn’t know if YouTube’s algorithms had demonetized or demoted their videos began to worry that their political leanings were being penalized. This led to anger and despair, which was worsened by YouTube’s clumsy appeals system. And it gave fodder to those eager to accuse YouTube of censorship, whether it was true.

It’s fair to be unfair, as long as you’re fair about it

Social media companies such as YouTube and Facebook have suggested that their platforms are open, meritocratic, impartial and evenhanded. This makes it hard for them to explain why they treat different people differently. However, other systems for adjudication make distinctions, too. For example, criminal law takes into account whether the accused is a child, impaired, a repeat offender, under the influence, responding in self-defense or under justifiable duress.

Similarly, there are plausible reasons platform companies might want to treat different tiers of users in different ways. For example, for postings about the coronavirus, it made sense to establish different rules for those who had established themselves as trustworthy. To decrease the spread of misinformation or harassment, platforms might reasonably want to impose higher standards rather than lower ones on users who had many followers, who held political office and had special obligations to the public, or who paid or received money to post.

But YouTube’s experience suggests that clarity about why different users are treated differently matters for public perception. When a company such as Facebook discriminates between different tiers of users just to avoid offending powerful people and mitigate possible PR disasters, observers will treat that reasoning as less legitimate than if the company were trying to hold the powerful to account. This is especially so if the differences are kept hidden from users, the public and even Facebook’s own Oversight Board.

These allegations are likely to breed distrust, accusations of bias and suspicions about Facebook’s intentions.

Robyn Caplan is a researcher at Data & Society Research Institute. Follow her @RobynCaplan.

Read More

Continue Reading

Trending