Connect with us

FACEBOOK

What If Regulating Facebook Fails?

Published

on

What if nothing works? What if, after years of scholarship and journalism exposing the dominance, abrogations, duplicity, arrogance, and incompetence of Facebook, none of the policy tools we have come to rely on to rein in corporations make any difference at all?

We have to be prepared for just such an outcome.

On Tuesday a federal court tossed out federal and state cases against Facebook for violating US antitrust laws. The judge ruled that, because antitrust has precise definitions of concepts like “monopoly” and high burdens of proof for actions in restraint of fair competition, the governments had not come close to justifying why these cases should proceed now. After all, the judge pointed out, the US government had raised no objections in 2012 when Facebook bought Instagram, or in 2014 when it bought WhatsApp. Why should the government swoop in to raise objections now? The judge was not wrong to rule that way. But we have been very wrong to allow our defenses against corporate power to shrink over the past 40 years.

Within hours of that court decision, the total value of Facebook stock rose above 1 trillion dollars. It joined only Microsoft, Amazon, Apple, and Alphabet (the company that owns Google) in reaching that valuation, making 17-year-old Facebook the youngest company to do so. And as it fast approaches 3 billion users worldwide, Facebook seems as unstoppable as it does unmanageable, at least in the short term.

Nothing before in human history—with the possible exception of Google—has reached into the lives of 3 billion humans who speak more than 100 languages. Facebook derives its power from that scale and the degree to which we depend on the Blue App, Instagram, and WhatsApp for our social interactions and virtual identities. While its products have proven to be bad for democracy and other living things, they’re remarkably useful and justifiably popular. Facebook may be terrible for us in the aggregate, but it serves each of its nearly 3 billion users well enough individually that the vast majority are unlikely to walk away from it. At the same time, the US government, often charged with the task of protecting us from our own bad habits, doesn’t actually have a good set of tools to address the damage that Facebook (or Google, for that matter) does.

Advertisement
free widgets for website

So what can we, as citizens, do? What should we demand from our governments?

There are currently three major areas of regulatory intervention in play: antitrust, content regulation, and data rights. For the most part, Facebook’s critics and those pushing for change have focused their attention on the first two. But while those avenues should be studied, considered, and pursued, neither is likely to solve our Facebook problem.

Antitrust is the angle that’s received the most attention recently, but we shouldn’t pin our hopes on it. This is not only because Republicans have spent 40 years gutting antitrust regulation to benefit their corporate patrons, and not only because an Obama-appointed federal judge tossed those cases out of court. Believing that competition could limit Facebook’s behavior, alter its design, or even stunt its growth demands a naive faith in markets and competition. Even in highly competitive markets like retail and groceries, big firms remain big. And big and small firms alike pollute, dodge taxes, exploit and mistreat labor, and distract people from civic and family commitments. That’s why we need other forms of regulation to impose costs or restrictions that can enrich our lives in ways that for-profit ventures cannot. Antitrust is at best a minor annoyance to companies and a minor boon to human beings.

See also  AutismUp's Facebook group provides support to caregivers

Content regulation, meanwhile, is an even clumsier and  less effective method of regulation. Many countries around the world explicitly ban Facebook, Google, and other publishers from distributing content that governments consider noxious. Despite all the flowery words about free speech that flow from Mark Zuckerberg, he and his company have never really championed—or even understood—what it means.

Of course, such direct state censorship is not on the table in the US, Canada, or other countries with long-standing liberal traditions. Another approach to content regulation in the US would be to limit the protection from liability that Facebook and other digital service providers currently enjoy for malicious content posted by users. That legal immunity is provided by Section 230 of the Communications Decency Act, and a number of critics have advocated changing it so that the platforms would be more concerned about the nasty things we post to and about one another.

Advertisement
free widgets for website

But despite the flurry of recent commentary on Section 230, removing or reforming it is not likely to make Facebook significantly better for us. The provision gets too much credit for enabling the vibrant creativity that fueled the digital revolution, and it gets too much blame for the damage the digital revolution has caused. Although some countries offer different forms of a liability shield, most of the world does not offer as broad a shield as Section 230—and yet platforms continue to grow and thrive globally anyway. India, for instance, just abandoned its limited protection from liability, yet there is no sign—nor possibility—of Facebook abandoning its largest market and greatest source of growth.

In the US, the idea for providing this protection was actually that it would create a market incentive for platforms to keep themselves clean. That hasn’t exactly worked out as planned. Companies like Facebook and Google do lightly and inconsistently try to police the content on their platforms. But the scale and scope of those platforms undermine any effort to do so in ways subtle and effective enough to protect users.

Perhaps, then, you might assume that making these companies smaller would help. And that’s a reasonable assumption. Significantly smaller services like Reddit, with only about 430 million users, do a much better job these days (at least since 2020, when Reddit abandoned its naive commitment to radical free speech and encouraged strong community content-moderation techniques, banning some major subreddits that encouraged hatred). But it’s too late to reverse Facebook’s global reach. There is no way for one country’s policy to slow its growth.

Unlike most other companies, growth and user engagement—not revenue, profit, or even market capitalization—have been the driving metrics for Mark Zuckerberg since the beginning. Any attempt to limit Facebook’s reach would have to contend with those core values. But we currently lack any methods to do so.

See also  Chinese Hackers Made Fake Facebook Profiles, Apps To Spy On Uyghur Activists

If all the external forces of regulation have failed, some say, maybe we can count on a “greenwashing” operation intended to assure regulators that the company can police itself: the Facebook Oversight Board. But this collection of civic leaders, all of whom seem sincere in their commitment to improving Facebook, are unable to actually do anything about the core problems of Facebook. The board only considers decisions to remove or retain content and accounts, as if those decisions were the reason Facebook threatens democracy and human rights around the world. The board pays no attention to algorithmic amplification of content. It does not concern itself with linguistic bias or limitations within the company. It does not question the commitment to growth and engagement. It does not examine the problems with Facebook’s commitment to artificial intelligence or virtual reality. The more seriously we take the impotent Oversight Board, the less likely we are to take Facebook as a whole seriously. The Oversight Board is mostly useless, and “self regulation” is an oxymoron. Yet for some reason, many smart people continue to take it seriously, allowing Facebook itself to structure the public debate and avoid real accountability.

Advertisement
free widgets for website

What about us? We are the 3 billion, after all. What if every Facebook user decided to be a better person, to think harder, to know more, to be kinder, more patient, and more tolerant? Well, we’ve been working on improving humanity for at least 2,000 years, and it’s not going that well. There is no reason to believe, even with “media education” or “media literacy” efforts aimed at young people in a few wealthy countries, that we can count on human improvement—especially when Facebook is designed to exploit our tendency to favor the shallow, emotional, and extreme expressions that our better angels eschew.

Facebook was designed for better animals than humans. It was designed for beings that don’t hate, exploit, harass, or terrorize each other—like golden retrievers. But we humans are nasty beasts. So we have to regulate and design our technologies to correct for our weaknesses. The challenge is figuring out how.

First, we must recognize that the threat of Facebook is not in some marginal aspect of its products or even in the nature of the content it distributes. It’s in those core values that Zuckerberg has embedded in every aspect of his company: a commitment to unrelenting growth and engagement. It’s enabled by the pervasive surveillance that Facebook exploits to target advertisements and content.

Mostly, it’s in the overall, deleterious effect of Facebook on our ability to think collectively.

That means we can’t organize a political movement around the mere fact that Donald Trump exploited Facebook to his benefit in 2016 or that Donald Trump got tossed off of Facebook in 2021 or even that Facebook contributed directly to the mass expulsion and murder of the Rohingya people in Myanmar. We can’t rally people around the idea that Facebook is dominant and coercive in the online advertising market around the world. We can’t explain the nuances of Section 230 of the Communication Decency Act and expect any sort of consensus about what to do about it (or even if reforming the law would make a difference to Facebook). None of that is sufficient.

Advertisement
free widgets for website

Facebook is dangerous because of the collective impact of 3 billion people being surveilled constantly, then having their social connections, cultural stimuli, and political awareness managed by predictive algorithms that are biased toward constant, increasing, immersive engagement. The problem is not that some crank or president is popular on Facebook in one corner of the world. The problem with Facebook is Facebook.

See also  Zuckerberg reportedly said Facebook needs 'to inflict pain' on Apple

Facebook is likely to be this powerful—if not more—for many decades. So while we strive to live better with it (and with each other), we must all spend the next few years imagining a more radical reform program. We must strike at the root of Facebook (and, while we are at it, Google). More specifically, there is one recent regulatory intervention, modest though it is, that could serve as a good first step. 

In 2018 the European Union began insisting that all companies that collect data respect certain basic rights of citizens. The resulting General Data Protection Regulation grants users some autonomy over the data that we generate, and it insists on minimal transparency when that data is used. While enforcement has been spotty, and the most visible sign of the GDPR has been extra warnings that demand we click through to accept terms, the law offers some potential to limit the power of big data vacuums like Facebook and Google. It should be studied closely, strengthened, and spread around the world. If the US Congress (and the parliaments of Canada, Australia, and India) would take citizens’ data rights more seriously than they do content regulation, there might be some hope.

Beyond the GDPR, an even more radical and useful approach would be to throttle Facebook’s (and any company’s) ability to track everything we do and say and limit the ways it can use our data to influence our social connections and political activities. We could limit the reach and power of Facebook without infringing speech rights. We could make Facebook matter less.

Imagine if we kept our focus on how Facebook actually works and why it’s as rich and powerful as it is. If we did that, instead of fluttering our attention to the latest example of bad content flowing across the platform and reaching some small fraction of users, we might have a chance. As Marshall McLuhan informed us 56 years ago, it’s the medium, not the message, that ultimately matters.

Advertisement
free widgets for website

More Great WIRED Stories

Read More

FACEBOOK

Facebook-Meta Earns the ‘Worst Company of 2021’ Title in This Survey

Published

on

By

facebook-meta-earns-the-‘worst-company-of-2021’-title-in-this-survey-–-news18
Facebook has had its share of controversies this year. The company was under more scrutiny after whistleblower Frances Haugen leaked a series of internal documents.

Facebook parent Meta has been named the Worst Company of the Year (2021) by Yahoo Finance respondents. According to the publication, an “open-ended” survey was published on Yahoo Finance on December 4 and 5, where 1,541 respondents participated. Facebook received 8 percent of the write-in vote, but respondents were seemingly mad about the Robinhood trading app as well. Electric truck startup Nikola, which was named last year’s worst company by the same publication also faced respondents ire.

Yahoo Finance notes, “Facebook has had its share of controversies this year.” Starting in January, Meta-owned WhatsApp got caught up in a huge controversy after the messaging app announced a new privacy policy (Terms of Service). WhatsApp said it would collect user information and share it with third-party apps for a better user experience. However, the app gave users no choice but later made modifications to the policy under pressure. Similarly, the company was under more scrutiny after whistleblower and former Facebook employee Frances Haugen leaked a series of internal documents showing the company’s problematic practices. It was revealed that Meta-owned Instagram had a negative impact on teenage girls, but the company did almost nothing to rectify the problem.

Yahoo Finance even highlights, “At the same time, some critics, including conservatives, say Facebook over-policed the platform’s speech and stifled their voices.” Critics also blame Facebook and other social media platforms for not curbing hate speech that led to Capitol Building riots.

See also  Facebook to pay $650 million settlement to users over photo tagging

However, around 30 percent of Yahoo Finance readers said that Facebook or Meta could redeem itself. One respondent suggested that the company could issue a formal apology for negligence and donate a sizable amount of its profits to a foundation to help reverse its harm.

On the other hand, respondents chose Microsoft as the Company of the Year (2021). The Satya Nadella-led company touched the trillion-mark this year and introduced notable upgrades. The most notable is the Windows 11 OS update that succeeds Windows 10.

Advertisement
free widgets for website
Continue Reading

FACEBOOK

Facebook pays 1.7 Cr fine to Russia after failing to delete content Moscow deems illegal

Published

on

By

facebook-pays-1.7-cr-fine-to-russia-after-failing-to-delete-content-moscow-deems-illegal

In the latest legal tussle with Russia over controversial social media regulation laws, Facebook paid 17 million roubles (Rs 1.7 Crore) for failing to remove content deemed illegal by Moscow. With a threat of potential larger fines looming, Facebook parent company Meta, owned by Mark Zuckerberg, is scheduled to face court next week over repeated violations of Russian legislation on content, Interfax News Agency reported. As per the latest updates, the social media giant could be fined a percentage of its annual revenue.

In October, Moscow sent state bailiffs to enforce the collection of 17 million roubles. Meanwhile, as per Interfax report citing a federal bailiffs’ database, on Sunday, there were more enforcement proceedings against the company. Apart from the popular social media app, Telegram has also paid 15 million roubles in fines for failing to comply with the Russian social media legislations that came into force in 2016.

Facebook pays $53k to Russia for refusing controversial social media laws

It is pertinent to mention that Facebook has locked horns with Moscow earlier in November, resulting in it paying 4 million roubles ($53,000) over its refusal to adhere to Russian data localisation laws, the Moscow Times reported. The Moscow court on November 25 had said that Facebook paid the fine levied in February, following which all proceedings against the US-based social media giant. The payment comes against the litigation filed against the company in 2018, alongside Twitter. The tech companies were also forced to pay an additional 3000 rubles ($40) for failing to comply with user data sharing rules as per the law. The Russian authorities have also previously blocked LinkedIn, owned by Microsoft, for failing to abide by the laws.

See also  Should you break up with Facebook?

Russian social media laws

As per Moscow Times, under the Russian social media regulation laws, all foreign technology companies are required to store data related to Russian customers and users on servers located in Russia. Additionally, the Russian tech companies will also have to share encryption data with the federal authorities as well as record user calls, messages and civil society group conversation records. The apparatus is said to be a severe breach of privacy rights and unfettered back-door access to personal data that could be used to harass Kremlin critics.

Continue Reading

FACEBOOK

Facebook Messenger Is Launching a Split Payments Feature for Users to Quickly Share Expenses

Published

on

By

Facebook Messenger Is Launching a Split Payments Feature for Users to Quickly Share Expenses

Meta has announced the arrival of a new Split Payments feature in Facebook Messenger. This feature, as the name suggests, will let you calculate and split expenses with others right from Facebook Messenger. This feature essentially looks to bring an easier method to share the cost of bills and expenses — for example, splitting a dinner bill with friends. Using this new Split Payment feature, Facebook Messenger users will be able to split bills evenly or modify the contribution for each individual, including their own.

The company took to its blog post to announce the new Split Payment feature in Facebook Messenger. 9to5Mac reports that this new bill splitting feature is still in beta and will be exclusive to US users at first. The rollout will begin early next week. As mentioned, it will help users share the cost of bills, expenses, and payments. This feature is especially useful for those who share an apartment and need to split the monthly rent and other expenses with their mates. It could also come handy at a group dinner with many people.

With Split Payments, users can add the number of people the expense needs to be divided with and, by default, the amount entered will be divided in equal parts. A user can also modify each person’s contribution including their own. To use Split Payments, click the Get Started button in a group chat or the Payments Hub in Messenger. Users can modify the contribution in the Split Payments option and send a notification to all the users who need to make payments. After entering a personalised message and confirming your Facebook Pay details, the request will be sent and viewable in the group chat thread.

See also  Facebook's Problems Aren't Only In Washington - Foreign Policy

Once someone has made the payment, you can mark their transaction as ‘completed’. The Split Payment feature will automatically take into account your share as well and calculate the amount owed accordingly.


For the latest tech news and reviews, follow Gadgets 360 on Twitter, Facebook, and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel.

Advertisement
free widgets for website

Tasneem Akolawala is a Senior Reporter for Gadgets 360. Her reporting expertise encompasses smartphones, wearables, apps, social media, and the overall tech industry. She reports out of Mumbai, and also writes about the ups and downs in the Indian telecom sector. Tasneem can be reached on Twitter at @MuteRiot, and leads, tips, and releases can be sent to tasneema@ndtv.com.

Continue Reading

Trending