Connect with us

FACEBOOK

How our culture and our government gave too much power to Facebook – New York Post

Published

on

Capitalism offers an optimal paradigm to organize a virtuous society’s economic affairs. But virtue is a precondition for capitalism, not a product of it, and no modern phenomenon better highlights that distinction than the rise of addictive social-media platforms.

This past week, The Wall Street Journal reported on Facebook’s knowledge of the harmful impact of its Instagram platform on teen girls; on Facebook’s role in promoting anger on its platform; on Facebook’s weak response to employee-reported drug-cartel and human-trafficking activity on its platform; on how the Facebook platform thwarted Mark Zuckerberg’s desire to promote ­COVID-19 vaccinations; and on Facebook’s “XCheck” program, which exempts high-profile accounts and VIP users from Facebook’s enforcement actions. More stories are still forthcoming.

It’s easy to criticize Facebook for these apparent failures, but we should pause to assess what we as a society should hold Facebook accountable for — and not.

The real problem with Facebook’s behavior is the revelation of its rampant institutional lying. In the XCheck story, we learned that after Facebook spent more than $130 million to create an Independent Oversight Board to oversee its content-moderation decisions, Facebook executives routinely lied to that board. Facebook told the Oversight Board that XCheck was only used in “a small number of decisions,” even though the program had grown to include 5.8 million users in 2020.

“We’re not actually doing what we say we do publicly,” and the company’s actions constitute a “breach of trust,” reads a confidential internal review done by Facebook.

We also learned — shockingly — that the CEO and COO of the trillion-dollar behemoth are regularly involved in decisions of what posts to remove when such posts are made by certain people who are exempted from Facebook’s community guidelines and content-moderation procedures. This is all while Facebook asserted that it applied the same standards to everyone.

See also  Foes united vs Facebook over Instagram's effect on teens - KTVZ
Facebook co-founder and CEO Mark Zuckerberg
Facebook CEO Mark Zuckerberg wields too much political and social influence on countries such as Israel and India.
The Asahi Shimbun via Getty Images

Apparently, XCheck was created to mitigate “p.r. fires” or negative media attentions when Facebook takes the wrong action against a high-profile VIP. Even worse than the existence of the XCheck program was Facebook’s dishonesty about it, reflecting the state of mind of a company that knew it was doing something wrong — and still did it anyway.

These revelations strengthen the case that Facebook likely serves increasingly as the censorship arm of the US government, just as it does for other governments around the world.

In countries like India, Israel, Thailand, and Vietnam, Facebook frequently removes posts at the behest of the government to deter regulatory reprisal. Here at home, we know that Mark Zuckerberg and Sheryl Sandberg regularly correspond with US officials, ranging from e-mail exchanges with Dr. Anthony Fauci on COVID-19 policy to discussing “problematic posts” that “spread disinformation” with the White House.

Silhouettes of mobile users are seen next to a screen projection of Instagram logo.
Media outlets are now beginning to unravel on the devastating effects Instagram has on today’s youth.
REUTERS/Dado Ruvic/Illustration/File

If Zuckerberg and Sandberg are also directly making decisions about which posts to censor versus permit, that makes it much more likely that they are responsive to the threats and inducements from government officials.

That’s what we should find alarming about the Journal’s reporting. But we should separate that from blaming Facebook for the anger of its users or for the self-esteem of teenage girls who suffer from body-image issues. The underlying cultural problems that create the conditions for anger, hostility and psychological insecurity should be addressed through spheres of public life that go beyond the purview of a social-media company — through family, faith and civic engagement.

See also  How one Facebook worker unfriended the giant social network - WBNG

To be sure, Facebook and other platforms amplify our pre-existing cultural failures and psychological vulnerabilities. A revival of, say, faith in God might address those issues more effectively than anything Mark Zuckerberg might do on a given day. The flawed premise that it’s Facebook’s job to address these cultural failures through its platform reinforces the uniquely postmodern problem that we have relocated our faith to new gods.

A female teenager is frustrated while looking onto her laptop.
America’s teenagers are constantly facing self-esteem issues at the expense of Facebook’s lack of moderation.
Peter Dazeley / Getty Images

Instagram has become a church for insecure teenage girls; Facebook has become a church for angry Americans. The Journal’s reporting indicts those churches for failing the faithful, whereas the real problem is that Facebook and Instagram should have never played the role of those churches in the first place.

Don’t like God? Fine — platonic virtue or civic identity can suffice. But our ability to find true meaning in the real world is a precondition for a healthy experience on the Internet. No Web site will ever fill our postmodern cultural void.

Yes, it’s true that Facebook magnifies our cultural failures, but assigning the responsibility to Facebook to fix these cultural problems wrongly empowers the very actors we should be stripping of social power instead.

Facebook deserves severe criticism for its rampant hypocrisy — claiming to make the world a better place while knowingly doing the opposite, and lying about its knowledge of it at every step along the way. It should be held liable both in the court of public opinion and in federal courts for its lies — drawing from legal doctrines of consumer fraud that punish companies for saying one thing and doing another, as well as doctrines of state action that recognize that private companies ought to be bound by the Constitution if they are working hand-in-glove with government actors to censor political speech that the government cannot itself censor. The social-media giants should also be responsible for illegal activities they allow, like gun-running, drug sales and child pornography.

See also  How A Single Share Of Facebook Could Fund A Rich Retirement
A photo of a potential hacker on his phone and laptop.
Facebook lets drug dealers, human traffickers and pedophiles run freely on social media while targeting conservative media outlets for political reasons.
Getty Images/EyeEm

These actions would make Facebook less able to sway American democracy and dupe the public.

But forcing the company to assume responsibility for body image and anger-management issues will, ironically, make Facebook and other social-media players even more powerful in our culture. The government, through big-tech proxy, would soon control what we can and can’t say, politically and culturally. Is that really what we want?

Vivek Ramaswamy is the author of “Woke, Inc.: Inside Corporate America’s Social-Justice Scam.”

Read More

FACEBOOK

Facebook Adds New Trend Insights in Creator Studio, Which Could Help Shape Your Posting Strategy

Published

on

By

facebook-adds-new-trend-insights-in-creator-studio,-which-could-help-shape-your-posting-strategy
en flag
sv flag

Facebook’s looking to provide more content insight within Creator Studio with the rollout of a new ‘Inspiration Hub’ element, which highlights trending content and hashtags within categories related to your business Page.

Facebook Inspiration Hub

As you can see in these screenshots, posted by social media expert Matt Navarra, when it becomes available to you, you’ll be able to access the new Inspiration Hub from the Home tab in Creator Studio.

At the right side of the screen, you can see the first of the new insights, with trending hashtags and videos from the last 24 hours, posted by Pages similar to yours, displayed above a ‘See more’ prompt.

When you tap through to the new hub, you’ll have a range of additional filters to check out trending content from across Facebook, including Page category, content type, region, and more.

Facebook Inspiration Hub

That could be hugely valuable in learning what Facebook users are responding to, and what people within your target market are engaging with in the app.

The Hub also includes insights into trending hashtags, within your chosen timeframe, which may further assist in tapping into trending discussions.

Facebook Inspiration Hub

How valuable hashtags are on Facebook is still up for debate, but you’ll also note that you can filter the displayed results by platform, so you can additionally display Instagram hashtag trends as well, which could be very valuable in maximizing your reach.

Much of this type of info has been available within CrowdTangle, Facebook’s analytics platform for journalists, for some time, but not everyone can access CrowdTangle data, which could make this an even more valuable proposition for many marketers.

See also  Half a billion people just had their Facebook data leaked

Of course, overall performance really relates to your own creative, and thinking through the action that you want your audience to take when reading your posts. But in terms of detecting new content trends, including hashtag usage, caption length, videos versus image posts, and more, there’s a lot that could be gleaned from these tools and filters.

It’s a significant analytics addition – we’ve asked Facebook for more info on the rollout of the new option, and whether it’s already beyond test mode, etc. We’ll update this post if/when we hear back.

Continue Reading

FACEBOOK

Meta Updates Policy on Cryptocurrency Ads, Opening the Door to More Crypto Promotions in its Apps

Published

on

By

meta-updates-policy-on-cryptocurrency-ads,-opening-the-door-to-more-crypto-promotions-in-its-apps
en flag
sv flag

With cryptocurrencies gaining momentum, in line with the broader Web 3.0 push, Meta has today announced an update to its ad policies around cryptocurrencies, which will open the door to more crypto advertisers on its platforms.

As per Meta:

Starting today, we’re updating our eligibility criteria for running ads about cryptocurrency on our platform by expanding the number of regulatory licenses we accept from three to 27. We are also making the list of eligible licenses publicly available on our policy page.”

Essentially, in order to run any crypto ads in Meta’s apps, that currency needs to adhere to regional licensing provisions, which vary by nation. With crypto becoming more accepted, Meta’s now looking to enable more crypto companies to publish ads on its platform, which will provide expanded opportunity for recognized crypto providers to promote their products, while also enabling Meta to make more money from crypto ads.

“Previously, advertisers could submit an application and include information such as any licenses they obtained, whether they are traded on a public stock exchange, and other relevant public background on their business. However, over the years the cryptocurrency landscape has matured and stabilized and experienced an increase in government regulation, which has helped to set clearer responsibilities and expectations for the industry. Going forward, we will be moving away from using a variety of signals to confirm eligibility and instead requiring one of these 27 licenses.”

Is that a good move? Well, as Meta notes, the crypto marketplace is maturing, and there’s now much wider recognition of cryptocurrencies as a legitimate form of payment. But they’re also not supported by most local financial regulators, which reduced transaction protection and oversight, which also brings a level of risk in such process.

See also  Facebook, Twitter, TikTok need to 'do the right thing', says FCA - Financial News

But then again, all crypto providers are required to clearly outline any such risks, and most also highlight the ongoing market volatility in the space. This expanded level of overall transparency means that most people who are investing in crypto have at least some awareness of these elements, which likely does diminish the risk factor in such promotions within Meta’s apps.

But as crypto adoption continues to expand, more of these risks will become apparent, and while much of the crypto community is built on good faith, and a sense of community around building something new, there are questions as to how much that can hold at scale, and what that will then mean for evolving scams and criminal activity, especially as more vulnerable investors are brought into the mix.

Broader promotional capacity through Meta’s apps will certainly help to boost exposure in this respect – though again, the relative risk factors are lessened by expanded regulatory oversight outside of the company.

You can read more about Meta’s expanded crypto ad regulations here.

Continue Reading

FACEBOOK

Meta Outlines Evolving Safety Measures in Messaging as it Seeks to Allay Fears Around the Expansion of E2E Encryption

Published

on

By

meta-outlines-evolving-safety-measures-in-messaging-as-it-seeks-to-allay-fears-around-the-expansion-of-e2e-encryption
en flag
sv flag

Amid rising concern about Meta’s move to roll out end-to-end encryption by default to all of its messaging apps, Meta’s Global Head of Safety Antigone Davis has today sought to provide a level of reassurance that Meta is indeed aware of the risks and dangers that such protection can pose, and that it is building safeguards into its processes to protect against potential misuse.

Though the measures outlined don’t exactly address all the issues raised by analysts and safety groups around the world.

As a quick recap, back in 2019, Facebook announced its plan to merge the messaging functionalities of Messenger, Instagram and WhatsApp, which would then provide users with a universal inbox, with all of your message threads from each app accessible on either platform.

The idea is that this will simplify cross-connection, while also opening the door to more opportunities for brands to connect with users in the messaging tool of their choice – but it also, inherently, means that the data protection method for its messaging tools must rise to the level of WhatsApp, its most secure messaging platform, which already includes E2E encryption as the default.

Various child safety experts raised the alarm, and several months after Facebook’s initial announcement, representatives from the UK, US and Australian Governments sent an open letter to Facebook CEO Mark Zuckerberg requesting that the company abandon its integration plan.

Meta has pushed ahead, despite specific concerns that the expansion of encryption will see its messaging tools used by child trafficking and exploitation groups, and now, as it closes in on the next stage, Meta’s working to counter such claims, with Davis outlining six key elements which she believes will ensure safety within this push.

See also  Two experimental Facebook accounts show how the company helped divide America - USA Today

Davis has explained the various measures that Meta has added on this front, including:

  • Detection tools to stop adults from repeatedly setting up new profiles in an attempt to connect minors that they don’t know
  • Safety notices in Messenger, which provide tips on spotting suspicious behavior
  • The capacity to filter messages with selected keywords on Instagram
  • More filtering options in chat requests to help avoid unwanted contact
  • Improved education prompts to help detect spammers and scammers in messages
  • New processes to make it easier to report potential harm, including an option to select “involves a child”, which will then prioritize the report for review and action

Meta messaging security options

Which are all good, all important steps in detection, while Davis also notes that its reporting process “decrypts portions of the conversation that were previously encrypted and unavailable to us so that we can take immediate action if violations are detected”.

That’ll no doubt raise an eyebrow or two among WhatsApp users – but the problem here is that, overall, the broader concern is that such protections will facilitate usage by criminal groups, and the reliance on self-reporting in this respect is not going to have any impact on these networks operating, at scale, under a more protected messaging framework within Meta’s app eco-system.

Governments have called for ‘backdoor access’ to break Meta’s encryption for investigations into such activity, which Meta says is both not possible and will not be built into its future framework. The elements outlined by Davis do little to address this specific need, and without the capacity to better detect such, it’s hard to see any of the groups opposed to Meta’s expanded encryption changing their stance, and accepting that the merging of all of the platform’s DM options will not also see a rise in criminal activity organized via the same apps.

See also  Tech Savvy: Facebook Groups connect users across the world

Of course, the counterargument could be that encryption is already available on WhatsApp, and that criminal activity of this type can already be undertaken within WhatsApp alone. But with a combined user count of 3.58 billion people per month across its family of apps, that’s a significantly broader interconnection of people than WhatsApp’s 2 billion active users, which, arguably, could open the door to far more potential harm and danger in this respect.

Really, there’s no right answer here. Privacy advocates will argue that encryption should be the standard, and that more people are actually more protected, on balance, by enhanced security measures. But there is also an undeniable risk in shielding even more criminal groups from detection.

Either way, right now, Meta seems determined to push ahead with the plan, which will weld all of its messaging tools together, and also make it more difficult to break-up its network, if any antitrust decisions don’t go Meta’s way, and it’s potentially pressed to sell-off Instagram or WhatsApp as a result.

But expect more debate to be had, in more countries, as Meta continues to justify its decision, and regulatory and law enforcement groups seek more options to help maintain a level of accessibility for criminal investigations and detection.

Continue Reading

Trending