Connect with us

FACEBOOK

‘Carol’s Journey’: What Facebook knew about how it radicalized users – NBC News

Published

on

In the summer of 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smith’s account indicated an interest in politics, parenting, and Christianity, and followed a few of her favorite brands, including Fox News and then-President Donald Trump. 

Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.

Smith didn’t follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smith’s feed was full of groups and pages that had violated Facebook’s own rules, including those against hate speech and disinformation.

Smith wasn’t a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platform’s role in misinforming and polarizing users through its recommendations systems.

That researcher said Smith’s Facebook experience was “a barrage of extreme, conspiratorial, and graphic content.” 

The body of research consistently found Facebook pushed some users into “rabbit holes,” increasingly narrow echo-chambers where violent conspiracy theories thrived. People radicalized through these rabbit holes make up a small slice of total users, but at Facebook’s scale, that can mean millions of individuals.

The findings, communicated in a report titled “Carol’s Journey to QAnon,” were among thousands of pages of documents included in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by legal counsel for Frances Haugen, who worked as a Facebook product manager until May. Haugen is now asserting whistleblower status and has filed several specific complaints that Facebook puts profit over public safety. Earlier this month, she testified about her claims before a Senate subcommittee

Versions of the disclosures — which redacted the names of researchers, including the author of “Carol’s Journey to QAnon” — were shared digitally and reviewed by a consortium of news organizations, including NBC News. The Wall Street Journal published a series of reports based on many of the documents last month. 

“While this was a study of one hypothetical user, it is a perfect example of research the company does to improve our systems and helped inform our decision to remove QAnon from the platform,” a Facebook spokesperson said in a response to emailed questions.

Facebook CEO Mark Zuckerberg has broadly denied Haugen’s claims, defending his company’s “industry-leading research program” and its commitment “to identify important issues and work on them.” The documents released by Haugen partly support those claims, but also highlight the frustrations of some of the employees engaged in that research. 

Among Haugen’s disclosures are research, reports and internal posts that suggest Facebook has long known that its algorithms and recommendation systems push some users to extremes. And while some managers and executives ignored the internal warnings, anti-vaccine groups, conspiracy theory movements and disinformation agents took advantage of their permissiveness, threatening public health, personal safety and democracy at large.  

“These documents effectively confirm what outside researchers were saying for years prior, which was often dismissed by Facebook,” said Renée DiResta, technical research manager at the Stanford Internet Observatory and one of the earliest harbingers of the risks of Facebook’s recommendation algorithms. 

Facebook’s own research shows how easily a relatively small group of users has been able to hijack the platform, and for DiResta, settles any remaining question about Facebook’s role in the growth of conspiracy networks. 

See also  Petersburg Symphony conductor Ulysses Kirksey has died, orchestra posts on Facebook

“Facebook literally helped facilitate a cult,” she said. 

‘A pattern at Facebook’

For years, company researchers had been running experiments like Carol Smith’s to gauge the platform’s hand in radicalizing users, according to the documents seen by NBC News.

This internal work repeatedly found that recommendation tools pushed users into extremist groups, a series of disclosures that helped inform policy changes and tweaks to recommendations and newsfeed rankings. Those rankings are a tentacled, ever-evolving system widely known as “the algorithm” that pushes content to users. But the research at that time stopped well short of inspiring any movement to change the groups and pages themselves.

That reluctance was indicative of “a pattern at Facebook,” Haugen told reporters this month. “They want the shortest path between their current policies and any action.”

Haugen added, “There is great hesitancy to proactively solve problems.” 

A Facebook spokesperson disputed that the research had not pushed the company to act and pointed to changes to groups announced in March.

While QAnon followers committed real-world violence in 2019 and 2020, groups and pages related to the conspiracy theory skyrocketed, according to internal documents. The documents also show how teams inside Facebook took concrete steps to understand and address those issues — some of which employees saw as too little, too late.  

By the summer of 2020, Facebook was hosting thousands of private QAnon groups and pages, with millions of members and followers, according to an unreleased internal investigation

A year after the FBI designated QAnon as a potential domestic terrorist threat in the wake of armed standoffs, kidnappings, harassment campaigns and shootings, Facebook labeled QAnon a “Violence Inciting Conspiracy Network,” and banned it from the platform, along with militias and other violent social movements. A small team working across several of Facebook’s departments had hosted hundreds of ads on Facebook and Instagram worth thousands of dollars and millions of views, “praising, supporting, or representing” the conspiracy theory.

The Facebook spokesperson said in an email that the company has “taken a more aggressive approach in how we reduce content that is likely to violate our policies, in addition to not recommending Groups, Pages or people that regularly post content that is likely to violate our policies.

For many employees inside Facebook, the enforcement came too late, according to posts left on Workplace, the company’s internal message board. 

“We’ve known for over a year now that our recommendation systems can very quickly lead users down the path to conspiracy theories and groups,” one integrity researcher, whose name had been redacted, wrote in a post announcing she was leaving the company. “This fringe group has grown to national prominence, with QAnon congressional candidates and QAnon hashtags and groups trending in the mainstream. We were willing to act only * after * things had spiraled into a dire state.” 

‘We should be concerned’

While Facebook’s ban initially appeared effective, a problem remained. The removal of groups and pages didn’t wipe out QAnon’s most extreme followers, who continued to organize on the platform.

“There was enough evidence to raise red flags in the expert community that Facebook and other platforms failed to address QAnon’s violent extremist dimension,” said Marc-André Argentino, a research fellow at King’s College London’s International Centre for the Study of Radicalisation, who has extensively studied QAnon. 

Believers simply rebranded as anti-child trafficking groups or migrated to other communities, including those around the anti-vaccine movement. 

It was a natural fit. Researchers inside Facebook studying the platform’s niche communities found violent conspiratorial beliefs to be connected to Covid vaccine hesitancy. In one study, researchers found QAnon community members were also highly concentrated in anti-vaccine communities. Anti-vaccine influencers had similarly embraced the opportunity of the pandemic, and used Facebook’s features like groups and livestreaming to grow their movements. 

See also  Facebook fined $8.45 mn in Italy over improper use of subscribers' data

“We do not know if QAnon created the preconditions for vaccine hesitancy beliefs,” researchers wrote. “It may not matter either way. We should be concerned about people affected by both problems.”

QAnon believers also jumped to groups promoting President Donald Trump’s false claim that the 2020 election was stolen, groups that trafficked in a hodgepodge of baseless conspiracy theories alleging voters, Democrats and election officials were somehow cheating Trump out of a second term. This new coalition, largely organized on Facebook, ultimately stormed the U.S. Capitol on Jan. 6, according to a report included in the document trove and first reported by Buzzfeed News in April. 

These conspiracy groups had become the fastest-growing groups on all of Facebook, according to the report, but Facebook wasn’t able to control their “meteoric growth,” the researchers wrote, “because we were looking at each entity individually, rather than as a cohesive movement.” A Facebook spokesperson told BuzzFeed News it took many steps to limit election misinformation but that it was unable to catch everything.

Facebook’s enforcement was “piecemeal,” the team of researchers wrote, noting, “we’re building tools and protocols and having policy discussions to help us do this better next time.” 

‘A head-heavy problem’

The attack on the Capitol invited harsh self-reflection from employees. 

One team invoked the lessons learned during QAnon’s moment to warn about permissiveness with anti-vaccine groups and content, which researchers found comprised up to half of all vaccine content impressions on the platform. 

“In rapidly-developing situations, we’ve often taken minimal action initially due to a combination of policy and product limitations making it extremely challenging to design, get approval for, and roll out new interventions quickly.” the report said. Qanon was offered as an example of an example of a time when Facebook was “prompted by societal outcry at the resulting harms to implement entity takedowns” for a crisis on which “we initially took limited or no action.” 

The effort to overturn the election also invigorated efforts to clean up the platform in a more proactive way. 

Facebook’s “Dangerous Content” team formed a working group in early 2021 to figure out ways to deal with the kind of users who had been a challenge for Facebook: communities including QAnon, Covid-denialists and the misogynist incel movement that weren’t obvious hate or terrorism groups, but that, by their nature, posed a risk to the safety of individuals and societies. 

The focus wasn’t to eradicate them, but to curb the growth of these newly branded “harmful topic communities,” with the same algorithmic tools that had allowed them to grow out of control. 

“We know how to detect and remove harmful content, adversarial actors, and malicious coordinated networks, but we have yet to understand the added harms associated with the formation of harmful communities, as well as how to deal with them,” the team wrote in a 2021 report.

In a February 2021 report, they got creative. An integrity team details an internal system meant to measure and protect users against societal harms including radicalization, polarization, and discrimination that its own recommendation systems had helped cause. Building on a previous research effort dubbed “Project Rabbithole,” the new program was dubbed Drebbel. Cornelis Drebbel was a 17th-century Dutch engineer known for inventing the first navigable submarine and the first thermostat. 

The Drebbel group was tasked with discovering and ultimately stopping the paths that moved users towards harmful content on Facebook and Instagram, including in anti-vax and QAnon groups. A post from the Drebbel team praised the earlier research on test users. “We believe Drebbel will be able to scale this up significantly,” they wrote.

See also  Google, Facebook, Microsoft top EU lobbying spending - study - Yahoo Finance

“Group joins can be an important signal and pathway for people going towards harmful

and disruptive communities,” the group stated in a post to Workplace, Facebook’s internal message board. “Disrupting this path can prevent further harm.”

The Drebbel group features prominently in Facebook’s “Deamplification Roadmap,” a multi-step plan published on the company Workplace on Jan. 6, that includes a complete audit of recommendation algorithms.

In March, the Drebbel group posted about their progress via a study and suggested a way forward. If researchers could systematically identify the “gateway groups,” those that fed into anti-vaccination and QAnon communities, they wrote, maybe Facebook could put up roadblocks to keep people from falling through the rabbit hole. 

The Drebbel “Gateway Groups” study looked back at a collection of QAnon and anti-vaccine groups that had been removed for violating policies around misinformation and violence and incitement. It used the membership of these purged groups to study how users had been pulled in. Drebbel identified 5,931 QAnon groups with 2.2 million total members, half of which joined through so-called gateway groups. For 913 anti-vaccination groups with 1.7 million members, the study identified one million gateway groups (Facebook has said it recognizes the need to do more).

Facebook integrity employees warned in an earlier report that anti-vaccine groups could become more extreme. 

“Expect to see a bridge between online and offline world,” the report said. “We might see motivated users create sub-communities with other highly motivated users to plan action to stop vaccination.”

A separate cross-department group reported this year that vaccine hesitancy in the U.S. “closely resembled” QAnon and Stop the Steal movements, “primarily driven by authentic actors and community building.” 

“We found, like many problems at FB,” the team wrote, “that this is a head-heavy problem with a relatively few number of actors creating a large percentage of the content and growth.”

The Facebook spokesperson said that the company had “focused on outcomes” in relation to Covid-19 and that it had seen vaccine hesitancy decline by 50 percent, according to a survey it conducted with Carnegie-Mellon University and University of Maryland.

Whether Facebook’s newest integrity initiatives will be able to stop the next dangerous conspiracy theory movement or the violent organization of existing movements remains to be seen. But their policy recommendations may carry more weight now that the violence on Jan. 6 laid bare the outsized influence and dangers of even the smallest extremist communities and the misinformation that fuels them. 

“The power of community, when based on harmful topics or ideologies, potentially poses a greater threat to our users than any single piece of content, adversarial actor, or malicious network,” a 2021 report concluded.

The Facebook spokesperson said that the recommendations in the “Deamplification Roadmap” are on track: “This is important work and we have a long track record of using our research to inform changes to our apps,” the spokesperson wrote. “Drebbel is consistent with this approach, and its research helped inform our decision this year to permanently stop recommending civic, political or news Groups on our platforms. We are proud of this work and we expect it to continue to inform product and policy decisions going forward.”

CORRECTION (Oct. 22, 2021, 7:06 p.m. ET): A previous version of this article misstated the status of groups studied by Facebook’s Drebbel team. It looked at groups that Facebook had removed, not those that were currently active.

Read More

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

FACEBOOK

POV: Facebook’s Change to Meta Blurs Lines Even Further

Published

on

By

pov:-facebook’s-change-to-meta-blurs-lines-even-further-|-bu-today
en flag
sv flag

COM’s Michelle Amazeen worries if people will know the difference between real-world and virtual experiences

When Facebook announced it was changing its name to Meta in October, the 2008 Pixar movie WALL-E was the first thing that came to my mind. The sci-fi movie is about a robot left on an uninhabitable Earth to clean up the garbage left behind by humans. Rampant consumerism and corporate greed have left Earth a wasteland, and humans have been evacuated to outer space. In this same way, I envision Facebook abandoning the real world for the virtual “metaverse”—shared online environments where people can interact. They leave behind unimaginable quantities of disinformation, amplified by their algorithms, along with harassment, hate speech, and angry partisans.

To move beyond my initial reaction and gain more insight into the implications of Facebook’s name change (and strategic plans) from a communication research perspective, I turned to two research fellows who study emerging media within the Communication Research Center (CRC) at Boston University’s College of Communication (COM).

Media psychologist James Cummings, a COM assistant professor of emerging media studies, indicates that a metaverse—if successful—would produce new issues in information processing and would place a new emphasis on theories of interpersonal communication—rather than just mass communication. As I feared, he also says it has the potential to augment existing media effects of concern related to social networking, namely misinformation, persuasion, addiction, and distraction.

First, Cummings explains there would be major implications for how billions of people select, process, and are influenced by media content. To be successful, the metaverse platforms will need to transform current modes of information processing and digital communication interactions into much more immersive, cognitively absorbing experiences.

“For instance,” he says, “the mainstreaming of consumer-facing immersive ‘virtual reality’ [VR]—which typically places high demands on users’ processing—will be coming in an age of media multitasking. Interfaces will need to figure out how to immerse users while still permitting them to access different information streams.”

See also  Oklahoma GOP Criticized for Anti-Semitic Facebook Post

Similarly, he says, mainstreaming “augmented reality” (AR) experiences will also mean requiring users to skillfully juggle attentional demands. People will suddenly be forced to multitask between virtual and real-world stimuli. These are common practices for hobbyists, but may present more challenges for a broader population of users.

Thus, Cummings suggests, if the metaverse is the ecosystem of devices and experiences that Facebook CEO Mark Zuckerberg envisions, users will be switching back and forth between different types of immersive experiences and stimuli, from reality to augmented reality to virtual reality. This scenario may present some new and interesting psychological experiences, in the effects of in-person (e.g., chatting with a friend in the same room), mediated (e.g., reading a news alert on your phone), and augmented messages (e.g., a holographic personal assistant)—all interdependent and blurring together.

Second, Cummings expects that a successful metaverse would mean exchanges with virtual content and people that are much more like face-to-face or interpersonal interactions. “This will require the designers of these platforms to master key elements of media richness theory and factors influencing users’ sense of spatial and social presence,” he explains. For instance, social networking in the metaverse may not only consist of the informational experiences we are used to today (e.g., reading text, watching videos, viewing pictures), but increasingly also perceptual experiences (e.g., a sense of being transported into the story, a feeling of being next to someone on the other side of the globe, noticing nonverbal behaviors).

Finally, Cummings indicates that immersive media are rife for a whole new breed of covert persuasion—such as “native advertising,” or ads that mimic their surroundings—to the extent that users confuse the perceptually plausible with the real. He’s particularly interested in seeing the impact of immersion on users’ perceptions of message authorship and authorial intent.

See also  Meet the Rustaceans: Daniel Xu

Indeed, back in the real world, native advertising has been widely adopted to covertly promote not only commercial products, but also political candidates. Candidates are increasingly relying upon “influencers” to post supportive messages on Facebook and other social media without consistently disclosing they are being paid to do so, blurring the critical line between what is real news and what is merely paid advertising. As I have previously addressed here, if the regulatory agencies that oversee advertising—both commercial and political—have not been able to keep up with the digital transformation of our media ecosystem, how will they be able to regulate the metaverse?

For Chris Wells, a COM associate professor of emerging media studies, the promise and pitfalls of the metaverse depend entirely on how Facebook rolls it out. For example, the radical network effects we see from social media rely to some degree on the extremely shortened forms of communication—short texts and short videos—that allow information scanning and selection on a very rapid scale. He indicates the pseudo-social presence of virtual reality would seem to reduce the number of people you can actually interact with. “How will the metaverse be organized and who will you be able to interact with?” Wells asks. Are people going to have coffee virtually? Virtual meetings? He suggests that a site such as Second Life may offer rudimentary evidence of the kinds of interactions that emerge when people engage with strangers in a massive virtual world.

Presumably, Wells suggests, Facebook will still have to provide a great deal of content moderation in the metaverse if people are to have any interactions outside tightly defined networks. “Given Facebook’s track record with their current platform,” he says, “this could well be an unmitigated disaster; but expecting this may lead them to tightly control who interacts with whom and in what ways.”

See also  Good Questions, Real Answers: Data-Handling and Privacy

Second Life notwithstanding, Wells also questions who will actually want to engage in such a virtual space. “My read of the pandemic is that people don’t particularly want to keep sitting in their bedrooms and interacting through Zoom,” he says.

“Will wearing an Oculus headset make that a lot better? I’m not sure,” he adds. “But I also suspect that there are at least a lot of people for whom going to a virtual concert or playing virtual chess with a friend in the park are paltry substitutes for the real thing.”

Wells concedes that there are a lot of millennials and Gen Zs who spend a lot of time in their bedrooms on video games, with digital avatars, and so forth. One possibility, he says, is that the metaverse becomes a niche space for these sorts of folks.

As these metaverse developments take shape, CRC fellows are well positioned to monitor these emerging media uses and perceptual effects. The CRC has multiple Oculus virtual reality headsets that can be paired with our psychophysiological measurement tools. For as technology takes us to new realms, we have a responsibility back in reality to analyze and understand how humans are affected.

Michelle Amazeen is a College of Communication associate professor and director of COM’s Communication Research Center.

“POV” is an opinion page that provides timely commentaries from students, faculty, and staff on a variety of issues: on-campus, local, state, national, or international. Anyone interested in submitting a piece, which should be about 700 words long, should contact John O’Rourke at orourkej@bu.eduBU Today reserves the right to reject or edit submissions. The views expressed are solely those of the author and are not intended to represent the views of Boston University.

Continue Reading

FACEBOOK

Facebook’s centralized metaverse a threat to the decentralized ecosystem?

Published

on

By

facebook’s-centralized-metaverse-a-threat-to-the-decentralized-ecosystem?-–-cointelegraph
en flag
sv flag

Facebook has been planning its foray into the metaverse for some time now — possibly even several years. But it’s only recently that its ambitious expansion plans have catapulted the concept into mainstream headlines across the globe. Renaming the parent company to Meta was perhaps the biggest, boldest statement of intent the firm could make. Suddenly, major news outlets were awash with explainer articles, while finance websites have been bubbling with excitement about the investment opportunities in this newly emerging sector. 

However, within the crypto sphere, the response has been understandably more muted. After all, decentralized versions of the metaverse have been in development around these parts for several years now. Even worse, the tech giants’ cavalier attitude to user privacy and data harvesting has informed many of the most cherished principles in the blockchain and crypto sector.

Nevertheless, metaverse tokens such as Decentraland (MANA) and Sandbox (SAND), enjoyed extensive rallies on the back of the news, and within a few days of Facebook’s announcement, decentralized metaverse project The Sandbox received $93 million in funding from investors, including Softbank.

But now that the dust has settled, do the company-formerly-known-as-Facebook’s plans represent good news for nonfungible token (NFT) and metaverse projects in crypto? Or does Meta have the potential to sink this still-nascent sector?

What is known so far?

Facebook hasn’t released many details about what can be expected from its version of the metaverse. A promotional video featuring the company co-founder and CEO Mark Zuckerberg, himself, along with his metaverse avatar, looked suitably glossy. Even so, it was scant with information about how things will actually work under the hood. However, based on precedent and what is known, some distinctions can be made between what Facebook is likely to be planning and the established decentralized metaverse projects.

Facebook has some form when it comes to questions over whether it will adopt decentralized infrastructure based on its efforts to launch a cryptocurrency. Diem, formerly Libra, is a currency run by a permissioned network of centralized companies. David Marcus, who heads up Diem, has also confirmed that the project, and by extension Facebook, is also considering NFTs integrated with Novi, the Diem-compatible wallet.

Based on all this, it’s fair to say that the Facebook metaverse would have an economy centered around the Diem currency, with NFT-based assets issued on the permissioned Diem network.

Announcing @Meta — the Facebook company’s new name. Meta is helping to build the metaverse, a place where we’ll play and connect in 3D. Welcome to the next chapter of social connection. pic.twitter.com/ywSJPLsCoD

— Meta (@Meta) October 28, 2021

The biggest difference between Facebook’s metaverse, and crypto’s metaverse projects, is that the latter operates on open, permissionless, blockchain architecture. Any developer can come and build a metaverse application on an open blockchain, and any user can acquire their own virtual real estate and engage with virtual assets.

See also  Here's why hedge fund manager Dan Niles likes Facebook as a reopening play

Critically, one of the biggest benefits of a decentralized, open architecture is that users can join and move around barrier-free between different metaverses. Interoperability protocols reduce friction between blockchains, allowing assets, including cryptocurrencies, stablecoins, utility tokens, NFTs, loyalty points, or anything else to be transferable across chains.

So the most crucial question regarding Facebook’s plans is around the extent to which the company plans for its metaverse to be interoperable, and metaverse assets to be fungible with other, non-Facebook issued assets.

From the standpoint of the decentralized metaverse, it doesn’t necessarily sound like great news. After all, Meta’s global user base dwarfs crypto’s. But there’s another way of looking at it, according to Robbie Ferguson, co-founder of Immutable, a layer two platform for NFTs:

“Even if [Meta] decides to pursue a closed ecosystem, it is still a fundamental core admission of the value that digital ownership provides — and the fact that the most valuable battleground of the future will be who owns the infrastructure of digital universes.”

Centralization could be the most limiting factor

Based on the fact that Diem is already a closed system, it seems likely that the Facebook metaverse will also be a closed ecosystem that won’t necessarily allow direct or easy interaction with decentralized metaverses. Such a “walled garden” approach would suit the company’s monopolistic tendencies but limit the potential for growth or Facebook-issued NFTs to attain any real-world value.

Furthermore, as Nick Rose Ntertsas CEO and founder of an NFT marketplace Ethernity Chain pointed out, users are becoming weary of Facebook’s centralized dominance. He added in a conversation with Cointelegraph:

“Amidst [the pandemic-fuelled digital] transition, crypto adoption rose five-fold. At the same time, public opinion polling worldwide shows growing distrust of centralized tech platforms, and more favorable ratings of the very nature of what crypto and blockchain offer in protecting privacy, enabling peer-to-peer transactions, and championing transparency and immutability.”

This point is even more pertinent when considering that the utility of Diem has been preemptively limited by regulators before it has even launched. Regardless of how Diem could eventually be used in a Facebook metaverse, regulators have made it clear that Diem isn’t welcome in the established financial system.

See also  Good Questions, Real Answers: Data-Handling and Privacy

So it seems evident that a closed Facebook metaverse will be limited to the point that it will be a completely different value proposition to what the decentralized metaverse projects are trying to achieve.

Meanwhile, decentralized digital platforms are already building and thriving. Does that mean there’s a risk that blockchain-based platforms could fall prey to the same fate as Instagram and WhatsApp, and get swallowed up as part of a Meta acquisition spree? Sebastien Borget, co-founder and chief operating officer of the Sandbox, believes that decentralized projects can take a different approach:

“Typically, big tech sits on the sidelines while new entrants fight for relevance and market share — and then swoops in to buy one of the strongest players. But that strategy only works if startups sell. So there has to be a different economic incentive, which is exactly why Web 3.0 is so powerful. It aligns the platform and the users to build a platform that stands on its own, where users have ownership over its governance — and ultimate success.”

A metaverse operated by tech giants?

Rather than attempting to dominate, Facebook may decide to integrate with established metaverses, games and crypto financial protocols — a potentially far more disruptive scenario. It could be seriously transformative for the crypto space, given the scale of Facebook’s user base.

Therefore, could there be a scenario where someone can move NFT assets between a Facebook metaverse and a decentralized network of metaverses? Sell Facebook-issued NFT assets on a DEX? Import a $69 billion Beeple to the Facebook metaverse to exhibit in a virtual gallery?

This seems to be an unlikely scenario as it would entail substantial changes in mindset from Facebook. While it would create exponentially more economic opportunity, regulatory concerns, risk assessments, and Facebook’s historical attitude to consuming competitors rather than playing alongside them are likely to be significant blockers.

See also  Meet the Rustaceans: Daniel Xu

Related: As Patreon tests the waters, can crypto open doors for content creators?

The most likely outcome seems to be that Facebook will attempt to play with established centralized tech and finance firms to bring value into its metaverse. Microsoft has already announced its own foray into the metaverse, but perhaps not as a direct competitor to what Facebook is attempting to achieve. Microsoft’s metaverse is focused on enhancing the “Teams” experience in comparison to Facebook’s VR-centric approach.

But it seems more plausible that the two firms would offer some kind of integration between their metaverse platforms than either of them would rush to partner with decentralized, open-source competitors. After all, Facebook’s original attempt to launch Libra involved other big tech and finance firms.

Make hay while the sun shines

Just as Libra created a lot of hype, which ultimately became muted by regulators, it seems likely that the development of a Facebook metaverse can play out in the same way with regards to its impact on the cryptocurrency sector.

Regulators will limit Facebook’s ability to get involved with money or finance, and the company isn’t likely to develop a sudden desire for open-source, decentralized, solutions.

However, the one positive boost that Libra brought to crypto was publicity. Ntertsas believes that this, alone, is enough to provide a boost to the decentralized NFT sector, explaining:

“Meta’s plans will enable a surge in utility for NFT issuers and minters. NFTs can then be used as metaverse goods — from wearables to art, to collectibles, and even status symbols — there is an infinite use case and utility to NFTs and what they can become in the ever-growing NFT ecosystem.”

In this respect, there are plenty of opportunities for decentralized metaverse projects to muscle into the limelight with their own offerings and showcase how decentralized solutions are already delivering what Facebook is still developing. Borget urges the community to seize the moment:

“Now is the time for us to double down on building our vision of the open, decentralized and user-driven metaverse. We also have to invest time and money in explaining the benefits of our vision over what the Facebooks of the world have offered thus far.”

Continue Reading

FACEBOOK

Facebook hackers target small business owners to scam money for ads

Published

on

By

facebook-hackers-target-small-business-owners-to-scam-money-for-ads-–-9news
en flag
sv flag

It took just 15 minutes for hackers to infiltrate Sydney single mum Sarah McTaggart’s Facebook page.

From there, they also took control of the account she uses to run her small business, wiping out 90 percent of the client base she has been building up for the past four years – almost in an instant.

Their target? The PayPal account she uses to buy Facebook ads for her business.

Sarah McTaggart has lost access to her business, which she runs through Facebook.
Sarah McTaggart has lost access to her business, which she runs through Facebook. (Supplied)

Ms McTaggart is among many small business owners who say they have had their Facebook pages hacked and fraudulent charges made on their PayPal or bank accounts as the scammers buy up ads with their money.

It was last Thursday evening when Ms McTaggart first noticed something was happening with her Facebook account.

“I was just watching TV and I opened up Facebook. I saw I had received and accepted a friend request from some guy in in the US who I didn’t send a friend request to,” Ms McTaggart said.

“Then, about five minutes later, Facebook sent me an email saying my account had been disabled because I had breached community standards,” she said.

Hackers changed Ms McTaggart's Facebook profile to that of a flag associated with ISIS.

The hackers had used a well-known technique, previously reported on by 9news.com.au, which involves changing the profile picture of the account they have hacked to that of a flag associated with the terrorist group ISIS.

The ISIS flag breaches Facebook’s community standards and automatically triggers an alert which causes Facebook to boot the user out of their account.

In another measure designed to keep her out, the hackers also changed Ms McTaggart’s age on her account, making her too young to own a Facebook account.

Ms McTaggart said she immediately took measures to to try report the hack to Facebook and prove her identity and age, but they were unsuccessful.

See also  Google, Facebook, Microsoft top EU lobbying spending - study - Yahoo Finance

Next, the hackers took control of her business page.

“I woke up the next morning and I received an email from PayPal saying a payment of $320 had been authorised for Facebook ads,” Ms McTaggart said.

Ms McTaggart said she had been unable to get the money the hackers spent on Facebook ads through her account back from PayPal.
Ms McTaggart said she had been unable to get the money the hackers spent on Facebook ads through her account back from PayPal. (Supplied)

Ms McTaggart had previously used the PayPal account to buy ads for her dreadlock business – Better Off Dread – where she creates and maintains dreadlocks for clients as well as selling accessories.

The mother-of-one said she was devastated to lose access to both her personal and business page.

Her business, which is largely run out of Facebook, was her livelihood, Ms McTaggart said.

“It is so distressing. Close to 90 percent of my new business inquiries come through Facebook,” she said.

“Almost all of my communications with my clients is on Facebook, so disabling is my account has completely cut off my capacity to talk to any of those people.

“I’m booked out with clients until mid-January, and I have no way of confirming appointments with those people. They’ve got no way of cancelling if they are sick.”

Ms McTaggart said she was initially confident she would be able to get access to her accounts back.

“I was thinking of course this will get resolved,” she said.

But, after exhausting all of the suggestions offered by Facebook’s customer service department online, Ms McTaggart said she was left frustrated by Facebook’s lack of accountability, with no number available to call the social media giant directly.

“It just dawned on me gradually that this was quite a complex situation, and there is actually no way to speak to a human at Facebook,” she said.

See also  News Corp strikes Facebook pay deal for Australian news

PayPal had also refused to refund the $320 the hackers spent on ads, she said.

“PayPal won’t refund that as I had an advertising agreement in place with Facebook,” she said.

“And I haven’t been able to communicate with anyone at Facebook to get them to refund it.”

A list of the charges Ianni Nicolaou found on his bank account statement after he was hacked.A list of the charges Ianni Nicolaou found on his bank account statement after he was hacked. (Supplied)

Ms McTaggart’s story is familiar to Ianni Nicolaou, a US real estate agent from Alabama.

Mr Nicolaou had his personal Facebook page and his business page hacked two months ago in August and has been unable to regain access to them both ever since.

“It’s awful. I’m a realtor and it’s absolutely necessary to use the platform these days,” Mr Nicolaou told 9News .com.au.

“I have a business page that I run advertisements through.

“I have invested money for my following, and now it’s gone – out of nowhere.”

After his accounts were hacked, Mr Nicolaou said he had also been hit with about A$1800 in charges made to the bank account linked to his Facebook business page.

“There were charges; charges after charges. They started at about $100 each and then kept getting bigger and bigger,” he said.

“What frustrated me the most is that there is no acknowledgement from Facebook. There is no-one to call at Facebook and say you have got fraudulent charges.

“I have literally tried everything but it is robots you are talking to.

“The way I feel is this is actually fraud. I can’t talk to a human who wants to help me but they are happy to take my money just fine.”

When contacted by 9news.com.au, Meta Australia spokesperson Antonia Sanda said its investigations team was working to restore both Ms McTaggart’s and Mr Nicolaou’s accounts.

See also  Facebook fined $8.45 mn in Italy over improper use of subscribers' data

“We want to keep suspicious activity off our platform and protect people’s accounts, and are working to restore these accounts to the rightful owners,” she said.

“Online phishing techniques are not unique to Facebook, however we’re making significant investments in technology to protect the security of people’s accounts.

“We strongly encourage people to strengthen their online security by turning on app-based two-factor authentication and alerts for unrecognised logins.”

Tips to stop your Facebook page getting hacked

  • Take action and report an account: People can always report an account, an ad, or a post that they feel is suspicious.
  • Don’t click on suspicious links: Don’t trust messages demanding money, offering gifts or threatening to delete or ban your account (or verifying your account on Instagram). To help you identify phishing and spam emails, you can view official emails sent from your settings within the app.
  • Don’t click on suspicious links from Meta/Facebook/Instagram: If you get a suspicious email or message or see a post claiming to be from Facebook, don’t click any links or attachments. If the link is suspicious, you’ll see the name or URL at the top of the page in red with a red triangle.
  • Don’t respond to these messages/ emails: Don’t answer messages asking for your password, social security number, or credit card information.
  • Avoid phishing: If you accidentally entered your username or password into a strange link, someone else might be able to log in to your account. Change your password regularly and don’t use the same passwords for everything.
  • Get alerts: Turn on two-factor authentication for additional account security.
  • Use extra security features: Get alerts about unrecognised logins and turn on two-factor authentication to increase your account security.
Continue Reading

Trending