Connect with us

FACEBOOK

On Facebook and foreign propaganda: Here’s how algorithms can manipulate you – Rappler

Published

on

An internal Facebook report found that the social media platform’s algorithms – the rules its computers follow in deciding the content that you see – enabled disinformation campaigns based in Eastern Europe to reach nearly half of all Americans in the run-up to the 2020 presidential election, according to a report in Technology Review.

The campaigns produced the most popular pages for Christian and Black American content, and overall reached 140 million U.S. users per month. Seventy-five percent of the people exposed to the content hadn’t followed any of the pages. People saw the content because Facebook’s content-recommendation system put it into their news feeds.

Social media platforms rely heavily on people’s behavior to decide on the content that you see. In particular, they watch for content that people respond to or “engage” with by liking, commenting and sharing. Troll farms, organizations that spread provocative content, exploit this by copying high-engagement content and posting it as their own.

As a computer scientist who studies the ways large numbers of people interact using technology, I understand the logic of using the wisdom of the crowds in these algorithms. I also see substantial pitfalls in how the social media companies do so in practice.

From lions on the savanna to Likes on Facebook
Advertisement

Throughout millions of years of evolution, these principles have been coded into the human brain in the form of cognitive biases that come with names like familiaritymere exposure and bandwagon effect. If everyone starts running, you should also start running; maybe someone saw a lion coming and running could save your life. You may not know why, but it’s wiser to ask questions later.

See also  Facebook Gaming continues to grow, xQc back on top of Twitch in August | Dot Esports

Your brain picks up clues from the environment – including your peers – and uses simple rules to quickly translate those signals into decisions: Go with the winner, follow the majority, copy your neighbor. These rules work remarkably well in typical situations because they are based on sound assumptions. For example, they assume that people often act rationally, it is unlikely that many are wrong, the past predicts the future, and so on.

Technology allows people to access signals from much larger numbers of other people, most of whom they do not know. Artificial intelligence applications make heavy use of these popularity or “engagement” signals, from selecting search engine results to recommending music and videos, and from suggesting friends to ranking posts on news feeds.

Not everything viral deserves to be

Our research shows that virtually all web technology platforms, such as social media and news recommendation systems, have a strong popularity bias. When applications are driven by cues like engagement rather than explicit search engine queries, popularity bias can lead to harmful unintended consequences.

Advertisement

Social media like Facebook, Instagram, Twitter, YouTube and TikTok rely heavily on AI algorithms to rank and recommend content. These algorithms take as input what you like, comment on and share – in other words, content you engage with. The goal of the algorithms is to maximize engagement by finding out what people like and ranking it at the top of their feeds.

On the surface this seems reasonable. If people like credible news, expert opinions and fun videos, these algorithms should identify such high-quality content. But the wisdom of the crowds makes a key assumption here: that recommending what is popular will help high-quality content “bubble up.”

See also  Publishers Have A Window Of Opportunity To Change Google And Facebook

We tested this assumption by studying an algorithm that ranks items using a mix of quality and popularity. We found that in general, popularity bias is more likely to lower the overall quality of content. The reason is that engagement is not a reliable indicator of quality when few people have been exposed to an item. In these cases, engagement generates a noisy signal, and the algorithm is likely to amplify this initial noise. Once the popularity of a low-quality item is large enough, it will keep getting amplified.

Algorithms aren’t the only thing affected by engagement bias – it can affect people too. Evidence shows that information is transmitted via “complex contagion,” meaning the more times people are exposed to an idea online, the more likely they are to adopt and reshare it. When social media tells people an item is going viral, their cognitive biases kick in and translate into the irresistible urge to pay attention to it and share it.

We recently ran an experiment using a news literacy app called Fakey. It is a game developed by our lab, which simulates a news feed like those of Facebook and Twitter. Players see a mix of current articles from fake news, junk science, hyperpartisan and conspiratorial sources, as well as mainstream sources. They get points for sharing or liking news from reliable sources and for flagging low-credibility articles for fact-checking.

Advertisement

The wisdom of the crowds fails because it is built on the false assumption that the crowd is made up of diverse, independent sources. There may be several reasons this is not the case.

See also  Why the Bengal CPI(M) Secretary's Two-Hour Facebook Speech is Significant

First, because of people’s tendency to associate with similar people, their online neighborhoods are not very diverse. The ease with which social media users can unfriend those with whom they disagree pushes people into homogeneous communities, often referred to as echo chambers.

Second, because many people’s friends are friends of one another, they influence one another. A famous experiment demonstrated that knowing what music your friends like affects your own stated preferences. Your social desire to conform distorts your independent judgment.

Third, popularity signals can be gamed. Over the years, search engines have developed sophisticated techniques to counter so-called “link farms” and other schemes to manipulate search algorithms. Social media platforms, on the other hand, are just beginning to learn about their own vulnerabilities.

A different, preventive approach would be to add friction. In other words, to slow down the process of spreading information. High-frequency behaviors such as automated liking and sharing could be inhibited by CAPTCHA tests or fees. Not only would this decrease opportunities for manipulation, but with less information people would be able to pay more attention to what they see. It would leave less room for engagement bias to affect people’s decisions.

Advertisement

It would also help if social media companies adjusted their algorithms to rely less on engagement to determine the content they serve you. Perhaps the revelations of Facebook’s knowledge of troll farms exploiting engagement will provide the necessary impetus. – Rappler.com

Read More

FACEBOOK

Facebook Messenger Is Launching a Split Payments Feature for Users to Quickly Share Expenses

Published

on

By

Facebook Messenger Is Launching a Split Payments Feature for Users to Quickly Share Expenses

Meta has announced the arrival of a new Split Payments feature in Facebook Messenger. This feature, as the name suggests, will let you calculate and split expenses with others right from Facebook Messenger. This feature essentially looks to bring an easier method to share the cost of bills and expenses — for example, splitting a dinner bill with friends. Using this new Split Payment feature, Facebook Messenger users will be able to split bills evenly or modify the contribution for each individual, including their own.

The company took to its blog post to announce the new Split Payment feature in Facebook Messenger. 9to5Mac reports that this new bill splitting feature is still in beta and will be exclusive to US users at first. The rollout will begin early next week. As mentioned, it will help users share the cost of bills, expenses, and payments. This feature is especially useful for those who share an apartment and need to split the monthly rent and other expenses with their mates. It could also come handy at a group dinner with many people.

With Split Payments, users can add the number of people the expense needs to be divided with and, by default, the amount entered will be divided in equal parts. A user can also modify each person’s contribution including their own. To use Split Payments, click the Get Started button in a group chat or the Payments Hub in Messenger. Users can modify the contribution in the Split Payments option and send a notification to all the users who need to make payments. After entering a personalised message and confirming your Facebook Pay details, the request will be sent and viewable in the group chat thread.

See also  Facebook flags post from non-profit that supports police officers

Once someone has made the payment, you can mark their transaction as ‘completed’. The Split Payment feature will automatically take into account your share as well and calculate the amount owed accordingly.


For the latest tech news and reviews, follow Gadgets 360 on Twitter, Facebook, and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel.

Advertisement

Tasneem Akolawala is a Senior Reporter for Gadgets 360. Her reporting expertise encompasses smartphones, wearables, apps, social media, and the overall tech industry. She reports out of Mumbai, and also writes about the ups and downs in the Indian telecom sector. Tasneem can be reached on Twitter at @MuteRiot, and leads, tips, and releases can be sent to tasneema@ndtv.com.

Continue Reading

FACEBOOK

Facebook Owner Meta Launches New Platform, Safety Hub to Protect Women in India

Published

on

By

Meta Image

Meta (formerly Facebook) on Thursday announced a slew of steps to protect woman users on its platform, including the launch of StopNCII.org in India that aims to combat the spread of non-consensual intimate images (NCII).

Meta has also launched the Women’s Safety Hub, which will be available in Hindi and 11 other Indian languages, that will enable more women users in India to access information about tools and resources that can help them make the most of their social media experience, while staying safe online.

This initiative by Meta will ensure women do not face a language barrier in accessing information Karuna Nain, director (global safety policy) at Meta Platforms, told reporters here.

“Safety is an integral part of Meta’s commitment to building and offering a safe online experience across the platforms and over the years the company has introduced several industry leading initiatives to protect users online.

“Furthering our effort to bolster the safety of users, we are bringing in a number of initiatives to ensure online safety of women on our platforms,” she added.

Advertisement

StopNCII.org is a platform that aims to combat the spread of non-consensual intimate images (NCII).

“It gives victims control. People can come to this platform proactively, hash their intimate videos and images, share their hashes back with the platform and participating companies,” Nain said.

She explained that the platform doesn’t receive any photos and videos, and instead what they get is the hash or unique digital fingerprint/unique identifier that tells the company that this is a known piece of content that is violating. “We can proactively keep a lookout for that content on our platforms and once it”s uploaded, our review team check what”s really going on and take appropriate action if it violates our policies,” she added.

See also  Police eye on Facebook 'pot' plant photo

In partnership with UK Revenge Porn Helpline, StopNCII.org builds on Meta’s NCII Pilot, an emergency programme that allows potential victims to proactively hash their intimate images so they can”t be proliferated on its platforms.

The first-of-its-kind platform, has partnered with global organisations to support the victims of NCII. In India, the platform has partnered with organisations such as Social Media Matters, Centre for Social Research, and Red Dot Foundation.

Advertisement

Nain added that the company is hopeful that this becomes an industrywide initiative, so that victims can just come to this one central place to get help and support and not have to go to each and every tech platform, one by one to get help and support.

Also, Bishakha Datta (executive editor of Point of View) and Jyoti Vadehra from Centre for Social Research are the first Indian members in Meta”s Global Women”s Safety Expert Advisors. The group comprises 12 other non-profit leaders, activists, and academic experts from different parts of the world and consults Meta in the development of new policies, products and programmes to better support women on its apps.

“We are confident that with our ever-growing safety measures, women will be able to enjoy a social experience which will enable them to learn, engage and grow without any challenges.

“India is an important market for us and bringing Bishakha and Jyoti onboard to our Women”s Safety Expert Advisory Group will go a long way in further enhancing our efforts to make our platforms safer for women in India,” Nain said.

See also  Why the Bengal CPI(M) Secretary's Two-Hour Facebook Speech is Significant

Advertisement
Continue Reading

FACEBOOK

Facebook Adds New Trend Insights in Creator Studio, Which Could Help Shape Your Posting Strategy

Published

on

By

facebook-adds-new-trend-insights-in-creator-studio,-which-could-help-shape-your-posting-strategy
en flag
sv flag

Facebook’s looking to provide more content insight within Creator Studio with the rollout of a new ‘Inspiration Hub’ element, which highlights trending content and hashtags within categories related to your business Page.

Facebook Inspiration Hub

As you can see in these screenshots, posted by social media expert Matt Navarra, when it becomes available to you, you’ll be able to access the new Inspiration Hub from the Home tab in Creator Studio.

At the right side of the screen, you can see the first of the new insights, with trending hashtags and videos from the last 24 hours, posted by Pages similar to yours, displayed above a ‘See more’ prompt.

When you tap through to the new hub, you’ll have a range of additional filters to check out trending content from across Facebook, including Page category, content type, region, and more.

Facebook Inspiration Hub

That could be hugely valuable in learning what Facebook users are responding to, and what people within your target market are engaging with in the app.

Advertisement

The Hub also includes insights into trending hashtags, within your chosen timeframe, which may further assist in tapping into trending discussions.

Facebook Inspiration Hub

How valuable hashtags are on Facebook is still up for debate, but you’ll also note that you can filter the displayed results by platform, so you can additionally display Instagram hashtag trends as well, which could be very valuable in maximizing your reach.

Much of this type of info has been available within CrowdTangle, Facebook’s analytics platform for journalists, for some time, but not everyone can access CrowdTangle data, which could make this an even more valuable proposition for many marketers.

See also  Why the Bengal CPI(M) Secretary's Two-Hour Facebook Speech is Significant

Of course, overall performance really relates to your own creative, and thinking through the action that you want your audience to take when reading your posts. But in terms of detecting new content trends, including hashtag usage, caption length, videos versus image posts, and more, there’s a lot that could be gleaned from these tools and filters.

It’s a significant analytics addition – we’ve asked Facebook for more info on the rollout of the new option, and whether it’s already beyond test mode, etc. We’ll update this post if/when we hear back.

Advertisement
Continue Reading

Trending