Connect with us

FACEBOOK

Changes to Groups API Access

Published

on

Since April of 2018, we’ve been reviewing the ways that people can use Facebook to share data with outside companies. We’ve removed or restricted a number of our developer APIs, such as the Groups API, which provides an interface between Facebook and apps that can integrate with a group.

Before April 2018, group admins could authorize an app for a group, which gave the app developer access to information in the group. But as part of the changes to the Groups API after April 2018, if an admin authorized this access, that app would only get information, such as the group’s name, the number of users, and the content of posts. For an app to access additional information such as name and profile picture in connection with group activity, group members had to opt-in.

As part of our ongoing review, we recently found that some apps retained access to group member information, like names and profile pictures in connection with group activity, from the Groups API, for longer than we intended. We have since removed their access. Today we are also reaching out to roughly 100 partners who may have accessed this information since we announced restrictions to the Groups API, although it’s likely that the number that actually did is smaller and decreased over time. We know at least 11 partners accessed group members’ information in the last 60 days. Although we’ve seen no evidence of abuse, we will ask them to delete any member data they may have retained and we will conduct audits to confirm that it has been deleted.

These were primarily social media management and video streaming apps, designed to make it easier for group admins to manage their groups more effectively and help members share videos to their groups. For example, if a business managed a large community consisting of many members across multiple groups, they could use a social media management app to provide customer service, including customized responses, at scale. But while this access provided benefits to people and groups on Facebook, we made the decision to remove it and are following through on that approach.

We aim to maintain a high standard of security on our platform and to treat our developers fairly. As we’ve said in the past, the new framework under our agreement with the FTC means more accountability and transparency into how we build and maintain products. As we continue to work through this process we expect to find more examples of where we can improve, either through our products or changing how data is accessed. We are committed to this work and supporting the people on our platform.

Continue Reading

FACEBOOK

Best Practices for Designing Great Messaging Experiences on Messenger

Published

on

We recently reminded our community of the upcoming policy changes to the Messenger platform that will go into effect on March 4, 2020. These policy changes were designed to improve the messaging experience between people and businesses by driving timely and personally relevant conversations — prioritizing conversations started by people and related follow-up communications.

To help businesses best adapt to these new policy changes, here are some tips on the best practices to adopt when designing messenger experiences:

1. Respond quickly and set customer expectations on response times

People expect businesses to respond quickly and provide timely updates. We have found a strong correlation between responsiveness and successful business outcomes.>

2. Make it short and sweet

Make sure to communicate your key points succinctly and early on in your message. This aligns with people’s expectations for messaging as a channel and increases readability. Messages that are short and to the point can also be read clearly in message previews.

3. Leverage Messenger features to send high value messages outside the 24 hour standard messaging window

Successful businesses know the options available to send messages outside the standard messaging window and use them effectively.

  • Message tags – use tags to send personal, timely and important non-promotional messages. Businesses can use tags to send account updates, post purchase updates, confirmed event updates, and human agent responses.
  • One-Time Notification – allows a page to request a user to send one follow-up message after the 24-hour messaging window has ended. This can be used for cases such as back in stock alerts where a person has explicitly requested the business to send out a notification. Make sure that the message matches the topic the user agreed to receive the notification for and this message is fully communicated on the first attempt. You may also want to prompt people to interact with your notification in order to restart the standard messaging window.
  • Sponsored Messages – use sponsored messages for broadcast promotional updates to customers you’ve interacted with in Messenger. Sponsored messages support Facebook ads targeting and have built-in integrity controls to help us safeguard the user experience in Messenger.

4. Focus on customer value

Ensure your messages clearly communicate customer value – especially notifications sent outside the standard messaging window. Sending out low value messages makes it more likely that customers will tune out or block messages from your business altogether. Businesses using Messenger’s platform should consider adjusting push parameters for valuable messages that don’t require immediate action.

5. Provide audiences with options to choose from

Consider giving your audience additional control over the type of content they will receive via Messenger. For example, you may allow the user to select specific types of account alerts or post-purchase updates provided they comply with the Messenger platform policies.

We believe following these simple guidelines will help to ensure a businesses’ messaging efforts will be effective and drive outcomes, while providing customers with pleasant and valuable interaction experiences that encourage them to continue engaging with the business on Messenger.

Facebook Developers

Continue Reading

FACEBOOK

Two Billion Users — Connecting the World Privately

Published

on

We are excited to share that, as of today, WhatsApp supports more than 2 billion users around the world.

Mothers and fathers can reach their loved ones no matter where they are. Brothers and sisters can share moments that matter. Coworkers can collaborate, and businesses can grow by easily connecting with their customers.

Private conversations that once were only possible face-to-face can now take place across great distances through instant chats and video calling. There are so many significant and special moments that take place over WhatsApp and we are humbled and honored to reach this milestone.

We know that the more we connect, the more we have to protect. As we conduct more of our lives online, protecting our conversations is more important than ever.

That is why every private message sent using WhatsApp is secured with end-to-end encryption by default. Strong encryption acts like an unbreakable digital lock that keeps the information you send over WhatsApp secure, helping protect you from hackers and criminals. Messages are only kept on your phone, and no one in between can read your messages or listen to your calls, not even us. Your private conversations stay between you.

Strong encryption is a necessity in modern life. We will not compromise on security because that would make people less safe. For even more protection, we work with top security experts, employ industry leading technology to stop misuse as well as provide controls and ways to report issues — without sacrificing privacy.

WhatsApp started with the goal of creating a service that is simple, reliable and private for people to use. Today we remain as committed as when we started, to help connect the world privately and to protect the personal communication of 2 billion users all over the world.

The post Two Billion Users — Connecting the World Privately appeared first on About Facebook.

Facebook Newsroom

Continue Reading

FACEBOOK

Facebook, Instagram and YouTube: Government forcing companies to protect you online

Published

on

Although many of the details have still to be confirmed, it’s likely the new rules will apply to Facebook, Twitter, Whatsapp, Snapchat, and Instagram

We often talk about the risks you might find online and whether social media companies need to do more to make sure you don’t come across inappropriate content.

Well, now media regulator Ofcom is getting new powers, to make sure companies protect both adults and children from harmful content online.

The media regulator makes sure everyone in media, including the BBC, is keeping to the rules.

Harmful content refers to things like violence, terrorism, cyber-bullying and child abuse.

The new rules will likely apply to Facebook – who also own Instagram and WhatsApp – Snapchat, Twitter, YouTube and TikTok, and will include things like comments, forums and video-sharing.

Platforms will need to ensure that illegal content is removed quickly, and may also have to “minimise the risks” of it appearing at all.

These plans have been talked about for a while now.

The idea of new rules to tackle ‘online harms’ was originally set out by the Department for Digital, Culture, Media and Sport in May 2018.

The government has now decided to give Ofcom these new powers following research called the ‘Online Harms consultation’, carried out in the UK in 2019.

Plans allowing Ofcom to take control of social media were first spoken of in August last year.

The government will officially announce these new powers for Ofcom on Wednesday 12 February.

But we won’t know right away exactly what new rules will be introduced, or what will happen to tech or social media companies who break the new rules.

Children’s charity the NSPCC has welcomed the news. It says trusting companies to keep children safe online has failed.

“Too many times social media companies have said: ‘We don’t like the idea of children being abused on our sites, we’ll do something, leave it to us,'” said chief executive Peter Wanless.

“Thirteen self-regulatory attempts to keep children safe online have failed.

To enjoy the CBBC Newsround website at its best you will need to have JavaScript turned on.

Back in Feb 2018 YouTube said they were “very sorry” after Newsround found several videos not suitable for children on the YouTube Kids app

The UK government’s Digital Secretary, Baroness Nicky Morgan said: “There are many platforms who ideally would not have wanted regulation, but I think that’s changing.”

“I think they understand now that actually regulation is coming.”

In many countries, social media platforms are allowed to regulate themselves, as long as they stick to local laws on illegal material.

But some, including Germany and Australia, have introduced strict rules to force social media platforms do more to protect users online.

In Australia, social media companies have to pay big fines and bosses can even be sent to prison if they break the rules.

For more information and tips about staying safe online, go to BBC Own It, and find out how to make the internet a better place for all of us.

Read More

Continue Reading

Trending