N95 respirator masks getting loaded onto an airplane bound for bushfire-affected areas of Australia.
Bushfires in Australia have killed more than 25 people, decimated Australia’s wildlife, and are expected to rack up historically high damage costs of multiple billions of dollars — and they’re still burning.
During and after a natural disaster, response organizations need accurate information — every minute counts in saving lives. Real-time information helps paint a more complete picture of where affected people are located, so that resources like food, water, and medical supplies can be efficiently distributed where they are needed most. We launched Facebook Disaster Maps specifically to help fill information gaps during these events. So when the Australia fires began, we quickly shared real-time maps with our more than 100 Data for Good partners. Those maps illustrate how populations are evacuating and whether they have access to cellular networks, which are helping response organizations optimize their response efforts. To help support and amplify the efforts of our community, we will also be matching up to AU$1 million in donations made to GlobalGiving and donating AU$250,000 to the Australian Red Cross.
Currently, Disaster Maps in Australia are being used by a range of national and international relief, response, and academic organizations. Direct Relief, a humanitarian aid organization focused on health and emergency response, is using these tools to analyze evacuation proceedings and has plans to distribute more than 500,000 respiratory masks to the Australian states of Victoria and New South Wales. Direct Relief first used Disaster Maps to respond to the Thomas Fire and Montecito mudslides of December 2017 and January 2018 to learn how large numbers of people behave during crisis events and to develop insights about how best to respond to medical needs.
How Facebook Disaster Maps Help
Traditional forms of data often do not provide an accurate real-time view of affected areas, which makes it extremely difficult to understand how to best direct response efforts. To combat gaps in information, Facebook Disaster Maps are generated within 24 hours of disaster striking, then refreshed daily throughout the event. The data in these maps is gathered from people using the Facebook app who have chosen to turn on Location Services and opt into a feature called Location History, which can be modified at any time under Privacy Settings. Access to this near real-time data on evacuations, displacement, and network connectivity access means disaster response agencies can act quickly and efficiently to save lives. Plus, continuously updating information allows them to respond to changing circumstances on the ground during and after the event. To assist with the bushfires, four maps have been shared: the South Coast of New South Wales; East Gippsland in Victoria; Green Wattle Creek Fire in New South Wales; and the Cudlee Creek Fire in South Australia.
Such real-time information helps responders effectively deploy resources to serve the neediest survivors and protect vulnerable populations by painting a more complete picture of where affected people are located. Disaster Maps’ Facebook population density map for the bushfires clearly illustrates the quick and massive evacuation of Batemans Bay over the course of several days, from December 31 to January 3, as fire swept over the town.
These maps allow responders to quickly get a read on how people are actually behaving during a specific emergency, rather than making assumptions or predicting behaviors based on past events. They’re proving to be a remarkable tool for responders to the catastrophic Australian bushfires, the second-largest fire event ever recorded globally based on land size, which so far have destroyed more than 15 million acres, more than seven times the acreage of the shocking California fires of 2018 or last year’s Amazon wildfire. At least a billion animals have been killed, wiping out multiple species of native Australian wildlife, including 30 percent of the world’s koala population.
Since its inception, Facebook Data for Good has generated Disaster Maps for hundreds of natural disasters, including Hurricanes Dorian and Barry, Typhoon Tisoy in the Philippines, and the recent earthquake in Puerto Rico. In addition to guiding response efforts, universities and researchers are also using Disaster Maps to analyze how disaster-affected populations utilize social services, what prompts them to obey evacuation orders, and how social ties affect their resilience after a disaster.
More Ways to Help
As part of our efforts to assist with the bushfires, we are donating AU$250,000 to the Australian Red Cross to support relief and recovery efforts. We will also match up to AU$1 million in donations made to GlobalGiving, which will distribute the money to local nonprofits working on recovery efforts.* Donations made through our Crisis Response pages for the bushfires across New South Wales or the bushfires across Victoria and South Australia will be matched up to AU$1 million.
The post Facebook Disaster Maps Help Those Affected by Australia’s Bushfires appeared first on About Facebook.
Messenger API to support Instagram
Today, we are announcing updates to the Messenger API to support Instagram messaging, giving businesses new tools to manage their customer communications on Instagram at scale. The new API features enable businesses to integrate Instagram messaging with their preferred business applications and workflows; helping drive more meaningful conversations, increase customer satisfaction and grow sales. The updated API is currently in beta with a limited number of developer partners and businesses.
Instagram is a place for emerging culture and trend creation and discovering new brands is a valuable part of this experience. Messaging plays a central role in helping people connect with brands in personal ways through story replies, direct messages, and mentions. Over the last year, total daily conversations between people and businesses on Messenger and Instagram grew over 40 percent. For businesses, the opportunity to drive sales and improve customer satisfaction by having meaningful interactions with people on Instagram messaging is huge.
“Instagram is a platform for community building, and we’ve long approached it as a way for us to connect with our customers in a place where they are already spending a lot of their time. With the newly launched Messenger API support for Instagram, we are now able to increase efficiency, drive even stronger user engagement, and easily maintain a two-way dialogue with our followers. This technology has helped us create a new pipeline for best-in-class service and allows for a direct line of communication that’s fast and easy for both customers and our internal team.” – Michael Kors Marketing
Works with your tools and workflows
Businesses want to use a single platform to respond to messages on multiple channels. The Messenger API now allows businesses to manage messages initiated by people throughout their Instagram presence, including Profile, Shops, and Stories. It will be possible for businesses to use information from core business systems right alongside Instagram messaging, enabling more personal conversations that drive better business outcomes. For example, businesses integrating with a CRM system can give agents a holistic view of customer loyalty. Furthermore, existing investments in people, tools, and workflows to manage other communication channels can be leveraged and extended to support customers on Instagram. This update will also bring Facebook Shops messaging features to the Messenger API so businesses can create more engaging and connected customer experiences.
To get started, businesses can easily work with developers to integrate Instagram messaging with their existing tools and systems.
Increases responsiveness and customer satisfaction
Customers value responsiveness when they have questions or need help from businesses. For the first time on Instagram, we’re introducing new features that will allow businesses to respond immediately to common inquiries using automation, while ensuring people are seamlessly connected to live support for more complex inquiries. One of our alpha partners, Clarabridge, reported their client brands had improved their response rate by up to 55% since being able to manage Instagram DMs through their platform.
The updates to the Messenger API are part of our overall effort to make it easier for businesses to reach their customers across our family of apps.
Messenger API support for Instagram is currently in beta with a focus on providing high quality, personalized messaging experiences on Instagram while increasing business efficiency. Adidas, Amaro, Glossier, H&M, MagazineLuiza, Michael Kors, Nars, Sephora and TechStyle Fashion Group and other consumer brands are already participating in the beta program. We are excited about early results some businesses saw during alpha testing, including higher response rates, reduced resolution times, and deeper customer insights as a result of integrations. We’re also testing with a limited number of developer partners; and are delighted at the initial response.
“On average, brands have saved at least four hours per agent per week by streamlining social community management within the Khoros platform, plus shortened response rates during business hours — which is crucial to meet as customers who message brands on social media expect a quick reply.” – Khoros
Required migration to token-based access for User Picture and oEmbed endpoints
As part of our Graph API 8.0 release, we announced breaking changes to how developers can access certain permissions and APIs. Starting October 24, 2020, developers need to migrate to token-based access in order to access User Picture and oEmbed endpoints for Facebook and Instagram.
This post outlines these changes and the necessary steps developers need to take to avoid disruption to their app.
Facebook will now require client or app access tokens to access a user’s profile picture. Beginning on October 24, 2020 queries for profile pictures made against user IDs without an access token will return a generic silhouette rather than a profile picture. This is a breaking change for partners. While client or app tokens will be required for user ID queries, they will continue to be a best practice (and not required) for ASID queries for the time being.
Facebook and Instagram oEmbed
We are also deprecating the existing Legacy API oEmbed endpoints for Facebook and Instagram on October 24, 2020, which will be replaced with new Graph API endpoints. If developers don’t make this change and continue to attempt to call the existing oEmbed API, their requests will fail and developers will receive an error message instead. These new endpoints will require client or app access tokens or ASID queries.
Ready to make the switch? You can read more about these changes in our developer documentation for User Picture and also visit our changelog for Facebook OEmbed and IG OEmbed for details on how to start calling these Graph API endpoints.
PyTorch Tutorials Refresh – Behind the Scenes
Hi, I’m Jessica, a Developer Advocate on the Facebook Open Source team. In this blog, I’ll take you behind the scenes to show you how Facebook supports and sustains our open source products – specifically PyTorch, an open source deep learning library. With every new release version, PyTorch pushes out new features, updates existing ones, and adds documentation and tutorials that cover how to implement these new changes.
On May 5, 2020, PyTorch released improvements to Tutorials homepage with new content and a fresh usability experience out into the world (see the Twitter thread) for the community. We introduced keyword based search tags and a new recipes format (bite-sized, ready-to-deploy examples) and more clearly highlighted helpful resources, which resulted in the fresh homepage style you see today.
Something Went Wrong
We’re having trouble playing this video.
As the framework grows with each release, we’re continuously collaborating with our community to not only create more learning content, but also make learning the content easier.
The tutorials refresh project focused on re-envisioning the learning experience by updating the UX and updating the learning content itself.
Our 3 major goals for the refresh were:
- Reduce blocks of text and make it easy for users to find important resources (e.g. PyTorch Cheat Sheet, New to PyTorch tutorials)
- Improve discoverability of relevant tutorials and surface more information for users to know about the available tutorial content
- Create content that allows users to quickly learn and deploy commonly used code snippets
And we addressed these goals by:
- Adding callout blocks with direct links to highlight important resource such as the beginner tutorial, the PyTorch Cheat Sheet and new recipes
- Adding filterable tags to help users easily find relevant tutorials; and formatting the tutorials cards with summaries so users know what to expect without having to click in
- Creating a new learning format, Recipes, and 15 brand new recipes covering some of the most popular PyTorch topics such as interpretability and quantization as well as basics such as how to load data in PyTorch
- In summary:
Add callouts with direct links to highlight important resources
Improve discoverability of relevant tutorials and surface more information for users to know about the available tutorial content
Add filterable tags to help users easily find relevant tutorials. Reformat tutorial cards with summaries so users know what to expect
Create content that allow users to quickly learn and deploy commonly used code snippets
Create a new learning format – Recipes. These are bite-sized, actionable examples of how to use specific Pytorch features, different from our previous full-length tutorials
Why We Made These Changes
Now, what drove these changes? These efforts were driven by feedback from the community; two sources of feedback were the UX research study and direct community interactions:
- UX Research study – Earlier in 2020, we conducted a UX research study of our website in collaboration with the Facebook UX Research team to understand how our developer community is using the website and evaluate ways we can improve it to better meet their needs
- In-person events and online feedback – The team gathered community feedback about existing tutorials to help identify learning gaps.
We used these channels of input to fuel revisioning our learning experience.
Rethinking the Learning Experience
Given the feedback from the UX Research study and the in-person workshop, we went back and rethought the current learning experience.
- 3 levels
- Level 1: API docs. Currently these exist and they contain code snippets that provide an easily understandable (and reproducible) example that allows a user to implement that particular API
- Level 2: The missing puzzle piece
- Level 3: Tutorials ideally provide an end-to-end experience that provides users an understanding of how to take data, train a model and deploy it into a production setting using PyTorch. These exist, but needed to be pruned of outdated content and cleaned up to better fit this model
- Realized we were missing something in between, content that was short, informative, and actionable. That’s how recipes were born. Level 2: Recipes are bite-sized, actionable examples of how to use specific PyTorch features, different from our tutorials
What Was the Process
Just as it took a large team effort, this was more of a marathon as opposed to a sprint. Let’s look at the process:
Timeline of the process:
Overall, the project took about 6 months, not including the UX research and prior feedback collection time. It started off with the kickoff discussion to align on the changes. We assessed the existing tutorials, pruned outdated content and decided on new recipe topics and assigned authors. In the meantime, marketing and documentation engineers collaborated with our web design team on the upcoming UI needs, created mocks to preview with the rest of the team and built out the infrastructure.
For logistics, we created a roadmap and set milestones for the team of authors. We held weekly standup meetings, and the team bounced ideas in chat. The changes were all made in a staging branch in GitHub, which allowed us to create previews of the final build. Next, the build process. Many of the recipe authors were first time content creators, so we held a live onboarding session where we discussed teaching mindset, writing with an active voice, outlining, code standards and requirements; and this was all captured in a new set of content creation documentation.
The bulk of the process was spent in building out the content, copy editing and implementing the UI experience.
With the product out the door, we took some time to perform a team retrospective – asking what went well? What went poorly? What can we do better next time? In addition, we continue to gather ongoing feedback from the community through GitHub issues.
Moving forward, we are brainstorming and forming a longer-term plan for the PyTorch learning experience as it relates to docs and tutorials.
Ways to Improve
Looking back on ways we could have improved:
- Timeline – Our timeline ended up taking longer than anticipated because it had been coupled with a previous version release and team members were serving double-duty working on release content, as well as tutorials refresh content. As version release approached, we took a step back and realized we needed more time to put out a more polished product.
- Testing – In software development, if there is an impending deadline, typically testing is the first thing to get squished; however, more focused testing will always save time in the bigger picture. For us, we would always welcome more time for more CI tests of the tutorial build, as well as beta tests of the user experience. Both of these are ongoing works in progress, as we continue to improve the tutorials experience overall.
So what’s next? We understand that this was just one change in a larger landscape of the overall PyTorch learning experience, but we are excited to keep improving this experience for you, our dedicated PyTorch user.
We would like to hear from you about your experience in the new tutorials. Found a tutorial you loved? Tweet about it and tag us (@PyTorch). Ran into an issue you can help fix? File an issue in https://github.com/pytorch/tutorials. We are excited to continue building the future of machine learning with you!