Connect with us


Exclusive extract: how Facebook’s engineers spied on women



An Ugly Truth: Inside Facebook’s Battle for Domination is a behind-the-scenes exposé by journalists Sheera Frenkel and Cecilia Kang that offers the definitive account of Facebook’s fall from grace. In this exclusive extract, they show how engineers would access users’ private information – including women they were dating – for over a decade

It was late at night, hours after his colleagues at Menlo Park had left the office, when the Facebook engineer felt pulled back to his laptop. He had enjoyed a few beers. Part of the reason, he thought, that his resolve was crumbling. He knew that with just a few taps at his keyboard, he could access the Facebook profile of a woman he had gone on a date with a few days ago. The date had gone well, in his opinion, but she had stopped answering his messages 24 hours after they parted ways. All he wanted to do was peek at her Facebook page to satisfy his curiosity, to see if maybe she had gotten sick, gone on vacation, or lost her dog – anything that would explain why she was not interested in a second date.

He logged on to his laptop and, using his access to Facebook’s stream of data on all its users, searched for his date. He knew enough details – first and last name, place of birth, and university – that finding her took only a few minutes. Facebook’s internal systems had a rich repository of information, including years of private conversations with friends over Facebook Messenger, events attended, photographs uploaded (including those she had deleted), and posts she had commented or clicked on. He saw the categories in which Facebook had placed her for advertisers: the company had decided that she was in her thirties, was politically left of centre, and led an active lifestyle. She had a wide range of interests, from a love of dogs to holidays in Southeast Asia. And through the Facebook app that she had installed on her phone, he saw her real-time location. It was more information than the engineer could possibly have gotten over the course of a dozen dinners.

See also  Flow Type Checker No Longer Just JavaScript with Types, Centers of Facebook Needs - InfoQ

Facebook’s managers stressed to their employees that anyone discovered taking advantage of their access to data for personal means, to look up a friend’s account or that of a family member, would be immediately fired. But the managers also knew there were no safeguards in place. The system had been designed to be open, transparent, and accessible to all employees. It was part of Zuckerberg’s founding ethos to cut away the red tape that slowed down engineers and prevented them from producing fast, independent work. This rule had been put in place when Facebook had fewer than one hundred employees. Yet, years later, with thousands of engineers across the company, nobody had revisited the practice. There was nothing but the goodwill of the employees themselves to stop them from abusing their access to users’ private information.

During a period spanning January 2014 to August 2015, the engineer who looked up his onetime date was just one of 52 Facebook employees fired for exploiting their access to user data. Men who looked up the Facebook profiles of women they were interested in made up the vast majority of engineers who abused their privileges. Most did little more than look up users’ information. But a few took it much further. One engineer used the data to confront a woman who had travelled with him on a European holiday; the two had gotten into a fight during the trip, and the engineer tracked her to her new hotel after she left the room they had been sharing. Another engineer accessed a woman’s Facebook page before they had even gone on a first date. He saw that she regularly visited Dolores Park, in San Francisco, and he found her there one day, enjoying the sun with her friends.

The fired engineers had used work laptops to look up specific accounts, and this unusual activity had triggered Facebook’s systems and alerted the engineers’ managers to their transgressions. Those employees were the ones who were found out after the fact. It was unknown how many others had gone undetected.

See also  Facebook is hated — and rich

The problem was brought to Mark Zuckerberg’s attention for the first time in September 2015, three months after the arrival of Alex Stamos, Facebook’s new chief security officer. Gathered in the CEO’s conference room, “the Aquarium”, Zuckerberg’s top executives had braced themselves for potentially bad news: Stamos had a reputation for blunt speech and high standards. One of the first objectives he had set out when he was hired that summer was a comprehensive evaluation of Facebook’s current state of security. It would be the first such assessment ever completed by an outsider.

Among themselves, the executives whispered that it was impossible to make a thorough assessment within such a short period of time and that whatever report Stamos delivered would surely flag superficial problems and give the new head of security some easy wins at the start of his tenure. Everyone’s life would be easier if Stamos assumed the posture of boundless optimism that pervaded Facebook’s top ranks. The company had never been doing better, with ads recently expanded on Instagram and a new milestone of a billion users logging on to the platform every day.

Instead, Stamos had come armed with a presentation that detailed problems across Facebook’s core products, workforce, and company structure. The organisation was devoting too much of its security efforts to protecting its website, while its apps, including Instagram and WhatsApp, were being largely ignored, he told the group. Facebook had not made headway on its promises to encrypt user data at its centres – unlike Yahoo, Stamos’s previous employer. Facebook’s security responsibilities were scattered across the company, and according to the report Stamos presented, the company was “not technically or culturally prepared to play against” its current level of adversary.

Worst of all, Stamos told them, was that despite firing dozens of employees over the last eighteen months for abusing their access, Facebook was doing nothing to solve or prevent what was clearly a systemic problem. In a chart, he highlighted how nearly every month, engineers had exploited the tools designed to give them easy access to data for building new products, to violate the privacy of Facebook users and infiltrate their lives. If the public knew about these transgressions, they would be outraged: for over a decade, thousands of Facebook’s engineers had been freely accessing users’ private data. The cases Stamos highlighted were only the ones the company knew about. Hundreds more may have slipped under the radar, he warned.

See also  Facebook Marketing: Promote your Facebook Page and Posts

Zuckerberg was clearly taken aback by the figures, and upset that the issue had not been brought to his attention sooner. “Everybody in engineering management knew there were incidents where employees had inappropriately managed data. Nobody had pulled it into one place, and they were surprised at the volume of engineers who had abused data,” Stamos recalled. Why hadn’t anyone thought to reassess the system that gave engineers access to user data, Zuckerberg asked. No one in the room pointed out that it was a system that he himself had designed and implemented. Over the years, his employees had suggested alternative ways of structuring data retention, to no avail. “At various times in Facebook’s history there were paths we could have taken, decisions we could have made, which would have limited, or even cut back on, the user data we were collecting,” said one longtime employee. “But that was antithetical to Mark’s DNA. Even before we took those options to him, we knew it wasn’t a path he would choose.”

One executive was noticeably absent from the September 2015 meeting. Only four months had passed since the death of Sheryl Sandberg’s husband. Security was Sandberg’s responsibility, and Stamos technically fell under her purview. But she had never suggested, nor been consulted about, the sweeping changes he was proposing. Stamos prevailed that day, but he made several powerful enemies.

*Read an interview with the authors here

This exclusive extract is from An Ugly Truth: Inside Facebook’s Battle for Domination by Sheera Frankel and Cecilia Kang (The Bridge Street Press). RRP £20. Buy now for £16.99 at or call 0844 871 1514

Read More


Facebook Adds New Trend Insights in Creator Studio, Which Could Help Shape Your Posting Strategy




en flag
sv flag

Facebook’s looking to provide more content insight within Creator Studio with the rollout of a new ‘Inspiration Hub’ element, which highlights trending content and hashtags within categories related to your business Page.

Facebook Inspiration Hub

As you can see in these screenshots, posted by social media expert Matt Navarra, when it becomes available to you, you’ll be able to access the new Inspiration Hub from the Home tab in Creator Studio.

At the right side of the screen, you can see the first of the new insights, with trending hashtags and videos from the last 24 hours, posted by Pages similar to yours, displayed above a ‘See more’ prompt.

When you tap through to the new hub, you’ll have a range of additional filters to check out trending content from across Facebook, including Page category, content type, region, and more.

Facebook Inspiration Hub

That could be hugely valuable in learning what Facebook users are responding to, and what people within your target market are engaging with in the app.

The Hub also includes insights into trending hashtags, within your chosen timeframe, which may further assist in tapping into trending discussions.

Facebook Inspiration Hub

How valuable hashtags are on Facebook is still up for debate, but you’ll also note that you can filter the displayed results by platform, so you can additionally display Instagram hashtag trends as well, which could be very valuable in maximizing your reach.

Much of this type of info has been available within CrowdTangle, Facebook’s analytics platform for journalists, for some time, but not everyone can access CrowdTangle data, which could make this an even more valuable proposition for many marketers.

See also  Facebook to restore Australian news page after deal

Of course, overall performance really relates to your own creative, and thinking through the action that you want your audience to take when reading your posts. But in terms of detecting new content trends, including hashtag usage, caption length, videos versus image posts, and more, there’s a lot that could be gleaned from these tools and filters.

It’s a significant analytics addition – we’ve asked Facebook for more info on the rollout of the new option, and whether it’s already beyond test mode, etc. We’ll update this post if/when we hear back.

Continue Reading


Meta Updates Policy on Cryptocurrency Ads, Opening the Door to More Crypto Promotions in its Apps




en flag
sv flag

With cryptocurrencies gaining momentum, in line with the broader Web 3.0 push, Meta has today announced an update to its ad policies around cryptocurrencies, which will open the door to more crypto advertisers on its platforms.

As per Meta:

Starting today, we’re updating our eligibility criteria for running ads about cryptocurrency on our platform by expanding the number of regulatory licenses we accept from three to 27. We are also making the list of eligible licenses publicly available on our policy page.”

Essentially, in order to run any crypto ads in Meta’s apps, that currency needs to adhere to regional licensing provisions, which vary by nation. With crypto becoming more accepted, Meta’s now looking to enable more crypto companies to publish ads on its platform, which will provide expanded opportunity for recognized crypto providers to promote their products, while also enabling Meta to make more money from crypto ads.

“Previously, advertisers could submit an application and include information such as any licenses they obtained, whether they are traded on a public stock exchange, and other relevant public background on their business. However, over the years the cryptocurrency landscape has matured and stabilized and experienced an increase in government regulation, which has helped to set clearer responsibilities and expectations for the industry. Going forward, we will be moving away from using a variety of signals to confirm eligibility and instead requiring one of these 27 licenses.”

Is that a good move? Well, as Meta notes, the crypto marketplace is maturing, and there’s now much wider recognition of cryptocurrencies as a legitimate form of payment. But they’re also not supported by most local financial regulators, which reduced transaction protection and oversight, which also brings a level of risk in such process.

See also  25% of people who rely on Facebook for news say they won't get a COVID-19 shot — slightly more ...

But then again, all crypto providers are required to clearly outline any such risks, and most also highlight the ongoing market volatility in the space. This expanded level of overall transparency means that most people who are investing in crypto have at least some awareness of these elements, which likely does diminish the risk factor in such promotions within Meta’s apps.

But as crypto adoption continues to expand, more of these risks will become apparent, and while much of the crypto community is built on good faith, and a sense of community around building something new, there are questions as to how much that can hold at scale, and what that will then mean for evolving scams and criminal activity, especially as more vulnerable investors are brought into the mix.

Broader promotional capacity through Meta’s apps will certainly help to boost exposure in this respect – though again, the relative risk factors are lessened by expanded regulatory oversight outside of the company.

You can read more about Meta’s expanded crypto ad regulations here.

Continue Reading


Meta Outlines Evolving Safety Measures in Messaging as it Seeks to Allay Fears Around the Expansion of E2E Encryption




en flag
sv flag

Amid rising concern about Meta’s move to roll out end-to-end encryption by default to all of its messaging apps, Meta’s Global Head of Safety Antigone Davis has today sought to provide a level of reassurance that Meta is indeed aware of the risks and dangers that such protection can pose, and that it is building safeguards into its processes to protect against potential misuse.

Though the measures outlined don’t exactly address all the issues raised by analysts and safety groups around the world.

As a quick recap, back in 2019, Facebook announced its plan to merge the messaging functionalities of Messenger, Instagram and WhatsApp, which would then provide users with a universal inbox, with all of your message threads from each app accessible on either platform.

The idea is that this will simplify cross-connection, while also opening the door to more opportunities for brands to connect with users in the messaging tool of their choice – but it also, inherently, means that the data protection method for its messaging tools must rise to the level of WhatsApp, its most secure messaging platform, which already includes E2E encryption as the default.

Various child safety experts raised the alarm, and several months after Facebook’s initial announcement, representatives from the UK, US and Australian Governments sent an open letter to Facebook CEO Mark Zuckerberg requesting that the company abandon its integration plan.

Meta has pushed ahead, despite specific concerns that the expansion of encryption will see its messaging tools used by child trafficking and exploitation groups, and now, as it closes in on the next stage, Meta’s working to counter such claims, with Davis outlining six key elements which she believes will ensure safety within this push.

See also  Analysts get bullish on Facebook and Twitter, but trader warns of one headwind coming to a ...

Davis has explained the various measures that Meta has added on this front, including:

  • Detection tools to stop adults from repeatedly setting up new profiles in an attempt to connect minors that they don’t know
  • Safety notices in Messenger, which provide tips on spotting suspicious behavior
  • The capacity to filter messages with selected keywords on Instagram
  • More filtering options in chat requests to help avoid unwanted contact
  • Improved education prompts to help detect spammers and scammers in messages
  • New processes to make it easier to report potential harm, including an option to select “involves a child”, which will then prioritize the report for review and action

Meta messaging security options

Which are all good, all important steps in detection, while Davis also notes that its reporting process “decrypts portions of the conversation that were previously encrypted and unavailable to us so that we can take immediate action if violations are detected”.

That’ll no doubt raise an eyebrow or two among WhatsApp users – but the problem here is that, overall, the broader concern is that such protections will facilitate usage by criminal groups, and the reliance on self-reporting in this respect is not going to have any impact on these networks operating, at scale, under a more protected messaging framework within Meta’s app eco-system.

Governments have called for ‘backdoor access’ to break Meta’s encryption for investigations into such activity, which Meta says is both not possible and will not be built into its future framework. The elements outlined by Davis do little to address this specific need, and without the capacity to better detect such, it’s hard to see any of the groups opposed to Meta’s expanded encryption changing their stance, and accepting that the merging of all of the platform’s DM options will not also see a rise in criminal activity organized via the same apps.

See also  Thomasville Fire Rescue Launches Official Facebook Page

Of course, the counterargument could be that encryption is already available on WhatsApp, and that criminal activity of this type can already be undertaken within WhatsApp alone. But with a combined user count of 3.58 billion people per month across its family of apps, that’s a significantly broader interconnection of people than WhatsApp’s 2 billion active users, which, arguably, could open the door to far more potential harm and danger in this respect.

Really, there’s no right answer here. Privacy advocates will argue that encryption should be the standard, and that more people are actually more protected, on balance, by enhanced security measures. But there is also an undeniable risk in shielding even more criminal groups from detection.

Either way, right now, Meta seems determined to push ahead with the plan, which will weld all of its messaging tools together, and also make it more difficult to break-up its network, if any antitrust decisions don’t go Meta’s way, and it’s potentially pressed to sell-off Instagram or WhatsApp as a result.

But expect more debate to be had, in more countries, as Meta continues to justify its decision, and regulatory and law enforcement groups seek more options to help maintain a level of accessibility for criminal investigations and detection.

Continue Reading