Feroza Aziz started her TikTok video like a typical makeup tutorial, telling viewers she would show them how to get long eyelashes. Then the 17-year-old stopped abruptly, calling instead on viewers to start researching the harrowing conditions facing Muslims in China’s detention camps.
The surprising bit of modern satire quickly went viral on TikTok, the short-video app and global phenomenon owned by a Beijing-based tech firm. But in the hours afterward, Aziz’s TikTok profile was suspended. By Tuesday, she told The Washington Post, she remained unable to access her account.
The videos, and Aziz’s suspension, have quickly touched off a public debate about one of the world’s fastest-growing social apps, including over its approach to political issues and its support of free speech in countries outside China, where its parent company ByteDance is headquartered.
The episode has highlighted a signature challenge facing TikTok: Famous for its lighthearted memes and singalong videos, the app increasingly finds itself facing scrutiny due to its close ties to a Chinese conglomerate that must adhere to the country’s strict censorship rules.
TikTok has said it makes decisions about the content it surfaces and suppresses for US users independent from the Chinese government. But its past practices and limited transparency have fueled deep skepticism among lawmakers, tech experts and some of its users.
The popularity of Aziz’s videos shows how TikTok has increasingly become a new home for discussion of politics and current events among young viewers on the Web. But the suspension has fueled concerns over how TikTok will respond to a growing level of acrimonious debate and discussion of issues critical of the Chinese government.
TikTok has said its audience prefers to use the video app for entertainment, not political debates, and that its executives have pushed to preserve the app as a refuge for positivity online. To abide by that mandate, former employees told The Post they were instructed to follow guidelines set by Chinese moderators and remove social or political content that would have been easily accepted elsewhere around the Web.
TikTok representatives said Tuesday that Aziz’s account was not suspended because of her criticism of China and that the company “does not moderate content due to political sensitivities.” Instead, they said she had broken the rules by registering a new account: A previous account of hers had been banned, they said, because she had posted a video referencing Osama bin Laden that had violated rules about promoting terrorist content.
Aziz, who said she is a high school junior in New Jersey, told The Post she never got any explanation about TikTok’s penalties on her account. The video TikTok referred to, she said, was an obvious bit of dark humor, and involved her singing in front of a series of a men that she suggested were attractive. A copy she shared with The Post shows bin Laden’s face appearing, for less than a second, as the surprise punchline.
“As Muslims, we’re ridiculed every day, so that was me making a joke to cope with the racism we face on a daily basis,” she said. “I’ve been told to go marry a terrorist, go marry bin Laden, so I thought: ‘Let me make a joke about this. We shouldn’t let these things get to us.'”
She said she found it “scary” that she was blocked for making what seemed to her like a harmless joke. And she said she felt it was “very suspicious” that her account was suspended only after she posted viral videos criticizing the home country of TikTok’s parent company.
Eric Han, the head of TikTok’s US Trust and Safety team, said in a statement to The Post that Aziz’s account was banned after the bin Laden post earlier this month. The app’s community guidelines, he said, strictly prohibit any videos that “promote and support” terrorist organizations.
Aziz’s lash-curling videos, which reference the camps in China’s Xinjiang region, can still be viewed on TikTok, where they have attracted more than 500,000 views. “TikTok does not moderate content due to political sensitivities and did not do so in this case,” Han said. Reviews of video for moderation can be triggered by several factors, Han said, including if a video passes certain “virality benchmarks.”
Aziz said late Tuesday she could not access the account, and that TikTok has provided her no information about whether she can use the service again. When TikTok users have videos removed for violating guidelines, Han said, they are not told the specific reason but can appeal the removal. “Her previous account was banned, so we wouldn’t have had communication with her on that account,” he added.
Aziz’s other videos on TikTok resemble many of the unrestrained, boundary-pushing parodies that often go viral on the Web. In other videos, she jokes about marrying her cousin, living with a strict Muslim mother, and being profiled online as a terrorist. In one video criticisng TikTok as “racist,” she said she posts “relatable Muslim content, things that Muslims can laugh at.”
Kate Klonick, an assistant professor at St John’s University School of Law who studies social media and free speech, said the incident illustrate the dangers when tech giants aren’t transparent about their practices – and aren’t regulated to be more forthcoming.
“It’s completely at the whim of these giant tech companies [as to] what they decide to tell us, and we have no way to fact check their account of things,” she said. “There’s no outside mechanism of enforcement.”
In doing so, though, Klonick said the struggle for an app such as TikTok is striking the right balance in what she described as the “paradox of content moderation.”
“We want to be protected from certain kinds of content . . . like terrorists using Osama bin Laden’s face to propagandize radical Islam,” she said. “But at the same time it’s critical to have access to that kind of bad content in order to critique it, or make fun of it, or tear it down, or use it to build culture.”
TikTok’s practices and its Chinese origins have raised alarms in Washington, where lawmakers and regulators fear Beijing’s heavy digital hand might affect Americans’ speech online and leave their personal data at risk.
Members of Congress led by Sen. Josh Hawley, a Missouri Republican, sought to grill top TikTok executives at a congressional hearing earlier this month, though the social-media app declined to appear, further stoking lawmakers’ ire.
TikTok, which traces its origins to ByteDance’s purchase of the karaoke app Musical.ly in 2017, also faces an investigation by an arm of the US government that reviews such mergers for potential national security concerns.
Federal lawmakers had encouraged such a probe, and they’ve asked US intelligence officials to open an additional investigation to determine if the Chinese government might be able to force TikTok to turn over American users’ data. TikTok has said that it stores such information in Virginia and Singapore.
Aziz said she used the makeup routine as a way to get the attention of viewers who might otherwise ignore the news. But she said she worries about how TikTok’s rules could influence the kinds of information young viewers see online.
The suspension, she said, is “just another reason for me to speak louder.”
© The Washington Post 2019
TikTok accused by amputee model of deciding who is ‘vulnerable to bullying’
Amputee, model and body positivity advocate Jess Quinn has lashed out at ever-growing social media network TikTok for censoring disabled people because it deemed them “susceptible” to bullying.
The 27-year-old New Zealander, who lost her leg to cancer when she was nine years old, shared her message on Instagram on Thursday.
Accompanying a video of her dancing with her prosthesis and wearing a hoodie with the words “all bodies welcome here” was a stark message for the video-based social media network.
“I hear you have shadow banned videos by ‘disabled, fat or LGBTQ ‘ users because they’re ‘vulnerable to bullying if their videos reach a wide audience’,” Quinn wrote.
Shadow banning is the practice by social media platforms of blocking users so it is not obvious to the user that they or their comments have been blocked.
“Well, on behalf of all of those people, the only bullying is your exclusion of people who you believe are ‘vulnerable’,” Quinn wrote.
“I thought I’d add a little video to your app of my ‘vulnerable’ self, wearing a sweatshirt that says ALL BODIES WELCOME HERE, while removing one of my body parts.”
More on 7NEWS.com.au
- ‘Extraordinary conditions’: NSW inferno blazes out of control as it races towards homes
- Development in case of off-duty cop who stoned a wombat to death
- Aussie Zac Whiting still behind bars in Bali after release hits a roadblock
- Aus Open golfer says Sydney smoke is ‘worse than China’
Her frustrations are due to an early edition of the app’s terms and conditions, which restricted content from people “susceptible to bullying or harassment based on their physical or mental condition”.
Though Quinn says this never affected her personally, she was distressed it could have.
In the video below, an explainer of what TikTok actually is
In response, TikTok has conceded it was “blunt and temporary”.
“This was never designed to be a long-term solution, but rather a way to help manage a troubling trend,” a spokeswoman told 7NEWS.com.au.
“While the intention was good, it became clear that the approach was wrong.
“We want TikTok to be a space where users can safely and freely express themselves, and we have long since changed the policy in favour of more nuanced anti-bullying policies and in-app protections.”
A blog post on the social network’s website says the user is in control of who is able to respond to their content, with blocking and reporting features available.
Quinn said the overarching goal for people with disabilities was just to be treated like anyone else.
“I thank you for your attempt at being considerate to us ‘vulnerable’ people but quite frankly we just want to be treated like everyone else,” she wrote.
TikTok Prevented Disabled Users’ Videos From Going Viral: Report
Chinese short video-sharing app TikTok has acknowledged that content produced by disabled users was deliberately suppressed by the firm’s moderators in a bid to prevent these users from becoming victims of bullying, the media reported. Facing criticism, TikTok acknowledged that its approach had been flawed, the BBC reported on Tuesday, adding that the measure was exposed by the German digital rights news site Netzpolitik.
Disability rights campaigners termed the strategy “bizarre”.
A leaked extract from TikTok’s rulebook gave examples of what its moderators were instructed to be on the lookout for: disabled people, those with Down’s syndrome and autism, people with facial disfigurements, and people with other “facial problems” such as a birthmark or sight squint.
Such users were “susceptible to bullying or harassment based on their physical or mental condition”, according to the rulebook.
The moderators were instructed to restrict viewership of affected users’ videos to the country where they were uploaded, according to an unnamed TikTok source quoted by Netzpolitik.
The moderators were told to prevent the clips of vulnerable users from appearing in the app’s main video feed once they had reached between 6,000 to 10,000 views, said the report.
A spokesman for TikTok admitted that it had made the wrong choice, the BBC reported.
“Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy,” he was quoted as saying.
“This was never designed to be a long-term solution, and while the intention was good, it became clear that the approach was wrong,” the spokesperson told the BBC.
TikTok Admits Error After Penalising Teen Who Posted Political Videos
TikTok on Wednesday acknowledged it had erred in penalising a 17-year-old who had posted witty but incisive political videos, promising it would restore her ability to access her account on her personal device. The company’s apology – coupled with a new pledge to reevaluate its policies – still failed to satisfy the teen, Feroza Aziz, who again raised concerns that she’d been the victim of censorship by the fast-growing, Chinese-owned social-media app.
“TikTok is trying to cover up this whole mess,” she told The Washington Post. “I won’t let them get away with this.”
The saga started earlier this week, when Aziz tweeted that her profile had been temporarily suspended. She attributed the penalty to the fact she had recently shared a satirical video that urged viewers to research the harrowing conditions facing Muslims in China’s detention camps. Her comment quickly garnered widespread attention because TikTok is owned by a China-based tech conglomerate, ByteDance, though the company has sought to stress recently its US operations are independent from Beijing’s strict censorship rules.
TikTok, however, said it had penalised her not for her comments about China but rather a video she’d shared earlier – a short clip, posted on to a different account, that included a photo of Osama bin Laden. Aziz’s video violated the company’s policies against terrorist content, TikTok said, so the company took action against her device, making any of her other accounts unavailable on that device. TikTok said her videos about China did not violate its rules, had not been removed and had been viewed more than a million times.
But the video in question – a copy of which she shared with The Washington Post – actually was a comedic video about dating that the company had misinterpreted as terrorism, Aziz said.
By Wednesday evening, TikTok had reversed course: The company said it restored her ability to access her account on her personal device. TikTok also acknowledged that her video about China had been removed for 50 minutes on Wednesday morning, which it attributed to a “human moderation error.”
“We acknowledge that at times, this process will not be perfect. Humans will sometimes make mistakes, such as the one made today in the case of @getmefamouspartthree’s video,” wrote Eric Han, the head of safety at TikTok U.S., referring to Aziz’s account.
“When those mistakes happen, however, our commitment is to quickly address and fix them, undertake trainings or make changes to reduce the risk of the same mistakes being repeated, and fully own the responsibility for our errors,” Han continued.
In doing so, TikTok for the first time offered detail about the actions it has taken to police its platform: In November, the company said, it banned 2,406 devices associated with accounts that violated rules about terrorism, child exploitation or spam. It was part of that sweep that Aziz’s own device had been banned, locking her out of her account there.
Aziz, however, said late Wednesday she isn’t convinced.
“Do I believe they took it away because of a unrelated satirical video that was deleted on a previous deleted account of mine? Right after I finished posting a 3 part video about the Uyghurs? No,” she tweeted Wednesday.
TikTok’s policies have drawn critical attention in Washington, where investigations have begun into whether the platform presents a national security risk.
© The Washington Post 2019