- TikTok has invited British lawmakers into its offices to review its algorithm.
- The invite was extended after a parliamentary committee questioned TikTok on its links to China's ruling Communist Party.
- TikTok is the international version of Chinese app Douyin, which is also owned by China's ByteDance.
LONDON — TikTok has invited British lawmakers into its offices to review its algorithm after a parliamentary committee questioned the social media app's links to China's ruling Communist Party.
Elizabeth Kanter, TikTok's U.K. director of government relations and public policy, invited the committee to visit its "transparency center" and review its code, as well as how it moderates content.
The invite came after Kanter was questioned by Conservative Member of Parliament Nusrat Ghani on whether TikTok or its Chinese parent company ByteDance censors content that China doesn't like, including content related to the persecution of Uighur Muslims in Xinjiang, where over a million people are being held in so-called "re-education" camps.
"We do not censor content," Kanter told the committee Thursday on a video call that was broadcast online. "I would encourage you to open the app and search for Tiananmen Square, search for Tibet, you will find that content on TikTok."
"We do not moderate content based on political sensitivities or affiliation," Kanter said. "There's no influence of the Chinese government on TikTok."
While Kanter denied TikTok currently censors this kind of content, she admitted the app had quashed content on the Uighur crisis in Xinjiang in the past.
"In the early days of TikTok there was some policies in place that took what we call a 'blunt instrument' to the way in which content was censored," she said. "At that time we took a decision ... to not allow conflict on the platform, and so there was some incidents where content was not allowed on the platform, specifically with regard to the Uighur situation."
"If you look at the platform now and search for the term 'Uighur' on the TikTok app, you can find plenty of content about the Uighurs. There's plenty of content that's critical of China," Kanter added.
"We do not in any way, shape, or form censor content or moderate in a way that would be favorable to China. You are right to say that those were a couple of years ago the content moderation guidelines, but they're absolutely not our policy now."
Kanter said TikTok's policy changed at least a year ago but she did not give an exact date. However, The Guardian newspaper obtained leaked TikTok documents in September 2019 that showed how TikTok used to censor videos critical of Beijing.
The documents reportedly listed Tibet and the Tiananmen Square massacre as examples of content to remove. When The Guardian article was published, TikTok said the documents were outdated and that the policies in them had been dropped in May 2019.
In a statement provided after the hearing, Kanter said: "During the course of today's hearing, I made an incorrect statement in response to a specific line of questioning about an outdated content policy."
Kanter added: "TikTok has previously acknowledged that in our very early days, we took a blunt approach to moderating content that promoted conflict, but we've also said we recognized this was the wrong approach and eliminated it. However, we want to be absolutely clear that even in those early policies, there was never a policy around the Uighur community, which is where I misspoke."
TikTok is the international version of Chinese app Douyin, which is also owned by ByteDance.
Kanter said that TikTok is under no obligation to share user data with the Chinese government due to its corporate structure and she said the company would refuse to hand over data if the Chinese government asked for it. TikTok user data is currently stored on servers in the U.S. and there is a backup in Singapore.
"None of our user data goes to China," she said. "The Chinese government has never asked us for any user data and of course if they did, we would not give them any data."
TikTok isn't the only tech company being probed on its content moderation decisions. Facebook, Twitter, YouTube and other social media companies are all being scrutinized more than ever.
The platforms, which have billions of users between them, are constantly trying to remove inappropriate content, including terrorist acts, nudity, hate speech, drug abuse and misinformation.