What a TikTok exec told the British government about the app that we didn’t already know

Key Points
  • Theo Bertram, TikTok's director of government relations and public policy in Europe, explained how groups operating on the dark web made plans to raid social media platforms including TikTok. 
  • In early September, TikTok struggled to deal with a graphic suicide video that kept being uploaded to the platform in different ways. 
  • Bertram also spoke about TikTok's corporate structure and where user data is stored. 
Theo Bertram, TikTok's director of government relations and public policy for Europe.
UK Government

LONDON — Theo Bertram, TikTok's director of government relations and public policy in Europe, provided the world with a few insights into the wildly popular short video app as he was grilled by British politicians in Westminster.

The former Google exec, who used to advise former Prime Ministers Tony Blair and Gordon Brown, attempted to fend off questions on everything from political censorship to how TikTok goes about removing harmful content.

Cool, calm, and collected, Bertram appeared to run rings around the "digitally literate" politicians at times as they asked basic questions about how TikTok works. One politician got confused between a MAC (media access control) address, which is used to determine a device on a network, and an Apple Mac at one point, while another said that TikTok's recommendation algorithm, which learns what you're interested in, was serving him "trashy" videos including "content where there's a mother and daughter pushing their tushy, if you know what I mean."

Giving evidence to the Commons committee for Digital, Culture, Media and Sport (DCMS) via a video call that was broadcast online, Bertram revealed how his company struggled to deal with a graphic suicide video that went viral on TikTok in early September after it was broadcast live on Facebook a week earlier.

"We learned that groups operating on the dark web made plans to raid social media platforms, including TikTok, in order to spread the video across the internet," Bertram said on Tuesday.

"What we saw was a group of users who were repeatedly attempting to upload the video to our platform, and splicing it, editing it, cutting it in different ways," he added. "I don't want to say too much publicly in this forum about how we detect and manage that, but our emergency machine-learning services kicked in, and they detected the videos."

TikTok's interim head Vanessa Pappas wrote a letter to other tech companies on Monday proposing a "global coalition" to help protect users from harmful content.

"Last night, we wrote to the CEOs of Facebook, Instagram, Google YouTube, Twitter, Twitch, Snapchat, Pinterest and Reddit," Bertram said. "What we are proposing is that, in the same way these companies already work together around [child sexual abuse imagery] and terrorist-related content, we should now establish a partnership around dealing with this type of content."

If a partnership had been in place in early September then Facebook could have shared technical information on the suicide video with TikTok that may have prevented it from being uploaded to begin with.

Policing content

Content moderation is a big deal for social media companies and striking the right balance between deleting and allowing content isn't always easy. Like Facebook and YouTube, TikTok relies on algorithms and thousands of human moderators to spot and remove content that breaches its policies.

In its second transparency report published Tuesday, TikTok said it removed over 104 million videos in the first six months of the year. It stressed that's less than 1% of all content uploaded to its platform.

Videos containing "adult nudity and sexual activities" accounted for 30.9% of all videos removed, while "suicide, self-harm and dangerous acts" made up 13.4%. Content featuring drug use and hate speech was also taken down.

Sometimes governments and law enforcement agencies ask TikTok to take down videos too. Between Jan. 1 and June. 30, TikTok removed over 500 pieces of content following government requests. Around half of those were removed in Russia, which has a reputation for discriminating against LGBTQ communities.

"The Russian law is terrible... but unfortunately we have to comply with legal requests in the country we operate," said Bertram.

Growing business

Bertram said that TikTok now has more than 10,000 people working on trust and safety worldwide, with 363 moderators in the U.K., where users posted 1.6 million videos a day on average in the first half of the year.

In total, TikTok now has more than 1,600 staff in Europe spread across offices in London, Dublin, Paris, and Berlin. London is the biggest hub, however, with around 800 people based there. "We're probably one of the fastest growing businesses in the country," Bertram said.

Asked if TikTok would make London its international headquarters one day, as previous media reports had suggested it might, Bertram said: "We're focused on the challenge we have in the U.S. before we look at the international regime."

European user data

One member of the committee, Damian Green, accused TikTok of breaking the law by storing European user data on servers in the U.S. 

Bertram denied the company had done anything illegal and stressed that it's a complicated legal matter. "Your understanding of complexity of EU law is not quite right," said Bertram.

TikTok declined to comment when CNBC asked who owns/operates the servers but a report from The Information in July said TikTok agreed to spend more than $800 million on Google Cloud.

From 2022, European user data will be stored and processed in a new 420 million euro Irish data center, Bertram said.

Corporate structure

The committee also attempted to try to understand TikTok's increasingly complex corporate structure.

TikTok Ltd and parent company ByteDance Ltd are both nominally headquartered in the Cayman Islands, where corporation tax is 0%.

An org chart on ByteDance's website shows there are four main TikTok entities that all sit under TikTok Ltd. They are: TikTok LLC; TikTok Australia Pty Ltd; TikTok Pte Ltd; and TikTok Information Technologies U.K Ltd.

Here's how they're set up:

  • Located in the U.S., TikTok LLC owns TikTok Inc, which is the U.S. business,
  • Located in Australia, TikTok Australia Pty Ltd owns the Australia and New Zealand business. 
  • Located in Singapore, TikTok Pte Ltd owns TikTok's operating entities in South East Asia and India.
  • Located in the U.K. TikTok Information Technologies U.K. Ltd owns TikTok's operating entities in the European Union.

However, there's talk of a new "TikTok Global" business being set up too if a deal with Oracle and Walmart goes ahead. Under the deal, which is yet to be approved by Beijing, Oracle would take a 12.5% stake in TikTok Global and Walmart would get a 7.5% stake.

ByteDance also confirmed that it would do a small round of pre-IPO (initial public offering) financing. TikTok Global will become an 80% holding subsidiary of ByteDance as a result, giving it majority control.

Over the weekend, Trump said the new TikTok Global will "have nothing to do with any outside land, any outside country, it will have nothing to do with China. It'll be totally secure. That'll be part of the deal."

ByteDance's majority ownership of TikTok appears to contradict that. But ByteDance is 40% owned by U.S. venture capital firms, so the Trump administration can technically claim TikTok Global is now majority owned by U.S. money.

— CNBC's Arjun Kharpal contributed to this article.