Behind the Scenes of TikTok’s Operations in Southeast Asia



In mid-February, ByteDance, the parent company behind the short-form video and social media platform TikTok, gave us the opportunity to visit their Singapore office, specifically their Transparency and Accountability Centre, and see a little bit of what goes on behind the scenes of the digital technology company’s operations.


ByteDance’s Singapore office is located at One Raffles Quay and has several floors. Our one-day visit there allowed us to see their Transparency and Accountability Centre, where we were briefed on a number of their operations at the office, including content moderation and user safety on the TikTok platform.


The conversation about content moderation on the platform was among the first topics discussed during our visit. TikTok confirmed that content supervision on the platform is divided into two, namely machine moderation or moderation using AI technology and human moderation, which is content supervision using human intuition.


For the machine learning phase of TikTok’s AI moderation system, they used 100 images and videos to teach their system what content is categorized as harmful. And from this, the system can learn context about whether a piece of content is appropriate to watch, or harmful.


Using the example of a knife, a camera connected to TikTok’s moderation system can detect the knife, and understand if the knife is being held in a threatening or casual manner, such as for cutting food and so on.


We were not allowed to record these examples, but what I can tell you is that the percentage of threatening content being detected is very high, exceeding 90 percent. For content that the AI ​​system may not be able to detect well, it is sent to a human content moderation team for review.


TikTok’s reliance on its AI moderation system is not something that is done by itself. According to their Community Guidelines Enforcement Report for the third quarter of 2024, their AI moderation system scans 1.6 million pieces of content every day worldwide, and the accuracy of this content scanning exceeds 90 percent. 80 percent of content found to violate the platform’s rules is automatically deleted.


Depending on the type of content being removed, it may also shut down accounts that are uploading content that is found to be in violation of these rules. A recent example is the incident where 18 local news platform TikTok accounts were shut down for uploading a video of a child being kidnapped in a surau.


We were told that this incident occurred because while TikTok is good at capturing the context of what is happening in a video, at the time of this writing (when we asked this question) it was still unable to distinguish why a particular clip was uploaded. We were also told that TikTok has restored the media accounts and they are now accessible as normal.


TikTok is also working with a number of local NGOs to ensure that content that may be detrimental to national harmony is given the proper context before it is removed or allowed to remain on their platform.


Speaking of manual content moderation, we were given the opportunity to test our ability to monitor this content, which ranges from the use of weapons, drugs, content that is harmful to children and others.


In my opinion, the task of policing this content is more difficult than expected, because sometimes our emotions and feelings will override the rules and measures that have been set to deal with the content that is shown. TikTok's content moderators have built-in tools such as scrolling videos from frame-to-frame and various other features such as sharing opinions to make it easier for them to make that decision.


Yes, the mistake of closing the 18 media accounts does show shortcomings in TikTok's AI content moderation system, but it also shows TikTok's firmness in not allowing content involving children in distress and abuse to be uploaded to the platform.


From the same report, TikTok said that a total of 2.1 million videos from Malaysian users were deleted from July to September 2024, of which 99.2 percent were proactively removed through the AI ​​content moderation system, and 89.1 percent of those videos were removed within 24 hours.


Meanwhile, TikTok also shared with us how they ensure that children are safe when using the platform. For example, users aged 13-15 will not be able to access features such as private messaging, the For You page and downloading videos. Their accounts will also be set to private with no option to change to public until they are 16-17 years old.


Last year, TikTok also introduced a Family Pairing feature that allows parents to control access to their minor children’s accounts, and to ensure that they are always aware of what content their children are watching.


For those who often watch mental health-related content, TikTok has also partnered with a number of local NGOs such as Crisis Helpline Heal 1555, Malaysian Mental Health Association, Doctor Anywhere Malaysia and MIASA to help people if they need professional help regarding their mental health.


We were also told how TikTok content goes viral. A common misconception about this is the widespread use of hashtags. We were told that using a lot of hashtags makes it easier to find content, yes, but it’s not that important in making a video go viral.


Activities like Follows, Likes, comments, content sharing, music usage, whether the video has been watched or not will all affect whether your video will be fed to more users. In short, the more users interact with your video positively, the higher the probability that your video will be shared with more users, and vice versa.


We also asked about how TikTok’s algorithm shows content to new users, and shows them the types of videos they are likely to watch. What TikTok does for new users is that they will be shown eight videos, and depending on the interaction with those videos, as we explained earlier, TikTok’s algorithm will see that you prefer the videos that have been watched, and for the next eight groups of videos, it will show videos that are similar to it, or from the same content creator.


After a few of these groups of videos have been shown, your For You page will show you videos that it thinks you’ll like, and will show you videos from a variety of genres, giving you the impression that you’ve got a video feed that’s not only fun to watch, but also relevant to your tastes.


To make sure you’re not just watching the same videos, TikTok will also reset your video feed, so that you’re watching more types of videos and not just the content you think you want. TikTok says that this will refresh the feed on your For You page, so that you can discover more interesting new content.


In the meantime, if you want to refresh your For You page’s content feed yourself, you can also do so in the settings menu, and all the data fed to TikTok’s algorithm will be forgotten.


If you’re interested in learning more about how TikTok recommends content for you, you can visit this website. It also explains a bit about how the algorithm ensures that you’re seeing content that’s safe and non-offensive.


That’s a little bit of what we learned from our brief time at ByteDance and TikTok’s Singapore offices. ByteDance's Singapore office also has a management division that looks after the Southeast Asia region, but their Transparency and Accountability operation is among their most important operations in this office.

Previous Post Next Post

Contact Form