We’ve previously reported today on how racially and politically charged posts have shown up on TikTok. This was even after it was reported earlier in October that TikTok won’t be allowing any political advertising to take place on their platform in light of GE15. TikTok has finally sent us a statement.
“At TikTok, we have zero tolerance against any form of hate speech and violent extremism. As it relates to May 13 content, we quickly removed videos which were in violation of our Community Guidelines. We continue to be on high alert and will aggressively remove any violative content, including video, audio, livestream, images, comments, links, or other text,” wrote TikTok’s representative.
They also added that since the lead-up to the elections, they “have been in constant communication with the relevant bodies, including MCMC about accounts that are involved in severe or repeated on-platform violations”. They also ask community members to use the in-app reporting function to immediately report any harmful content. To do so, press and hold a video and a prompt will appear for users to click “Report”.
When I wrote this article, posted around 4.30pm, I was still able to see a few hateful and polarising content in light of GE15. If I typed “13 May” in TikTok’s search bar—referring to the 13 May incident in 1969, I was able to see videos inciting a possible repeat of 13 May.
Now, typing “13 May” on the search bar left me with much less content, as it seems like the content has been removed from the platform. The content that’s left included videos that ask users to report any harmful 13 May-related content.
I also asked TikTok regarding the paid partnership videos that should have gone through some form of moderation. But they only replied with the basics with no explanation of how the paid videos were approved in the first place.
“We have zero tolerance against any form of hate speech and violent extremism and this extends to branded content. Our branded content policy makes it clear that all branded content must comply with our Community Guidelines and Terms of Service,” they said.
According to a person close to the matter, the reason why some racially-charged content stays on the platform initially is that they’re vague enough and do not have a clear threat of who and what exactly they want to do. And yes, even if the content includes showing weapons and the words “Wake up, Malays”.
[ IMAGE SOURCE, 2 ]