A person close to the matter has revealed to SoyaCincau about why some racially and politically charged content on TikTok isn’t moderated. This is due to a lot of the type of content being seen and reported on the platform after GE15.
As of now at the time of writing, when you type in “13 Mei” in TikTok’s search bar, you’re still able to see some content that encourages and threatens violence with weapons in light of the GE15 election. “13 Mei” is in reference to the 13 May incident in 1969—when a riot occurred in the aftermath of the 1969 Malaysian general election.
The content on TikTok managed to stay up until now, even after many reports by media and a statement by the Royal Malaysian Police. Twitter users also reported that several TikTok videos encouraging race-based hatred post-GE15 were “paid partnerships”, which meant that the users were being paid to create the polarising content.
So, why are some of those videos still up on TikTok?
A person close to the matter revealed that a lot of the videos do not actually violate TikTok’s policies as the videos can be vague—unless there are orders for the videos to be taken down. So, some of the more vague TikTok videos like showing weapons and saying “Wake up, Malays” do not have a clear threat of who and what exactly they want to do.
The same person stated that TikTok moderators only review content flagged by the platform, which can mean that the TikTok bots could miss out on vital videos. It is a likely reason why some content is still up.
We’ve reached out to TikTok representatives, who have still yet to give us a statement regarding the post-GE15 racially charged videos. They would also need to explain why some “paid partnership” TikToks that had a hateful political message were passed through TikTok sensors.
Paid content on the platform should have gone through some form of moderation, but it seems like getting an ad approved on the platform is easier than getting an ad approved on Facebook—which I’ve tried before. Facebook seemed like it used to be much easier to get something wild approved on their platform before they ramped up their security measures.
[ IMAGE SOURCE ]