TikTok Risks Legal Action If It Fails to Tackle Cybercrime

KUALA LUMPUR, Sept 5 — Social media platform TikTok may face legal action if it fails to take appropriate measures to address online crimes, said Communications Minister Datuk Fahmi Fadzil.

He said the platform has so far failed to provide enough moderators to monitor and review harmful content, among others.

“In general, I am very dissatisfied with TikTok’s lack of seriousness in taking actions to address certain issues we have previously raised.

“One example is the case of cyberbullying against the late Rajeswary, better known as Esha. TikTok had promised to increase the number of moderators to monitor content, including TikTok Live.

“However, during a meeting with TikTok today, they still failed to state the number of moderators that have been added to monitor and review content, including live broadcasts in Tamil,” Fahmi said.

He was speaking at a press conference after a meeting with the Royal Malaysia Police (PDRM), the Malaysian Communications and Multimedia Commission (MCMC), and TikTok at the PDRM headquarters today.

Fahmi added that the ministry had repeatedly requested TikTok to provide the figures (number of moderators), but they had failed to do so.

“This is a very serious matter because the Deputy Minister (Teo Nie Ching), MCMC, and I continue to receive complaints from the Indian community regarding cyberbullying on TikTok.

“Failure to address this issue could result in TikTok facing legal action. I leave it to MCMC to review what actions can be taken,” he said.

The meeting also revealed that TikTok had been negligent and slow in providing information in cases involving scams, based on reports from the Criminal Investigation Department (CID) and the Commercial Crime Investigation Department (CCID).

Fahmi said that responsibility for setting the priorities that TikTok must address to resolve the matter promptly has been entrusted to the CID, the CCID, and MCMC.

Meanwhile, based on feedback from MCMC, 76,002 content were removed from TikTok’s platform between January 1 and August 31, while another 10,730 content requested for removal have not been taken down.

“It is important to stress that content removal is carried out by the platform itself, based on its published community guidelines.

“Therefore, even though MCMC may request the removal of certain content, if the platform deems that the content does not violate its community guidelines, it will not be taken down,” he said.