Video-sharing service TikTok has reiterated the use of its moderation mechanisms to remove any potentially harmful or inappropriate content from its platform, saying Pakistan is among the top five markets where the largest volume of videos was removed over violations of its community guidelines.
TikTok is a Chinese social networking app that allows users to make video clips, lip-sync to songs and create short videos.
The company’s statement comes more than two weeks after the Pakistan Telecommunication Authority (PTA) issued a “final warning” to it over obscene and immoral content on the platform. The PTA last month also banned live streaming application Bigo.
In a statement on Thursday, TikTok said its latest transparency report showed that Pakistan is one of the five markets with the largest volume of videos removed for violating its community guidelines or terms of service.
“This demonstrates TikTok’s commitment to remove any potentially harmful or inappropriate content reported in Pakistan,” the video-sharing service said.
It added that being “the leading global platform for short videos, TikTok has grown increasingly popular in Pakistan by offering a space for fun and creative expression. While users enjoy creating content on TikTok, with it comes the responsibility to keep users safe on the platform.”
In order to address this, TikTok said it has released an updated publication of its community guidelines in Urdu that will help maintain “a supportive and welcoming environment on TikTok for users in Pakistan”.
The guidelines provide general guidance on what is and what isn’t allowed on the platform, and are localised and implemented “in accordance with local laws and norms”, according to the press release.
The company said its teams remove content that violates the community guidelines and suspend or ban accounts involved in severe or repeated violations.
“Content moderation is performed by deploying a combination of policies, technologies, and moderation strategies to detect and review problematic content, accounts, and implement appropriate penalties,” it explained.
According to the statement, TikTok’s systems automatically flag certain types of content that may violate its community guidelines. But because “technology today isn’t so advanced that a platform can solely rely on it to enforce our policies”, the company uses a team of trained moderators to help review and remove content.
As an example, TikTok said context can be important when determining whether certain content, like satire, is violative.
Another way the TikTok team moderates content is based on reports that it receives from its users. “There is a simple in-app reporting feature for users to flag potentially inappropriate content or accounts to TikTok,” the statement said.
TikTok, which is owned by China’s ByteDance, has been downloaded almost 39 million times in Pakistan and is the third most downloaded app over the past year after WhatsApp and Facebook, according to analytics firm Sensor Tower.
The video-sharing service says it deleted more than 49m videos which broke its rules between July and December 2019. About a quarter of those videos were deleted for containing adult nudity or sexual activity, according to its latest transparency report.
About one-third of the videos were from India (where the Chinese app is now banned), followed by the United States and Pakistan where it has removed over 3m videos for violating its community guidelines.