One of the world’s most popular smartphone apps has a child-safety problem that India isn’t prepared to deal with.
On Feb. 27, the US federal trade commission (FTC) slapped a fine of $5.7 million (Rs40 crore) on TikTok, a social network that allows users to create and share music videos with their followers, to settle allegations of child privacy law violation. The app has been accused of collecting personal data from users under the age of 13 without seeking parental consent. This is the largest civil penalty the FTC has ever collected in a children’s privacy case.
India is the biggest market for TikTok, comprising almost 40% of the app’s 500 million user base.
And the country, too, has reasons for concern over the app being used to spread hate speech, fake news, and child porn, apart from endangering users physically through various viral hashtag challenges. The government in the southern Indian state of Tamil Nadu has even suggested banning the app because of its often sexually explicit content, among other things.
However, experts believe India does not have any cyber laws that specifically protect children’s privacy.
“The protection under existing laws is restricted to content that exposes children in an obscene, indecent or sexually explicit manner, involves abuse, sexual harassment, or child pornography,” Suneeth Katarki, founding partner at Bengaluru-based Indus Law, told Quartz.
TikTok, developed by Beijing-based tech unicorn Bytedance, is a massive rage among Bollywood-crazed Indians who post videos lip-syncing to songs or reciting movie dialogues. Its ubiquity adds to the concerns over its safety lapses.
So the app has set up a moderation team in India that covers major regional languages, including Hindi, Tamil, Telugu, Bengali, and Gujarati, from its two offices in Mumbai and Delhi. Earlier this month, the app partnered with Jharkhand-based cybersecurity think tank Cyber Peace Foundation to launch educational posters on online safety to be distributed in schools and colleges.
It is also looking to appoint a “chief nodal officer” who would work with the Indian government to address concerns of child safety. Recently, TikTok India appointed Sandhya Sharma, a former Mastercard employee, as its public policy director.
“As a global community, safety is one of TikTok’s topmost priorities,” Sharma said in early February. “In addition to user education, we at TikTok are continuously working to introduce additional features to promote safety. TikTok’s first of a kind Digital Wellbeing feature, which limits the time users can spend on the app, is one such example.”
Putting kids first
TikTok isn’t alone in watching out for kids. Video-sharing platform YouTube decided to disable comments on videos featuring minors after video blogger Matt Watson detailed how paedophiles enter a “wormhole” of YouTube videos to see footage of children doing innocuous activities presented in sexually suggestive positions.
The Google-owned site has since banned over 400 accounts and taken down dozens of videos that put children at risk.
But moderation isn’t the sole issue to grapple with, according to Dylan Collins, CEO of child-tech company SuperAwesome. It’s “rather a lack of responsibility on behalf of platforms to implement the correct technology in order to protect kids online,” he said. Collins’ firm creates safe digital experiences for companies such as Disney, Mattel, Hasbro, and Cartoon Network, whose primary customers are children.
Exercising better tech hygiene, TikTok’s app in India—meant for users aged 13 and above—includes age-gating measures at signup, the company told Quartz in a statement. The company has also set a 12+ App Store rating which enables parents to simply block it from their child’s phone using device-based parental controls.
“Parents/legal guardians can help guide teens to use the app in an age-appropriate manner and notification banners have been added to videos that may be inappropriate for younger audiences,” TikTok told Quartz. “We also have a “Digital Wellbeing” function allows users or parental guardians to control time spent on TikTok, as well as to limit the appearance of content that may not be appropriate for all audiences.”