CHENNAI | BENGALURU: Social media, content and gaming apps are grappling for a way to deal with heightened regulatory scrutiny as Indian courts step in to stem the flood of socalled objectionable content on these platforms, in one of the world’s fastest-growing internet markets.
Companies such as Facebook, Google-owned YouTube as well as Chinese apps such as TikTok and Bigo Live will face even more regulatory scrutiny in future, according to people in the know.
As India debates the contours of a proposed data law and awaits clarity on amendments to intermediary guidelines for technology platforms, those concerned about the potentially harmful effects of certain content on these apps are turning to the courts for redressal.
“The courts are filling a vacuum and that’s a good thing,” said a senior official at the ministry of electronics and IT.
‘NOT REGULATED BY ANY LEGISLATION CURRENTLY’
The official expects “there will be a vacuum until new laws are finalised”.
Apps such as TikTok and Bigo Live, that allow youngsters to use the short video platform, have been accused of allowing sexually explicit content to be beamed on their platform. The result: An interim ban on downloads of TikTok — a popular video app among teenagers — by the Madras High Court. In Gujarat, several districts have banned PUBG, the multi-player mobile gaming app, saying it is addictive for youngsters. Legal experts who expect scrutiny over digital platforms to only increase say that such regulations must focus on removing illegal content as against removing the app completely.
“There is definitely more talk worldwide about regulating digital platforms and making them more accountable,” said Anirudh Rastogi, founder of Ikigai Law. “However, it is important to keep in mind that they are not liable legally for actions of their users,” he said.
Researchers examining cases at the intersection of internet freedom and privacy are of the view that the recent debates around the accountability of intermediaries like TikTok, PUBG, Facebook and Instagram have been occasioned by spread of inappropriate content on these platforms.
“Because the content could be obscene in many instances, the intermediaries are being brought under the purview (of the law) now. This is, however, not regulated by any legislation currently,” said Nitish Chandan, who manages training and policy at Cyber Peace Foundation.
Section 79 of the Indian Information Technology Act, 2000 exempts intermediaries from liability for obscene and illegal content, as long as they do not play any part in creation of content. The proposed amendment have stricter timelines on takedown of harmful content once notified by the government.
The Madras High Court in its April 3 order banning TikTok’s downloads had also asked if the government would enact a statute, like the Children’s Online Privacy Protection Act, enacted by the United States, to prevent children from becoming cyber/ online victims.
The proposed data protection law drafted by Justice BN Srikrishna Committee has prohibited technology companies from profiling, tracking, behavioural monitoring or advertising directed at children. The draft also proposed parental consent to sign up for apps.
TikTok claimed that it complies with local laws and ensures that it doesn’t promote objectionable and illegal content on its platform.
In response to queries from ET, Sumedhas Rajgopal, lead of strategy and partnerships at TikTok India said, “TikTok removed six million videos that violated TikTok’s Community Guidelines. Our moderation team is based in over 20 countries and regions including India, covers 15 major Indian languages, including Hindi, Tamil, Telugu, Bengali, Gujarati and more.”
Facebook, Google, WhatsApp, and Tencent, which owns PUBG mobile developer, did not respond to emailed queries on the measures they have taken to regulate content on their platforms as well as their view on proposed legislations to safeguard against the spread of illegal content.
Earlier, India had raised concerns over the spread of rumours on social media platforms which led to lynching and violence against people. The country has asked Facebook-owned WhatsApp to allow traceability of messages so that rumours can be curbed.
In March, Facebook, WhatsApp, Google, Twitter, Share-Chat and ByteDance agreed to India’s first voluntary code’ on taking down ‘problematic content’ and to bring ‘transparency in political advertising’. Since February, ShareChat has taken down nearly half a million pieces of content and banned 54,400 accounts. In April first week, Facebook said it removed over 700 pages that violated its policies after it detected ‘coordinated inauthentic behaviour’ ahead of the general elections.
Voicing concerns against outright bans, the Internet Freedom Foundation (IFF), an Indian digital liberties organisation on Wednesday sent a letter to MeitY arguing against outright bans on applications “that provide no resolve and consider alternative measures”.
The organisation has requested the ministry to adopt a “rights respecting stand against app bans, commence a transparent consultation process and activate the defunct Cyber Regulations Advisory Committee, which was tasked with providing subject matter expertise to the government on issues related to the use of technology.
Despite citizen concerns over digital freedom, governments worldwide are moving towards greater regulatory control over content platforms. On April 3, Australia passed a law that will make social media companies liable for fines up to 10% of profits or the arrest of its executives and jail terms of up to three years, if companies fail to remove “abhorrent violent material” from their platforms.
Singapore has drafted a law that could make internet and social media companies pay fines of up to S$1 million and jail of up to six years for officials who do not comply in removing fake news on their platforms. It also mandates those who spread fake news to file a correction online.
Source: Economic Times