In News
- Recently, the European Union (EU) has given final approval to online safety focused legislation which is an overhaul of the region’s social media and e-commerce rules.
Key features of the Digital Services Act
- Faster removal of content:
- As part of the overhaul, social media companies will have to add “new procedures for faster removal” of content deemed illegal or harmful.
- They will also have to explain to users how their content takedown policy works.
- The DSA also allows users to challenge takedown decisions taken by platforms and seek out-of-court settlements.
- Bigger platforms have greater responsibility:
- One of the most crucial features of the legislation is that it avoids a one-size fits all approach and places increased accountability on the Big Tech companies.
- Under the DSA, ‘Very Large Online Platforms’ (VLOPs) and ‘Very Large Online Search Engines’ (VLOSEs), that is platforms, having more than 45 million users in the EU, will have more stringent requirements.
- Direct supervision by European Commission:
- These requirements and their enforcement will be centrally supervised by the European Commission itself, a key way to ensure that companies do not sidestep the legislation at the member-state level.
- More transparency on how algorithms work:
- VLOPs and VLOSEs will face transparency measures and scrutiny of how their algorithms work, and will be required to conduct systemic risk analysis and reduction to drive accountability about the society impacts of their products.
- VLOPs must allow regulators to access their data to assess compliance and let researchers access their data to identify systemic risks of illegal or harmful content.
- Clearer identifiers for ads and who’s paying for them
- Online platforms must ensure that users can easily identify advertisements and understand who presents or pays for the advertisement.
- They must not display personalised advertising directed towards minors or based on sensitive personal data.
Significance of the move
- The law tightly regulates the way intermediaries, especially large platforms such as Google, Meta, Twitter, and YouTube, function in terms of moderating user content.
- It will give better protection to users and to fundamental rights online, establish a powerful transparency and accountability framework for online platforms and provide a single, uniform framework across the EU.
Comparison of EU’s DSA with India’s Online Laws
- India had notified extensive changes to its social media regulations in the form of the Information Technology Rules, 2021 (IT Rules) which placed significant due diligence requirements on large social media platforms such as Meta and Twitter.
- These included appointing key personnel to handle law enforcement requests and user grievances.
- Enabling identification of the first originator of the information on its platform under certain conditions.
- One of the reasons that the platform may be required to trace the originator is if a user has shared child sexual abuse material on its platform.
- WhatsApp has alleged that the requirement will dilute the encryption security on its platform and could compromise personal messages of millions of Indians.
- Deploying technology-based measures on a best-effort basis to identify certain types of content.
- One of the most contentious proposals is the creation of government-backed grievance appellate committees which would have the authority to review and revoke content moderation decisions taken by platforms.
Way Forward
- India is also working on a complete overhaul of its technology policies and is expected to soon come out with a replacement of its IT Act 2000, which is expected to look at ensuring net neutrality and algorithmic accountability of social media platforms, among other things.
Source: IE
Previous article
Mediation Bill, 2021: Promotion of Mediation & Concerns
Next article
Vyom Mitra