Political agreement on the Digital Services Act (DSA)

    0
    201

    In News

    Recently, the European Parliament and European Union (EU) Member States have reached a political agreement on the Digital Services Act (DSA).

    About

    • It is  a landmark legislation to force big Internet companies to act against disinformation and illegal and harmful content, and to provide better protection for Internet users and their fundamental rights.
    • The Act, which is yet to become law, was proposed by the EU Commission (anti-trust) in December 2020. 
    • The proposed Act will work in conjunction with the EU’s Digital Markets Act (DMA), which was approved earlier.

    Digital Services Act (DSA)

    • What it will do: The DSA will tightly regulate the way intermediaries, especially large platforms such as Google, Facebook, and YouTube, function when it comes to moderating user content. 
      • Instead of letting platforms decide how to deal with abusive or illegal content, the DSA will lay down specific rules and obligations for these companies to follow.
    • Applicable to: 
      • It will apply to a large category of online services, from simple websites to Internet infrastructure services and online platforms. 
      • The obligations for each of these will differ according to their size and role.
      • The legislation brings in its ambit platforms that provide Internet access, domain name registrars, hosting services such as cloud computing and web-hosting services. But more importantly, very large online platforms (VLOPs) and very large online search engines (VLOSEs) will face “more stringent requirements.”
      • Any service with more than 45 million monthly active users in the EU will fall into this category. Those with under 45 million monthly active users in the EU will be exempt from certain new obligations.

    Significance of New Rules

    • Faster and actions: Online platforms and intermediaries such as Facebook, Google, YouTube, etc will have to add “new procedures for faster removal” of content deemed illegal or harmful. This can vary according to the laws of each EU Member State.
    • Accountable actions: These platforms will have to clearly explain their policy on taking down content; users will be able to challenge these takedowns as well. Platforms will need to have a clear mechanism to help users flag content that is illegal. Platforms will have to cooperate with “trusted flaggers”.
    • Proper information display: Marketplaces such as Amazon will have to “impose a duty of care” on sellers who are using their platform to sell products online. They will have to “collect and display information on the products and services sold in order to ensure that consumers are properly informed.”
    • Audits: An obligation for very large digital platforms and services to analyse systemic risks they create and to carry out risk reduction analysis”. This audit for platforms like Google and Facebook will need to take place every year.
    • Risk analysis: The Act proposes to allow independent vetted researchers to have access to public data from these platforms to carry out studies to understand these risks better.
    • Ban: The DSA proposes to ban ‘Dark Patterns’ or “misleading interfaces” that are designed to trick users into doing something that they would not agree to otherwise.
      • This includes forcible pop-up pages, giving greater prominence to a particular choice, etc. The proposed law requires that customers be offered a choice of a system which does not “recommend content based on their profiling”.
    • Crisis mechanism clause: It refers to the Russia-Ukraine conflict — which will be “activated by the Commission on the recommendation of the board of national Digital Services Coordinators”. However, these special measures will only be in place for three months.
      • This clause will make it “possible to analyse the impact of the activities of these platforms” on the crisis, and the Commission will decide the appropriate steps to be taken to ensure the fundamental rights of users are not violated.
    • Minor protection: The law proposes stronger protection for minors, and aims to ban targeted advertising for them based on their personal data.
    • Transparency: It also proposes “transparency measures for online platforms on a variety of issues, including on the algorithms used for recommending content or products to users”.
    • Easy exit: It says that cancelling a subscription should be as easy as subscribing.

    Effect on Social Media

    • It has been clarified that the platforms and other intermediaries will not be liable for the unlawful behaviour of users. So, they still have ‘safe harbour’ in some sense.
    • However, if the platforms are “aware of illegal acts and fail to remove them,” they will be liable for this user behaviour. Small platforms, which remove any illegal content they detect, will not be liable.
    • India’s IT Rules announced last year make the social media intermediary and its executives liable if the company fails to carry out due diligence. 
      • Rule 4 (a) states that significant social media intermediaries — such as Facebook or Google — must appoint a chief compliance officer (CCO), who could be booked if a tweet or post that violates local laws is not removed within the stipulated period.
      • India’s Rules also introduce the need to publish a monthly compliance report. They include a clause on the need to trace the originator of a message — this provision has been challenged by WhatsApp in Delhi High Court.

    Source: IE