As was the case with the burgeoning issue of consumer data privacy and the adoption of GDPR, the European Union is taking the lead in the battle to regulate the internet. The adoption of the “Digital Markets Act” and “Digital Services Act” by ministers of the Council of the European Union, is another key step towards stricter regulation of tech – including social media platforms, internet hosting providers, search engines and ecommerce marketplaces. The measures intend to radically reframe the legal landscape governing the internet.
Rules governing the online landscape for Europe have been largely unchanged since 2000 when the EU ecommerce directive came into effect. Meanwhile, significant new challenges such as the widespread sale of illegal and counterfeit goods, hate speech and disinformation have proliferated in the internet age. Persistent controversies relating to the power of tech-giants to influence and shape the world – online and offline – have made the need for new regulations quite apparent.
The twin legislative proposals aim to fulfill two main goals: “to create a safer digital space in which the fundamental rights of all users of digital services are protected and to establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally.”
The Digital Markets Act lays out a regulatory framework for preserving competition in the digital space, while the Digital Services Act focuses on preventing the proliferation of illegal content and products to create a safer internet.
A Closer Look at the Digital Services Act
The Digital Services Act relates to issues surrounding the regulation of ecommerce, social media, and other digital service providers. The act establishes a broad range of regulations to govern the online space. Some of the most noteworthy elements are explored below:
The DSA seeks to regulate digital services that “act as intermediaries in their role of connecting consumers with goods, services, and content.” While covering a wide range of typologies, the most obvious players which fit into this category are ecommerce marketplaces and social media platforms.The act sets out new requirements for due diligence, monitoring and reporting as it relates to sellers and content providers.
Of note for marketplaces are the new due-diligence requirements for third-party sellers. These include collecting and verifying seller contact, financial, identity and company details. Furthermore, platforms must share the following with consumers:
- Name, address, telephone number, email address of the trader/seller
- The company registration number in the relevant trade register
- A self-certification by the trader/seller committing to only offer products or services that comply with the applicable rules of EU law.
The new requirements are meant to offer transparency regarding the identity of sellers and weed out bad actors.
Disproportionate Focus on “Very Large Online Platforms”
Noting that certain platforms “have a particular impact on the economy and society and pose particular risks in the dissemination of illegal content and societal harms”, the Digital Services Act places a disproportionate focus on the largest platforms and players.
According to the DSA, “Very Large Online Platforms” (VLOPs) are defined as though with a user base of roughly 10% of the European Union population.
VLOPs will have additional reporting, monitoring and other obligations. These include:
- Yearly independent audits
- Structural risk assessments
- Sufficient content moderation mechanisms to address identified risks
- A dedicated DSA compliance officer
- Reporting provisions and data sharing with relevant authorities
Hosting Services – New Responsibility
A major shakeup brought about by the DSA is formally assigning responsibility to hosting providers. The act stipulates that not only are platforms and marketplaces obligated to notify authorities of illegal activity but also hosting providers. This seeks to remedy the current situation whereby companies hosting websites which are spreading hate, inciting violence, or offering the sale of illegal products and services largely fell outside the regulatory framework.
Reporting and Flagging Mechanisms
Platforms will be required to have mechanisms in place for detecting and removing illicit content. This includes the use of outside organizations deemed “trusted flaggers” as well as the use of automated tools. The proper handling of user generated reports is also addressed in the proposal.
Penalties for Non-Compliance
Companies which are deemed to violate the new acts could face severe and crippling fines. The proposed sanction for violating their statues can amount to up to 10% of a company’s global revenues.
When will it take effect?
While the path to finalizing and adopting final texts of the two acts faces still faces hurdles, they could be adopted as early as mid-2022.
The Digital Services Act marks a turning point in the internet age and the beckoning of likely further regulation and enforcement of the digital space. EverC is tracking these developments closely and working to help our clients meet these changing regulatory challenges and ensure a safer internet for all.
For more information regarding the Digital Services Act or how EverC can assist your organization, contact us at email@example.com or myself directly at firstname.lastname@example.org.