The Incoming EU’s Digital Services Act And What It Means For Enterprises
Just like GDPR, the DSA will have world changing consequences for how we deal with data
Add bookmarkThis week, Meta was fined €265mn by Ireland’s privacy watchdog over its handling of user data, bringing the total amount the technology giant has been fined by European regulators to nearly €1bn.
The fines are part of the wider enforcement of the General Data Protection Regulation, an EU-wide law that was seen setting a global standard for online privacy (also known as the ‘Brussels’ effect) when it came into force four years ago.
But GDPR was not the only tool in the EU’s disposal, and the this month saw the introduction of the bloc’s Digital Services Act (DSA) - a lengthy (300-page) piece of cross-sector legislation with composite rules and legal obligations for technology companies.
Notably, there is a focus on social media, user-oriented communities, and online services with an advertising-driven business model.
One of the central goals of the Digital Services Act is to put an end to the self-regulation of Big Tech and force companies to be more transparent, particularly in the realm of algorithmic accountability and content moderation.
To do so, the DSA includes clear responsibilities for the EU and member states to enforce these rules and obligations.
So who exactly does the DSA apply to?
The Digital Services Act applies to host services, marketplaces, and online platforms that offer services in the EU, regardless of their place of establishment. Therefore, the effect of the Act and the expectation to comply will be felt globally.
There is a specific focus on Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs). VLOPs have been defined as platforms that have over 45 million average monthly users in the EU.
Key Provisions for Accountability & Compliance:
- All online platforms must publicly report how they use automated content moderation tools and the tools’ error rates.
- All online platforms will be required to disclose the number of removal orders issued by national authorities (enforcers) and all notices about the presence of illegal content by trusted flaggers (content moderators) or obtained by automated means.
Risk Governance Approach
A risk governance approach underpins the EU Digital Services Act. It pertains to regulated responsibilities to address systemic issues such as disinformation, hoaxes and manipulation during pandemics, harm to vulnerable groups and other emerging societal harms. These issues are categorised as online harm/harmful content in the legislation and are governed by Articles 26, 27 and 28.
Article 26
Referred to as the risk assessments provision, Article 26 stipulates that VLOPs and VLOSEs will need to conduct risk assessments on an annual basis or at the time of deploying a new relevant functionality to identify any systemic risks coming from the design and provision of services.
These assessments must identify risks related to all fundamental rights outlined by the charter, focusing on risks to freedom of expression, electoral processes and civic discourse, the protection of minors, public health, and sexual violence. As the technology harms landscape shifts and evolves, risks may evolve too. Which means that ensuring agile risk assessments is critical.
Article 27
As stipulated by Article 27, these risk assessments must be accompanied by mitigation measures that are reasonable, proportionate and effective. Efforts to mitigate the risks associated with harmful content should bear in mind that harmful content should be treated in the same way as illegal content to the extent that harmful content is not illegal. The DSA’s rules will only impose measures to remove or encourage the removal of illegal content in full respect to the freedom of expression.
Article 28
Article 28 will require that VLOPs submit annual independent and third-party audits to certify that they comply with Articles 26, 27 and overall reporting requirements. In addition, the audits would ensure that VLOPs comply with Ch. III of the DSA, the third-party auditor, would have to prove independence from the industry for an audit to be considered valid.
Companies should also note that vetted researchers, including academics and civil society organizations, could gain access to relevant data to conduct their own research surrounding risk identification and mitigation.
Enforcement
The DSA will be enforced through National Authorities and the EU Commission, where national authorities must assign a competent authority to supervise and enforce. For VLOPs and VLOSEs, the EU Commission will be the enforcement body.
Penalties
- Article 52: Non-compliance can attract penalties of up to 6% of annual worldwide revenue.
- Article 54: Companies and platforms will also be exposed to civil suits and liability, as individuals, businesses, and other users can seek compensation for any damage or loss from non-compliance/infringement.
What comes next?
The Digital Services Act came into effect on November 16th, 2022, and digital service providers now have three months to publish their number of active users. As per Article 93, the new rules will become applicable from February 17, 2024. VLOPs and VLOSEs would have to be ready to comply four months earlier.
To understand this problem even more and how your enterprise can be proactive in ensuring your data does not fall foul of incoming regulation, sign up to our next event: Explainable AI: How to Navigate The New World of Data Regulation.