Europe’s Big Tech law has been passed.  Now comes the hard part

Europe’s Big Tech law has been passed. Now comes the hard part

The potential gold standard for online content management in the EU – the Digital Services Act – is now a reality after the European Parliament voted overwhelmingly in favor of the legislation earlier this week. The last hurdle, which is just a formality, is that the European Council of Ministers has to sign the text in September.

The good news is that the groundbreaking legislation includes some of the most comprehensive transparency and platform liability obligations to date. It gives users real control and visibility over the content they interact with and protects against some of the most pervasive and harmful aspects of our online spaces.

The focus is now shifting to the implementation of the expanded law as the European Commission starts seriously developing the enforcement mechanisms. The proposed regime is a complex structure in which responsibilities are shared between the European Commission and national regulators, in this case Digital Services Coordinators (DSCs). It will rely heavily on creating new roles, expanding existing responsibilities and seamless collaboration across borders. What is clear is that at the moment there is simply not the institutional capacity to effectively implement this legislation.

In a “tasterThe Commission has given a glimpse of how they are proposing to overcome some of the more obvious implementation challenges, such as how they plan to monitor major online platforms and how they will try to avoid the problems that plague the GDPR, such as out-of-sync national regulators and selective enforcement, but their proposal only raises new questions. A huge number of new staff will need to be hired and a new European Center for Algorithmic Transparency will have to attract world-class data scientists and experts to help enforce the extensive new obligations in algorithmic transparency and data accessibility. The Commission’s preliminary vision is to organize its regulatory responsibilities into thematic areas, including a social issues team, which will be tasked with overseeing some of the new due diligence obligations. Insufficient resources are a cause for concern here and would eventually risk turning these hard-won commitments into empty tick box exercises.

A crucial example is the obligation of platforms to conduct assessments to address systemic risks on their services. This is a complex process that must take into account all fundamental rights protected by the EU Charter. To do this, the tech companies will need to develop Human Rights Impact Assessments (HRIAs) – an assessment process designed to identify and mitigate potential human rights risks arising from a service or company, or in this case a platform – something civil society urged urge them to do during the negotiations. However, it is up to the board of directors, composed of the DSCs and chaired by the Commission, to annually assess the most prominent identified systemic risks and identify best practices for mitigation measures. As someone who has contributed to the development and review of HRIAs, I know this will be no small feat, even if independent auditors and researchers are involved in the process.

If they are to make an impact, the assessments must establish comprehensive baselines, concrete impact analyses, evaluation procedures and strategies for stakeholder engagement. The very best HRIAs take a gender-sensitive approach and pay specific attention to systemic risks that will have a disproportionate impact on those of historically marginalized communities.

This is the most concrete method to ensure that all possible violations of rights are included.

Fortunately, the international human rights framework, such as the UN Guiding Principles on Human Rights, provides guidance on how best to develop these assessments. Nevertheless, the success of the provision will depend on how platforms interpret and invest in these assessments, and even more so on how well the Commission and national regulators will enforce these obligations. However, with current capabilities, the institutions’ ability to develop the guidelines and best practices and evaluate mitigation strategies is nowhere near the scale the DSA needs.