Ready or Not, Here Comes the AI Act!

D&I Alert

D&I’s summary of the changes coming your way

The European Parliament has approved the Artificial Intelligence Act on 13 March 2024.

The AI Act is a huge step forward in creating a legal framework for AI technology throughout the European Union. It brings about substantial new obligations for both the developers and users of artificial intelligence (or, using the terminology of the Act itself, the providers, importers, deployers, authorised representatives and other parties listed in the Act). However, although the categorisation does cut a few corners, the AI Act can be seen as a type of “product safety” legislation. As such, it leaves a wide range of topics to be dealt with in other EU and/or national laws, or by the parties involved in a specific transaction.

Important dates: spring 2026

Although we can now rest assured that the AI Act will be adopted by the European Union, there is still work to be done. Most importantly, the Act needs to go through a final lawyer-linguist check and be formally endorsed by the Council. Once all this is done, it will enter into force on the 20th day following its publication in the Official Journal of the European Union.

As with the GDPR, the AI Act will not become applicable immediately. Instead, organisations are, as a main rule, given 24 months to prepare for it. The AI Act will, therefore, likely be applied starting spring 2026 – with some exceptions, the most notable ones which are:

  • The prohibitions on AI practices causing unacceptable risk will be applicable after 6 months from the date of entry into force, and
  • The obligations for general purpose AI governance become applicable after 12 months from the date of entry into force.

Changes coming your way

Although the European Parliament made their decision only yesterday, we at D&I have been working closely with the legal aspects of using, procuring and developing AI and other transformative technologies for years. Especially such themes as assessing data readiness, preparing AI and data strategies, carrying out IP risk assessments, and drafting, amending and negotiating IP, data and service agreements have kept us busy lately.

From a regulatory perspective, in addition to the upcoming AI Act, our work focuses especially on copyrights (incl. OSS), trade secrets, employee privacy, confidentiality of communications, requirements on the contents of contracts, automated decision making, and data protection – in addition to compliance with sector specific laws, especially related to the financial and health sectors, and consumer commerce.

Zooming in on the AI Act itself, here are some thoughts on what companies should be focusing on in 2024.

1. Understand what is covered by the definitions of “AI system” and “general-purpose AI model” and create an AI Governance Strategy. One of the most challenging tasks of the legislators was finding a consensus on what types of systems should be regulated as “artificial intelligence”. Although the definitions are broad, there are also some limitations to the scope of applicability of the Act.

2. Map your use of AI and the risks involved with it. The adopted text takes a risk-based approach to AI. The higher the risk, the more stringent the obligations. The AI Act even ended up banning the use of certain AI systems due to the unacceptable risk they are seen to pose on health, safety and fundamental rights. Where low risk systems are subject to rather lenient obligations which revolve especially around transparency, high risks systems must also comply with numerous other provisions on, among others, risk management, impact assessments, data governance requirements, technical documentation and record-keeping, cyber security, human oversight, and system robustness and accuracy. In order to be able to comply with these new obligations, companies need to assess their current compliance level and perform a gap analysis to define a roadmap for meeting the new requirements. To do so, you must first know the risks involved with the AI you are using (or developing).

3. (Prepare to) communicate transparently. As mentioned above, regardless of what type of AI system you are using, you will most likely be subject to transparency obligations regarding your AI use. Therefore, prepare to communicate openly with your employees, customers and stakeholders on your use of AI technologies, and what effects such use has on those individuals.

4. Integrate AI Act compliance into your existing workflows. You know what they say: “Don’t fix it if it’s not broken.” Instead of creating completely new compliance processes for AI, it is often more efficient to adapt your old ones instead. Especially data protection, procurement and data security processes often form a sturdy foundation for the use of AI systems as well. When choosing the “building blocks” of your internal AI compliance work, look especially to your policies, processes, training material, training events, and monitoring and supervision activities to ensure they all take AI into account.

5. Reuse, recycle, reduce – ensure you take good care of your data assets. AI is often only as good as the data it relies on. In order to ensure your AI tools can be used to their full potential and that you stay compliant with not only the AI Act itself, but all other applicable laws, taking an early look at what data you have at your disposal, evaluating the quality and content of such data, and ensuring you have sufficient rights to such data is – in many cases – essential to success.

6. Update your contracts and templates. Collaboration with your AI partners is key in ensuring compliance with the AI Act. In many situations such collaboration is best supported through a clear agreement framework which sets out unambiguous obligations for each party.

7. Think: business advantages. The Act creates new business opportunities and calls for innovative new services in an EU-wide harmonized market. Are you up for the challenge?

 

 

Latest insights

Fostering Continuous Development

Article / 1 Jul 2024

Advocate for Change: Good Governance and Sustainability

Article / 1 Jul 2024