Internet Regulation Updater
First compliance deadlines for the European Union’s AI Act pass – are you compliant?
The first compliance deadlines of the European Union’s Artificial Intelligence Act (‘Act’), which entered into force in August last year, are here. As of 2 February 2025, the AI literacy and Prohibited AI Practices provisions outlined in Articles 4 and 5 (respectively) of the Act became applicable.
What is the effect of the Prohibited AI Practices Article?
Under the AI Act’s risk-based approach, certain AI practices deemed to present an unacceptable risk to people’s health, safety or fundamental rights are now prohibited in the EU. Examples of such practices include using AI to manipulate someone in a way that they are not aware of or using AI to monitor someone’s emotions in the workplace using their biometric data. Organisations either putting these systems on the market, or using them in the EU, face significant fines of up to the greater of €35 million or 7% of global turnover.
In order to comply with the requirements, by 2 February organisations in scope of the Act should have:
- Identified any Prohibited AI’s in their organisation; and
- Decommissioned or remediated them.
In order to manage ongoing compliance, organisations will need to:
- Establish a process as part of their AI governance to ensure that any future development or procurement of AI does not deliver prohibited practices.
Deloitte can support organisations in managing their compliance with these requirements, including by helping to put in place an AI inventory, design a process for classifying AI’s according to the classifications under the EU AI Act, supporting with legal determinations of potentially Prohibited AIs, and with decommissioning of AI systems.
What does AI Literacy under the Act mean?
In order to manage the risks of AI (and also realise the benefits), the Act expects providers and deployers of AI systems to ensure a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf. Organisations in scope of these requirements need to start planning now for an AI literacy programme that is sufficient for those interacting with and managing compliance with AI in their organisation in different ways.
In order to comply with the requirements, by 2 February organisations in scope of the Act should have:
- Identified the AI literacy requirements relevant to their organisation, as well as the technical knowledge and experience of those to be trained, as well as the context and intended audience of the systems that are to be used; and
- Taken measures, aligned to the requirements identified, to ensure sufficient literacy in their organisation.
In order to manage ongoing compliance, organisations will need to:
- Establish an ongoing plan to ensure sufficient AI literacy across their organisation; and
- Deliver ongoing AI training and literacy content.
Deloitte can support organisations in managing the AI Act requirements relating to AI literacy, as well as supporting wider culture change and training to harness the benefits of AI and manage the risks in line with your strategy, by developing AI training plans and building AI training and awareness content. Our training programmes equip clients with the knowledge and skills to understand the regulation's requirements, identify risks, and implement appropriate compliance measures as needed across the organisation.
What's next?
A wave of further provisions will come into effect from 2 August 2025, including:
- Regulations concerning general-purpose AI models (GPAI);
- Rules governing notified bodies responsible for conformity assessments;
- Confidentiality obligations; and
- Penalties for non-compliance with the Act.
While the full enforcement of penalties for non-compliance does not begin until 2 August 2025, the guidelines on prohibited AI practices published by the European Commission highlight that parties can still approach national courts to seek interim injunctions against non-compliant AI systems. This means that even though monitoring and fines are not yet fully applicable, legal action can be taken against organisations deemed to be violating the Act's prohibitions.
How we can help
Deloitte can assist you in navigating the complexities of the AI Act, as well as emerging global AI regulation, through comprehensive solutions. By leveraging our expertise, Deloitte empowers clients to confidently navigate the evolving regulatory landscape and foster responsible AI innovation.
The Act is a complex and evolving piece of legislation. Staying informed about its deadlines and understanding the potential for legal action, is essential for any organisation developing or deploying AI systems within the EU. If you want to find out more about how Deloitte can support you in managing compliance with these provisions and preparing for implementation with the rest of the Act please do get in touch.
Your contacts
If you would like to speak to the Deloitte team please contact:
Joey Conway, Internet Regulation Partner, Legal Lead
Nick Seeber, Partner, Global Internet Regulation Lead
Scott Bailey, Director, Global Internet Regulation Lead
Piyush Goraniya, Senior Associate, Global Internet Regulation Lead
Content from Deloitte's Internet Regulation blog can now be sent direct to your inbox. Choose the topic and frequency by subscribing here and selecting Internet Regulation.