Understanding requirements for AI systems and implementing them in a structured manner
The AI Act creates a uniform legal framework for the use of artificial intelligence across Europe for the first time. It focuses on the question of the conditions under which AI systems can be used in a safe, traceable and trustworthy manner.
Unlike traditional product regulation, the AI Act not only assesses the system itself, but also its context of use. The requirements are based on the risk posed by an AI system.
For companies, this means that the use of AI becomes a regulated task requiring verification - from development to operation.
Which AI systems fall under the AI Act
The AI Act applies to systems that are based on AI - especially if they support or automate decision-making. The decisive factor is not only the technology, but also its planned and foreseeable intended use.
The following are affected, for example
Not affected by the AI Act
AI systems in the areas of national security, defense and military purposes are excluded from the AI Act. In addition, the Act does not apply to pure research, testing or development activities as long as systems or models have not yet been placed on the market or put into operation.
Similar to the CRA, the AI Act distinguishes between different players along the value chain:
Depending on the role, different obligations arise - especially between manufacturers and operators.
Requirements depend on the risk of the AI system
The AI Act distinguishes between different risk categories - from minimal risk to high-risk AI.
High-risk systems are particularly relevant, for example in safety-critical applications or as a component of machines.
General purpose AI (GPAI) - AI models that can be used for a wide range of applications - are also taken into account. Depending on the application and performance, extensive requirements may also apply here
Requirements along the AI life cycle
The requirements of the AI Act relate to both the development and operation of AI systems.
Development and design
Key requirements must already be met during development:
AI systems must be designed to be comprehensible and controllable.
Transparency and control
A key difference to other sets of rules is the requirement for dealing with decisions:
This ensures that AI systems do not remain "black boxes".
Conformity and placing on the market
The following evidence is required for regulated AI systems:
Operation and monitoring
There are also extensive obligations during operation:
The challenge lies not only in the technology, but also in the combination of technology, regulation and application context.
Practice shows that
The combination of AI development, system integration and regulatory assessment is particularly challenging
The AI Act will come into full force on August 2, 2027. However, parts of it will apply earlier.

| Infringement | Penalty |
|---|---|
| Use of prohibited AI systems | Up to €35 million or 7% of the total global annual turnover of the previous financial year, whichever is higher |
| Non-compliance with obligations (excluding use of prohibited AI systems) | Up to €15 million or 3% of the total worldwide annual turnover of the previous financial year, whichever is higher |
| False/misleading/incomplete information provided to notifying bodies or market surveillance authorities | Up to € 7.5 million or 1% of the total worldwide annual turnover of the previous financial year, whichever is higher |
| Non-compliance obligations by providers of AI models for general purposes | Up to €15 million or 3% of the total worldwide annual turnover of the previous financial year, whichever is higher |
Note: For newly established companies and SMEs, the lower amount applies instead of the higher amount.
Companies are often faced with the following questions:
The requirements of the AI Act not only affect individual components, but also the entire system and its use.
IABG provides support with:
Please find more information on

Please fill in the form and we will get in touch with you as soon as possible.
This section contains third-party content that you can view with a single click.
By loading the form, personal data may be transmitted to the third-party provider. You can find more information in our privacy policy