More and more businesses are relying on automated decision-making processes, including the use of AI, for example, when granting loans in the financial sector. Under Art. 22 GDPR, data subjects have the right not to be subject to a decision based solely on automated processing that has significant legal effects, such as the rejection of a loan application. If automated decision-making is used, the GDPR requires, in addition to the general requirements:
- Transparency in the privacy policy regarding the existence of automated decision-making and information about the logic involved, as well as the significance and the envisaged consequences for the data subject.
- Integration of human intervention in the decision-making process with a significant influence on the decision.
Current fines: What happened?
1. Hamburg: Fine imposed on financial company for non-transparent decisions
In September 2025, the Hamburg data protection authority imposed a fine of EUR 492,000 on a financial company. The reason: credit card applications were rejected on the basis of automated decisions without the data subjects being adequately informed of the reasons. This was despite the fact that some of them had good credit ratings. The business did not sufficiently comply with its information and disclosure obligations and, when asked, did not provide a comprehensible explanation of the logic involved in the decision-making process.
2. Netherlands: Fine for automated credit assessment
The Dutch data protection authorities also recently imposed a fine of
EUR 2,700,000 against the provider Experian, which used automated credit assessment without informing the data subjects fairly about the decision-making logic and its consequences. Experian has also built up a database in the Netherlands with information from various public and private sources, such as the commercial register and telecommunications companies. The necessity of the data collection could not be convincingly justified, and the use of sensitive data could have had serious consequences for data subjects. Experian acknowledges the violation of the law, is not appealing the fine, has ceased its business activities in the Netherlands and will perform the erasure of the database this year.
Practice tips for your business
1. Create transparency:
Inform data subjects about whether and how automated decision-making is used. Take into account the involved logic, as well as the significance and the envisaged consequences on data subjects.
2. Define internal processes:
Determine who in the business is responsible for access and information requests. Establish clear internal checklists and responsibilities.
3. Ensure you are able to provide information:
Prepare sample responses explaining the logic of automated decision-making to data subjects (e.g. which criteria are assessed and to what extent the result is influenced).
4. Integrate human review:
Integrate human review mechanisms by appointing a responsible person with significant influence on the decision.
We are happy to answer any questions you may have about specific implementation options or reviewing your processes.