Publication date
Monday, 03 March 2025

Pyblication type
Blog post

Author(s)
Dirk Brand (Dr)
Technology, Media & Telecommunications

Tags
#automateddecisionmaking #transparency #credit #GDPR
Transparency and automated decision-making – rights of data subjects

In a judgement by the European Court of Justice ("ECJ") on 27 February 2025, clarity was provided about the meaning of transparency in the context of automated decision-making under the General Data Protection Regulation ("GDPR")[1]. Case C-203/22 Dun & Bradstreet is not only important regarding the application of the GDPR, but it is also relevant for the implementation of the EU AI Act (2024) [2] that contains specific transparency requirements. Furthermore, it is also of interest to countries with data protection legislation similar to the GDPR.

The matter before the ECJ originated in Austria where a mobile phone operator denied a person a mobile phone contract due to an insufficient credit profile. Dun & Bradstreet Austria ("D&B"), which specialises in credit assessments, provided a credit profile of the customer based on an AI model. The customer objected against this credit assessment and submitted a request to the Austrian Data Protection Authority to obtain meaningful information from D&B about the logic involved in processing her personal data to produce this credit profile.

The Austrian DPA allowed the request, but D&B objected, and the customer lodged an action in the Austrian Administrative Court, which confirmed that the customer had a right under art. 15(1)(h) of the GDPR to access to accurate information about the processed personal data and the internal logic involved in the automated decision-making, so that she can understand it and verify the decision about her credit assessment. The Austrian Court found that D&B provided information to the customer, but it was not sufficient. It is interesting to note that the information provided by D&B contradicted the outcome of the AI decision. The Austrian court referred the matter to the ECJ to seek clarity on the meaning of the right to ‘meaningful information about the logic involved in the automated decision-making’, and the interpretation of the directive on trade secrets.[3]

The ECJ considered the meaning of transparency in the application of the right to meaningful information about the logic involved in the automated decision-making under the GDPR. The Court said that providing the actual algorithmic code would not satisfy the requirement under art. 15(1)(h) of the GDPR, since it would not be meaningful information that a data subject could use. A data subject has a right in terms of art. 15(1(h) to obtain from a controller confirmation if personal data of him or her are being processed, and in the case of automated decision-making, to get access to meaningful information about the logic involved in that automated decision-making.

The ECJ held that the controller (processer of the personal data) must describe the procedure and principles actually applied in such a way that the data subject can understand how his or her personal data were processed in the automated decision-making. The explanatory information must be aligned with the actual decision. The information must therefore be clear and understandable to enable the data subject to act upon it.

If a controller states that the requested information contains protected data of third parties or trade secrets, the controller must provide the alleged protected information to the data protection authority or court, who will then decide if it is indeed the case, and then to determine the extent of the data subject’s right to access to the disputed information.

In South Africa, section 71(3)(b) of the Protection of Personal Information Act ("POPIA") stipulates that a responsible party (the processor of the data) has an obligation to provide a data subject with ‘sufficient information about the underlying logic of the automated processing of the information relating to him or her to enable him or her to make representations’ regarding the automated decision-making of personal data to create a profile of such a person. The wording in this section in POPIA is not the same as it is in the comparable provision under the GDPR, but the implications are similar. While this provision in POPIA confirms an obligation on the responsible party and does not stipulate a right of the data subject, it is argued that a data subject can indeed claim enforcement of such obligation in order to enable him or her to exercise their rights regarding automated decision-making. If the reasoning of the ECJ is to be applied in South Africa, it means that compliance with the transparency obligation in sec. 71(3)(b) of POPIA requires that clear and understandable information about the logic of the automated decision-making should be provided to the data subject. It is not necessary to provide the actual AI code, and the code itself will also not meet the requirement of clear and understandable information.

This court decision confirmed the importance of algorithmic transparency. If a user of AI or a data subject impacted by automated decision-making don’t have clear and understandable information about the logic of the specific AI model, the outcome of the automated decision can’t be meaningfully contested.

[1] Regulation (EU) No 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation). [2] Regulation (EU) 2024/1689 of the European Parliament and of the Council of 13 June 2024, laying down harmonised rules on artificial intelligence and amending Regulations (EC) No 300/2008, (EU) No 167/2013, (EU) No 168/2013, (EU) 2018/858, (EU) 2018/1139 and (EU) 2019/2144 and Directives 2014/90/EU, (EU) 2016/797 and (EU) 2020/1828 (Artificial Intelligence Act). [3] Directive (EU) 2016/943 of the European Parliament and of the Council of 8 June 2016 on the protection of undisclosed know-how and business information (trade secrets) against their unlawful acquisition, use and disclosure).

By using this website, you agree to our prevailing terms of use.
Design collaboration with Sidebar Design Studio