ECJ ruling on SCHUFA scoring system: Rebooting the right to a human decision-maker in the AI Age?

The European Court of Justice holds that credit scoring constitutes an 'automated decision' that deserves protections under the GDPR

Summary of the ECJ decisions

In December 2023, the European Court of Justice (ECJ) addressed the issue of credit scoring systems in two important decisions involving SCHUFA, a private German company which provides financial institutions with credit scoring information.

The first decision regarded the scope of the right to be forgotten in connection with processing of personal data which was deleted from its original public sources. In this regard, the ECJ ruled that private credit agencies should not store data relating to the discharge from remaining debts for longer than public insolvency registers retain the information.

The second decision focused on the interpretation of automated decision making which is provided under article 22 of the GDPR. This blog post will comment on the latter.

The case was referred to preliminary ruling to the ECJ by German Administrative Court in the case of OQ v SCHUFA Holding AG. SCHUFA estimates the probability of persons repaying their loans based on their specific characteristics using mathematical and statistical methodologies, and provides to its clients a "score value" representing a prediction of a person’s creditworthiness based on those automated decisions.

OQ, an individual whose loan request was refused based on scoring information provided by SCHUFA, requested the company to provide her with information on her personal data and to erase some of her data which was allegedly incorrect. The company provided only a general outline that explains the decision but refrained from disclosing the various data points taken into account for the purpose of the actual probability calculation and their respective weight. The local DPA rejected OQ’s subsequent complaint to order SCHUFA to comply with her request. OQ appealed against the decision to the Wiesbaden administrative court, which referred the matter to the ECJ to determine whether the issue at stake constitute automated decision-making within the meaning of Article 22(1) of the GDPR which provides the data subject protections under art 22(2)(b).

The Court addressed the question whether scoring by SCHUFA constitutes an automated decision-making within the meaning of Art. 22 GDPR. SCHUFA argued that it merely provides scores to its clients, while the 'decision' itself is made by third parties - the financial institutions which receive the scoring information and decide whether to grant the loans. the ECJ rejected this position and determined that SCHUFA’s credit scoring result is considered an automated decision within the ambit of ART22.

In its decision, the court outlined Article 22’s three cumulative conditions: (1) there must be a ‘decision’; (2) that decision must be ‘based solely on automated processing, including profiling’; and (3) the decision must produce ‘legal effects concerning [the interested party]’ or ‘similarly significantly [affect] him or her’.

The court broadly interpreted the term ‘decision’ in Article 22 as capable of incorporating "a number of acts which may affect the data subject in many ways", including for example, digital recruiting practices that are conducted without human intervention. Accordingly, it can capture the calculation of a probability value signifying a person’s creditworthiness. The court further found that the second condition is also met in the case of SCHUFA’s activities, as they meet the definition of profiling under the GDPR. In other words, producing "probability value" based on personal data and concerning to the person ability - will constitutes profiling and hence included under the auspice of "automated processing".

The court then turned to the third condition. Examining whether the scoring value produced by private company SCHUFA is considered to be a decision that produces "legal effects" or "affect the person significantly," the Court relied on of the referring’s court findings. Since in practice, an insufficient probability value (produced by SCHUFA and provided to third party – the bank) leads, in almost all cases, to a refusal of that bank to grant the loan applied for, the court concluded that a scoring result provided by SCHUFA plays a "a determining role" in granting the loan by the bank.

Notably in this regard, the court explained that a narrow interpretation according to which the "decision" is only that which is being made by the bank based on the scoring result will in a fact, circumvent Article 22, and leave a "legal lacuna" in the protection on data subject within automated processing.

Additionally, the court noted that automated decision refers also to automated processing of personal data that revolves analyzing a person (data subject) performance with regard to work, economic situation and health, as well as analyzing reliability or behavior. The court emphasized the risks that automated decisions entails, such as embedded discrimination in the data processing.  This, according to the court, justifies the broad interpretation of the term "decision" and the safeguards it invokes in accordance.

It concluded that the SCHUFA scoring result is an "automated decision" which should accord with Article 22, 5 and 6 to the GDPR in order to meet the required legal basis. It follows then that in such cases, the data subject deserves the protection of "obtain human intervention on the part of the controller, to express his or her point of view and to challenge the decision taken in his or her regard".

Preliminary thoughts

The broad interpretation taken by the Court is a key point in its decision. According to the Court’s interpretation, every scoring evaluation or profiling producing a "probability value" will be considered "automated decision", even if this result is being transferred to a non-automatic, human decision maker, if it can be empirically inferred that the latter “draws strongly” on that value.

In terms of compliance, the SCHUFA case stresses that automated decision-making processes cannot be deemed to have a human on the loop, if that particular human substantially relies on automated results. It thus increases companies' compliance responsibilities.

Another notable point of the SCHUFA case, is that the reference to the right not to be subject to automated decision making was made not in connection with the actual exercise of the right but in order to exercise the transparency rights applicable under Article 15(h) of the GDPR, pursuant to which the data subject shall have the right to obtain from the controller meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject. It should be noted that under the looming EU AI Act (based on the recent draft agreement), the right of an affected person to a meaningful explanation of individual decision making is extended to decisions based on the output from a high-risk AI system, which lowers the threshold set in the GDPR – according to the interpretation in SCHUFA.

Furthermore, under Article 14 of the looming AI Act, High Risk AI system (which include credit scoring systems such as those developed by SCHUFA) should remain subject to meaningful human oversight. Under Article 14, human oversight abilities will ensure that the individual human overseers are able to decide not to use the system or disregard it, intervene in its operations or stop them altogether. Future cases on automated decision making are likely to focus then on the likelihood and the extent these oversight abilities can be actualized.

With the advent of automated decision systems – in banks, courts, and other civil and administrative systems – and as these technologies becoming more accurate and advanced, so does rise the temptation to defer crucial decisions regarding individual persons from the hands of humans to the robotic talons of machines.  As such, crucial ethical legal and human-rights questions are rising. In this light, the SCHUFA decision, that in a fact reanimates the somewhat dormant Article 22 of the GDPR, seems more relevant than ever.

dafna and amir