CHAPTER III

Article 27: Fundamental rights impact assessment for high-risk AI systems

156 days until the FRIA deadline

Plain English Summary

What this article does: Requires deployers of high-risk AI used in certain public-facing contexts to carry out a fundamental rights impact assessment before first use. Who it applies to: Deployers that are public bodies or private entities providing public services, and deployers of AI systems listed in certain Annex III categories. Key requirements: Para 1: Applicable deployers must perform a fundamental rights impact assessment before putting the system into use. Para 2: The assessment must include: a description of the deployer's processes where the AI will be used, a description of the period of and frequency of use, the categories of natural persons and groups likely to be affected, the specific risks of harm likely to affect those persons/groups, a description of human oversight measures, the measures to be taken if those risks materialise, and the governance and complaint mechanisms. Para 3: Where the deployer already has a DPIA under GDPR, the fundamental rights impact assessment may complement it. Para 4: The assessment must be notified to the market surveillance authority. Para 5: Must be updated when conditions change substantially.

1. Prior to deploying a high-risk AI system referred to in (2), with the exception of high-risk AI systems intended to be used in the area listed in point 2 of Annex III, deployers that are bodies governed by public law, or are private entities providing public services, and deployers of high-risk AI systems referred to in points 5 (b) and (c) of Annex III, shall perform an assessment of the impact on fundamental rights that the use of such system may produce. For that purpose, deployers shall perform an assessment consisting of: (a) a description of the deployer’s processes in which the high-risk AI system will be used in line with its intended purpose; (b) a description of the period of time within which, and the frequency with which, each high-risk AI system is intended to be used; (c) the categories of natural persons and groups likely to be affected by its use in the specific context; (d) the specific risks of harm likely to have an impact on the categories of natural persons or groups of persons identified pursuant to point (c) of this paragraph, taking into account the information given by the provider pursuant to ; (e) a description of the implementation of human oversight measures, according to the instructions for use; (f) the measures to be taken in the case of the materialisation of those risks, including the arrangements for internal governance and complaint mechanisms.

2. The obligation laid down in paragraph 1 applies to the first use of the high-risk AI system. The deployer may, in similar cases, rely on previously conducted fundamental rights impact assessments or existing impact assessments carried out by provider. If, during the use of the high-risk AI system, the deployer considers that any of the elements listed in paragraph 1 has changed or is no longer up to date, the deployer shall take the necessary steps to update the information.

3. Once the assessment referred to in paragraph 1 of this Article has been performed, the deployer shall notify the market surveillance authority of its results, submitting the filled-out template referred to in paragraph 5 of this Article as part of the notification. In the case referred to in (1), deployers may be exempt from that obligation to notify.

4. If any of the obligations laid down in this Article is already met through the data protection impact assessment conducted pursuant to of Regulation (EU) 2016/679 or Article 27 of Directive (EU) 2016/680, the fundamental rights impact assessment referred to in paragraph 1 of this Article shall complement that data protection impact assessment.

5. The AI Office shall develop a template for a questionnaire, including through an automated tool, to facilitate deployers in complying with their obligations under this Article in a simplified manner. SECTION 4 Notifying authorities and notified bodies

Cross-referenced Articles

Built by Paul McCormack — lawyer, product leader, and founder of Kormoon. This site is an independent informational resource only and does not constitute legal advice. No reliance should be placed on its contents. For the authoritative text, refer to the official EUR-Lex source linked in the Annexes tab, or consult your legal advisor.