Guide10 min read

General-Purpose AI Models: What Articles 53 and 55 Require

Provider obligations for GPAI models — from technical documentation to systemic risk, explained for model developers and downstream integrators

P

Paul McCormack

AI Governance & Compliance · 1 March 2026

General-purpose AI models — think foundation models, large language models, and multimodal systems — have their own dedicated compliance framework under the EU AI Act. Articles 53 and 55 establish what providers of these models must do, with obligations that have applied since 2 August 2025. This guide breaks down the requirements for both standard GPAI models and those with systemic risk.

What Is a General-Purpose AI Model?

defines a general-purpose AI model as an AI model — including where trained with a large amount of data using self-supervision at scale — that displays significant generality and is capable of competently performing a wide range of distinct tasks regardless of the way the model is placed on the market. This captures foundation models, large language models (LLMs), and multimodal models like GPT, Claude, Gemini, Llama, and Mistral.

The key distinction: GPAI obligations apply to the model itself, not to specific AI systems built on top of it. A provider who fine-tunes a GPAI model and deploys it as a customer service chatbot has obligations both as a GPAI provider (for the model) and potentially as a high-risk AI system provider (for the deployed system).

Article 53: Obligations for All GPAI Providers

Every provider of a GPAI model placed on the EU market must comply with four core obligations under :

1

Technical documentation (Annex XI)

Prepare and maintain technical documentation of the model, including its training and testing process, and the results of its evaluation. This must be provided to the AI Office on request. Annex XI specifies the required content in detail.

2

Downstream provider information (Annex XII)

Provide information and documentation to downstream providers who integrate the GPAI model into their AI systems. Annex XII specifies what must be shared — capabilities, limitations, intended use, and integration guidance.

3

Copyright compliance policy

Put in place a policy to comply with EU copyright law, in particular Directive (EU) 2019/790. This includes respecting opt-out rights under of the Directive — where rights holders have expressly reserved the right to opt out of text and data mining.

4

Training data summary

Draw up and make publicly available a sufficiently detailed summary of the content used for training the GPAI model. The AI Office has published a template for this summary.

The Open-Source Exception

provides a limited exception for open-source GPAI models — those whose parameters, including weights, architecture, and model usage information, are made publicly available. Open-source providers are exempt from the downstream information (Annex XII) and technical documentation obligations, but must still comply with copyright policy and the training data summary requirements.

This exception does not apply if the model is classified as having systemic risk under — in that case, the full obligations apply regardless of licensing.

When Does a GPAI Model Have Systemic Risk?

establishes two routes to systemic risk classification:

  • Presumption based on compute: A GPAI model is presumed to have systemic risk if the cumulative compute used for its training exceeds 10^25 FLOPs (Annex XIII). This threshold can be updated by the Commission.
  • Commission designation: The Commission can designate a model as having systemic risk based on criteria in Annex XIII, considering capabilities, reach, number of registered users, cross-border availability, and other indicators.

Once classified, the provider must notify the Commission within two weeks. The classification can be challenged through the procedure in .

Article 55: Additional Obligations for Systemic Risk Models

Providers of GPAI models with systemic risk must comply with all obligations plus additional requirements under :

  • Model evaluation — perform standardised state-of-the-art evaluations, including adversarial testing, to identify and mitigate systemic risks
  • Systemic risk assessment and mitigation — assess and mitigate possible systemic risks at Union level, including their sources
  • Serious incident tracking — document and report serious incidents and possible corrective measures to the AI Office and national competent authorities
  • Adequate cybersecurity — ensure an adequate level of cybersecurity protection for the model and its physical infrastructure

Codes of Practice

introduces codes of practice as the primary compliance mechanism for GPAI obligations. The AI Office has been coordinating the development of these codes with GPAI providers, downstream providers, and other stakeholders. Adherence to an approved code of practice creates a presumption of conformity with the corresponding obligations.

For providers who choose not to follow a code of practice, requires them to demonstrate an alternative adequate means of compliance — which the AI Office must approve.

Timeline

GPAI obligations under Articles 53-55 have applied since 2 August 2025. Providers who placed GPAI models on the market before that date had a transition period to comply. Any new GPAI model placed on the EU market must now comply from day one.

Key Takeaways

GPAI obligations apply to model providers — separate from any high-risk AI system obligations on downstream deployers

All GPAI providers must produce technical documentation, downstream information, copyright policies, and training data summaries

Open-source models get a partial exemption — but not if they have systemic risk

Systemic risk is presumed above 10^25 FLOPs and triggers additional evaluation, mitigation, and reporting obligations

GPAI obligations have applied since 2 August 2025 — compliance is already required

Tags

GPAIArticle 53Article 55Systemic RiskProvider Obligations

Not sure if you need a FRIA?

Use our free FRIA Screening Tool to find out in under 5 minutes.

Built by Paul McCormack — lawyer, product leader, and founder of Kormoon. This site is an independent informational resource only and does not constitute legal advice. No reliance should be placed on its contents. For the authoritative text, refer to the official EUR-Lex source linked in the Annexes tab, or consult your legal advisor.