KLA Digital Logo
KLA Digital
EU AI Act Hub

EU AI Act Compliance Hub: Risk Tiers, Obligations, Evidence

Figure out what applies to you. Generate checklists and audit-ready artifacts you can forward to auditors, counsel, and your board.

Orientation only. Not legal advice.

Selected:Not sure · Chatbot
Run the 3‑minute Risk + Obligations Check
Timeline

What changed and when

Key dates in plain language, with a visible freshness signal.

Last updated: Dec 15, 2025
12 Jul 2024
Published in the Official Journal
Start of the countdown. Use this date to sanity-check phased applicability timelines.
1 Aug 2024
Entered into force
The regulation is in force, with many obligations phasing in later.
2 Feb 2025
Prohibited practices apply (Article 5)
High-risk or not, banned use cases should be removed or redesigned.
2 Aug 2025
General-purpose AI (GPAI) obligations begin
Provider-side duties start phasing in for GPAI models and systemic-risk models.
2 Aug 2026
Most obligations apply
High-level operational programs should be live, not “in planning”.
2 Aug 2027
Some high-risk rules fully apply
Later-stage requirements and category-specific obligations phase in.
Unacceptable · High-risk · Limited · Minimal

Risk tiers (operational view)

Enough to orient. Not an encyclopedia.

Unacceptable

Unacceptable (prohibited)

Stop-ship risks. Remove or redesign and keep remediation evidence.

Typical examples

  • Prohibited practices (Article 5)
  • Certain biometric or manipulative use patterns

What you need to have

  • A “fail-closed” policy gate (blocks prohibited paths)
  • A remediation decision trail (tickets, approvals, releases)
  • Evidence of removal/redesign and regression tests
High-risk

High-risk (Annex III)

Operationalize controls and build an audit-ready evidence package.

Typical examples

  • Hiring/HR decision support
  • Credit/insurance decisions
  • Sensitive biometrics
  • Healthcare ops

What you need to have

  • Risk management + verification evidence
  • Annex IV-aligned technical documentation
  • Logging/traceability and human oversight records
  • Quality processes (QMS) + monitoring cadence
Limited

Limited risk (transparency)

Add disclosures and retain evidence that you did.

Typical examples

  • Chatbots and conversational agents
  • AI-generated or manipulated content

What you need to have

  • User disclosures in the flow
  • Proof of disclosure (screenshots + telemetry events)
  • Change control for model/policy updates
Minimal

Minimal risk

Baseline governance so you can prove what ran and why.

Typical examples

  • Internal tools and low-stakes automation (often)

What you need to have

  • System card + lightweight risk review
  • Basic logging (what ran when) and retention
  • Monitoring + incident handling path
3 minutes

Risk + obligations checker

Fast orientation that produces an obligations summary and recommended next steps.

Checker
Step 1 of 9

EU AI Act Risk + Obligations Check

Orientation only. Results are shown immediately; exports are gated to keep your artifacts forwardable.

This 3‑minute checker estimates your likely risk tier and the obligations you’ll need to operationalize — plus the evidence artifacts you should keep.

  • Shows results ungated
  • Gates export packs behind email
  • Links to role-specific templates + deep dives

Not legal advice.

Directory

Popular deep dives

The highest-intent spokes: operational steps and evidence checklists.

Annex III high-risk list (with examples)

Providers & deployers mapping use cases to high-risk categories

A practical “does this look like Annex III?” checklist and evidence pointers.

7 minRead

Article 5 prohibited AI practices (operational checklist)

Anyone shipping AI features into the EU market

A fast filter for “stop-ship” risks and how to document remediation.

6 minRead

Article 50 transparency obligations (plain language)

Chatbots, conversational agents, deepfakes, and content generation teams

What disclosures to add, and what evidence to keep that you did.

7 minRead

GPAI + foundation model obligations (orientation)

Teams building on, providing, or deploying general-purpose AI models

Provider vs deployer obligations and what to request from vendors.

8 minRead

Conformity assessment explained

High-risk teams preparing for audits and go-live

What “assessment” means operationally and what artifacts to assemble.

7 minRead

Technical documentation checklist (Annex IV-aligned)

Engineering and compliance teams drafting documentation packages

A skeleton you can use to build a defensible dossier.

9 minRead

Quality management system (QMS) essentials

Providers operationalizing repeatable compliance

Minimum viable QMS processes and evidence to retain.

8 minRead

Fines and penalties (plain-English)

Executives and risk owners budgeting compliance work

A non-alarmist view of enforcement signals and how to reduce exposure.

6 minRead
Lead magnet

Templates + artifacts

Forwardable outputs that unlock internal buy-in.

Compliance Starter Pack

  • AI System Card template
  • Risk assessment outline (risk register style)
  • Human oversight plan checklist
  • Technical documentation skeleton
  • Vendor due diligence checklist (deployer-focused)
  • Logging/audit event checklist (evidence-ready)

Fictional samples. Not legal advice.

Need an evidence path?

KLA Digital turns obligations into controls, controls into measurements, and measurements into defensible evidence you can export.

Control plane

Operationalize compliance

From obligations → controls. From controls → measurements. From measurements → defensible evidence.

Govern

Policy-as-code checkpoints pause risky steps for human review, enforce disclosures, and block prohibited paths.

  • Checkpoints + escalation rules
  • Change control tied to releases
  • Exception approvals with expiry

Measure

Sampling checks accuracy/grounding, tracks near-misses, and makes drift visible.

  • Sampling evaluations + thresholds
  • Near-miss and violation trends
  • Alerting + escalation evidence

Prove

Tamper-proof, append-only audit trail with Evidence Room exports you can hand to auditors.

  • Tamper-proof audit trail
  • Evidence Room export bundles
  • Integrity proofs for defensibility
Request a pilot
Not legal advice

FAQ

Fast answers to common scope and implementation questions.

Next step

Ready to turn confusion into evidence?

Run the checker, download templates, or book a short readiness call.