Skip to Content
Security & TrustComplianceEU AI Act

EU AI Act

The EU AI Act (Regulation (EU) 2024/1689) introduces mandatory requirements for providers and deployers of AI systems placed on the market in the European Union. VeriProof is designed specifically to address the audit trail, documentation, and risk management record requirements that apply to high-risk AI systems under the Act.

The EU AI Act became fully applicable on 2 August 2026 for most high-risk AI systems. Limited-risk obligations (transparency) applied from August 2025. General-purpose AI model obligations applied from August 2025.


Applicability

This guidance applies to you if you are:

  • A provider placing a high-risk AI system on the EU market or putting it into service
  • A deployer using a high-risk AI system in a way that affects EU individuals
  • A provider of a General Purpose AI (GPAI) model with systemic risk

If you are building AI decision-support systems in any of the Annex III categories — including employment decisions, access to essential services, biometric identification, or law enforcement — the Act’s high-risk requirements apply.


Article 9 — Risk Management System

The Act requires a documented, ongoing risk management process covering identified risks, their estimation, mitigation measures, and residual risk evaluation.

VeriProof capabilityHow it supports Article 9
Immutable session recordsProvides the evidence base for evaluating how your system actually behaves in production, not just in testing
Governance scoringAutomated scoring against configurable thresholds flags sessions that exceed risk parameters
Alert rulesReal-time notification when sessions breach governance rules (bias thresholds, confidence scores, refusals)
Time-machine queriesHistorical analysis to identify whether risk patterns changed after model updates or deployment changes

Article 10 — Data Governance

Article 10 requires that training, validation, and test datasets meet quality criteria and that data governance practices are documented.

VeriProof’s focus is production inference observability rather than training data management. Where Article 10 intersects with VeriProof is in production data monitoring:

  • VeriProof captures the inputs actually received in production and the outputs actually produced, creating a ground-truth record of real-world data distribution
  • This record supports post-deployment validation that production data remains within the distribution of the system’s training and test sets
  • Detected anomalies (unusual input patterns, unexpected output distributions) are surfaced through alert rules and governance dashboards

Article 11 — Technical Documentation

Article 11 requires comprehensive technical documentation before a high-risk AI system is placed on the market. The documentation must include system architecture, training data description, accuracy metrics, and ongoing performance data.

VeriProof’s current evidence artifacts support Article 11 documentation work by providing:

  • Session-level decision records with full input/output provenance
  • Blockchain-anchored Merkle proofs demonstrating records haven’t been altered
  • Governance scoring and monitoring outputs from the portal
  • Export metadata attached to the downloaded evidence artifacts

Today, teams typically assemble an EU AI Act package from session evidence JSON/PDF, application-level evidence ZIP exports, blockchain audit certificates, and their own system documentation. See the Evidence Packaging Walkthrough for the current workflow.


Article 13 — Transparency and Provision of Information

Article 13 requires that high-risk AI systems are designed to allow deployers to understand the system’s capabilities and limitations. It also mandates instructions for use.

VeriProof supports Article 13 by providing:

  • Full decision traceability: every input, chain-of-thought step, tool call, and output can be retrieved for any session within the retention window
  • Confidence and quality metadata captured alongside each decision
  • Integration with your documentation: exported evidence artifacts can be attached to your own Article 13 documentation package

Article 17 — Quality Management System

Article 17 requires a quality management system covering procedures for change management, incident handling, and corrective actions when issues arise post-deployment.

The VeriProof features most directly relevant to Article 17 are:

  • Alert rules: Define thresholds that, when exceeded, trigger notifications. This is the operational mechanism for detecting quality issues in production
  • Governance scoring: Configured scoring thresholds give you a quantitative quality signal continuously, not just during incident reviews
  • Compliance monitoring guide: See the Compliance Monitoring guide for how to set up a complete Article 17-compatible quality management workflow

General Purpose AI — Article 53 / 55

If you use a GPAI model (such as OpenAI GPT-4o, Anthropic Claude, or Google Gemini) in a system that falls within the Act’s scope, VeriProof helps you create the logging and audit infrastructure required under Article 53 (for all GPAI models) and Article 55 (for GPAI models with systemic risk).

Key obligations supported:

ObligationVeriProof capability
Keep technical documentationSession evidence artifacts and audit exports
Provide information to downstream providersEvidence artifacts can be shared alongside your own technical documentation
Implement a copyright policy for training dataOut of scope for VeriProof
Report serious incidentsAlert rules and incident record export

EU AI Act Readiness Checklist

Use this checklist to assess your current readiness using VeriProof:

  • SDK integrated and capturing sessions for all production deployments
  • Data subjects registered for any system processing identifiable individuals
  • Governance scoring configured with thresholds aligned to your risk assessment
  • Alert rules active for high-risk event types (refusals, low confidence, fairness flags)
  • Evidence export workflow tested and output validated against your documentation requirements
  • Retention period set to at least the period required by your post-market monitoring plan
  • DPA in place covering VeriProof as a processor

Next Steps

Last updated on