Skip to main content
FORTISEU
Back to Blog
DORA8 October 202510 min readAttila Bognar

DORA After Go-Live: What Auditors and Regulators Request First

DORA has been in effect since January 17, 2025. Here are the first artifacts auditors and regulators ask for, and how to respond with evidence instead of narrative.

DORA After Go-Live: What Auditors and Regulators Request First featured visual
DORAICT riskTPRMResilience testingAudit readiness

DORA has been live since January 17, 2025. Nine months in, nobody is grading intent. Regulators and auditors are grading control — specifically, your ability to produce structured evidence that your ICT risk management framework is not just documented but operating. If your team still needs a war room to assemble review evidence, the program is not mature. It is exposed.

The first wave of DORA supervisory engagement and audit activity is revealing a consistent pattern: regulators are not asking surprising questions. They are asking predictable questions and finding that organizations cannot answer them quickly, consistently, or with evidence that holds up to scrutiny. The gap is not knowledge. It is operational evidence architecture.

The Predictable First Requests

Across financial entities in the EU — credit institutions, investment firms, insurance undertakings, payment institutions — the initial DORA-related requests from auditors and supervisors follow a remarkably consistent pattern. Understanding what they ask first allows you to prioritize evidence readiness.

ICT Risk Management Framework Documentation

The first and most foundational request is for the ICT risk management framework required under DORA Art. 6. Auditors want to see the complete framework document, but more importantly, they want evidence that it is operationalized — not just written. Specific sub-requests typically include:

  • The governance structure: who owns ICT risk management, what is the reporting line to the management body, how frequently does the management body review ICT risk (per Art. 5(2))?
  • The risk identification methodology: how do you identify ICT risks across business functions, and how is the identification kept current?
  • The risk assessment and classification approach: what severity model do you use, how are ICT risks rated, and how do ratings feed into treatment decisions?
  • Evidence of management body approval and oversight, satisfying Art. 5(2) requirements that the management body defines, approves, oversees, and is responsible for the ICT risk management framework.

The common failure point is not the absence of a framework document. Most organizations have one. The failure is that the document describes an idealized operating model that diverges from actual practice. Auditors test this by cross-referencing framework claims against operational evidence: "Your framework says ICT risk is reviewed quarterly by the management body. Show me the last four meeting minutes with ICT risk agenda items."

Register of Information

DORA Art. 28(3) requires financial entities to maintain a register of information in relation to all contractual arrangements on the use of ICT services provided by ICT third-party service providers. This register is not a vendor list. It is a structured dataset that must include, at minimum: the ICT service provider identity, the services provided, the criticality assessment, the sub-outsourcing chain, and the contractual terms governing each arrangement.

The ESAs published technical standards specifying the register format, and supervisors expect this register to be current, complete, and producible on demand. In practice, "on demand" means within days, not weeks.

Organizations that maintained traditional vendor inventories before DORA find that their existing data does not map cleanly to the Art. 28(3) requirements. The register demands a level of granularity — particularly around sub-outsourcing chains and service-level criticality — that most legacy TPRM programs did not capture. FortisEU's vendor risk management module structures this register natively, mapping providers to services, criticality tiers, and contractual terms in the format supervisors expect.

Incident Classification Methodology

DORA Art. 18 requires financial entities to classify ICT-related incidents based on criteria specified by the ESAs, including: the number of clients and counterparts affected, the duration, the geographic spread, data losses, the criticality of services affected, and the economic impact. Auditors ask for the classification methodology document, but they also ask for evidence of application: show me how you classified your last three ICT incidents using this methodology.

The typical gap is that organizations have adopted a classification scheme on paper but apply it inconsistently in practice. Different teams interpret severity criteria differently. Classification decisions are not documented with rationale. And when the methodology was last updated is unclear.

TLPT Scope and Planning

For financial entities subject to threat-led penetration testing under DORA Art. 26-27, auditors and supervisors are asking about scope determination. Which critical functions are included in the TLPT program? What methodology was used to determine scope? What is the testing timeline? Who are the designated threat intelligence and red team providers?

Even for organizations not yet in their first TLPT cycle, supervisors want to see evidence of scope planning and methodology selection. The expectation is that TLPT is being actively planned, not deferred.

Business Continuity and Recovery

DORA Art. 11-12 require ICT business continuity management, including business impact analysis, recovery plans, and regular testing. Auditors ask for: the business impact analysis covering all ICT-dependent critical functions, recovery time and recovery point objectives for each critical function, test results from the most recent continuity exercise, and lessons learned with remediation tracking.

The consistent finding is that business continuity plans exist but testing evidence is thin. Organizations that test annually produce stale results. Organizations that test more frequently often do not document outcomes in a format that satisfies regulatory review.

Why Evidence Architecture Matters More Than Policy

The pattern across all these requests is consistent: auditors are not primarily evaluating whether policies and frameworks exist. They are evaluating whether those frameworks operate — whether decisions are made, documented, and traceable according to the framework's own design.

This is the fundamental shift that DORA introduces. Previous regulatory frameworks for financial entities tended to focus on whether appropriate governance structures and policies were in place. DORA focuses on whether those structures produce verifiable operational outcomes. Art. 6(5) requires financial entities to "identify, classify and adequately document all ICT supported business functions." Art. 5(2) requires the management body to bear "ultimate responsibility" for ICT risk management. These are operational obligations with evidence requirements, not policy obligations with documentation requirements.

For compliance officers and risk managers, this means the evidence architecture — how evidence is generated, stored, linked to controls, and retrievable — is as important as the control framework itself. A well-designed control with no evidence is, from a regulatory perspective, indistinguishable from no control at all.

Building Audit-Ready Evidence Packs

Organizations that perform well in DORA reviews share a common approach: they maintain standing evidence packs that can be assembled into regulatory responses within hours, not weeks. The architecture behind this capability involves several components.

Continuous Evidence Collection

Rather than assembling evidence retroactively for each review, mature programs generate evidence continuously as controls operate. Access reviews produce timestamped completion records. Risk assessments generate versioned artifacts. Management body meetings produce minutes with ICT risk agenda items flagged and tagged. FortisEU's evidence collection module automates this capture across your control framework, generating audit-ready artifacts as a byproduct of normal operations.

Control-Evidence Mapping

Every control in the DORA framework should be mapped to one or more evidence artifacts, with defined freshness requirements. This mapping serves as a standing index: when an auditor requests evidence for a specific Art. 6 requirement, the compliance team can immediately identify which artifacts are relevant, where they are stored, and whether they are current.

Version Control and Audit Trail

DORA evidence must be timestamped and version-controlled. When a risk assessment is updated, the previous version must be retained. When a policy is amended, the change history must be traceable. When a management body decision is recorded, the meeting date, attendees, and decision text must be preserved.

This is not optional documentation discipline — it is a regulatory requirement. Art. 6(8) requires financial entities to document their ICT risk management framework, including all changes and the dates of those changes. Auditors will test version control as part of their assessment.

Cross-Domain Linkage

DORA obligations are interconnected. An incident (Art. 17-23) may trigger review of the risk management framework (Art. 6), updates to the register of information (Art. 28), and modifications to business continuity plans (Art. 11). Evidence that links these domains — showing how an incident finding propagated to framework updates — demonstrates operational maturity that auditors value.

The Register of Information Deep Dive

The Art. 28(3) register of information deserves particular attention because it is the single most commonly requested artifact in early DORA reviews, and the one where organizations most frequently fall short.

The register must cover all contractual arrangements for ICT services provided by third-party providers. "All" is the operative word — not just critical providers, not just outsourced services, but all ICT service arrangements. The ESAs' technical standards (RTS on the register of information) specify the data fields, which include:

  • Provider identification and LEI (Legal Entity Identifier) where available
  • Description of ICT services provided
  • Function supported and its criticality assessment
  • Start date, duration, and renewal terms
  • Sub-outsourcing arrangements, including the full chain
  • Data storage and processing locations
  • Substitutability assessment
  • Contractual terms regarding audit rights, exit provisions, and incident notification

For large financial groups, this register may contain hundreds or thousands of entries. Maintaining it manually is operationally unsustainable. Automated integration with contract management, procurement, and vendor risk management systems is necessary to keep the register current.

Supervisors are using the register as a starting point for deeper inquiry. They select entries from the register and request supporting documentation: show me the contract. Show me the criticality assessment. Show me the last due diligence review. Show me the exit plan. The register is not the endpoint — it is the index that enables structured supervisory examination.

Responding to On-Site Supervisory Visits

For financial entities subject to on-site supervisory visits — particularly significant institutions under ECB direct supervision — DORA examinations have specific characteristics that differ from traditional IT audit engagements.

Supervisors arrive with pre-formed hypotheses based on off-site analysis of the institution's reporting, peer comparisons, and thematic review priorities. They request specific evidence to test those hypotheses. Responses that are narrative-heavy and evidence-light will result in follow-up requests and extended examination timelines.

Practical preparation includes: designating a single point of contact for supervisory evidence requests (not the entire compliance team); maintaining a DORA evidence repository that the designated contact can navigate quickly; preparing briefing materials that connect your control framework to DORA articles (so you can respond to article-specific questions without translation delay); and ensuring that control owners are available and briefed on their responsibilities during the examination period.

Managing the NIS2-DORA Overlap

Financial entities that are also subject to NIS2 — and many are, particularly payment institutions and market infrastructures — face overlapping supervisory expectations. DORA Art. 1(2) addresses this by establishing that DORA is lex specialis for financial entities, meaning DORA requirements take precedence where they overlap with NIS2.

In practice, the overlap creates operational complexity in areas like incident reporting (DORA Art. 19 vs. NIS2 Art. 23), supply chain security (DORA Art. 28 vs. NIS2 Art. 21(2)(d)), and governance (DORA Art. 5 vs. NIS2 Art. 20). Organizations subject to both should maintain a single integrated control framework with mapping to both regulations, rather than operating parallel compliance programs. FortisEU's compliance automation platform supports this cross-regulation mapping natively.

The Maturity Signal Auditors Look For

Beyond specific evidence artifacts, auditors evaluate overall program maturity. The signals they look for include:

Speed of evidence production. Can you produce the requested artifact within hours, or does it take weeks? Speed correlates with operational integration — evidence that is produced quickly was generated by operational processes, not assembled for the audit.

Consistency across domains. Does your risk management framework language match your incident classification language? Do your register of information entries align with your business continuity scope? Inconsistency suggests that different teams built different components without integration.

Evidence of learning loops. Has your framework been updated based on operational experience? Can you show how an incident finding led to a control improvement? Can you demonstrate how a TLPT result changed your risk assessment? Supervisors value evidence that the framework is living, not static.

Management body engagement. Not just meeting minutes, but evidence of substantive engagement. Did the management body ask questions? Request follow-up? Make decisions? Challenge assumptions? Supervisors can distinguish perfunctory oversight from genuine governance.

Key Takeaways

  • The first DORA requests are predictable. ICT risk management framework, register of information, incident classification methodology, TLPT scope, and business continuity test results. Prepare standing evidence packs for each.

  • Evidence architecture is more important than policy quality. Auditors test whether frameworks operate, not just whether they exist. A well-written policy with no operational evidence is a finding, not an asset.

  • The register of information is the most common gap. Art. 28(3) requires comprehensive, granular, current data on all ICT third-party arrangements. Manual maintenance does not scale. Invest in automated register management.

  • Speed of evidence production signals maturity. Organizations that produce evidence in hours demonstrate operational integration. Organizations that need weeks demonstrate audit-preparation dependence. Auditors notice the difference.

  • Integrate DORA and NIS2 compliance. For entities subject to both, a single control framework with dual-regulation mapping is more efficient and defensible than parallel programs.

Next Step

Turn guidance into evidence.

If procurement is involved, start with the Trust Center. If you want to see the product, create an account or launch a live demo.