The EU regulatory environment in 2025 is placing demands on compliance teams that the traditional operating model was never designed to handle. NIS2 requires cybersecurity risk management with management body accountability. DORA requires ICT third-party risk management, incident reporting within four hours, and threat-led penetration testing. The EU AI Act requires risk classification, conformity assessments, and governance structures for high-risk AI systems. And most in-scope entities need to comply with at least two of these simultaneously.
The compliance function that worked in 2020 — a small team of legal-adjacent professionals maintaining policy documents and managing annual audits — is structurally insufficient for this operating environment. The question is not whether to invest in compliance capability. The question is how to build a team that can operate across multiple regulatory frameworks, bridge the gap between legal interpretation and technical implementation, and sustain continuous compliance rather than periodic compliance theatre.
The Demand Surge: Three Regulations, Not Enough People
The numbers illustrate the problem. NIS2 brings approximately 160,000 entities into scope across the EU — up from roughly 10,000 under NIS1. DORA applies to over 22,000 financial entities and their ICT service providers. The EU AI Act affects any organisation deploying or developing AI systems classified as high-risk, a category that spans healthcare, financial services, education, employment, and critical infrastructure.
Each regulation requires specialised knowledge. NIS2 demands understanding of cybersecurity risk management frameworks, incident handling procedures, and supply chain security. DORA demands familiarity with ICT risk management, digital operational resilience testing, and financial sector supervisory practices. The AI Act demands knowledge of AI risk classification, technical documentation, and conformity assessment procedures.
The labour market has not adjusted. LinkedIn job postings for "NIS2 compliance" roles tripled between Q4 2024 and Q2 2025. Demand for DORA expertise in financial services has produced salary inflation of 25-40% for experienced GRC professionals in major financial centres. And the AI Act has created an entirely new role category — AI governance specialist — for which the supply of qualified candidates is negligible because the regulation is less than a year old.
This supply-demand imbalance is not temporary. The regulations are permanent. The compliance obligations are continuous. And the supervisory apparatus is still scaling up, meaning enforcement intensity will increase, not decrease, over the coming years.
The Modern Compliance Team: Roles Beyond the Traditional Compliance Officer
The traditional compliance team operated on a legal-advisory model: interpret the regulation, write the policy, train the staff, manage the audit. This model assumed that compliance was primarily a documentation and advisory function, with limited need for technical depth or operational involvement.
The NIS2/DORA/AI Act landscape breaks this assumption in three ways.
First, the regulations require technical implementation, not just policy. NIS2 Article 21 requires actual cybersecurity measures — incident detection, access control, encryption, supply chain security controls. DORA requires a functioning ICT risk management framework, operational resilience testing, and a maintained register of information. You cannot comply through documentation alone.
Second, the regulations require continuous evidence, not periodic attestation. DORA incident reporting operates on a four-hour timeline. NIS2 incident reporting operates on a 24-hour timeline. Supply chain risk must be continuously assessed, not annually reviewed. The compliance function must be operationally embedded, not periodically consulted.
Third, the regulations span domains that no single professional profile covers. Legal interpretation of directive text. Technical implementation of cybersecurity controls. Data analysis for risk assessment. Vendor management for third-party oversight. AI governance for the AI Act. No individual has all of these skills. The team must be designed as a composite capability.
The emerging roles that successful compliance teams are building around include:
Head of GRC / Chief Compliance Officer. The strategic leader. Owns the compliance programme, reports to the management body (as required by NIS2 Article 20 and DORA Article 5), and manages regulatory relationships. This role increasingly requires both legal and technical fluency — the ability to translate supervisory expectations into operational requirements and vice versa. For CISOs in organisations where compliance reports into the security function, this is becoming a dual mandate.
GRC Engineer. This is the role that did not exist five years ago and is now the hardest to fill. The GRC engineer sits at the intersection of compliance requirements and technical systems. They build and maintain the tooling that automates evidence collection, control monitoring, and regulatory reporting. They integrate compliance automation platforms with the organisation's technical infrastructure. They write the queries that populate the DORA register of information from procurement and IT asset management systems. They are neither pure engineers nor pure compliance professionals — they are the bridge.
Regulatory Intelligence Analyst. Responsible for monitoring the regulatory landscape, tracking transposition progress across Member States (critical for NIS2), analysing ENISA guidance and ESA technical standards, and translating regulatory developments into impact assessments for the organisation. In a fast-moving regulatory environment, this role prevents the team from being surprised by new requirements or supervisory expectations.
Third-Party Risk Manager. DORA Chapter V and NIS2 Article 21(2)(d) both require structured vendor risk management. This role owns the vendor assessment process, maintains the register of information, conducts due diligence on new ICT service providers, monitors existing providers, and manages exit strategy development for critical dependencies. In organisations with significant vendor portfolios, this is a full-time function, not a task assigned to the compliance officer.
Compliance Automation Specialist. Focused on designing and operating the workflows, dashboards, and automated processes that make continuous compliance feasible. This role works with questionnaire automation tools, evidence collection pipelines, and reporting templates. They reduce the manual burden on the rest of the team and ensure that compliance evidence is generated as a by-product of operations, not assembled manually for audits.
AI Governance Specialist. For organisations deploying high-risk AI systems under the EU AI Act, this role manages risk classification, maintains technical documentation, coordinates conformity assessments, and implements the governance structures required by Article 9 (risk management system) and Article 17 (quality management system). This is the newest and rarest profile.
The GRC Engineer: Where Compliance Meets Engineering
The GRC engineer role deserves particular attention because it represents the most significant structural shift in compliance team composition.
Historically, compliance teams consumed technology — they used GRC platforms configured by IT or vendors. The GRC engineer produces compliance technology. They write code or configure systems to automate evidence collection. They build integrations between the compliance platform and the organisation's identity provider, cloud infrastructure, ticketing systems, and monitoring tools. They ensure that when a supervisor asks for evidence of access control reviews, the answer is not "we will compile it" but "here is the automated report generated last Tuesday."
The skills profile is distinctive:
- Understanding of regulatory requirements sufficient to identify what constitutes valid evidence
- Technical skills to build integrations, write queries, and configure automation workflows
- Data engineering capability to aggregate and normalise evidence from multiple source systems
- Scripting and API fluency (Python, SQL, REST APIs are the common toolkit)
- Familiarity with GRC platforms and their extensibility models
The market for GRC engineers is extremely tight because the role requires a combination that formal education rarely produces. Most GRC engineers are either security engineers who developed regulatory knowledge or compliance professionals who taught themselves to code. Neither pathway is common.
Organisations that cannot hire GRC engineers have two alternatives: train them internally (pairing a motivated engineer with a compliance mentor, or vice versa) or outsource the function to managed GRC service providers that offer engineering capability as a service. The second option is viable for initial setup but creates dependency if not managed carefully.
Team Structures by Organisation Size
The right team structure depends on the organisation's size, regulatory exposure, and technical complexity. The following structures are illustrative, not prescriptive. They assume the organisation is in scope for at least NIS2 or DORA, and potentially both.
The 50-Person Organisation
A 50-person entity newly in scope under NIS2 — say, a managed service provider or a mid-sized SaaS company in a critical sector — typically cannot justify a dedicated compliance team. The realistic structure is:
- One senior hire combining the Head of GRC and regulatory intelligence functions. This person owns the compliance programme, reports to the CEO or CTO, and interfaces with the competent authority. They need both legal literacy and technical fluency.
- Fractional or outsourced GRC engineering to set up compliance tooling, build evidence collection integrations, and configure the automation workflows. This can be a part-time contractor or a managed service.
- Existing IT/security staff absorb operational compliance tasks — running vulnerability scans, maintaining access reviews, operating incident response — with guidance from the compliance lead on what constitutes adequate evidence.
- External MSSP provides monitoring and incident detection capability, with contractual requirements aligned to NIS2 reporting timelines.
Total dedicated compliance headcount: 1-1.5 FTE. Total compliance-adjacent time from existing staff: 0.5-1 FTE.
The key risk at this size is single-point-of-failure dependency on the compliance lead. Documented procedures, automated evidence collection, and clear delegation arrangements are essential to mitigate this risk.
The 200-Person Organisation
A 200-person financial services firm subject to both NIS2 and DORA — an investment firm, a payment institution, or an insurance intermediary — needs a dedicated function:
- Head of GRC (1 FTE) — strategic leadership, management body reporting, supervisory relationships
- GRC Engineer (1 FTE) — tooling, automation, evidence pipeline, register of information maintenance
- Third-Party Risk Manager (1 FTE) — DORA register, vendor assessments, exit strategy management, concentration risk analysis
- Compliance Officer (1 FTE) — policy management, training, incident classification and reporting, audit coordination
- Shared resources: IT security team handles operational controls; legal counsel provides regulatory interpretation on demand
Total dedicated compliance headcount: 4 FTE. This structure handles NIS2 and DORA requirements without requiring every team member to be a domain expert in both frameworks.
The 1000+ Person Organisation
A large bank, energy utility, or telecommunications operator subject to NIS2 (and DORA for financial entities) requires a structured compliance department:
- Chief Compliance Officer or VP of GRC — executive leadership, board reporting, cross-functional coordination
- Regulatory Intelligence Team (2-3 analysts) — monitoring 27 Member States' transposition, tracking ESA technical standards, analysing ENISA guidance
- GRC Engineering Team (2-4 engineers) — platform management, automation development, integration with enterprise systems
- Third-Party Risk Management Team (2-4 specialists) — managing hundreds or thousands of vendor relationships across multiple DORA/NIS2 requirements
- Operational Compliance Team (3-5 professionals) — policy lifecycle management, training delivery, incident reporting operations, audit management
- AI Governance Specialist (1 FTE, if deploying high-risk AI systems)
- Regional/jurisdictional leads for entities operating across multiple Member States with varying NIS2 transposition status
Total dedicated compliance headcount: 12-20 FTE, depending on regulatory exposure and geographic complexity.
Skills to Hire for vs Skills to Automate
Not every compliance task requires a human with specialised skills. The efficient team invests in human capability where judgment is required and automates where the task is repeatable, data-driven, or time-sensitive.
Hire for:
- Regulatory interpretation and impact assessment (requires contextual judgment)
- Supervisory relationship management (requires interpersonal skill and domain authority)
- Risk acceptance decisions (requires business context that automated systems lack)
- Incident classification under ambiguity (the four-hour DORA window demands human judgment on borderline cases)
- Exit strategy development (requires understanding of operational dependencies that resist automation)
- Management body training and engagement (requires communication skills adapted to a non-technical audience)
Automate:
- Evidence collection from technical controls (access logs, patch status, encryption configuration)
- Control monitoring and drift detection (automated comparison of current state against baseline)
- Vendor questionnaire distribution, tracking, and reminder workflows
- Policy review scheduling and attestation tracking
- Regulatory change monitoring and alert distribution
- Report generation for supervisory submissions
- Training completion tracking and certification management
The boundary between hire and automate is not static. As compliance automation platforms mature, tasks that previously required human analysis — such as vendor risk scoring based on questionnaire responses or control gap identification based on framework mapping — are becoming automatable. The team should be designed with this trajectory in mind: invest in human roles that are durable, and build automation for roles that are transitional.
The Outsourcing Question: MSPs, Consultants, and What to Keep In-House
The talent shortage makes outsourcing tempting. And for specific functions, outsourcing is the right answer. But the NIS2 and DORA frameworks impose limits on what can be outsourced without creating compliance risk.
Appropriate to outsource:
- Security monitoring and incident detection (MSSP) — provided contractual notification timelines align with regulatory reporting requirements
- Penetration testing and TLPT execution — typically outsourced to specialised testing firms, as DORA Article 26 requires external testers for TLPT
- GRC platform implementation and initial configuration — a one-time or periodic engagement
- Regulatory gap assessments — useful for establishing baseline, though the entity must own the remediation programme
- Vendor questionnaire administration — the operational mechanics of distributing and collecting questionnaire responses
Must keep in-house:
- Management body accountability and governance decisions — NIS2 Article 20 and DORA Article 5 place non-delegable responsibilities on the management body
- Risk acceptance decisions — the entity must own these; outsourced advisors can inform but not decide
- Supervisory relationships — regulators expect to engage with the entity's own staff, not its consultants
- Incident classification and regulatory notification — the judgment call on whether an incident meets notification thresholds must be made by personnel with authority to commit the entity
- Strategic compliance programme design — the consultant can advise, but the programme must reflect the entity's specific risk profile and business context
The hybrid model — a lean in-house compliance core supported by outsourced operational capabilities — works well for entities in the 50-200 person range. Larger entities should bring more capability in-house as the cost-effectiveness crossover favours permanent staff over sustained consulting engagements.
One caution: over-reliance on consultants for compliance programme design creates a particular risk. The programme looks professional on paper but has no institutional knowledge underpinning it. When the consultants disengage, the in-house team may not understand the rationale behind the design decisions, making the programme fragile and difficult to maintain. Any consulting engagement should include explicit knowledge transfer as a deliverable, not an afterthought.
Key Takeaways
- The NIS2/DORA/AI Act regulatory wave requires compliance teams that blend legal, technical, and operational skills. The traditional policy-and-audit model is insufficient for regulations that require technical implementation, continuous evidence, and cross-domain expertise.
- The GRC engineer — bridging compliance knowledge and technical implementation — is the most important new role and the hardest to hire. Organisations that cannot recruit externally should invest in internal training programmes that pair engineers with compliance mentors.
- Team size scales with organisational complexity: 1-1.5 dedicated FTE for a 50-person entity, 4 FTE for a 200-person entity, and 12-20 FTE for a 1000+ person enterprise. These are not aspirational — they reflect the operational demands of the regulations.
- Automate evidence collection, control monitoring, and workflow management. Hire for regulatory judgment, supervisory relationships, and risk decisions. The boundary will shift as automation matures, so design the team with this trajectory in mind.
- Outsourcing is appropriate for monitoring, testing, and operational administration. Governance, risk acceptance, supervisory engagement, and incident classification must remain in-house. Over-reliance on consultants without knowledge transfer creates fragile programmes.
