ENISA's 2025 NIS Investments report confirms what experienced security leaders already suspect: the correlation between cybersecurity spending levels and operational resilience outcomes is weaker than boardrooms assume. Organizations across the EU are investing heavily in security technology while systematically underinvesting in the people and operational processes required to make that technology deliver results. The execution gap between purchased capability and operational reality is where resilience actually fails.
What the 2025 Report Reveals
ENISA's annual NIS Investments report surveys essential and important entities across EU Member States to understand cybersecurity spending patterns, resource allocation, and capability maturity. The 2025 edition, covering data from the 2024 fiscal year, reveals several patterns that should concern security leaders and the boards they report to.
Technology spending continues to grow, but unevenly. Across surveyed entities, cybersecurity budgets as a percentage of IT spending have increased modestly year over year. However, the distribution is heavily skewed toward technology acquisition: endpoint protection, network security, SIEM/SOAR platforms, and cloud security tooling. These categories consistently consume 60-70% of cybersecurity budgets across sectors.
Talent investment lags dramatically. Spending on cybersecurity personnel, training, and capability development typically accounts for less than 25% of security budgets in most sectors. This ratio has remained stubbornly consistent across multiple years of ENISA reporting, despite widespread acknowledgment of the skills shortage.
Operational process maturity trails technology deployment. Perhaps the most telling finding is the gap between technology deployment rates and the operational maturity scores for the processes that should leverage those technologies. Organizations report high deployment rates for security tooling but significantly lower maturity scores for incident response, vulnerability management, and supply chain risk assessment processes.
Sector variation is significant. Financial services and energy sectors show higher absolute spending and better talent-to-technology ratios than healthcare, water management, and digital infrastructure sectors. This tracks with the regulatory pressure gradient: sectors that faced earlier, more prescriptive regulation (financial services under DORA, energy under pre-NIS2 requirements) invested in operational capability earlier.
The Execution Gap: Why More Tools Can Mean Slower Decisions
The execution gap is not a mystery. It follows a predictable pattern that ENISA's data illuminates across sectors.
An organization acquires a new security platform. Deployment proceeds. The tool generates alerts, dashboards, and data. But the team operating the tool has not grown proportionally. The processes consuming the tool's output have not been redesigned. The decision frameworks that should translate tool output into risk reduction actions remain unchanged.
The result is more visibility and slower decisions. Alert volumes increase. Dashboard complexity grows. The cognitive load on security teams rises. Mean time to triage worsens because each alert now competes with output from multiple overlapping tools. Executive reporting becomes more data-rich and less decision-useful, because the reporting process aggregates tool output rather than synthesizing risk posture.
This pattern is not theoretical. ENISA's survey data shows that organizations with high technology deployment scores but low process maturity scores consistently report longer incident response times, higher vulnerability remediation backlogs, and lower confidence in their risk posture assessments. More tools, worse outcomes. The paradox is real and measurable.
The Talent Dimension: Not Just a Headcount Problem
The talent gap in EU cybersecurity is frequently discussed in headcount terms: unfilled positions, competition for candidates, salary inflation. These are real constraints, but the ENISA data suggests the talent problem is more nuanced than a simple supply-demand imbalance.
Skills composition matters more than team size. Organizations with smaller but well-composed security teams (combining technical depth with governance, risk, and communication skills) consistently outperform larger teams composed primarily of technical operators. The most effective security programs have people who can translate between technical reality and business context. Those translators are scarcer than any technical specialty.
Retention is a bigger problem than recruitment. ENISA's survey data indicates that cybersecurity staff turnover rates in EU essential entities average 15-20% annually. At that rate, organizational knowledge degrades faster than it accumulates. The investment in onboarding, tool training, and relationship building that makes a security team effective is repeatedly written off. Retention investment, including career development, workload management, and decision authority, yields higher returns than recruitment spending.
Training investment is backward-looking. Most cybersecurity training budgets focus on certification maintenance (ISO 27001 Lead Auditor, CISSP, CISM) rather than forward-looking capability development. Certifications are necessary but not sufficient. The skills gap that actually impairs operations is in areas like cloud security architecture, AI risk assessment, supply chain threat modeling, and regulatory interpretation. These do not map cleanly to existing certification programs.
Practical Budget Allocation: The 40-30-30 Framework
Based on the patterns ENISA's data reveals, security leaders should reconsider the conventional budget allocation that concentrates spending on technology acquisition. A more resilient allocation follows a 40-30-30 model:
40% Technology. This covers tooling acquisition, licensing, and maintenance. The key discipline is rationalization: before adding a new tool, audit whether existing tools are being used to their full capability. ENISA's data consistently shows that organizations use an average of 40-60% of the features in their deployed security tools. Increasing utilization of existing investments is cheaper and less operationally disruptive than deploying new platforms.
30% People. This covers personnel costs, training, and development, but also two areas that budgets frequently ignore: operational process design (the work of structuring how tools are used, not just who uses them) and decision framework development (building the triage models, escalation protocols, and reporting structures that convert tool output into risk decisions). Investing in process design is investing in people effectiveness.
30% Operations. This covers the ongoing operational costs that neither technology licensing nor personnel costs capture: exercise programs, tabletop scenarios, evidence management, audit preparation, third-party assessments, and the continuous improvement work that makes the other 70% deliver outcomes. This category is the most frequently underfunded and the most directly correlated with operational resilience.
The exact ratios will vary by organizational maturity and sector. A newly regulated NIS2 entity building from a low baseline may need to front-load technology spending. A mature financial services entity may need to shift toward operations and people. The principle is constant: technology without proportional investment in people and operational process creates complexity, not capability.
The Evidence Latency Problem
One of the most operationally significant findings in ENISA's report is the prevalence of evidence latency: the time between a control being tested and the evidence of that test being available for governance consumption.
In organizations with high technology spending but low process maturity, evidence latency averages weeks to months. A vulnerability scan runs, but the results are not triaged, risk-contextualized, and made available for executive reporting for weeks. A control is tested, but the test evidence is not formatted, stored, and linked to the relevant compliance requirement until an audit is imminent.
This latency has compounding costs. It delays risk decisions because leadership lacks current information. It creates audit preparation scrambles that consume operational capacity. It undermines the credibility of continuous control monitoring claims because the monitoring may be continuous but the governance visibility is periodic.
Reducing evidence latency is one of the highest-return investments a security program can make. It does not require new technology. It requires process redesign: standardizing evidence formats, automating evidence collection from existing tools, and integrating evidence workflows into operational processes rather than treating evidence as a separate compliance activity.
What High-Performance Programs Do Differently
ENISA's data reveals consistent patterns among the organizations that achieve strong operational resilience outcomes relative to their spending levels. These organizations share several characteristics:
They measure outcomes, not activity. Instead of tracking dashboard metrics (alerts processed, tickets closed, tools deployed), they measure risk posture change: mean exposure duration for critical vulnerabilities, time to detect and contain incidents, percentage of critical controls operating within tolerance, and evidence currency across the control framework.
They maintain strict decision forums. A monthly security steering committee with real decision authority, where trade-offs are made explicitly and documented, is more valuable than weekly status meetings that review metrics without making resource allocation decisions. The decision forum is where technology investment converts to risk reduction, and without it, investment becomes spending.
They reduce duplicate control tracking. Overlapping compliance obligations under NIS2, DORA, ISO 27001, and sector-specific requirements create a temptation to build separate control tracking for each framework. High-performance programs build a single control baseline and map it to multiple regulatory requirements, eliminating the duplication that consumes operator time and creates inconsistency risk.
They invest in communication capability. The ability to explain security posture, risk decisions, and investment rationale to non-technical stakeholders is treated as a core team capability, not a nice-to-have. This investment pays dividends in board confidence, budget approval velocity, and the quality of CISO-to-board communication that determines whether the program receives sustained executive support.
The Board Conversation: Reframing Security Investment
The execution gap ultimately persists because boards approve budgets based on an incomplete model of how cybersecurity spending creates resilience. The conventional model is: threats increase, therefore we need more tools to detect and prevent those threats, therefore the budget should grow in proportion to the threat landscape.
This model is not wrong, but it is incomplete. It omits the operational capacity required to convert tool output into risk reduction. A more complete model acknowledges that security spending has diminishing returns unless it is balanced across technology, people, and operational process.
The board conversation should shift from "how much are we spending on cybersecurity?" to "what is our operational capacity to convert security spending into risk reduction?" This reframing aligns with the management accountability provisions in NIS2 Article 20, which hold management bodies personally responsible for overseeing ICT risk management, and with DORA's requirements for the management body to maintain sufficient knowledge of ICT risk.
When boards understand that their compliance investment ROI depends as much on operational execution as on technology acquisition, the talent and process investment conversation becomes significantly easier.
Key Takeaways
- ENISA's data confirms the execution gap is structural, not anecdotal. High technology deployment scores combined with low process maturity scores correlate with worse operational outcomes. More tools can mean slower decisions.
- Rebalance budgets toward a 40-30-30 model (technology, people, operations). The conventional 70%+ allocation to technology creates capability that the organization cannot operationally absorb.
- Invest in evidence latency reduction as a force multiplier. Faster evidence flows improve governance decisions, reduce audit preparation costs, and enable genuine continuous monitoring.
- Retention investment yields higher returns than recruitment spending. At 15-20% annual turnover, organizational knowledge degrades faster than it accumulates. Career development, workload management, and decision authority are retention tools.
- Shift the board conversation from spending levels to operational conversion capacity. NIS2 Article 20 management accountability makes this not just a strategic argument but a regulatory expectation.
The ENISA NIS Investments report is published annually, and annually it tells the same story: EU organizations are better at buying security than operating it. The organizations that break this pattern are not the ones with the largest budgets. They are the ones that invest proportionally in the people and processes that convert technology into resilience. That distinction is where competitive advantage and regulatory confidence are built.
