User Access Reviews: The Complete Guide
Comprehensive guide to user access reviews covering regulatory requirements under NIS2, DORA, and ISO 27001, the end-to-end review process from planning to remediation, automation strategies, and how to avoid common pitfalls like rubber-stamping.
- 1
User access reviews are a legal obligation under NIS2 Article 21(2)(i), DORA Article 9(4)(c), ISO 27001 A.5.15, and GDPR Article 32 — not a discretionary best practice.
- 2
Tier your review frequency by risk: quarterly for critical systems, semi-annual for business applications, annual for general tools.
- 3
Combat rubber-stamping by limiting reviewer scope, embedding usage analytics, and tracking reviewer behaviour metrics.
- 4
Extend review scope beyond core applications to include SaaS, service accounts, shared accounts, and infrastructure-level access.
- 5
Use review findings as a feedback loop to improve upstream provisioning and de-provisioning processes.
1. What Are User Access Reviews and Why They Matter
A user access review (UAR) is a periodic, structured evaluation of who has access to what systems, data, and resources within an organisation, and whether that access remains appropriate. The process requires designated reviewers — typically line managers, application owners, or data custodians — to examine each user's entitlements and either certify that access is still justified or flag it for revocation. Access reviews are sometimes referred to as access recertification, entitlement reviews, or attestation campaigns, but the underlying mechanism is the same: a human-in-the-loop validation that the principle of least privilege is being maintained in practice, not just in policy.
Access reviews matter because entitlements drift. Organisations grant access reactively — a new project, a temporary assignment, a troubleshooting escalation — and rarely revoke it proactively. Studies consistently show that the average enterprise user accumulates 30-40% more access than their role requires within two years of hire. This entitlement bloat creates excessive blast radius during account compromise, violates least privilege principles embedded in every major security framework, and in regulated sectors exposes the organisation to supervisory findings and fines. The 2023 Verizon Data Breach Investigations Report attributed 74% of breaches to the human element, with privilege misuse and credential abuse as leading vectors — access reviews are the primary preventive control against both.
For EU-regulated entities, user access reviews are not a discretionary best practice but a legal obligation. NIS2, DORA, ISO 27001, and GDPR each impose access control requirements that are operationally satisfied through periodic access reviews. The failure to conduct reviews is not a theoretical risk: supervisory authorities across the EU have cited inadequate access control practices in enforcement actions, and auditors consistently flag the absence of documented review evidence as a material finding. Establishing a robust access review programme is foundational to demonstrating compliance across multiple frameworks simultaneously.
2. Regulatory Requirements for Access Reviews
NIS2 Article 21(2)(i) explicitly mandates human resources security, access control policies, and asset management as part of the cybersecurity risk-management measures that essential and important entities must implement. This is not a vague principle — the Directive specifically enumerates access control alongside identity management and authentication (including multi-factor authentication under Article 21(2)(j)) as a required measure category. The Commission Implementing Regulation (EU) 2024/2690, which details the technical and methodological requirements for NIS2 measures, further specifies that entities must establish and maintain access rights management procedures, including periodic reviews of access rights. Supervisory authorities will expect evidence of regular, documented access reviews as the primary mechanism for operationalising this requirement.
DORA Article 9(4)(c) requires financial entities to implement strong authentication mechanisms, including policies and procedures for managing access rights. The Regulatory Technical Standards under DORA elaborate on this, mandating that financial entities conduct periodic reviews of access rights allocation to ensure that entitlements remain aligned with job functions and are promptly revoked upon role changes or employment termination. For financial entities subject to both DORA and NIS2, the access review programme must satisfy the more prescriptive of the two frameworks — in practice, DORA's requirements for financial entities are more granular than NIS2's baseline, particularly regarding privileged access and segregation of duties.
ISO 27001 Annex A Control A.5.15 (Access control) and A.5.18 (Access rights) establish the international baseline for access review practices. A.5.18 specifically requires that access rights be reviewed at regular intervals and after any change such as promotion, demotion, or termination. The control further specifies that access rights to privileged processing facilities should be reviewed at more frequent intervals than standard access. GDPR, while not a cybersecurity framework, imposes its own access review obligation through Article 5(1)(f) (integrity and confidentiality) and Article 32 (security of processing), which together require data controllers to ensure that access to personal data is limited to authorised personnel with a legitimate processing purpose — a requirement that is operationally verified through periodic reviews.
For organisations subject to multiple frameworks, a single access review campaign can satisfy NIS2, DORA, ISO 27001, and GDPR requirements simultaneously — provided the scope, frequency, and documentation meet the most prescriptive standard applicable to each system.
3. The Access Review Process: Planning, Execution, Certification, Remediation
Planning is where most access review programmes succeed or fail. Before launching a review campaign, define the scope (which systems, applications, and data repositories are in scope), the review population (all users, or segmented by risk tier), the reviewer assignments (who will review each user's access — typically the user's direct manager for role-based access and the application or data owner for system-specific entitlements), the certification window (how long reviewers have to complete their attestations), and the escalation path for overdue or contested reviews. Segment your application estate into risk tiers: Tier 1 (critical and regulated systems — quarterly reviews), Tier 2 (business applications with sensitive data — semi-annual reviews), and Tier 3 (general productivity tools — annual reviews). This tiered approach focuses review effort where risk is highest and avoids reviewer fatigue from over-frequent campaigns on low-risk systems.
Execution requires presenting reviewers with clear, actionable information. For each user-entitlement pair, the reviewer should see the user's current role and department, the specific entitlement or role assigned, when it was granted and by whom, when it was last used (if usage data is available), and whether the entitlement is flagged as privileged or carries segregation-of-duties risk. Reviewers then make a binary decision for each entitlement: certify (access is appropriate and should continue) or revoke (access is no longer justified). Avoid three-way decisions that include a 'modify' option — this introduces ambiguity and delays. If access needs modification, revoke the current entitlement and raise a separate access request for the adjusted level.
Remediation is the step that converts a review from a compliance exercise into a security control. Every revocation decision must be actioned within a defined SLA — best practice is 72 hours for standard access and 24 hours for privileged access. Track remediation to completion and report on the revocation rate (percentage of entitlements revoked per campaign), remediation SLA compliance, and any exceptions or overrides. A review programme with a consistently low revocation rate (below 2-3%) should be investigated — it may indicate rubber-stamping rather than genuine scrutiny. Conversely, a high revocation rate in early campaigns is healthy and should normalise over time as entitlement hygiene improves.
Include 'last accessed' timestamps alongside each entitlement during review. Reviewers who can see that a privilege has not been used in 90 days are far more likely to make accurate revocation decisions than those reviewing entitlements without usage context.
4. Automation and Continuous Access Monitoring
Manual, spreadsheet-based access reviews do not scale beyond a few hundred users and a handful of applications. As the organisation grows, the combinatorial explosion of user-entitlement pairs — a 500-person organisation with 30 applications and an average of 5 entitlements per application generates 75,000 review decisions per cycle — makes manual processes unworkable. Invest in an access review platform that automates campaign creation, reviewer assignment, entitlement aggregation from source systems (IAM, AD/Entra ID, SaaS applications via SCIM/API), and remediation ticket generation. The platform should support risk-based micro-certifications (reviewing only changed or high-risk entitlements) alongside full periodic campaigns.
Continuous access monitoring complements periodic reviews by detecting anomalous access events between review cycles. Implement automated alerts for entitlement grants that bypass the standard request-approval workflow, privilege escalations that are not associated with an approved change ticket, dormant accounts that become active after extended inactivity, and access patterns that violate segregation-of-duties policies. These real-time signals reduce the window of exposure between periodic reviews and provide a defence-in-depth layer that satisfies the 'continuous improvement' expectations embedded in NIS2 Article 21's proportionality principle.
The integration between access review automation and your broader identity governance stack is critical. Access review findings should feed back into your role-mining engine to refine role definitions, your provisioning system to enforce revocation decisions automatically, your risk scoring model to adjust user risk profiles based on review outcomes, and your compliance evidence repository to generate audit-ready documentation. A well-integrated access review programme transforms from a periodic compliance exercise into a continuous identity governance capability that reduces risk, improves operational efficiency, and generates the evidence trail that supervisory authorities expect.
5. Common Pitfalls: Rubber-Stamping, Scope Gaps, and Beyond
Rubber-stamping — the practice of approving all entitlements without genuine scrutiny — is the most pervasive failure mode in access review programmes. It occurs when reviewers face too many decisions in too short a window, when the review interface presents entitlements without context (no usage data, no risk indicators, no role relevance), or when there are no consequences for negligent certification. Combat rubber-stamping through several mechanisms: limit campaign scope so that no reviewer faces more than 50-100 decisions per cycle, embed usage analytics and risk flags directly into the review interface, introduce random spot-check audits where a compliance officer independently validates a sample of certify decisions, and track reviewer behaviour metrics such as time-per-decision and approval rate to identify reviewers who are certifying without reading.
Scope gaps are the second most common pitfall. Many organisations review access to their core enterprise applications but neglect SaaS applications provisioned outside IT governance (shadow IT), service accounts and non-human identities that may hold privileged access, shared or generic accounts that multiple users access with a single credential, and database-level access that is granted directly rather than through application roles. Each of these gaps represents unreviewed access that can be exploited. Expand your review scope progressively: start with critical and regulated systems, then extend to SaaS applications discovered through CASB or SSO logs, then to non-human identities, and finally to infrastructure-level access (cloud IAM roles, database grants, network device accounts).
A third pitfall is treating the access review as a standalone compliance exercise disconnected from the identity lifecycle. Reviews that find access to revoke are identifying a symptom — the root cause is a provisioning process that grants excessive access upfront or a de-provisioning process that fails to remove access promptly when roles change. Use review findings to drive upstream improvements: if the same application consistently shows high revocation rates, investigate whether its role model is too coarse; if leavers consistently appear in reviews with active accounts, investigate the offboarding process integration with HR systems. The access review should be a feedback loop that continuously improves the identity governance programme, not a periodic checkbox exercise.
A review programme with a consistent approval rate above 97% across all campaigns should be treated as a red flag. Investigate whether reviewers are genuinely scrutinising entitlements or rubber-stamping to clear their queue.
6. Evidence and Documentation for Audit Readiness
Supervisory authorities and auditors evaluate access review programmes on three dimensions: design effectiveness (is the programme designed to address the regulatory requirements?), operating effectiveness (is it running as designed, consistently and on schedule?), and outcome evidence (are review findings actioned and do they demonstrably reduce access risk?). Your documentation must address all three dimensions to withstand scrutiny.
For design effectiveness, maintain a formal access review policy that specifies scope, frequency, reviewer assignment methodology, certification and revocation procedures, escalation paths, and exception handling. Cross-reference the policy to the specific regulatory requirements it addresses (NIS2 Article 21(2)(i), DORA Article 9(4)(c), ISO 27001 A.5.15/A.5.18, GDPR Article 32). For operating effectiveness, retain campaign records showing that reviews were initiated on schedule, completed within the certification window, and that all reviewers participated. Track campaign completion rates — a campaign where 20% of reviewers did not complete their attestations is an operating effectiveness failure regardless of the results from the 80% who did.
For outcome evidence, generate per-campaign reports showing total entitlements reviewed, certification and revocation decisions, revocation rate by application and risk tier, remediation SLA compliance, and trend analysis across campaigns. Auditors are particularly interested in trends: a declining revocation rate over successive campaigns indicates that the programme is improving entitlement hygiene, while a stable or increasing rate may indicate that root causes are not being addressed. Retain all review artefacts for the period required by your applicable regulations — NIS2 does not specify a retention period, but DORA's five-year record-keeping requirement and ISO 27001's evidence retention expectations suggest a minimum of three years for access review records.
How often should user access reviews be conducted under EU regulations?
There is no single mandated frequency across all EU frameworks. ISO 27001 A.5.18 requires reviews at 'regular intervals' and after role changes, with more frequent reviews for privileged access. DORA's RTS expects periodic reviews aligned with the entity's risk appetite. Best practice for EU-regulated entities is quarterly reviews for critical and privileged systems, semi-annual for business applications handling sensitive data, and annual for general productivity tools. The key is demonstrating a risk-based rationale for your chosen frequency.
Who should be the reviewer in an access review campaign?
The reviewer should be the person best positioned to assess whether access is appropriate for the user's current role. For role-based access (e.g., HR system access for HR staff), the user's direct line manager is typically the appropriate reviewer. For application-specific entitlements (e.g., admin privileges in a finance system), the application owner or data custodian should review. For privileged access, consider a two-tier model where both the line manager and a security team member must certify.
Can a single access review programme satisfy multiple framework requirements?
Yes, and this is strongly recommended to avoid duplication. A well-designed access review programme can simultaneously satisfy NIS2, DORA, ISO 27001, and GDPR requirements provided the scope covers all regulated systems, the frequency meets the most prescriptive applicable standard, documentation cross-references all relevant regulatory articles, and remediation SLAs meet the tightest applicable deadline. Map each framework's specific requirements to your review programme design and document the coverage explicitly.
What is the difference between an access review and an access audit?
An access review (or access certification) is a preventive control where designated reviewers validate the appropriateness of current access entitlements and decide whether to certify or revoke each one. An access audit is a detective control where an independent party (internal audit, external auditor, or supervisory authority) examines the access review programme itself — its design, operating effectiveness, and outcomes. Both are necessary: the review controls access, and the audit controls the review.
How do we handle service accounts in access reviews?
Service accounts and non-human identities should be included in your access review programme with dedicated review procedures. Assign each service account an owner (typically the application or infrastructure team lead who manages the system the account serves). Reviewers should validate that the service account is still required, that its permissions have not expanded beyond the minimum necessary, that credentials are being rotated per policy, and that the account's activity logs show only expected behaviour. Review service accounts at least as frequently as the systems they access.
Ready to Operationalise This?
Turn this guide into working compliance workflows. Create an account or schedule a personalised demo.