This article evaluates the NCSC’s Cybersecurity Risk Management Guidance through the lens of human factors and cyber psychology. While the framework excels in technical prevention, it falls short in addressing the human elements crucial to recovery, such as morale, stress, and organisational culture. Recommendations include integrating resilience-focused metrics, adapting attack trees for human-centric scenarios, balancing prevention with recovery, and leveraging cyber psychology insights. These enhancements would align the guidance with the realities of human behaviour, creating a more effective and comprehensive approach to cyber risk management.
When evaluating the NCSC’s Cybersecurity Risk Management Guidance framework from a human factors and cyber psychology perspective, certain inconsistencies and limitations come to light. The system/component-driven approach to risk management appears to fall short when applied to people and human behaviour, as it doesn’t fully capture the nuances of human response, organisational culture, and resilience. Moreover, the integration of “attack trees” into this framework raises questions about their practical value. The approach risks becoming an exercise in theoretical modelling, potentially providing opportunities for consultants to create overly complex scenarios that are unlikely to materialise in reality.
A key observation at Psyber, Inc. is that the root causes of attacks and breaches are often driven by technological misconfigurations or vulnerabilities. While an organisation’s recoverability hinges on human factors, such as staff morale, self-organisational ability, and the collective capacity to “bounce back.” These human elements dictate how effectively an organisation can respond to and recover from an incident, underscoring the need for a more balanced framework that fully integrates human-centric risk management principles. This response explores these challenges and offers suggestions for how organisations can enhance and streamline the framework to address these gaps.
Contents
1. Limitations of the System/Component Approach for Human Factors
- People Are Not “Components”:
- The guidance’s system/component-driven framework treats people as if they are another part of the “machine” (e.g., components like hardware, software, or data). However, humans are not deterministic like machines—they are influenced by emotions, fatigue, stress, and morale, none of which are adequately addressed by the framework.
- Treating people as static “components” ignores the dynamic and context-dependent nature of human behaviour in cyber security incidents.
- Complex Interactions Are Underexplored:
- System-driven approaches aim to model complex interactions but fall short when considering human dynamics. For example:
- How staff respond under pressure during an incident.
- The cascading effects of low morale on communication and decision-making.
- The influence of organisational culture on adherence to risk management protocols.
- These factors are critical to recoverability but are not captured effectively in a system or component-driven model.
- System-driven approaches aim to model complex interactions but fall short when considering human dynamics. For example:
2. Attack Trees: An Analytical Blind Spot
- Overly Mechanistic View:
- Attack trees focus on technical paths to compromise, assuming attackers operate in a predictable and structured manner. This misses the messy, adaptive, and often opportunistic nature of real-world attacks, particularly social engineering or insider threats.
- For example:
- A phishing attack isn’t just about bypassing authentication. It also involves understanding why a person might click a malicious link (e.g., time pressure, lack of training, or fatigue).
- Neglect of Recovery Factors:
- Attack trees are inherently about identifying points of failure rather than points of recovery. They don’t address how an organisation’s human and cultural attributes might mitigate or exacerbate the consequences of an attack.
- A significant opportunity is missed to model scenarios that examine how staff morale, empowerment, and cohesion influence recovery time and effectiveness.
- Potential for Misguided Effort:
- The detailed modelling of hypothetical attack paths often leads to a focus on obscure or unlikely scenarios. This creates a fertile ground for consultants to build complex models that don’t reflect operational realities, diverting resources from practical, people-focused risk reduction measures.
3. Technology Drives Breach; People Drive Recovery
Our ongoing research suggests that technology weaknesses dictate breaches while human factors dictate recovery. Applying this observation can be useful in this situation and the guidance does not adequately acknowledge this dichotomy:
- Over-Emphasis on Prevention:
- The framework is heavily weighted toward preventing attacks through technical controls, leaving recovery largely to generic incident response plans and assurance models.
- This approach assumes that technical measures can “solve” security problems while ignoring the role of human factors in enabling recovery and minimising long-term impact.
- Lack of Resilience-Oriented Metrics:
- There is little focus on measuring or improving the human aspects of resilience, such as:
- Staff morale during a crisis.
- Psychological safety in reporting and responding to incidents.
- Organisational trust and cohesion under pressure.
- There is little focus on measuring or improving the human aspects of resilience, such as:
- Realistic Recovery Scenarios Are Missing:
- The guidance should explicitly model scenarios where recovery hinges on human performance:
- How well staff can self-organise during a ransomware attack.
- How quickly decision-makers can regain situational awareness after a major breach.
- How a strong organisational culture can mitigate panic and confusion.
- The guidance should explicitly model scenarios where recovery hinges on human performance:
4. Implications for Cyber Psychology and Human Factors
- Neglected Insights from Cyber Psychology:
- The guidance does not leverage key findings from cyber psychology, such as:
- Cognitive overload: How staff under stress might fail to follow established protocols.
- Social dynamics: How trust and collaboration between teams influence incident response.
- Decision-making under pressure: How biases and heuristics might lead to poor choices during an attack.
- The guidance does not leverage key findings from cyber psychology, such as:
- Missed Opportunities in Training:
- Training is treated as a checkbox item rather than as an opportunity to build psychological resilience and adaptive capacity.
- For example, instead of just training staff to recognise phishing emails, organisations could focus on:
- Building confidence in reporting suspected threats.
- Practicing adaptive problem-solving during simulated crises.
- Resilience as a Human Quality:
- Resilience is often discussed as a technical quality (e.g., redundancy, failover). The human dimension of resilience—self-organisational ability, creativity, and adaptability under stress—is undervalued and underexplored.
5. Practical Recommendations
To address these gaps and better integrate human factors into the framework, the following changes should be considered:
Reframe Risk Management Around People
- Recognise that people are not components. Develop models that incorporate dynamic human factors, such as morale, stress, and group cohesion.
- Use resilience-focused metrics to measure and improve human recovery capabilities.
Adapt Attack Trees for Human-Centric Scenarios
- Extend attack trees to include human-centric paths (e.g., emotional triggers leading to phishing success, insider threats driven by discontent).
- Model recovery pathways alongside attack paths, focusing on the role of human decision-making and teamwork.
Balance Prevention with Recovery
- Shift the emphasis from solely preventing breaches to preparing for recovery. For example:
- Train staff in self-organisation during incidents.
- Develop playbooks that emphasise team collaboration under pressure.
Incorporate Cyber Psychology Principles
- Integrate insights from cyber psychology into all phases of the framework:
- Scenario planning should include psychological stressors and their impact on decision-making.
- Assurance models should account for organisational culture and its effect on compliance and resilience.
Invest in Realistic Training
- Move beyond theoretical exercises to immersive simulations that test both technical and human responses.
- Focus on building psychological safety and morale through regular practice of adaptive responses.
Encourage a Culture of Learning
- Instead of just mitigating risks, foster a culture that values learning from incidents. This includes debriefing exercises that emphasise personal and team growth.
Conclusion
The current NCSC guidance struggles to align its system/component-driven approach with the realities of human factors and cyber psychology. It overemphasises technical solutions to prevent breaches while underestimating the critical role of people in ensuring recovery. By integrating human-centric models, leveraging insights from cyber psychology, and balancing prevention with resilience, the framework can become a more realistic and actionable guide for organisations to manage cyber risks effectively. This shift would not only address the gaps you’ve identified but also position the framework as a leading example of human-technology integration in cyber security.