All articles

Cybersecurity Leaders Are Addressing Alert Fatigue With A Focus On Analyst Well-Being

The Security Digest - News Team
Published
April 20, 2026

Jojo D. and Ty Hughes, experts in cybersecurity and human-centered transformation, discuss alert fatigue and suggest a hybrid approach that combines AI tools with leadership support to alleviate the costly burden of employee exhaustion.

Credit: The Security Digest

Key Points

  • The traditional cybersecurity approach of adding more monitoring tools is leading to severe alert fatigue, where operators are too overwhelmed to identify real threats.

  • Jojo D. and Ty Hughes, experts in cybersecurity and human-centered transformation, warn that these constant "task saturation" conditions cause security operators to ignore critical warnings.

  • To combat this, experts recommend shifting to a new strategy that treats human attention as a finite resource, requiring leaders to build in recovery time and protect operators from burnout with intelligent tools like AI.

We take for granted that whatever problems arise in IT, humans will be able to handle them. But in reality, system performance is downstream of human performance.

JoJo D.

"The Chaos Guru"

JoJo D.

"The Chaos Guru"
Cybersecurity Strategist

For years, the standard playbook for security teams was simple: buy more tools, get more visibility. But that approach has hit a very human bottleneck, where teams drown in dashboards, spending more time sorting through alerts than responding to actual incidents. A worsening signal-to-noise ratio leaves operators stuck in a constant triage of questionable data. As leaders try to break free from alert fatigue, they are realizing that maximizing visibility without human-centric filtering quickly leads to information overload for the people on the other side of the screen. For a growing number of security teams, the real constraint is the finite human attention required to manage it.

We spoke with two veteran technology leaders who are actively tackling the problem. Jothi Dugar, otherwise known as Jojo D., "The Chaos Guru", is a globally recognized executive strategist with over 25 years of experience advising leaders at the intersection of technology, cybersecurity, and human-centered transformation. An international bestselling author, her work focuses on building resilient, high-performing leaders and teams by integrating strategic clarity, emotional intelligence, and sustainable leadership practices. Ty Hughes, the "AI Alchemist," is a transformational technologist and storyteller with nearly two decades of experience helping organizations navigate complexity and disruption through human-centered technology leadership. As a President's Management Council Interagency Rotation Fellow, his work focuses on the intersection of advanced technologies, emotional intelligence, and resilience. Together, they have reshaped the conversation around security culture by treating the people behind the systems as a core part of operational resilience.

Historically, organizations have maintained rigorous discipline in measuring system uptime and latency, while paying less attention to the cognitive load on their people. Dugar suggests the under-measured variable of human fatigue drives major operational breakdowns.

  • Hardware meets wetware: Over the years of building cybersecurity programs, Dugar says organizations are good at measuring IT systems but less so at measuring the people responsible for them. "We take for granted that whatever problems arise in IT, humans will be able to handle them. But in reality, system performance is downstream of human performance. If your operators are fatigued, overloaded, or disengaged, your entire holistic system is already degraded."
  • Bandwidth on empty: Hughes remarks on the glut of available data and suggests that visibility isn't necessarily the issue. "The risk isn't in what we can't see. It's in what we can't focus on. More visibility without clarity becomes well-instrumented confusion."

"If you just stepped away for five minutes, took a quick study break, and then came back to it, we've all experienced the clarity that comes after that pause. Why would we think it would be any different when it comes to cybersecurity?" - Ty Hughes

When operators hit their cognitive limits, the environment pushes them into a mode where reaction replaces analysis. Dugar argues that this problem is compounded by tool sprawl, and Hughes further elaborates that uncalibrated systems encourage staff to develop workarounds that route around the very tools the organization invested in. Research on the cognitive dynamics of cybersecurity alert handling reinforces this point: as input volume exceeds a certain threshold, analyst performance degrades sharply. Organizations then end up with alert fatigue, where legitimate security events get lost in the din of false alarms, something that other practitioners have described as a predictable byproduct of poorly calibrated security theater

  • Cockpit chaos: Hughes likens the modern security environment to a commercial airplane simulator where an overwhelming volume of indicators suddenly stacks up. "Alarms are sounding, indicators are flashing, and messages are stacking up across multiple displays. The aircraft is actually functioning fine, but the cockpit warnings are completely overwhelming. In the aviation industry, this is called task saturation."
  • Trickle-up exhaustion: Such conditioning can introduce vulnerabilities that reach across the organization, including senior leadership. "Maybe there are increased false positives, or analysts and even our CISOs are bypassing systems because they are just tired," Dugar says.

The burnout often shows up as a slow slide from active verification to total apathy, as constant noise pollution conditions operators to tune out. Industry surveys on cybersecurity burnout confirm the pattern at scale: extended exposure to high-alert environments correlates with declining accuracy, rising attrition, and weakened incident response. Because everyone from the SOC floor to the C-suite and the board is fatigued, teams must ruthlessly filter alerts based on business and reputational risk.

To address this, organizations are deploying AI-powered contextual filtering to triage the noise and help repair the breakdown between what we trust AI for and what we trust humans for. Dugar sees a role for AI as a filter, not another firehose. By focusing on where the business is at greatest risk, the technology can assess which alerts actually matter and relieve SOCs and other specialists from having to address every alert as it comes in.

  • Sanctuary in the SOC: Dugar emphasizes that leaders should align workflows to how humans actually process information, not how they wish they did. "When you have an incident, having a space where people feel like they can take a moment to do some sort of mental, physical, or energetic exercise to recover and then jump back into the fire."
  • Taking a break: Hughes agrees: "If you just stepped away for five minutes, took a quick study break, and then came back to it, we've all experienced the clarity that comes after that pause. Why would we think it would be any different when it comes to cybersecurity?"

This is the point where "visibility" shifts from incident visibility to people's visibility. And, according to Dugar, this responsibility falls specifically on leadership as a function of the business, because organizations that treat workforce resilience as a governance priority outperform those that don't.

  • Visibility starts at the top: Where does leadership fit into understanding their team's well-being? According to Dugar, it is up to the leadership team to coordinate proactively. "It's not that leaders don't care; they just may not be close to the people actually running the organization for them. They might not be aware that people are getting burnt out or that morale is low unless one of the leaders brings that to the board's attention. And I strongly believe we shouldn't just be presenting problems to leadership. We should also be presenting solutions."

Dugar also challenges the instinct to keep adding tools by drawing a parallel to the Western medical system, where treating symptoms without investigating root causes produces a cascade of new problems. And this is the heart of Dugar's and Hughes's argument: by the time an organization sees the symptoms of fatigue, the problem is already well established, and missed threats are a real issue. "In the cyber world, there's a vulnerability, and instead of figuring out where it came from, we just throw a bunch of tools at it. It fixes the superficial layer, but it's not fixing the root causes."