From the Book
Explore the ideas that are transforming how organizations think about security culture. These excerpts offer a window into The Giroux Methodology.
From the Author
A personal message from Jamie Giroux on what inspired three decades of work to culminate in Stop Training Robots.
Video message from Jamie is currently in production.
Be the first to watch when it's ready:
Walk into any security awareness training session and watch what happens. Eyes glaze over. Phones appear under tables. People click "Next" without reading, racing toward the completion certificate that proves they've been "trained."
We've built an entire industry around treating humans like faulty machines that need reprogramming. Install the right code—policies, procedures, annual training modules—and surely they'll execute correctly. When they don't, we blame the hardware. "Human error." The weakest link.
But here's what thirty years in this field has taught me: the problem isn't the humans. It's the assumption that humans should behave like robots in the first place.
"When we treat employees like machines to be programmed, we get compliance without comprehension. We get checkboxes instead of commitment. We get the appearance of security without its substance."
Consider the phishing simulation—the crown jewel of modern security awareness. An employee clicks a fake malicious link. What happens next? In most organizations: shame, mandatory retraining, perhaps a meeting with their manager. The message is clear: you failed. You are the vulnerability.
Now consider what we've actually accomplished. We've taught that employee to hide their mistakes. To feel anxious rather than vigilant. To see security as something done to them rather than with them. We've made the problem worse while convincing ourselves we're making it better.
There is another way.
Emotional intelligence isn't soft. It's not about being nice, avoiding conflict, or making everyone feel good. In the context of security, emotional intelligence is a tactical advantage—one that threat actors have been exploiting for years while defenders pretend it doesn't exist.
Think about what makes social engineering so devastatingly effective. It's not technical sophistication. It's psychological sophistication. Attackers understand fear, urgency, authority, social proof, reciprocity. They understand that a person who feels rushed makes different decisions than a person who feels calm. They understand that humans don't operate on logic alone.
"The attackers have been studying psychology for decades. It's time defenders did the same—not to manipulate our people, but to empower them."
The five pillars of emotional intelligence—self-awareness, self-regulation, motivation, empathy, and social skills—map directly onto the capabilities we need to build genuine security cultures:
Self-awareness means knowing your own biases before attackers exploit them. It means recognizing when you're tired, stressed, or distracted—the moments when you're most vulnerable. Security programs that build self-awareness don't just train people what to look for; they train people to know when they're not looking clearly.
Empathy means understanding that the colleague who clicked the phishing link wasn't stupid—they were probably overwhelmed, multitasking, trying to be responsive. It means designing security controls that account for how people actually work, not how policy documents say they should work.
This isn't about being soft on security. It's about being smart about humans.
Fear is the default setting for security communication. Scary statistics. Breach horror stories. "You could be next." The assumption is simple: if people are scared enough, they'll be careful.
The assumption is also wrong.
Fear doesn't create vigilance. Fear creates anxiety. And anxiety is the enemy of good judgment.
When people are anxious, they make faster decisions to reduce their discomfort. They're more susceptible to urgency tactics—exactly what phishing emails exploit. They avoid situations that trigger their anxiety, which means avoiding reporting potential issues. They develop learned helplessness: "I can't possibly keep up with all these threats, so why try?"
"Fear-based security creates the very vulnerability it claims to address. It's not just ineffective—it's counterproductive."
I've seen this play out hundreds of times. An organization sends increasingly alarming security communications. Click rates on phishing simulations go down—success! Except incident reports also go down. People aren't more secure; they're more silent. They've learned that engaging with security only brings trouble.
The alternative isn't removing all consequence or pretending threats don't exist. It's building a culture where people feel confident rather than scared, capable rather than helpless, supported rather than surveilled.
The research is clear: positive emotions broaden attention and build resilience. People who feel safe are more likely to notice anomalies, report concerns, and help colleagues. Fear narrows focus to immediate escape; confidence enables the peripheral awareness that catches threats.
Here's a question I ask every security leader: "What happens when someone in your organization falls for a phishing email and reports it immediately?"
The answers reveal everything about security culture. In some organizations: "They have to retake training." In others: "We log it and move on." In the best: "We thank them enthusiastically and use it as a learning opportunity for everyone."
Now consider the math. A phishing email arrives. The employee clicks. In the first few seconds, the damage—if any—is done. But the learning opportunity is just beginning. If that employee reports immediately, security can warn others, analyze the attack, improve defenses, and build organizational knowledge.
"The moment someone reports a security incident—even one they caused—is the moment they become your strongest ally. How you respond determines whether they ever do it again."
This is the reporting paradox: the behavior we most need to encourage (immediate reporting) is the behavior our systems most often punish. We've created an environment where the rational choice is to stay quiet and hope no one notices.
I worked with an organization that transformed their culture by making one simple change: public celebration of reports. Not private acknowledgment—public celebration. "Sarah from accounting spotted a sophisticated phishing attempt today and reported it within minutes. Thanks to her vigilance, we were able to warn the entire organization before anyone else was targeted."
Within six months, their report-to-click ratio improved by 400%. Not because people clicked less, but because people reported more. They'd made security everyone's job—and everyone's win.
Security culture flows downhill. Every time.
You can have the most sophisticated training program, the most elegant policies, the most advanced technology. If leadership doesn't model security behavior, none of it matters. Employees watch what leaders do, not what security teams say.
The CEO who shares passwords "just this once." The VP who demands exemptions from security controls that slow her down. The manager who treats security training as an interruption to "real work." Every one of these moments teaches the organization what security actually means here.
"A security leader who cannot read a room will never secure it. Technical excellence means nothing if you can't bring people along."
But here's what's often missed: leadership modeling isn't just about following rules. It's about demonstrating the vulnerability that security culture requires.
The most powerful security moment I've witnessed came from a CEO who stood up at an all-hands and said: "Last week, I almost fell for a phishing email. Here's what it looked like, here's why it almost worked, and here's what I learned." The room was silent. Then someone else shared their near-miss. Then another.
In five minutes, that CEO did more for security culture than a year of training modules. He made it safe to be human. He made vigilance something to aspire to rather than perfection to fake.
This is the leadership equation: visible vulnerability plus genuine commitment equals cultural permission. Give people permission to be imperfect humans in the security conversation, and they'll show up as engaged partners rather than reluctant subjects.
These excerpts are just the beginning. The complete book provides the full methodology, implementation frameworks, and practical tools.