After Gartner released “Top Cybersecurity Trends for 2023”, those in Security Awareness, Behaviour and Culture did our stampy feet, and spewed our rage into the ether, or was that just me?
I promised vengeance, and said I would publish a response, so here it is. My response:
Security leaders should rethink their balance of investments across technology and human-centric security design practices.
2020 called, and wants its talk punchline back! Or there’s my blog on “Security Worker vs Secure Tech”? Ok, so I’m not a big deal, maybe no one saw my slides, which look something like this:
Yes! Security leaders should rethink their balance of investments across technology and human-centric security design practices, but they won’t, because the majority of security leaders are sold on “technology solves all”, and people are the weakest link, blah, blah, and all that jazz! One Gartner Report, behind a paywall, won’t change the things we’ve been trying to encourage for years! I know what you’re shouting; not all security leaders. You’re wrong. All security leaders! Look at the chart here. Can you honestly say your investment split doesn’t look like this? Is it closer to 50:50? If it is, can we be friends?
“A human-centred approach to cybersecurity is essential to reduce security failures,” said Richard Addiscott, Sr Director Analyst at Gartner. “Focusing on people in control design and implementation, as well as through business communications and cybersecurity talent management, will help to improve business-risk decisions and cybersecurity staff retention.”
I feel like we’re either missing some punctuation, or this doesn’t actually say anything. What are you on about Richard? Human-centred approach reduces security failures? Which security failures? The failure to secure, or the failures of security? Open to interpretation, maybe?
“Focusing on people in control design and implementation, as well as through business communications and cybersecurity talent management, will help to improve business-risk decisions and cybersecurity staff retention.”
So, I think I know what he’s getting at. Security people roll out technical controls without considering the human use case. If that’s what he means, then yes. They do. You need the cultural and behavioural piece to go hand in hand with the technical control. Spot on. Simple enough, but that’s not what I see in the real world. Flip the switch for the control, without a communication campaign, and staff bypass the control, because the clever people didn’t stop to consider what it means for the people trying to do their jobs. Or, do all the awareness, and don’t implement controls, leaving staff to “do the right thing”. A balance is needed here, so yep, focus on people in control design and implementation. Understand the use case, and try to preempt the control bypass that will inevitably come.
Cybersecurity talent management, in this context, to me, means Cybersecurity talent must have a human-centred approach, and be managed into this “shoe”, which leads to cybersecurity staff retention. What happened to “you cannot put the same shoe on every foot”? Cue my internal meltdown on what’s right for security, and what’s right for equality vs equity. Perhaps I've misunderstood Richard’s point, but what I’m reading is that Cybersecurity people must think in a certain way, yet what makes a high performance team is a collection of people who think in different ways. There are shelves and shelves of books about this; The Five Dysfunctions of a Team: A Leadership Fable, for one. Not to stereotype, but from my own experience, high performance security people aren’t blessed with the greatest of soft skills, they are computer wizards, and a computer is their safe space, so to retain them, we have to force them outside of their safe space, instead of collaborating with people who do think human-centred first? So, explain to me how making a skilled and capable person behave in a less than optimal way keeps them happy in their job, and not wanting to leave?
“Human-centric security design prioritises the role of employee experience across the controls management life cycle. By 2027, 50% of large enterprise chief information security officers (CISOs) will have adopted human-centric security design practices to minimise cybersecurity-induced friction and maximise control adoption.”
Prioritising employee experience? What a novel idea. Minimise cybersecurity-induced friction? How is this a statement, slipped into any report, in such a blasé fashion, acceptable? Like cybersecurity-induced friction is the norm? Why? Oh.. the rage! I’m honestly at a loss for words. By 2027 CISOs will… minimise cybersecurity-induced friction. If this blasé comment is a known known by CISOs, what the hell are you playing at? I’m not naive, I spend a good amount of my time sweeping up the carnage left behind a CISO, while shouting “nothing to see here”, but I am shocked and appalled that cybersecurity-induced friction can be something we’ll just get to, while wondering why the rest of the trends in this report even need to be mentioned.
“By 2027” feels a really long way away. I know time passes more quickly as you get older, but can we afford to drag this out until 2027? Employees already dislike security teams, and being told to do things securely, so why not look at human-centric security design to prioritise the role of employee experience across the controls management life cycle now? That way, you understand your employees, and how they work, and what your controls mean, and how they will create an insider threat.
As Helen Lovejoy, from The Simpsons said: "Ohhh, won't somebody please think of the frictionless relationship with cybersecurity!" (I paraphrase).
“Traditional security awareness programs have failed to reduce unsecure employee behaviour,” said Addiscott. “CISOs must review past cybersecurity incidents to identify major sources of cybersecurity induced-friction and determine where they can ease the burden for employees through more human-centric controls or retire controls that add friction without meaningfully reducing risk.”
I wish I had the budget for a Gartner subscription, so I could read what they consider to be a security awareness program, but I don’t, so we’ll have to assume they mean the “basics”.
The basic and traditional security awareness programs are not security awareness programs, they are annual training tick box exercises, with a monthly blog, if you’re lucky. They are not designed to reduce unsecure behaviour, they are designed to fill a very specific need, whether it be audit prep, investigation evidence, or be the minimum someone needs to do to pass their probation.
Absolutely, root cause analysis is key, and again, no offence Richard, but I think you took those words from a talk I did many many years ago! If you missed it, good job on catching up with the entire Security Awareness community! When you analyse the incident root cause, and produce a program that mitigates against the cause, you mature the business, security, and its people. Again we mention the friction here, with the addition of the retirement of controls that add friction without meaningfully reducing risk. Stop using the control playbook of switches to flip. People will just bypass the nuisance ones anyway. Use fact and risk, and evidence to inform security. It just makes sense.
“Traditionally, cybersecurity leaders have focused on improving technology and processes that support their programs, with little focus on the people that create these changes. CISOs who take a human-centric talent management approach to attract and retain talent have seen improvements in their functional and technical maturity. By 2026, Gartner predicts that 60% of organisations will shift from external hiring to “quiet hiring” from internal talent markets to address systemic cybersecurity and recruitment challenges.”
Quiet hiring of transferable skills within an organisation? Now, I’m no expert on recruitment challenges, but I’ve been part of a fair few recruitment campaigns with my clients, and I’m not sure Gartner understands the recruitment challenges out there? Or maybe they understand from one side of the fence? The recruitment challenges, as I see them, are that there aren’t the right skills in the open market, so you could interview 100 potential candidates, and end up hiring 0. You could see the most academic CVs, yet they don’t have any on the job experience, so there’s a huge cost in providing that experience. The salaries are either overly competitive, or low, that candidates can pick and choose where they want to work. And so on.
Now I’m all in favour of hiring transferable skills, and I routinely do this. I don’t hire cyber people, I hire people who know stuff that I don’t know. I hire people who have a different life experience to me, so we can be a high performance team, but that’s really hard. That takes a lot of time in mentoring, and costs a lot in training, and hiring an off the shelf security person may be all you have the time and money to do. I absolutely agree that by hiring the people that improve technology and processes will deliver improvements in functional and technical maturity, but I’m not sure shifting from external hiring to “quiet hiring” is the answer to this problem. It may be the answer to some of the problems, but, my crystal ball says that’s unlikely.
“Technology is moving from central IT functions to lines of business, corporate functions, fusion teams and individual employees. A Gartner survey found that 41% of employees perform some kind of technology work, a trend that is expected to continue growing over the next five years.”
“Business leaders now widely accept that cybersecurity risk is a top business risk to manage – not a technology problem to solve,” said Addiscott. “Supporting and accelerating business outcomes is a core cybersecurity priority, yet remains a top challenge.”
“CISOs must modify their cybersecurity’s operating model to integrate how work gets done. Employees must know how to balance a number of risks including cybersecurity, financial, reputational, competitive and legal risks. Cybersecurity must also connect to business value by measuring and reporting success against business outcomes and priorities.”
41% of employees perform some kind of technology work? Give me a minute, while I try to come up with any employee, in any place, that doesn’t use any type of technology for their work… Let's consider someone who doesn’t touch a computer to do their job. Um, ok, let’s say the cleaning crew who arrive at 5am. They don’t use a computer to clean, perhaps? They probably do use the building access controls, aka a physical security control. They probably create some sort of log about the outcomes, and checks they carried out, which, in fairness, may be pen and paper records, and may not use any technology. My staff use a clocking in and out app on their phones, so let’s assume this cleaning crew does that too. Anyway, I digress. 41% of employees perform some kind of technology using a system the cybersecurity team knows exists, is probably more accurate.
Moving on. Good CISOs do understand they are in the business of supporting and accelerating business outcomes, and it remains a top challenge because year on year, cybersecurity budgets have been cut. That old “do more, with less” thing. So CISOs are forced to beg, borrow, and steal talent and resources from across the organisation, to make a difference to the business value.
On the “employees must know how to balance a number of risks” statement, I took a few minutes to stare at a wall, to consider what I think about this, and remember back to when I was an “employee who must know about this”. So, what do I think about this? I think, that I think…. That you need to know how to do the things that relate to your job, and when someone stomps around with a risk hammer, of any flavour, (cybersecurity, financial, reputational, competitive or legal) will feel like they’re talking with the mic off. Real security behaviour and culture change isn’t this. Employees should have a general awareness of all the risk flavours, and willingly care about them. The very fact you have to try to make people care about something means people actively don’t care. How many charity collectors do you avoid in the street, vs the ones you engage with?
“Cybersecurity must also connect to business value by measuring and reporting success against business outcomes and priorities.”
Yep. Bang on. Yet how many cybersecurity people do you know who are capable of doing this? This takes a Security culture and behaviour person, or a Cyber leader with the ability to translate security stuff into business outcomes and priorities, and it’s not an easy job, but well worth doing. (Welcome to Jemma’s toast talk, 2020).
“The board’s increased focus on cybersecurity is being driven by the trend toward explicit-level accountability for cybersecurity to include enhanced responsibilities for board members in their governance activities. Cybersecurity leaders must provide boards with reporting that demonstrates the impact of cybersecurity programs on the organisation’s goals and objectives.”
“SRMs leaders must encourage active board participation and engagement in cybersecurity decision making,” said Addiscott. “Act as a strategic advisor, providing recommendations for actions to be taken by the board, including allocation of budgets and resources for security.”
And, it’s as easy as that! I can’t be the only one who’s attended a talk titled “what boards care about” or “how to speak to the board” at some cyber conference? What infuriates me about these talks is there is never a member of a board leading these talks, or taking part in the panel. Why? Is it because they don’t want to give you this crucial information? Because they want to see 31 slides of words, when they asked for a RAG status? No! It’s because us security people never ask them! We assume their brains work in the same way as ours. You get 5 minutes with the board, if you’re lucky, and you went with 31 slides! For once, get off your ass and ask them how they want info. It’s the first thing I do! I ask how do you best digest information, and I give them the information in that way. It’s how you influence business decisions, through security, and again, I did a talk on this, a very long time ago!
So, where was I going with my spurge of angry words? When I was repeatedly tagged in the post relating to the report on LinkedIn, there were a huge number of people like me who were hollering “we’ve been saying this for years!” When I shared the post, I had a few interesting questions from my network, and every time I read a question, I re-read the piece to refresh my memory, and the rage grew. I don’t know who Richard is, and I’m sure I’m off his seasonal card list, but things like this are just not helpful. I guess my rage stems from the throw away comments that make this the norm. The far flung dates that may sit in someone’s mind of “I don’t need to think about this for a while”.
Cybersecurity is in the mess it’s in because we prioritise technology and processes, and we always forget the people aspect. Cybersecurity still believes that people are the problem and the weakest link. Cybersecurity people are amazing people, and I love most of you, but they can not be all the things, all of the time. You can not make a blanket statement that says people all need to behave or think in a set way. You can not expect cybersecurity people to work in the same way as internal comms, or PR teams, just like you don’t expect the receptionist to monitor the SIEM while the analyst is on lunch.
Gartner, you are not helping anyone with this type of content.