Studies show that the primary root cause of cyber-attacks often boils down to a human factors issue. And organizations are taking note, increasingly turning to security training to improve security knowledge and avoid putting users and the organization at risk. But the effectiveness of security training has long been debated. Studies show that most employees tend to tune out from training sessions or they can find it uninteresting. This is because so-called ‘security awareness’ is kind of a misnomer. Simple knowledge transfer is not the goal. Knowing is not the same thing as doing. Therefore, the goal of a human-focused program should be behavior change and a resulting reduction of risk. This is why security programs need to be more ‘humanized’ and practical in their approach.
Let’s discuss and understand the key elements of a human-centric cybersecurity awareness program:
4 key elements of a human-centric cybersecurity program
Think about the time when someone had to teach us how to eat with a spoon. A very basic early life skill when a parent had to be patient and clean up afterwards because we didn’t always hit our target. Similarly, when people engage with security, there has to be a level of patience when teaching employees security best practices. There has to be a right way of teaching — starting with listening, not making them feel ignorant or embarrassed but building a relationship on a personal level so that it helps to foster trust. If employees trust you, they’re more likely to engage with you; they’re more likely to report a security incident versus hiding it because they fear a reprisal may result or hurt feelings.
Awareness training was built on the premise of helping users connect the dots — how do we notify people of a risk they don’t see as evident when they click a malicious link, are redirected to a bogus website, or download some malware-laden attachment? Security teams probably tell hundreds of users about the different risk scenarios they might encounter however, they fail to connect these risks to the activities in their daily work. If your house is on fire, you can see flames and smell smoke, the instinct for protection and survival gets activated. So far, we really haven’t done a great job of activating those senses. For example, using the same password is a bad security habit. If one takes a step back and puts this in the context of their home, then that is something that everyone can instantly relate to. What if the same key could be used to open multiple houses, would you feel safe at night? Similarly, passwords are the keys to our data, money, ID, privacy, and so on. Such examples can immediately help people connect and visualize a digital world they cannot see — making the abstract seem tangible by connecting it to something that people already know and care about.
Culture is really an important aspect of what makes people tick. Keep cultural sensitivity in mind when designing training and enablement programs. Focus on building the right relationships with the right teams across departments and regions so that security teams have a better understanding of the cultural status in the context of security. For example, it’s important to localize your content, not translate it when dealing with global regions. The content should be in local format, using local examples and local attack vectors that are prevalent in those regions; not just a straight-up translation, because many of those examples may not apply to those regions.
Similarly, culture as it pertains to the legal team is probably not the same as the culture in your marketing team. Therefore, security teams will have to adapt their communication styles and approach, keeping in mind the relevance of the cultural context. And don’t forget, culture is infectious, it can work both in favor or against your program. If you get your foundation right, there will be a domino effect and more people will get with the program and if you don’t, more people will feel disengaged.
Each of us in our role, our department, and our function, have some limitations. Because of that security teams should take the time to understand what those limitations and frustrations may be. Is there too much bureaucracy? Are there too many steps, too many redundant approvals to pass for getting things done? How can we improve this process? If it is a particular team, if it’s a particular leader, if it’s a business function, how can we improve what is already being done? Viewing your security program and your employees’ security struggles through an empathetic lens is key to understanding why someone is behaving in a certain way. For example, shadow IT (or unauthorized use of technology) is dangerous in organizations because it’s uncontrollable. IT often lashes out to the business saying, you’re not allowed to use that server or tool because it hasn’t been scanned or tested or approved because we don’t know if it’s secure or meets our compliance standards.
But the question that they often forget to ask is, why are you using it and how can I enable you by supplying you with something that performs just as well plus allows you to do your job? That should be the first question, and after that, shadow IT should slowly disappear as a problem and employees will finally see IT as an enabler.
To summarize, it’s important to understand and explain the technical concepts of cybersecurity, however it’s equally important (if not more) to be able to communicate patiently with people. This is something that is currently lacking in a majority of cybersecurity practitioners. If security teams develop their soft skills and work to humanize their training and security programs, they will have a better chance at building a stronger, more resilient, and infectious culture of security.