ignosecond – ig no sec ond – \‘ig nə se kənd – The time between the moment one does something inherently stupid and the moment one realizes that it is too late to stop the results of that action. Examples include pushing a locked car door closed and realizing that the keys are in the ignition or opening an attachment or clicking on a link in an email message from supposedly a business associate or friend and recognizing the telltale signs of a phishing scam.
It is turning out that the latest breaches were the result of an ‘ignosecond’ by one or more employees that in turn caused a security breach to be possible. All it took was an email message to personnel that included a piece of malware hidden in a file attachment that exploited a vulnerability that then allowed the installation of a backdoor and viola, another compromise.
This should be a wakeup call to all security professionals. It does not matter how sophisticated your security technology is, it only takes one person to cancel all that out. This is why the PCI DSS dedicates requirements in 12.6 to security awareness. The requirements in 12.6 state:Implement a formal security awareness program to make all personnel aware of the importance of cardholder data security.
- Educate personnel upon hire and at least annually. Note: Methods can vary depending on the role of the personnel and their level of access to the cardholder data.
- Require personnel to acknowledge at least annually that they have read and understood the security policy and procedures.
The problem is that a lot of security professionals give only lip service to security awareness training. Let us face it; security awareness training is not as sexy as security technologies like SEIM and WIPS. And besides, our users are, well, users. Even if you train them, they still make mistakes, so why bother with security awareness training? However, at the end of the day, everything in an organization’s security posture comes down to the people that interact with the information you are trying to protect. As I stated earlier, it only takes one person having a bad day or a “bad apple” to make all of an organization’s security technology and other controls impotent
This is why security awareness is such an important part of an organization’s security posture. Whether you like it or not, there are human beings in the equation and human beings are fallible. The only way to address this situation is to educate your fellow employees on how to make things secure and avoid being taken in. But remember, human beings are fallible, so no matter how hard you press security awareness in your organization, you are still going to have incidents. Therefore, the goal of security awareness training is to minimize the number and impact of those “ignoseconds.”
But we need to be honest about all of this. Human beings are fallible and we all have our “moments.” As a result, even with a lot of appropriate security awareness training and periodic reminders, one or more personnel can have a “moment” and create the possibility of a breach. Even with defense in depth, all it takes is one well crafted attack, a fallible human and your security is breached. As I have repeatedly stated, security is not and never will be perfect. And this is particularly true when human beings are involved.
My favorite story about such a situation is from years ago when I was conducting a social engineering attack against a subsidiary of a Fortune 500 company. We had crafted a very real looking email message from a known Human Resources consulting firm indicating that they were conducting a survey of the subsidiary’s employees on behalf of the corporate parent. We instructed recipients to log into a phony Web site and take the survey. All they had to do was use their network logon credentials to gain access to the survey. We only got two hits before the parent company’s HR department sent out an urgent email message telling employees that our message was bogus. One of the two people caught was the CFO of the subsidiary that had hired us. His comment when confronted with the fact of his “moment?” “I suppose I shouldn’t have done that.” What an understatement!
But this story points out the problem all security professionals face and this is one problem that technology is not going to solve. In the end, people are always going to make mistakes and all we can do is minimize the impact of those mistakes. Minimizing the impact means real security awareness training coupled with social engineering testing to assess how the security awareness training is working. In addition, you need to structure your preventative, detective and corrective controls such that you address any points in your controls where one “moment” results in a compromise. In some cases, you may need to restrict peoples’ access to certain resources or divide up responsibilities.
Most security professionals loathe social engineering tests and rightly so. As someone famously said a while back, “When on a witch hunt, you are always going to find at least one witch.” As I have already stated, everyone has their moments and as social engineers such as Kevin Mitnick have shown, there are always ways to social engineer your way into any organization. Not that organizations have done themselves any favors in this area. For the last quarter of a century, most organizations have been focused on customer service improvements. A by-product of this customer service improvement focus has been to train employees to be customer friendly to a fault. It is those faults that are now being used against them by social engineers. While good customer service is necessary, customer service training needs to be coupled with a healthy dose of skepticism to ensure that information is not provided without proper authorization.
The best example of customer service gone awry is from the 2010 DEF CON “How Strong Is Your Schmooze” contest. This contest was a social engineering exercise against large companies that resulted in some very embarrassing results. Contestants had two weeks to prepare for their social engineering exercise by conducting research on their target. Of the 15 organizations contacted and 25 available “flags” that could be obtained, 14 gave up one or more “flags.” To add insult to injury, the social engineers had only 25 minutes to perform their telephone calls in front of a live audience. If you have read the report you may have issues with the 25 “flags” that were used (God knows the FBI was very concerned and advised the DEF CON people on what they considered okay information to obtain), but you must remember that if this sort of information was obtainable, then probably just about anything could be obtained.
The lesson to be learned in all of this is that if you are not worrying about social engineering and conducting security awareness training, then you are kidding yourself if you think your organization is truly secure. Yes, there is little you can do to stop human beings from having “ignoseconds.” But you can take steps to minimize the impact and one of the most important is to get serious about your security awareness training and to follow that training up with social engineering testing. Just acting on those two items can make a significant difference in the impact of a social engineering attack.
UPDATE: If you think I’m blowing smoke, here are the results of a survey that confirms what I am saying.