Latest News / To Eyre Is Human
To Eyre Is Human
12 June 2017 |
Insider threats don’t spring forth from nothing, wholly formed and evil. Business Reporter's resident U.S. 'blogger Keil Hubert argues that signs and symptoms of potential insider threat behaviour can be noticed and acted on before an incident, if only someone is actively paying attention to the users.
Quick: name some personality traits that indicate a potential ‘insider threat.’ I’ll bet five quid that you instantly named several of the common traits that the U.S. National Cybersecurity and Communications Integration Center listed in their 2014 publication ‘Combating the Insider Threat.’ Right up front: ‘greedy.’ That’s a classic. You probably listed narcissism or possessing an ‘entitlement mentality.’ Those are great. Maybe ‘egocentric’ or ‘arrogant’ too. Well done, you. We’re all familiar with these sorts of malcontents; they’re what we expect to hear in real-world news stories about rogue employees. Remember in 2008 when the City of San Francisco was locked out of their own fibre optic network by a disgruntled sysadmin? Yeah. These people are real. We have examples.
One thing, though: when you assembled that list of insider threat personality traits, did you include the trait ‘lonely’? Because you really should have …
I just got around to reading Brenden Koemer’s article ‘For the Love of Duke’ from the October 2015 edition of WIRED, and … wow.  This is a gripping read. The thrust of the article is right there in its subtitle: ‘How a lonely Appalachian woman met the man of her dreams online – and became a pawn in a global crime scheme.’ If you don’t have time to read the original, I’ll summarize: in 2012, a middle-aged woman named Audrey was contacted by a Nigerian ‘Yahoo Boy’ scammer over social media under the false pretence of romantic interest. As their online ‘relationship’ blossomed, the lonely woman was convinced to send all the money that she had (and could raise) to Nigerian scammers pretending to be ‘Duke,’ a down-on-his-luck widowed Scottish oil rig worker. By the end of the saga, Audrey was laundering hundreds of thousands of dollars of fraud proceeds for the scammers through her friends’ and family members’ bank accounts. She was arrested. There was no happy ending.
If this had been a Hollywood romantic comedy, the ‘criminal fraud’ subplot would have been a wacky misunderstanding in the second act, resolved and forgotten by the joyous wedding in the third act.
Funny thing … the Combating the Insider Threat report never uses the words ‘lonely,’ ‘romance,’ or ‘isolated’ once. Same thing in most of the other insider threat articles that I’ve collected as handouts for my IT Troubleshooter course students. They don’t mention those motivating factors either. The thing is, just because a lonely person may not be as likely to evolve into an insider threat as an egocentric, aggressive, and angry employee, that doesn’t mean that they’re not vulnerable to being exploited by cybercriminals. Remember: A criminal’s ultimate objective is to secure an outcome as efficiently as possible. Ideally, a baddie would just sneak in the target’s systems and steal whatever information or money he or she came for. When that’s infeasible, though, they co-opt helpers – some volunteers, some dupes – to do their work for them. A user who’s starved for positive attention and who lacks the technical sophistication to recognize a lie can be turned over time into an involuntary quisling … and, thereby, can be made to do the criminal’s bidding inside the network.
This is not meant as a slam against vulnerable users, middle-aged female workers, or true romantics. Not at all! Koemer’s article relates one story of a user who was tricked into facilitating cybercrime. There are thousands of other stories out there of people who were manipulated by criminals along their own unique biases into doing things that the shouldn’t have done. That’s what the practice of social engineering is all about: taking advantage of decent people who are distracted, misdirected, over-friendly, or otherwise vulnerable. Legendary hacker Kevin Mitnick’s book The Art of Deception is dedicated entirely to the craft of compromising people rather than defences. It works.
Back when I was in the military, I used to teach my techs Operations Security awareness.  As IT people, we had a significant role when it came to protecting sensitive information. While most military members have to be perpetually tight-lipped, our IT support guys also had to be on the lookout for signs of adversary data collection activities. We had to know how the bad guys went after our data so that we could implement our own active and passive countermeasures to thwart the baddies for when one of our users inevitably made a mistake or got tricked by an adversary.
I’ve been trying to incorporate that same sort of impactful education for my IT Troubleshooter Course students because they – unlike my troopers – aren’t likely to get that sort of exposure in their technical classes. Most conventional IT education is over-focused on technical skills. There’s really no equivalent of an ‘IT liberal arts curriculum.’ A student is force-fed highly-concentrated bursts of facts, definitions, procedure steps, and trivia unique to either specific technology products (e.g., Windows XP) or else performing specific functions (e.g., recovering deleted data). Students get the bare minimum of ‘what’ and ‘how,’ with only the occasional glimmerings of ‘why.’ This practice makes the majority of entry-level IT workers qualified to perform technical actions under strict supervision … and little else.
We exacerbate the problem when we take promising young techs and entomb them in the data centre, isolated from all user contact.
I’ve seen this often with new hires fresh from university. Interns and new entry-level techs come in with a CV full of certifications and a transcript full of technical classes, but with no meaningful job experience, no practical interpersonal experience, and no sense of perspective. The first deficiency is to be expected; the other two are indicators of inadequate education. Tobe fair, these are completely manageable deficiencies. Since we know to expect it, we can mitigate our newbies’ skill gaps.
The first thing that we have to teach is how to see and comprehend the ‘bigger picture’ in information services. That means teaching the things like office politics, organisational structure, interpersonal dynamics, IT logistics, and (most importantly) user behaviour. Most high school, vocational, community college, and university degree programmes don’t mandate any social science requirements into their IT training curricula. So, we teach it. Or, at least, I do, because I want my new hires to be useful as soon as they hit the ground.
That’s why I’m teaching and emphasizing cross-disciplinary skills in my IT Troubleshooter Course. I’m investing about half of my total instruction hours to the social science stuff – the culture and context knowledge – that will make these guys immensely valuable to their next hiring manager. That’s because a very large part of social science training involves understanding users, not just technology. Yes, we cover a bunch of IT facts, but that’s only ever half of the equation. It’s how real people use their IT kit that matters. Since I’m optimizing the programme specifically for first tier tech support jobs, that focused instruction is crucial to the students’ success.
I argue the first tier of any tech support model is always the most important tier. It doesn’t matter which support model you subscribe to; the lowest tier in every model is the one that operates closest to the users. These are the techs who interact with users where they live. They get to directly observe user behaviour and interact beyond just dry, functional support requests. First tier technical support is the tip of the allegorical spear for detecting, correcting, and understanding problems related to behaviour. They’re often the first people who find the subtle clues that expose bigger issues.
You don’t have to flirt with users to get them to open up to you. Sometimes, the simplest act of human kindness is enough.
That proximity means that they’re the IT organisation’s eyes and ears for information security vulnerability detection. By interacting with users, junior field techs come to understand the people that they serve. As they discover personality quirks and traits, they can alert the InfoSec department to potential indicators of compromise. They also have a have a fair fighting chance to pre-emptively intercede with the user to mitigate whatever potential threat they might casually discover.
This doesn’t have to be a covert or sinister thing. It’s the exact same function that beat cops and Army corpsmen employ. By immersing oneself in the community and interacting with people, the representative of a larger function learns to spot signs and symptoms of potential disruptive behaviour, and then directly address the person or people involved. Just like a medic noticing that a squaddie is looking pale and wan, or a cop noticing that a normally-cheerful kid is becoming sullen and aggressive, a tech support agent notices when a user is uncharacteristically upset, distracted, distant, or otherwise atypical. The sharp field tech can then address the behaviour change directly. This isn’t a new idea. It’s entirely old-school policing: know your people, know your ground, and pay attention to changes that might signal trouble brewing.
When you look at most IT employee training, though, the human side of IT rarely gets covered.  There’s tons of fault and failure content, procedural instruction and definitions, but little (if any) attention paid in lecture to user behaviour. No wonder, then, that so many organisations find themselves blindsided by insider threat events. Rather than pay attention to (and comprehend the importance of) the early warning signs of peculiar employee behaviour, companies in general (and IT departments in particular) tend to treat rogue users like the classic ‘madwoman in the attic’ literary trope. Think Jane Eyre’s out-of-nowhere villain Bertha Rochester. Everything was fine, until one day when everyone was completely surprised by User X’s sudden and wholly unexpected violation of company regulations. The merciless incident review highlighted a bunch of obvious clues that management and security should have noticed, but it seemed like one had wanted to get involved …
I call malarkey on that. At the very least, the assigned managers should damned well have known their own people well enough to spot the early warning signs of inappropriate behaviour. So, too, should the tech support agents who supported the office. The person responsible for keeping the user operational had a duty to notice warning signs and strange personality traits. The question to ask was (I argue) was that tech support agent properly trained to recognize and to report on such behaviour?
Imagine if Audrey from the aforementioned WIRED article had been one of your organisation’s users. Knowing just a little about her background and personality, it would have been easy for a manager or co-worker to notice when she changed from being quietly sad to passionately thrilled about a new long-distance romantic relationship. Her demeanour and attitude changed. There’s nothing wrong with that, no; be happy for your users’ new-found joy. But Audrey’s corresponding change in communications habits – that is, significantly increased social media use and international video calling – should have triggered an immediate warning, especially for your InfoSec department. It would only have taken one field tech chatting with Audrey to learn about her new ‘Scottish’ boyfriend, and then correlate that with Audrey’s Skype connections from the US to Nigeria in order to recognize a crucial discrepancy and sound the alarm.
Most security people are insatiably curious people. Spark one boffin’s interest, and you frequently get the entire security department working on your case.
Audrey could have been saved a ton of grief and legal woes (not to mention saved her house, her savings, and her freedom). Unfortunately, Audrey wasn’t working in a typical 9-to-5 office job with attentive tech support people when the Nigerian scammers targeted her. By the time her friends and family realized that she’d been scammed it was already too late to save her from the consequences of her well-intentioned criminal actions.
That’s not the case for the people working with you, though. It’s both possible and necessary to detect potential vulnerabilities in your users and possible indicators of compromise in their communications and work habits. As a responsible member of the IT team, you can engage with your users, learn who they are, and pay attention to the odd changes in their behaviour patterns. Managers are the best-positioned sensors for this, but never underestimate the ability of Tier One tech support agents to correlate human behaviour clues with supporting technical evidence. Curious people are often the best detectors of threat indicators. Sometimes, it’s the humble young tech support guy or gal who makes all the difference between an attempted breach and a realized one.
If, that is, we train those Tier One tech support workers how and why to do so. We can’t expect our entry-level workers to have the mind-set of a seasoned veteran immediately upon entering the labour force. Someone has to teach the new folks the required skills and then continuously coach them thereafter on how to best employ those skills in the messy, confusing, and chaotic workplace.
 I know. I’m being a terrible nerd right now. I just got around to reading the October 2015 issue of WIRED over lunch last week, and I’ve been subscribing to this magazine since its first year of publication. Somehow I’ve gotten waaaaaaaaaay behind in my professional reading.
 Operations Security – or OPSEC – is the practice of denying your enemies the ability to interpret friendly intentions or activities through collecting snippets of unclassified information.
 With the notable exception of the SANS family of courses. They delve deeply into the topic, which is why I’m strongly positively biased towards them as a training provider.
Title Allusion: Charlotte Brontë, Jane Eyre (1847 book)
Images under licence from thinkstockphotos.co.uk, copyright: depressed, MarinaZg; happy couple, Mckyartstudio; IT engineer, zhuyufang; delighted happy woman, yacobchuk; policemen, KatarzynaBialasiewicz.
POC is Keil Hubert, firstname.lastname@example.org.
Follow him on Twitter at @keilhubert.
Keil Hubert is a retired U.S. Air Force ‘Cyberspace Operations’ officer, with over ten years of military command experience. He currently consults on business, security and technology issues in Texas. He’s built dot-com start-ups for KPMG Consulting, created an in-house consulting practice for Yahoo!, and helped to launch four small businesses (including his own).
Keil’s experience creating and leading IT teams in the defense, healthcare, media, government and non-profit sectors has afforded him an eclectic perspective on the integration of business needs, technical services and creative employee development… This serves him well as Business Reporter’s resident U.S. blogger.