Well that stinks, doesn't it? Sony Pictures goes and scrubs the launch of a $44 million movie after being hacked, potentially by North Korea. Almost reads more like a James Bond plot than a news story, but there it is. And this time, it doesn't seem likely that Bond, James Bond, is going to show up at the eleventh hour to save the day.
Sure, Sony's hands may well have been tied when a couple of major movie theater chains announced they wouldn't be showing the movie. And I have no doubt that there are plenty of facts in this situation that I'm simply not privy to.
But the movie in question, The Interview, is a fictional comedy, after all. Where is the righteous indignation? Where is the "screw you, we're going to run it anyway"spirit? Perhaps it's worth considering how the U.S. would respond if another country filmed a "comedy"that depicted a standing U.S. president being assassinated. Still, the whole situation stinks, and I sure wish that Sony had the spine to stand up to this cyberterrorism and run its movie anyway, even if it meant taking it straight to online streaming services and such, bypassing the equally spineless theaters.
But that's not really what I'm here to talk about today. Instead of dwelling on that which we can't change, let's instead consider how companies like Sony could turn a negative into a positive. After all, make no mistake about it: This time it's Sony, but it could be any company next time. And now that the precedent of capitulation has been set, I fully expect the bad guys to be emboldened more than just a little.
So then, how could Sony have turned things around here, even after agreeing to not run the movie?
First and foremost, let's take the time to learn from what happened here. By that, I mean let's study how the attacks took place and ensure we can prevent similar attacks in the future. To do that, the techniques and tools, if possible, must be understood by the IT security community, not just a few incident responders at Sony and whatever other incident responders worked on this incident.
In the incident-response world, through organizations like the Forum of Incident Response and Security Teams (FIRST), technical case studies from incidents like this one are often shared and discussed. FIRST itself holds several technical colloquia each year for that sort of purpose. Without a doubt, the most compelling sessions at these events, which are generally open only to the FIRST member teams, are the case studies of actual incidents.
The incident-response community uses an information-sharing model similar to that used by the medical community: It's acceptable to discuss a disease, its symptoms and its cures, but it's not acceptable to discuss patients directly.
I would hope that someone either at Sony or at another computer security incident-response team (CSIRT) that worked on this incident would be able to present a technical case study at an event like a FIRST "TC"so that others can understand and prepare for similar attacks.
But we can't stop there. It's one thing for the incident responders to understand new attacks and tools, assuming that's what was actually used against Sony, but it's also vital that that information works its way upstream, all the way to the people who develop our software.
I can vouch for the difficulty encountered in trying to get information like that all the way back to software developers. Indeed, I've spent the greater part of the last decade doing exactly that in my consulting practice.
But if we have any hope of getting to a point where a company like Sony can with confidence summon the backbone to say "hell no"to a cyberbully --or cyberterrorist --we simply have to more effectively learn from our mistakes and share that knowledge to all the key stakeholders in the security of our systems. That set of stakeholders extends far beyond the confines of the traditional IT security department.
So, to you fellow incident responders out there, I say this: Find ways of spreading the knowledge you've amassed through dealing with incidents, and make sure that knowledge can be consumed by software developers, testers, system architects, business owners, and the entire cast of characters that have a stake in our business systems. We'll never succeed by simply putting out today's fires without taking the time to systemically improve our practices.
With more than 20 years in the information security field, Kenneth van Wyk has worked at Carnegie Mellon University's CERT/CC, the U.S. Deptartment of Defense, Para-Protect and others. He has published two books on information security and is working on a third. He is the president and principal consultant at KRvW Associates LLC in Alexandria, Va.