We've lost something precious -- perhaps irretrievably. We've lost trust in our information systems, thanks to Edward Snowden's leaks of National Security Agency documents.
The loss of trust runs deep. We've lost trust in our security APIs, our crypto, our random-number generators, our network security protocols, our operating systems, our firmware, our hardware. To paraphrase my colleague Gary McGraw, it really is turtles all the way down, only these are snapping turtles, and they are bearing down on our most private of private data.
In the six months or so since Snowden's revelations began, our trust has been hog-tied, beaten to a pulp and left for dead.
Restoring that trust will take much longer than it took to be lost, if it can be done at all. It's going to be a long and arduous process. If it's even possible, it is going to take years.
While we work to rebuild our lost trust, we have to continue to use the systems we no longer trust. We can't just abandon them and start from scratch. It will be something like completely refurbishing an airplane, engines and all, while it's already in the air.
I'm not here to pass judgment on Snowden, the NSA or any of the other parties caught up in this mess. Let the right and wrong be argued elsewhere. For the purposes of this column, the important point is that the whipped cream is out of the can and there's no getting it back in. So let's focus on how we can start to fix what has been broken.
Since we can't just throw everything away and start over, I think we have to tackle the problem from two directions: from the top down and from the bottom up.
Working from the top down, standards organizations, including the Internet Engineering Task Force, have to assemble the brightest minds to develop security standards that are outside the reach of any one government or agency. Everything, from our crypto algorithms to security protocols, needs to be re-evaluated, and fixed or rewritten if necessary. This will take years of hard work by smart people -- digital patriots, I'll call them.
And that's just the start. Just how paranoid do we need to be? If there have been hardware and firmware compromises, as some stories have suggested, the answer might be more than we can bear. But we've got to start somewhere, and our critical evaluations should certainly include supply chain management from the chip level upwards.
And then there's the bottom-up side of things. That effort will mostly rely on software developers. How do we write software that can be trusted at all? We start by always questioning our trust. Why should we trust an API, an algorithm, etc.? I've covered quite a bit of this ground here in my columns, but some of the principles we need to strive for include doing our own key management whenever possible, building our code on robust, rigorously peer-reviewed security foundations, and so on.
But, as we learned from the famous Ken Thompson paper " Reflections on Trusting Trust," we need to be very aware of what we are trusting and why we are trusting it. Compilers can be compromised, for example, so that even otherwise secure software can end up having weaknesses in it.
That means we need to place an extra emphasis on our security testing of code that matters (to us). Things like dynamic validation testing to verify that our security requirements are indeed being faithfully executed simply must be done.
As a community, our "security virginity" is lost -- if ever we had it in the first place. We can no longer merely rely on things like SSL/TLS to keep our data (in transit) secure. We have to question everything, and we have to double-check our work to ensure that even the most basic levels of trustworthiness are being met.
It is a sad state of affairs that has brought us to this point, and restoring our confidence is going to require a tremendous and concerted effort. We cannot simply go back to the status quo and hope for the best.
But if we focus our top-down and bottom-up efforts appropriately, perhaps we can bore a tunnel from both ends and actually meet halfway across, much as they built the Chunnel years ago. But, unlike the Chunnel, we don't only have to span the English Channel. We have to span the Atlantic Ocean, the Pacific Ocean, the Indian Ocean, the...
With more than 20 years in the information security field, Kenneth van Wyk has worked at Carnegie Mellon University's CERT/CC, the U.S. Deptartment of Defense, Para-Protect and others. He has published two books on information security and is working on a third. He is the president and principal consultant at KRvW Associates LLC in Alexandria, Va.
Read more about security in Computerworld's Security Topic Center.