Google’s Vice President Security and Privacy Engineering Gerhard Eschelbeck spoke yesterday to a packed house at the RSA Security Conference about his professional life. Google operates in a fishbowl because its business model depends on both consumers, enterprise users and privacy regulators trusting it to store vast amounts of data in its data centers. Given this scrutiny and gigantic computing scale makes Google intriguing. It’s a benchmark establishing best security practices.
Eschelbeck’s stark mission statement “to protect users’ data” speaks of the alignment of his security group with the company’s cloud services and advertising business model.
Scale at Google according to Eschelbeck is like everything else at Google, it is intense. He said that when he interviewed for his job two years ago he had no idea of the extremes of the responsibility or the scale of the operation. He confessed that when he was asked to estimate the size of Google’s data centers and networks during interviews for his CSO position his estimates were two to three orders of magnitude too low.
Like most security pros, Eschelbeck kept details close to the vest; however he did offer as an example of scale Google’s source code base that he guards both from theft and from intrusion and infection with malicious code. Today’s environment – Google’s source code repository has 2 billion lines adding up to 85 terabyte that 85,000 engineers interact with every day.
+ NOT AT THE SHOW? See all the news as it happens +
Eschelbeck filled in some context to his role with his experience. During his career beginning in the mid-1990s the types of attacks and the defenses evolved in four-year cycles in response to the evolution of platforms. Viruses became worms that became spyware and then ransomeware as networked PCs evolved into a pervasive web of cloud interconnected PCs. Then mobile and now mobile-cloud started a parallel cycle presenting some of the same and new challenges.
He is confident that the cloud security challenges have just about been met. But looking forward to 2020 he sees a huge network of interconnected things that presents yet unsolved problems that need engineering breakthroughs in encryption, authentication and software updates. These challenges should be solved now, not in four years when the systems reach scale.
Eschelbeck has a team of 600 people, half developers and product managers, project managers, administrators and operations. It runs 24X7 and around the world under the assumption that Google’s users and infrastructure are always under attack.
Some members of the Google security team work on building tools to protect the data. Others study the threat landscape, the constantly evolving threats to constantly evolving technologies used by consumers and in data centers. Always mindful of the next exploit they ask the question: what are the new techniques used by the bad actors to which they must respond?
Eschelbeck cited as an example of Google security team’s speed and agility the discovery of a serious buffer overflow condition in the widely used Linux glibc library. Multiple teams were scrambled and worked in parallel streams. Google’s signal team monitored the infrastructure for an attack. Researchers proved the flaw could be exploited while other researchers sought yet undetected variants. In parallel Google engineers worked with engineers at Redhat to create a patch and test it. Finally the last step was to notify the Linux community of the risk of the glibc exploit and the availability of a patch.
Eschelbeck said that his greatest responsibility is to continue to conduct research and innovate to build better security and privacy into products. It is the same innovation model used in self driving cars and the novel project loon internet access network suspended from balloons applied to privacy and security.
An example of the innovation commitment is Project Zero. A zero-day exploit is one discovered by bad actors, criminals and nation states that has never been used before to breach sovereign or enterprise security. Such an exploit is undetectable because it doesn’t have a known signature for which signal teams can be on the alert. Google formed a full-time team dedicated to detecting yet undetected vulnerabilities, not only in Google software but any software used by its users. He also said that funding the security research community and participating in related open source communities was important.
Eschelbeck’s team also works with all the product teams to advise them on building secure software and to scrutinize code prior to release. They can veto a release if they perceive the potential for an exploit. Living in the fishbowl, there is no difference in the impact to users’ trust of a small or large breach.
One of his constant worries is how to validate what is delivered from outside by Google’s global hardware supply chain. Like most platform companies such as Facebook and Amazon, Google assembles its computing infrastructure from commodity hardware components. As more companies grow into large platform companies and adopt similar commodity hardware infrastructures, the attack plane becomes larger and riskier and validation becomes a bigger problem.
It could be speculated that the security team might produce a security as a service product or perhaps Eschelbeck was empathizing about the plight of small and midsized business as he pondered out loud how they can survive without Google’s scale and a CSO organization.
CSOs would be wise to follow Eschelbeck and Google because the company’s infrastructure is years ahead of most enterprises and has the engineering resources and scale to illuminate pending threats and evolving defenses.