The Ontario Cancer Institute in Canada has deployed what it's calling Canada's fastest research supercomputer to help discover more effective cancer treatments.
The goal of the research, which has been under way for 10 years is to understand cancer by developing and applying various algorithms in order to analyze large quantities of complex data. Besides the Ontario Cancer Institute, the project team consists of scientists from Princess Margaret Hospital, University Health Network and Buffalo's Hauptman-Woodward Medical Research Institute.
The project lead scientist, Igor Jurisica, said that designing new cancer drugs requires analyses of protein interactions that when displayed on a screen is much like "a huge black hairball." Different algorithms must then be applied to help interpret the mass of data.
Another part of the research is understanding the structure of proteins in order to devise cancer treatments. This requires the creation of crystals, and therefore determining optimal conditions to creating quality crystals. But the approach begets a "massive information technology problem," said Jurisica, with the combination of proteins and conditions resulting in more than 90 million images to analyze and interpret.
"So our task is to have algorithms that look through all those images and classify them to find the results of the experiment. That's where we need this massive computing power to be able to handle this complexity," said Jurisica.
The technology is the IBM System Cluster 1350 supercomputer that incorporates the DCS9550 Disk Storage System, as well as Deep Computing Visualization to create high-resolution images required for the research analysis. The system also includes 1,344 processor cores in the Linux cluster running at 12.5 teraflops (trillion calculations per second) with 150 TB.
The deployment was made possible by grants from the Canada Foundation for Innovation and the Ontario Ministry of Research and Innovation. IBM provided in-kind donation for the hardware, software and services.
Chris Pratt, strategy initiatives executive with the US-based company sees the long-standing relationship with the research team as a collaborative one, where IBM's role goes beyond merely providing the IT infrastructure. "It's the conceptual design and scoping of the problem to provisioning and supplying of equipment, and making sure it works and does what it's supposed to do," said Pratt.
"This is not grandma's exchange server," he said, "This is a very complex type of problem that requires specific skills."
And, as of last November, the project was running its 90 million analyses on the World Community Grid's network of 250,000 PCs. "Even with that power, we will finish in 2014," said Jurisica, adding that data garnered from the World Community Grid would require massive computing power to run analyses.
Computations that were taking months on the old infrastructure now takes days. "So it's really an order of magnitude change how quickly we can run these analyses," said Jurisica.
IBM's interest in research projects reflects its belief that innovation in one domain is applicable to others. "Across research, we think of it as being very specifically focused, but very often discoveries in one area of research lead to leaps forward in other areas of research that were not related," said Pratt.