Harvard University and Yale University's football teams meet in Cambridge, Mass., this Saturday, Nov. 17, 2012, for the 129th time in their history. Blood will be spilled, grass will be torn up, alums will reconnect and memories will be made. It's safe to say that not many in attendance will be thinking about the network and computing research efforts underway at these Ivy League schools, so we're here to give a quick rundown of some recent efforts as a sort of pre-game warmup.
*Really, really moving to e-voting.
A slew of Harvard researchers are exploring the potential for online elections, meaning everything from U.S. presidential elections to community voting for things like building a school vs. a pool. Among them is Lirong Xia, a postdoctoral researcher at the Center for Research on Computation and Society at the Harvard School of Engineering and Applied Sciences. Computational scientists like Xia are looking at questions such as whether society would get a truer sense of the population's wishes if election ballots voted on via the Internet and if questions were spread out over multiple days. There might be some questions that are too difficult to vote on at the same time, such as those where the outcome of one question might affect how people would vote on other questions. Without online voting, "You can't say, 'Today you'll come in and vote on the first issue, and then we'll announce the result, and tomorrow you'll come back again and vote on the second issue.' That's too costly," he says. Online voting might also enable easier tabulation of a voting system that let citizens rank candidates, rather than choosing one from a field of two or more.
But of course major concerns about security and privacy have prevented widespread online voting from getting off the ground. Xia's research includes efforts in the area of computational complexity to discourage fraudulent voting behavior by making it if not impossible, at least difficult, initially for non-online voting.
(See also: "If the Internet's magic, why can't we vote on it," by Harvard's own Scott Bradner)
*Spreading the research wealth
Harvard is receiving a 4-year, roughly $5 million grant from the National Science Foundation's Secure and Trustworthy Cyberspace program to study and improve upon the privacy of research data online.
"The Internet and, in particular, social networking sites, provide an amazingly powerful platform for researchers to gather, mine, and share data on human behavior and interactions," says Salil Vadhan, a professor of computer science and applied mathematics at the Harvard School of Engineering and Applied Sciences. "Even with the best intentions and safeguards in place, however, the risk of personal information leaking out remains high."
The "Privacy Tools for Sharing Research Data" project will develop methods, tools and policies to take advantage of the data available but address the risks of dealing with it. They'll be looking at how researchers can share data they have access to locally or regionally, but perhaps not nationally or globally, and also at how to include access to data from commercial outfits, like Netflix or Facebook, without impinging on customer privacy.
The new tools will be tested and deployed at the IQSS Dataverse Network, a huge open-source digital repository of social science datasets.
*Faster Wi-Fi and LTE
Researchers from Harvard have joined force with those from MIT, Caltech and European schools to come up with a way to boost wireless network performance by as much as 10 times without resorting to more base stations, power or spectrum.
Their coded TCP innovation virtually eliminates the packet loss that makes wireless comparably slow to wired networks. Coded TCP is designed to prevent latency and the resending of bad packets. According to an Extreme Tech report on the research, algebraic equations are used to replace blocks of packets, enabling systems to determine what data was being sent by solving the equations rather than fishing around for dropped packets.
*Less messy quantum computing
Yale physicists are working to make quantum computers that could process information exponentially faster than today's machines a reality.
The focus of their latest work, described in a paper in the online journal Nature called "Realization of three-qubit quantum error correction with superconducting circuits," addresses a key shortcoming of early quantum computers - their susceptibility to errors. Their breakthrough is demonstration quantum computing error correction in a solid-state system.
"Without error correction, you couldn't make a quantum computer that had an exponential speed-up," said Matthew Reed, a fifth-year Ph.D. student in physics at Yale who is the paper's first author. "Small errors would otherwise inexorably build up and cause the computation to fail."
*Making research data fly
The National Science Foundation awarded Yale a roughly $500,000 grant to craft a 100Gbps network for shuttling gobs of scientific data around the school and beyond, via a link to Internet2. The new network will run 10 times faster than the current network, which is shared with the rest of the university. Importantly, part of the project is a DMZ to prevent unauthorized parties from dipping into the data.
According to the NSF, the broader impact of the award is: "In essence, this project is creating a virtual facility for scientific collaboration and data sharing that will allow new research communities and collaborations to form within its walls. By leveraging the Science Network and Science DMZ, Yale will be able to pioneer new modes of research collaboration; host and serve up 'big data' scientific data stores; and work with institutions in its region, across the nation, and around the world to broaden the opportunities for scientific research, and support excellent education and training for students at all levels." (More from The Yale Daily News.)
*MacArthur Fellow into more than theory
Yale's Daniel Speilman, a professor of computer science, mathematics and applied science was named one of this year's 23 MacArthur Fellows - in other words, he's the recipient of a "genius grant." He gets $500,000 to work with over five years.
According to the MacArthur Foundation: "Spielman is a theoretical computer scientist studying abstract questions that nonetheless affect the essential aspects of daily life in modern society how we communicate and how we measure, predict, and regulate our environment and our behavior."
Spielman's early research included coding theory, which is at the mathematical heart of ensuring reliable electronic communications, including for HD TV transmissions (Related: Network coding - networking's next revolution? ).
Spielman's work in computer science has included explaining optimization algorithms in such a way that improvements could be made in everything from transportation scheduling to operating system design.
Bob Brown tracks network research in his Alpha Doggs blog and Facebook page, as well on Twitter and Google +.
Read more about data center in Network World's Data Center section.