Beeps, blips and IT: Making sense of sensor data

As sensors proliferate in every industry, companies wrestle to turn the fire hose of real-time data into usable business intelligence.

It's no exaggeration to say the '00s have been the decade when the electronic sensor left the factory floor and went, well, everywhere.

Manufacturers and retailers use RFID tags to track inventory. Food and pharmaceutical companies use temperature and humidity sensors during transportation and in warehouses. Government and civil engineering firms use wireless sensors to monitor public infrastructure like bridges and dams for structural integrity. And electric utilities and corporate consumers are using sensor technology to help them manage, distribute and use power more cost effectively.

In the face of such a rapid-fire proliferation of beeps and buzzes, technology executives are wondering how they can tie sensor data into the IT network, says Jeff Platon, vice president of marketing for emerging vertical markets at Cisco Systems.

Previously, most sensors were built into machines and transmitted information to closed systems using proprietary communications protocols. But over the past five years, new technologies have given rise to a new generation of sensors that are mobile and networkable, enabling their use in a much wider variety of applications.

As sensors spread, vendors and corporate IT managers alike are exploring ways to integrate information from such devices into their overall IT networks, Platon says. The goal? To use sensors not just for tracking, counting or otherwise monitoring things, but to combine and analyze sensor data with other business indicators to identify long-term trends and gain competitive insight.

A single corporation may have a plant manager overseeing automation and control sensors, a facilities manager monitoring building-automation sensors, and a distribution manager with supply-chain sensors, Platon says. The natural question becomes, why not merge all this information on the central network and see it all together in a business context?

Answer: because it's too complicated -- for right now, at least. There are both technical and cultural barriers to integration, says Platon.

First, the sensor industry has spawned a rat's nest of protocols. In the manufacturing arena alone, there are 250 different proprietary protocols. Even those wireless sensor networks that use the IP protocol often have to use different nonstandard versions because of the sensors' low power levels and processing capability, Platon notes.

Cisco and sensor network vendor Arch Rock co-chair an Internet Engineering Task Force working group to develop a standard for IP-based routing techniques over these wireless sensor networks, but the process could take years, Platon says. Perhaps even more significant, there is a culture gap in most corporations between the plant or supply-chain manager who typically controls sensor networks and the CIO who runs enterprise IT, says Platon.

Operations and facilities staff focus on watching sensor networks for alerts and alarms and responding quickly to solve immediate, typically physical problems -- such as a malfunctioning refrigeration unit, explains Chet Namboodri, Cisco's global director of manufacturing industry solutions, who calls such decision-makers the "concrete" side of operations.

IT -- the "carpet" side of the company -- is more oriented toward processing and analyzing information in a standardized way over time. "The cultural differences are the biggest challenge in terms of converging the networks and the capabilities of the concrete side and the carpeted side" of a company, says Namboodri.

And yet a few companies with particularly pressing needs are starting to investigate the possibility of joining concrete with carpet. The two projects profiled below -- both driven by the rising costs and short supplies of energy -- represent two different approaches to linking sensors to the overall IT network, but the long-term goal of these projects is the same: to integrate that data into the process of managing the business in order to increase efficiency and cut costs.

Page Break

Utility taps the power of smart metering

Constrained energy capacity, an aging infrastructure and security concerns are driving electric utilities to build a smarter grid -- one with processing power and communications technologies that will enable the utilities to monitor, manage and distribute energy more efficiently.

A key element of the smart grid is the smart meter, which collects detailed information on energy use at individual buildings and has two-way communications with the utility. With smart meters, utilities can monitor how much power a particular house or office is using and, under terms to which customers agree, can throttle power down to certain buildings or even certain systems within buildings at particular times in order to better manage electricity during peak use periods.

Allegheny Power, the distribution unit of Allegheny Energy that delivers electrical service to approximately 1.5 million customers in Maryland, Pennsylvania, Virginia and West Virginia, is launching a smart grid pilot project in May whereby controls for a six-building office park in Morgantown, West Virginia, will be integrated directly into the utility's infrastructure.

In addition to the smart meters, Allegheny is installing new sensors on power lines and integrating old sensors into one network, all of which will feed into Allegheny's overall enterprise network, explains Harley Mayfield, planning engineer at Allegheny Energy.

Unlike traditional meters, which merely count the number of kilowatt hours, these smart meters will track kilowatt hours over time, which will tell the utility when a given building or neighborhood is using a peak amount of electricity or even which systems in a building are using the most power at a particular time.

Combining that information with data on power-line loads and the demands on various other parts of the network accomplishes two goals, Mayfield notes. First, it helps the company manage power distribution in the short-term -- by ratcheting down power to certain nonessential areas at peak times to avoid a brownout, for example. Second, it allows Allegheny to build a database to help it make more intelligent decisions about its distribution system in the long term, he says.

But the utility currently has a mishmash of sensors in the field -- including old analogue sensors that predate the digital age and rudimentary digital sensors that are now years old -- all from different manufacturers and using different protocols. "One of the problems that has plagued us," says Mayfield, "is how to integrate these different sensors to get this information into the system" -- that is, into Allegheny's distribution network.

To solve the problem, Allegheny is using sensor hardware and middleware from Augusta Systems, a maker of enterprise sensor networks. The products work like universal translators for various sensor standards and protocols -- they can convert proprietary protocols to an IP protocol, which in turn enables that information to be fed into an IP-based enterprise network, explains Patrick Esposito, Augusta president and chief operating officer. "For many enterprises, that is a huge problem -- wrestling with all that data and getting the non-IP data onto the IP network," he notes.

Allegheny will use the Augusta Systems products as regional processors in the distribution network around the office complex. These ports will receive data via Wi-Fi from the smart meters and various sensor devices, translate that data into an IP protocol, cull the relevant information and forward it to the mainframe, says Mayfield.

Long term, the system could help Allegheny decide when or whether to increase its distribution capacity, says Mayfield. Today, the utility has only isolated pieces of information -- the condition of a given power line or the fact that a given neighborhood uses a certain amount of power over a six-month period, for example.

The new system will provide more specific information and combine it with other data that gives Allegheny a more accurate and complete picture. "We'll be able to identify the peak load on the substation and whether we need to increase capacity," says Mayfield. "That's data that we don't have right now."

Page Break

Tracking data center energy costs

Ever since the data center became the poster child for IT energy waste, companies have been looking for ways to make it more efficient. Nobody knows that more than Alan McNab, president of consulting firm McNab and Associates, who has been working with a large financial services firm in New York on a project to identify and monitor cost drivers in its data centers.

Operating expenditures over the lifetime of data centers can be two times the amount spent to build them, and yet most companies have very little concrete information on what's driving those operating costs, says McNab. "There's a real need to get more cost transparency around operational expenditures," he says.

If a company can understand what's driving operational costs, then it can make more intelligent decisions about how, when and where to deploy some of the new cost-saving technologies like virtualization, he notes. In addition, the company can use cost data to design more cost-efficient centers in the future.

As a first step toward identifying data center operating costs for the New York financial services firm, which he declines to identify, McNab is running a trial program using wireless sensor networks from Arch Rock to identify temperature and other metrics in specific areas of the data center. At the same time, McNab is working to tie in all sorts of other data that is relevant to operational costs, including the model and age of specific servers, when each was deployed, how it's being used, how much electricity it uses, and how many trouble tickets have been issued for that server.

Arch Rock has specifically designed its wireless sensor products for integration into IP networks via Web services. "The sensor is an IP device on the enterprise network," explains Arch Rock CEO Roland Acra. "IT managers can access the sensors over the Web and manage them just as they would any other type of IP device. The IT person can pick their programming language, their development environment, their operating system."

That means IT staff can program the devices to do what they want without having to learn a proprietary language or go through the operations or manufacturing specialist, he notes. "And they can make calls into the sensors through Web services directories just as they do with databases and other applications in the IT infrastructure," Acra says.

Ultimately, integration and analysis of all this information could enable data center operators to cut their costs substantially. If one area is running too hot, for example, a building-automation system could open or reconfigure vents to increase the air conditioning in that one area only -- rather than unnecessarily cooling the entire room. Or if the data center uses virtualization, certain applications could be moved from hot aisles of servers to cooler aisles. Right now, however, it's still a far cry from being integrated into the IT infrastructure.

"This stuff with the sensors, this is way out there," McNab stresses. "It's not happening today."

Even though companies have the ability to track power use down to the level of the individual power strip, few if any are collecting that data or integrating it into the enterprise yet, agrees Joshua Aaron, president of Business Technology Partners, a New York-based IT infrastructure consultancy. Some clients are beginning to ask about it, however.

"A lot of companies are collecting a lot of data, but not a lot of companies are sophisticated to the point of using it for real-time data mining or iterative planning. It's not usually integrated in real-time applications where people are looking at it in relation to other things every day," he says.

But Arch Rock's Acra says companies could do cost accounting down to the level of specific servers that are performing specific applications. "Through sensors, you can get electricity consumption data from a cubicle, a rack of servers or even each individual server," he says.

"But you don't want to take that data in isolation. You'd want to tie electricity function to particular business functions," Acra continues. "In a financial firm, for example, was this energy used by traders who were doing a lot of activity? Or was it the quantitative guys who were doing technical modeling on stocks? Or were the investment bankers banging away at M&A?"

If energy costs keep going up as fast as they have in the past year, such sensor applications could well be among the first to link up with enterprise IT. No one right now seems to know what sensors' killer app might be. What's clear is that the ubiquity of sensors is producing lots of data that companies would like to use in more ways.

"The data coming off of these sensor networks is becoming more important in overall business management, whether that's asset tracking, whether it's just-in-time manufacturing, or whether it's connecting electric demand to the supply side," says Cisco's Platon. "The last 20 years we've spent connecting some 4 billion computers. The next 20 years are going to be spent connecting some tens of billions of industrial objects."