The Standard Performance Evaluation Corporation (SPEC), well known for its computer system benchmarks, is planning to extend its testing methodology to measure cloud deployments as well.
The company has formed a new working group to develop metrics for measuring cloud computing. The group will be a subgroup of the Open Systems Group (OSG), and will include participants from companies such as AMD, Dell, IBM, Intel, Oracle, Red Hat and VMware. Long-time SPEC benchmark developer Yun Chao will help the effort, as well members from the SPEC Research Cloud group.
With cloud computing "you don't just have a single-sized computer most of the time. You are allowed to vary to your needs," said Rema Hariharan, chair of OSGCloud. As result, new metrics are needed to quantify operational concepts that mainly apply in cloud computing.
As with other SPEC benchmarks, vendors of cloud software and services could use the benchmarks to measure and publicize their own offerings, the group hopes.
While existing SPEC benchmarks measure many aspects of operations that take place in the cloud such as sever throughput, response time and power consumption, the new benchmarks would specify those unique attributes of the cloud, Hariharan said. They could include factors such as elasticity, throughput, response time and variability.
Elasticity, for instance, could be defined as how quickly a service can be changed to meet customer needs. It could consist of specific metrics such as how long it takes to spin up a particular workload or the time it takes to add more resources such as CPUs or memory to an existing workload. Elasticity could also incorporate a test for agility, which would measure how closely a vendor can match the users' needs in terms of required resources for the job.
The benchmarks will have to be designed so they can be applied to a wide number of varying services, Hariharan said. The group foresees the cloud metrics spanning two different types of cloud services. One type of service, which they call black box, will not offer any details about the underlying hardware and software, while the other set, nicknamed white box, will provide specific details about the hardware and software used in the service.
At first, OSGCloud will focus on developing benchmarks for measuring IaaS (infrastructure as a service), services now provided by companies such as Amazon and Microsoft. Later, it may tackle benchmarks for PaaS (platform as a service) and SaaS (software as a service) as well, Hariharan said.
The group hopes to publish the first benchmark definitions in about a year. It has already posted a 50-page report outlying the issues and goals of setting a cloud benchmark.
A nonprofit organization, SPEC develops and maintains standardized benchmarks for evaluating enterprise computing systems.