Sign up now to get free exclusive access to reports, research and invitation only events.
Computerworld gets an exclusive behind the scenes look inside Internode's Adelaide data centre with network guru Mark Newton
The centre was officially opened in April 2007 by South Australian Premier Mike Rann.
Battery room for the UPS systems.
UPS' run on electricity, but they are always charging the batteries.
Internode is 16 years old and 10 years ago when Mark Newton started there were 1000 customers.
Internode network engineer Mark Newton designed and ran the project to build the data centre.
Newtown says in building a data centre, between 20 and 25 per cent of the costs come from back up appliances.
Newton says: “Keeping your hot air and cold air separate increases the efficiency of your air conditioning, which in-turn reduces the amount of energy that they need to consume and lowers your operational costs - and you can get away with having less air conditioning, which means lower costs to deploy the facility as well."
This cage separates Internode’s infrastructure from its customers' infrastructure.
40 technical staff have access to the data centre.
Gas suppression system is the first line of defence, and the room would fill with fire suppressant gas if a fire breaks out.
The humidity is maintained at 50 per cent, plus or minus 10, and temperature at 22 degrees, plus or minus one.
All the air conditioners in the data centre are cooled down by this equipment. Chilled water runs through the pipes which is then pumped under the floor through the air conditioners to cool the room down.
Internode's second Adelaide data centre is housed in a facility that was once used by Westpac for Cheque Processing.
The back-up generators start within 10 seconds of a power interruption. Internode run the generators on full load for six hours every six months, and every month for maintenance.
There are two power systems in the data centre, and this is the UPS room for the 'B' powered leg. If something catastrophic happened in this room, says Newton, power will still be provided to the data centre by the 'A' power leg.
The blue cables have 288 fibres in them.
If you’re an Internode ADSL customer in South Australia or Northern Territory, there’s a 50/50 chance you’ll be on one of these routers at the moment.
The facility serves two purposes by meeting Internode's own needs and the needs of their clients.
Internode's operations are continually expanding and the company is approaching 400 staff. "As other tenants in the building move out, Internode are taking over their lease and expanding the office," Newton said.
Internode's co-location customer offerings. Every rack is dual powered.
The pump sucks air through the tubes into a detection apparatus. Acting like a hyper-sensitive smoke detector, the white tube samples the air for smoke. Once a fire is detected, it triggers the gas that resides in the steel pipes.
The air conditioners in the data centre pull air in at the top, cool it on its way through and then blow the cold air out the bottom.
The blue cable is approximately two kilometers long, stretching out to Internode's other data centre on the other side of Adelaide’s CBD.
The purpose of the UPS' in this facility is to keep everything running for the amount of time it takes the generators to start – which is less than 30 seconds.
This is where the national network comes into Internode's facility – 10 gig to Melbourne and Sydney and 1.2 gig to Perth.
The air conditioners are discharged under the floor and cool air flows up from the grills. Every second aisle is the “return air”.
The fibre system provides greater connection.
This data centre boosted Internode's broadband capacity by 400 per cent.
Internode customers can either take the space in the form of a cage or they can lease space on a rack by rack basis.
Internode runs a double conversion UPS system.
The dry pipe sprinkler system. Under usual conditions, the pipe is filled with compressed air. If it gets hot enough for a sprinkler heads to burst, all the compressed air releases and a pressure detector will start up the water once the pressure is decreased. Water flows as a back up.
Operating dual high-end data centres means that Internode duplicates its core routing, switching and server assets.
The $3 million data centre incorporates a $500,000 fibre-cross-connected multi-terabyte Sun Storage Area Network (SAN) system.
The data centre has an on-site power generation capacity of one megawatt.
This carrier-neutral facility provides carrier services to customers from Silk, Optus, Uecomm and Telstra.
A diesel-powered generator can operate for as long as six days with on-site fuel.
The data centre has an FM200 fire suppression system that uses gas to extinguish a fire without disrupting computer equipment.
"Our data centres end up being nexus’ where we can patch from one cable system to another. So if we need to link a building that’s on one of our cables, to a building that’s on another cable in another part of the city – we pick a couple of parts in town where the two cables can come together and patch one to the other," Newton said of Internode's fibre system.
The data centre area is 400 square metres.
Internode's second data centre was established due to growth in customer interest in co-locating servers attached to the the company's international network.
The cables that feed the two UPS rooms don’t cross over, and also, the cables that go from the data centre to the UPS room don’t cross over.