As businesses race ahead to deliver on the promise of AI, IoT and other data-intensive applications, a recent survey revealed many Australian businesses are still battling age-old issues when it comes to storing the vast amount of data this requires. In fact, those surveyed indicated scalability (51%), management costs (42%) and the cost of acquisition (40%) were the top storage issues faced by their organisations today. This was closely followed by data duplication (39%).
The survey was conducted by Lenovo and included responses from 134 professionals in the IT industry, working in companies with 1 to more than 2,500 employees. It looked at the challenges created by growing volumes of data, and also how and where this data is being stored.
While scalability topped the list of current issues, organisations foresee cost and data protection becoming a bigger issue as the volume and velocity of data grows. Nearly half (48%) were concerned about the need to spend money to just keep up with demands while 47% anticipated challenges with protecting and retaining all of their data – despite having limited insight into what that data is and its value to the business.
What was more surprising is that only 43% anticipated challenges around using stored data as a source of value. It is possible this reflects respondents’ confidence in their data management strategies and the technologies available to unlock data insights. At the same time, it points to a contention between cost and innovation that needs to be addressed.
According to IDC research, it’s expected that there will be 44 trillion gigabytes of data in existence by 2020. This creates a need for deep learning and inference capabilities, built on a foundation of high performance computing (HPC) infrastructure, that can process this information, generate new, actionable insights and underpin key business and scientific advancements. Innovation, however, will be stifled if organisations cannot overcome these traditional barriers of scalability, affordability and speed.
When asked about their existing storage solutions, respondents indicated they are storing their data on-premise (41%), in the public cloud (11%), with a service provider (6%) or a combination of all three (42)%.
Looking to the future, 62% were considering cloud technologies to tackle data growth. Trailing behind were SAN and Fibre Channel solutions (42%), Software-defined storage (29%) and Object storage (19%).
Edge computing to solve some of business’ current woes
As data grows and workloads get more intense, there’s a growing debate about edge computing vs the cloud. The idea that edge could replace cloud is a premature assumption but edge computing can help to address a simple problem. As IoT devices multiply and AI is embedded into more applications, the volume of data to be analysed will increase exponentially. At the same time, processing must be lightning fast. Waiting for the data to be transferred to the cloud, processed and then analysed will impact on the business’ ability to act on the insights gained.
This in mind, it’s no surprise that the top challenges businesses are encountering in terms of remote access to storage systems include system speeds that are not up to the task (49%) and process delays due to data volume (46%).
Edge computing provides a potential solution to these challenges by putting processing power into local computing devices so specific data can be analysed right then and there, instead of being sent to the cloud. However, as the research suggests, the ultimate solution to these challenges will need to deliver a solid return on investment while scaling to support the explosive growth in data anticipated for the fourth industrial revolution.
Mike Whatley is ANZ enterprise sales manager at Lenovo Data Centre Group.