The global pandemic has created new challenges including a massive change in the way we work. TIME Magazine calls it the world’s largest work-from-home experiment and for enterprises this has meant a re-prioritisation of IT resources while planning for business recovery. In this commentary for Harbour IT’s RE:IMAGINE, I will outline what a modern data experience will look like for companies in a pandemic world and how they can bring about greater efficiency and performance from a work-from-home employee base.
To be sure, digitalisation was already taking place even before the pandemic. According to IDC, by the year 2022, half of Asia Pacific’s economy will be digitalised. With COVID-19, this prediction will likely be revised with governments and enterprises now accelerating their digitalisation efforts.
Central to this digitalisation migration is data. Organisations are realising the importance of a data-driven approach, which prioritises and transforms data into an agile, seamless resource that is delivered with velocity to produce trusted insights.
Behind the scenes of this big digitalisation migration are three defining trends that have shaped the evolution of storage: simplicity, seamlessness and sustainability. Today, users must deal with increasingly large volumes of data, resulting in the need for an architecture that allows data to be stored and accessed easily over time.
As digital transformation matures, how will organisations manage their data effectively and efficiently to secure a competitive edge in the digital economy?
Several trends will emerge that will define the modern data experience. These include the swing to operating expenditure (OpEx) models; the emergence of fast, object storage; the rising adoption of containers in mainstream applications; and greater automation through modern analytics.
Demand for Storage-as-a-Service will increase
As-a-Service models may have existed since the beginning of public cloud, but we are seeing rising demand in response to the adoption of hybrid cloud.
For most storage consumers, hybrid cloud is the reality and the future. In a whitepaper by 451 Research, more than 90% of businesses in APAC have multiple cloud environments with varying degrees of interoperability, of which more than half are already on hybrid cloud. This is because hybrid cloud allows organisations to combine the best of both worlds – enterprise capabilities and control from on-premise infrastructure, together with simplicity and automation in the cloud.
Investments in subscription-based, Operating Expenditure (Opex) models will increase. Organisations will have to balance the operations and purchasing aspects to deliver a non-disruptive, evergreen experience that can scale as needed. This protects investments in the technology architecture for at least ten years, eliminating the need to change the infrastructure every three to four years, or do arduous data migrations.
Database operations will become more streamlined
Object storage has become the standard for cloud-native applications, due to its ability to support highly parallel and distributed access to large data sets. As applications are developed or re-platformed for cloud-friendly architectures, object storage will become the go-to for enabling applications to decouple, and disaggregating applications and their compute resources from a pool of shared storage.
This will be accompanied by new solid-state technologies such as Storage Class Memory (SCM) and quad-level cells (QLC) coming online, stratifying the memory space. On the high-end, with the combination of SCM and high-speed protocols like NVMe-oF, shared storage arrays can now provide servers storage-like performance to the most latency-sensitive applications.
At the same time, the impending introduction of QLC is bringing flash to tiers of storage that have largely stayed on magnetic disk to date. Recognising the emerging demand for QLC technology, Pure Storage launched FlashArray//C – one of the industry’s first QLC-ready all-flash storage systems – during the Accelerate conference last year. This cost reduction enables applications to take advantage of the benefits of flash beyond performance: simplicity, reliability, and reduced data center power and space.
Smart provisioning of container storage to gain momentum
Containers were created to make deploying stateless applications as simple and cost-efficient as possible. In recent years, the soaring popularity of Kubernetes and VMware’s endorsement of containers has contributed to rising container usage towards mainstream applications.
Containerised application environments are fluid and scale rapidly. Organisations with the ability to keep up will see Artificial Intelligence (AI) operations move from advisory roles to automated actions, allowing customers to take a hands-free approach in decision making.
Organisations that have successfully integrated a container storage platform into their application environments include Cogo Labs, a technology incubator. Cogo worked with Pure to build ‘blindingly fast storage’ that improved productivity for its data scientists and developers and reduced operating costs. They achieved this with Pure Storage FlashBlade, which acts as a data hub and accelerated the move to Kubernetes, a containerisation management platform that supports new business opportunities.
The era of next-gen analytics is now
With more affordable infrastructure options such as stronger central processing units, consumption-based infrastructure, lower-priced flash memory, as well as open-source and commercial stream analytics platforms, modern analytics can now reach a larger scale on cloud-native analytics architectures.
By taking a critical look at the future of storage infrastructure and technologies, organisations can craft a forward-looking digital strategy that delivers a modern data experience – fit for a new, exciting decade in tech innovation.