Subscribe to Email Updates

4 Requirements of Modern Backup and Archive

by Catherine Chiang – February 20, 2018

As enterprise datasets grow at unprecedented rates, with the majority of it being unstructured data, requirements for modern backup and archive have expanded beyond the capabilities of legacy secondary storage systems.

Modern backup and archive must have the following four requirements to meet the demands of massive unstructured data:

Policy-Based Data Management

Policy-based workflows for backup and archive streamline data management by allowing administrators to easily set policies for automatic backup and tiering to cloud.

Another big problem with massive datasets is not knowing what’s there, so it’s incredibly useful to have a single consolidated tier with indexing and search. Autodiscovery of shares and exports helps administrators easily discover what data needs to be protected.

Data Movement Engine

Moving data becomes extremely difficult when datasets are large. Legacy backup solutions use single-threaded protocols to move data, but this fails for petabyte-scale data.

A modern backup and archive solution must use highly parallel streams to move data. Another requirement of modern data movement is latency awareness, which enables backups to run continuously without creating backup windows.

Cloud-Native Services

As data grows quickly both on-premises and in cloud, enterprises face the challenge of data management in a hybrid world. Cloud-native architecture for on-premises infrastructure helps to bring cloud benefits to enterprise datacenters.

These cloud benefits include scale-out architecture, resiliency, and agility. Scale-out architecture allows enterprises to scale their solution without creating silos, which becomes essential as data grows large. The distributed nature of cloud-native architecture results in resiliency, protecting against potential failures. Lastly, users experience agility due to nondisruptive software updates.


Finally, as the demands for managing massive unstructured data grow, as-a-Service providers offer a scalable, more agile alternative to legacy systems.

“as-a-Service” means that instead of buying hardware and then dealing with the day-to-day maintenance and troubleshooting of the secondary storage solution, customers pay for an all-inclusive service which alleviates management overhead.

as-a-Service offerings take care of monitoring, diagnostics, failure management, and software updates. This is particularly valuable as data grows larger because it actually ends up saving enterprises money by lowering total cost of ownership (TCO). And unlike traditional managed services, which employed customer service teams that get expensive as data grows, as-a-Service vendors use software to provide efficient and cost-effective services.

Related Content

Tomorrow’s Cars Need Modern Data Management Today

March 14, 2018

The Internet of Things is transforming industries across the board, and the automotive industry is no exception. From connected cars to autonomous vehicles, big changes in automotive technology are creating big data.

read more

Backup: One Size Does Not Fit All Anymore

February 27, 2018

In the good old days when data was much smaller, having a single backup solution was ideal. However, as data has exploded, it has also generated new requirements that necessitate multiple solutions. 50% of enterprises surveyed for the 2017 Gartner Backup Magic Quadrant today deploy multiple backup solutions.

read more

Optimize Your Storage Tiers with Pure Storage FlashBlade and Igneous Hybrid Storage Cloud

February 13, 2018

As enterprise datasets explode to billions of files and petabytes of data, enterprises need modern data management solutions that offer high performance and large capacity.

read more