Executive Viewpoint 2014 Prediction: Sepaton - Data Protection Trends to Watch Out for in 2014

By Peter Quirk (Profile)
Share
Friday, December 13th 2013
Advanced

As 2013 draws to a close, it seems an appropriate time to reflect on the past year and to look ahead to the data protection challenges, trends, and emerging technologies that will shape our coming year. As IT professionals, we divide our time between looking back -- analyzing our systems to identify ways to improve them, to save money, to do more with fewer resources – and looking ahead to the latest technologies to keep our data centers efficient, flexible, and responsive in today’s dynamic marketplace. Here are trends to watch out for in 2014:

Trend #1: Data Growth Hockey Stick

Data growth is nothing new in enterprise data centers. What will be new for 2013 is the hockey stick growth that will be driven by increased use of big data analytics, growing number of applications, and increased use of large database-driven applications.

What this means:

Rapid data growth causes three major (and myriad minor) issues. First, backing up large and fast-growing data volumes within backup windows becomes an almost daily struggle.

Second, your costs skyrocket. Capital spending increases as you quickly (and often unexpectedly) outgrow your backup system and start adding systems for capacity and performance. Your IT labor costs go up too as your IT administrators spend hours moving data off disk-based systems onto tape archives, adding and load balancing additional systems, and tuning every possible part of your data center to squeeze out every last drop of performance.

Third, you start compromising protection levels. You can’t move massive volumes of data offsite for disaster protection efficiently, you can’t backup data as frequently as you would like. You can’t encrypt data at rest.

What you should do:

Use systems that are specifically to handle enterprise data protection needs. These systems lets you add performance and capacity as you need it to protect massive data volumes without data center sprawl. They automatically load balance and tune themselves for optimal efficiency. They are built to deliver the performance you need to meet backup windows with ease.

Trend #2: Increased Reliance on Large Database-Driven Applications in a Sea of Unstructured Data

Enterprises are moving more and more business operations onto large Oracle, SQL, DB2 database driven systems. As a result, enterprises are backing a wide mixture of data types and more database data than ever before and their tolerance for downtime for this data is nearly zero.

What this means:

Database data is notoriously difficult to deduplicate efficiently for two reasons. First, databases store data in very small segments (<8KB) that inline deduplication systems cannot process without slowing backup performance. Second, to meet backup windows, they are typically backed up using multiplexing, and multistreaming, which are not supported by inline, hash-based deduplication systems. As a result, data centers are being inundated with massive volumes of under-deduplicated database data.