Backup Problems That Companies Need To Overcome In Order To Capitalize On Big Data

The “Mainframe Age” saw the birth of databases and business process automation. Then, came the “PC Age” which placed the power of computing on the desktops of every worker. Next came the “Internet Age”, which created the virtual world in which most business takes place today. Now, we’re entering the age of “Big Data”.

A major shift is taking place right now in data centers across the world, and it’s spawning a new era in the history of business computing.Until recently, business data was mainly used to support business decisions. Now, data is becoming the business itself.

A combination of cheap storage and extremely powerful processing and analysis capability is spawning new value-creating opportunities which had previously been the domain of science fiction.
Of course, any business that wants to leverage the power of big data will also have to protect this data from disasters, accidents and mechanical failures. This means that certain backup challenges will need to be overcome in order to support the activities associated with Big Data computing.

Manual Labour Costs

Hardware costs are dropping exponentially as complexity is exponentially rising. As this happens, automation is critical in order to prevent labour costs from also increasing exponentially.

Data Transfer Speeds

Transferring data to off-site storage using tape is extremely slow for both backup and recovery. Big data systems will require that data be backed up at network speeds. In cases where data is growing at a faster rate than the public Internet infrastructure can support, companies will require dedicated connections with their remote backup facilities or service providers.

Recovery Speeds

As the strategic importance of corporate data increases, so will the potential risks and costs associated with unplanned downtime. Big data applications will require data backup strategies which are optimized for maximum recovery speeds. This may mean combining both local and remote backups, and it could also mean maintaining a mirrored emergency failover facility at a remote site.

Recovery Point Objectives

Because data loses value with age, the most recent data is also the most valuable. This is especially true for big data apps which work with real-time analysis. Because of this, backup plans will need to be designed to minimize data loss in the event of a disaster.
It’s no longer acceptable to lose 24hours of data because of a daily backup process. Higher frequencies will become the norm.

Archiving

In order to maximize server performance and adhere to regulatory requirements, inactive, older or low-value data will need to be stored separately from live working data. Data archiving and electronic discovery will become much more important in the big data age.

Backup Windows

Big data takes a long time to back up. It’s simply not practical to take a system off-line for 8 hours for daily backups. Tomorrow’s big data apps will need to perform more frequent backups, faster, and with less interruption.

Data Security

It’s really pretty shocking to think that – even today – most businesses don’t encrypt their backups. Backups should be encrypted, especially when stored off-site. Losing backup media is a common problem and can provide hackers and identity thieves with access to sensitive and private information. And big data will only worsen this threat.

Compression and Deduplication

No matter how cheap storage is today, it will always be cheaper tomorrow. That’s why it’s important to budget accurately and maximize storage utilization to minimize waste. Technologies like storage virtualization, block-level incremental backups, compression, and deduplication will grow in importance thanks to big data. But there will also be a balance and need for native and fast storage and because of the processing required for these activities as well as encryption.

These are just a few of the many new challenges that organizations will need to overcome as they begin to leverage the opportunities presented by big data computing. That’s why many organizations are choosing to outsource their data backup and online storage requirements and partner with industry experts.

About The Author:Storagepipe Solutions has been a leader in data center backups for over 10 years, and they’re experts in backing up large volumes of complex data center data. Leading the industry in the fields of disaster recovery, regulatory compliance, and business continuity backupsolutions in anticipation of rapidly-changing data management trends, its more than just an online backup software company. A comprehensive suite of hosted disaster recovery solutions that give organizations greater control over their retention policies while allowing them to overcome complex backup management challenges through cost-effective automation is what StoragePipe has to offer.

 

Data Center Talk updates its resources everyday. Visit us to know of the latest technology and standards from the data center world.

Please leave your views and comments on DCT Forum.