Backup validation

Backup validation

Backup validation is the process whereby owners of computer data may examine how their data was backed up in order to understand what their risk of data loss might be. It also speaks to optimization of such processes, charging for them as well as estimating future requirements, sometimes called capacity planning.

History

Over the past several decades (leading up to 2005), organizations (banks, governments, schools, manufacturers and others) have increased their reliance more on "Open Systems" and less on "Closed Systems". For example, 25 years ago, a large bank might have most if not all of its critical data housed in an IBM mainframe computer (a "Closed System"), but today, that same bank might store a substantially greater portion of its critical data in spreadsheets, databases, or even word processing documents (i.e., "Open Systems"). The problem with Open Systems is, primarily, their unpredictable nature. The very nature of an Open System is that it is exposed to potentially thousands if not millions of variables ranging from network overloads to computer virus attacks to simple software incompatibility. Any one, or indeed several in combination, of these factors may result in either lost data and/or compromised data backup attempts. These types of problems do not generally occur on Closed Systems, or at least, in unpredictable ways. In the "old days", backups were a nicely contained affair. Today, because of the ubiquity of, and dependence upon, Open Systems, an entire industry has developed around data protection. Three key elements of such data protection are Validation, Optimization and Chargeback.

Validation

Validation is the process of finding out whether a backup attempt succeeded or not, or, whether the data is backed up enough to consider it "protected". This process usually involves the examination of log files, the "smoking gun" often left behind after a backup attempts takes place, as well as media databases, data traffic and even magnetic tapes. Patterns can be detected, key error messages identified and statistics extracted in order to determine which backups worked and which did not.

Optimization

Optimization is the process of examining productivity patterns in the process of backup to determine where improvements can be made and often, where certain (less important) backup jobs may be eliminated entirely.

Chargeback

Very often, the service of backing up data is done by one person (or persons) in the service of others, the Owners of the data. Becoming more prevalent today also is the charging for those services back to the data Owner(s). A simple fee per backup might be agreed upon, or, as is more often the case, a complex charge based on success rates, speed, size, frequency and retention (how long the copy is kept) is put into place. Usually some form of service level agreement (SLA) is in place between the backup service provider and the data owner in which it is agreed what is to be done and how the service is to be charged for.

ee also

* Backup
* Backup software


Wikimedia Foundation. 2010.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • Backup — For other uses of Backup , see Backup (disambiguation). In information technology, a backup or the process of backing up is making copies of data which may be used to restore the original after a data loss event. The verb form is back up in two… …   Wikipedia

  • Cloud backup — and Recovery is a subset of Cloud Computing and a form of Online Backup. Cloud Backup and Recovery has the following characteristics: Contents 1 Service based 2 Ubiquitous Access 3 Scalable and Elastic …   Wikipedia

  • Windows Server 2008 — Part of the Microsoft Windows family …   Wikipedia

  • Oracle Database — Developer(s) Oracle Corporation Development status Active Written in …   Wikipedia

  • Fuel cell — For other uses, see Fuel cell (disambiguation). Demonstration model of a direct methanol fuel cell. The actual fuel cell stack is the layered cube shape in the center of the image A fuel cell is a device that converts the chemical energy from a… …   Wikipedia

  • iOS version history — Contents 1 Overview 2 Versions 2.1 Unreleased versions …   Wikipedia

  • Entity-attribute-value model — (EAV), also known as object attribute value model and open schema is a data model that is used in circumstances where the number of attributes (properties, parameters) that can be used to describe a thing (an entity or object ) is potentially… …   Wikipedia

  • XFLAIM Database Engine — Infobox Software name = XFLAIM caption = latest release version = latest release date = operating system = Cross platform genre = Development Library license = GPL website = [http://developer.novell.com/wiki/index.php/FLAIM XFLAIM] TOC… …   Wikipedia

  • Data deduplication — In computing, data deduplication is a specialized data compression technique for eliminating coarse grained redundant data. The technique is used to improve storage utilization and can also be applied to network data transfers to reduce the… …   Wikipedia

  • Windows Installer — This article is about the Microsoft Windows component. For the installation of the operating system itself, see Windows Setup. Windows Installer Default window (after running msiexec.exe) Original author(s) Microsoft …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”