How should you back up data so that it can be reliably recovered through a process that doesn't interfere with daily workloads traveling across the network?
What do we mean by large? A simple question with a not-so-simple answer. If your total data footprint is 5 TB or more, that’s considered large. But what kind of data is it? How many actual files are there? How frequently do they change? How much can they be compressed? It’s likely that two different 5 TB environments would require different data protection schemes if they were comprised of different file types that changed at different rates. On the other hand, bandwidth capacity restrictions are a common denominator for all environments. The question boils down to this: How should you back up data so that it can be reliably recovered through a process that doesn’t interfere with daily workloads traveling across the network? IT pros on the frontlines have no single tool for determining the impact that backing up large datasets will have on bandwidth. It’s a process of trial and error, even for the experts who do it daily. You can only protect as much data as your network will allow. And there’s little use backing up data that can’t be recovered in a timely fashion. Before you consider how you’re going to back up large datasets, first consider how you may need to recover the data.
Offered Free by: Carbonite, Inc
See All Resources from: Carbonite, Inc