Hi I have a customer who has multiple sites and now we are looking to replicate those site to the cloud and need to evaluate the bandwidths for each site !
Is there a way to generate a report for each site telling us what is the equivalent of a full backup for those sites? I have the Total Application Size and Total Data Size on Disks but I cannot really rely on this since some have 15 days retention and others have 60 days retention ! Is there a report or a way to find this kind of information ?
I’ll have to see about a report, but a good way to identify the amount of data that needs to be copied is to look at the “Baseline Size” statistic on the DDB. If all their remote sites use a single or few DDBs it should be quick to take a look at this in the UI. This should be an accurate measure of initial transfer - similar if you had to seal the deduplication store and start from scratch. Its measured at 120% to account for some incremental change, but may be “worst case scenario” rather than under-estimating.
I’ll have to see about a report, but a good way to identify the amount of data that needs to be copied is to look at the “Baseline Size” statistic on the DDB. If all their remote sites use a single or few DDBs it should be quick to take a look at this in the UI. This should be an accurate measure of initial transfer - similar if you had to seal the deduplication store and start from scratch. Its measured at 120% to account for some incremental change, but may be “worst case scenario” rather than under-estimating.
We use 3 different kinds of cookies. You can choose which cookies you want to accept. We need basic cookies to make this site work, therefore these are the minimum you can select. Learn more about our cookies.