Solved

Primary Copy with Azure Immutability or DD is Better and Why?

  • 6 November 2023
  • 3 replies
  • 121 views

Userlevel 2
Badge +4

Which is the better solution considering the Cost, Space, Dedup Parameters, Throughputs for a Primary Copy only?

  1. Primary Copy : With Daly incremental, Weekly Full, On Azure immutable with Dedup that is a mix of DDB Sealing, Data retains longer than my basic retention of 30 Days. (Aux - Full Backups later go to Azure immutable container)
  2. Primary Copy : With Daly incremental, Weekly Full On DD - with NO DEDUP, with basic retention of 30 Days. Later have the Full Backups Aux to Azure immutable Cloud.

FYI : My Aux copy of Full backups will stay Azure immutable, there is no change in this concept.

Just want to make sure having a DD for the primary copy daily backups would be beneficial considering the above parameters.

Can you please help me decide 

icon

Best answer by SMD 7 November 2023, 07:39

View original

3 replies

Userlevel 2
Badge +8

Hi Kavya,

 

For option one : there is no need to create a secondary copy and we can utilise something as extended retention on the same primary copy for full backup jobs. the only cost inolved will the storage cost assocaited with the Azure immutable library.

 

For option 2 : you will have to purchase the DD box which is expensive, if you already have a DD box then it would make sense. if you are looking to purchase a new DD box then you will have to consider the size of data on cloud, if it is huge, then purchasing the DD box will make sense and it would eliminate cloud storage cost and provide better deduplication (as its hardware based deduplication.)

 

Userlevel 2
Badge +4

For Example

I have 100 TB of data to be backed up.

In the long Run and never ending backups with daily schedules

  1. Azure immutable with 30 Days basic retention with DDB sealing is cost effective?
  2. one time purchase of DD is effective?

Also please help me decide on other parameters apart from COST, like Space, Dedup Parameters, Throughputs, Bandwidth, Network throttle

Userlevel 2
Badge +8

Hi Kavya,

Sealing the DDB will maintain the jobs twice the retention dates  (Because on the sealed DDB Micro pruning does not work), so if we have 100TB of Data it might go till 200TB because twice the retention.

 

Whenever DDB is sealed, a new baseline is created which will consume twice the Library space.

 

In comparision with commvault deduplication, Data Domain will provide better deduplication savings because they have a dedicated hardware for it.

 

So till now cost, space , dedeuplication is all better with DD.

 

note:

DD is a one-time investment , whereas for cloud you will have to pay on a monthly basis

 

Throughput , Bandwidth and Networt throttle is solely dependent on the internet speeds, the highier the speeds the better the throughput.

 

 

note:

Micro pruning : Deletes the chunks as and when they are aged

Macro Pruning : Deletes the entire data when all the jobs are aged

 

 

Reply