Skip to main content
Solved

Aux copy secondary copy backup to S3 cloud library from tape

  • November 21, 2024
  • 3 replies
  • 57 views

Forum|alt.badge.img+3

I would like to create a secondary copy of tape backups to aux copy existing tape Archive backups to a newly configured AWS cloud library. This is using a Windows 2022 VM as a new mediaagent.

We have an AuxCopy schedule which copies Primary data from all these tape SP copies in one data centre to a matching copy in a second data center.  
a)12Month-LTO5, b)5 Years- LTO5, c)7 years-LTO5 ,d)10 Years-LT05  
there is no new data, but we need to retain it & I would like to decommission the tape library as soon as possible   

1) If I add a new DDB disk to the mediaagent & enable De Duplication as per this doco - 
Creating a Selective Copy Using a Library Does it work to reduce the data size going to the cloud library ?

Because if I use the recommended method in Creating a Storage Policy Copy with Deduplication , it says “Secondary copies are automatically associated with the System Created Autocopy schedule policy.“

I take that to mean I would need to immediately edit the association to only copy the jobs with a one time manual aux copy job. ?  

2) is there any advantage to using the local Deduplication option with a local DDB on a  
a second volume on the MA associated with the Cloud Library?. I could skip using deduplication
because I don’t really want to create a new Global Deduplication Policy, which is what this doco states will happen if I create a Cloud Network storage Pool
Configuring a Cloud Network Storage Pool     

We just need a one off copy of the jobs in the above SP copies rather than a fully scheduled Aux copy as I don’t to copy everything     
        

Best answer by johanningk

Hi ​@smuv59,

If your copy source is Tape, the data is not deduplicated. (at least I doubt, that you use deduplication with tape).

If you add a new cloud copy, this should use deduplication blocksize of 512k instead of 64k for disklibrary deduplication.
Therefore a new DDB is reqiured anyways. Forthermore, you cannot use the same DDB (or storage pool) twice within the same Storage Policy.
Each Storage Policy Copy requires its own DDB.
Whether it is applicable to place this on the same pysical disk as others on the same Media Agent, depends on the load (performance) and size (capacity) of the DDB and disk.

If you don’t want ro add all data from tape to the cloud copy, it might make sense to create the cloud copy as selective without automatic selection. This allows you to manually select the jobs you want to copy. Even though this can end up in a lot of manual activity.

Or create it selective whith only monthly/weekly fulls, which will only copy the matching jobs.

Another option can be to define the new copy as synchronous and define the start date to a data of last week. This will select the latest backups to be copied to cloud.
I think in general, a synchonous copy will be the correct type, if you intend to move all tape jobs to cloud.
The selection is already done, when copying data from primary to tape.
Only jobs available on the source copy are included in the selection criterias.

Don’t forget to define the tape copy as source for the newly created cloud copy.

The automatic Auxcopy schedule is always used to associate newly created SPCs. If you want to have it scheduled differen, you need to remove the SPC association from the Auto Copy schedule policy and assign it to your individually configured schedule policy.

Keep in mind, that retention is maintained / defined per Storage Policy Copy. You need to either define the same ruleset to the cloud copy as you did to to the tape copy, or inherit manual retention settings from the source copy.

rgds
Klaus

View original
Did this answer your question?

3 replies

Forum|alt.badge.img+11
  • Byte
  • 88 replies
  • Answer
  • November 21, 2024

Hi ​@smuv59,

If your copy source is Tape, the data is not deduplicated. (at least I doubt, that you use deduplication with tape).

If you add a new cloud copy, this should use deduplication blocksize of 512k instead of 64k for disklibrary deduplication.
Therefore a new DDB is reqiured anyways. Forthermore, you cannot use the same DDB (or storage pool) twice within the same Storage Policy.
Each Storage Policy Copy requires its own DDB.
Whether it is applicable to place this on the same pysical disk as others on the same Media Agent, depends on the load (performance) and size (capacity) of the DDB and disk.

If you don’t want ro add all data from tape to the cloud copy, it might make sense to create the cloud copy as selective without automatic selection. This allows you to manually select the jobs you want to copy. Even though this can end up in a lot of manual activity.

Or create it selective whith only monthly/weekly fulls, which will only copy the matching jobs.

Another option can be to define the new copy as synchronous and define the start date to a data of last week. This will select the latest backups to be copied to cloud.
I think in general, a synchonous copy will be the correct type, if you intend to move all tape jobs to cloud.
The selection is already done, when copying data from primary to tape.
Only jobs available on the source copy are included in the selection criterias.

Don’t forget to define the tape copy as source for the newly created cloud copy.

The automatic Auxcopy schedule is always used to associate newly created SPCs. If you want to have it scheduled differen, you need to remove the SPC association from the Auto Copy schedule policy and assign it to your individually configured schedule policy.

Keep in mind, that retention is maintained / defined per Storage Policy Copy. You need to either define the same ruleset to the cloud copy as you did to to the tape copy, or inherit manual retention settings from the source copy.

rgds
Klaus


Forum|alt.badge.img+3
  • Author
  • Byte
  • 5 replies
  • November 25, 2024

Thanks for your reply johanningk, that’s useful info. a follow-up question is ,could I use one DDB If I run each aux copy separately, by changing the source SP? . This should mean that the DDB stores information about the selective copy jobs. If I use a global tape Ddb I would need to seal it when the jobs finally age off?


Forum|alt.badge.img+11
  • Byte
  • 88 replies
  • November 26, 2024

Hi smuv59,

a DDB is always associated to a Storage Policy → Storage Policy Copy.

within a storage policy, you cannot use the same DDB twice.

to be honest, I’ve never seen or used Silo Option (DDB with tapes) in any of my installations (since early  CV Version 9!)

Why would you like to use it ?
In the worst case you might end up in restoring from 100+ tapes if you want to restore a single VM of Size 20GB, because the unique blocks are spread accross all you tape media !

I would not consider this as a valid sceanrio.


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings