Solved

Cloud library migration from Azure one tenant to another

  • 8 March 2023
  • 8 replies
  • 217 views

Badge +3

We are running on Commvault 11.20.

Currently backup jobs are using Azure Blob cloud disk library, with container default setting.  (On Azure is container type is Cool). 

We would like to move this storage to another tenant with different storage account with Cool/Archieve type container.  Looking for a best approach to migrate the storage, if it can be done from Commvault and not from Azure.

icon

Best answer by Niall 9 March 2023, 19:11

View original

8 replies

Userlevel 3
Badge +8

@ameersyed when you use the Cool/Archive Tier of storage with Commvault you configure the Azure Storage as Cool, and then the application tags the Chunks of data to be written to Archive Storage. (The associated Meta Data for the chunks, which is a small percent, are written to cool storage). 

This means to migrate from a Cool Tier to an Archive Tier, you will need to create a new cloud library within Commvault (When you do this, you configure it to use the Archive Tier of storage) with a new (Global) deduplication database pointing to it. After these are created you can create new copies of your storage policies pointing to the new (archive) cloud library and DASH copy the data over. Once complete you can then delete the old copies.

It is actually quite straight forward but depending on your retention you may chose to allow the old jobs to age out naturally rather than copying them over (Cost of rehydration versus savings made in using Archive storage). 

And I would recommend that you review the version of indexing that is being used by the clients. Anything that is not using V2 Indexing, I would migrate to V2 indexing first as this will ensure any restores will use a workflow to recall the data, rather than you having to recall the data manually first. 

HTH

Niall

Badge +3

We donnot want to use existing storage account in tenant A (existing one). We would like to use storage account from Tenant B ( new ).

if I have to leave existing storage to age as per retention.  On the new storage (tenant B), Do I have to create a new cloud disk library with storage class as Cold or Archieve or Cold/Archieve.  Is one Cloud library enough or should I create one for Cold and One for Archieve. 

 

Userlevel 3
Badge +8

Hi @ameersyed 

Configure your new Azure Storage Account (Tenant B ) as cool storage. In Commvault you need to configure a new Cloud Library on the new (Tenant B ) Azure Storage Account. When you configure the new Cloud Library in Commvault you define it as Cool/Archive. This way although you have set it to cool in Azure, when Commvault uses this Cloud Library it will tell Azure to put the data in Archive and the Metadata in cool. 

Does that clarify it for you?

Niall

Badge +3

@Niall  After this, I can still use DASH copy if I want to migrate data from old storage. Correct?

Userlevel 3
Badge +8

Hi @ameersyed 

After you have created your new Cloud Library with Archive/Cool storage you will need to create a Storage Pool/DDB to use it and then you can create new Storage Policy Copies and pick data to AUX, or DASH, copy into it. 

HTH

Niall

 

Badge +3

@Niall One last question, I would like to store primary copy for 60 days on cool and any data which is older than 60 days to move to Archieve. Is this possible on Commvault or it’s something we have to explore on Azure. 

 

 

Userlevel 3
Badge +8

@ameersyed This possible, you just need two Cloud libraries, one configured as Cool and the second configured as Cool/Archive. Then configure your Storage Policy primary copy to write data to the Cloud Library configured as “Cool” with your 60 days retention and then create a secondary copy pointing to the Cloud Library configured as “Cool/Archive” with your long term retention. Assuming you are deduplicating the data you will need two DDBs, one for the primary backups and one for the secondary copy. 

HTH

Niall

Badge +3

@Niall I have configured as mentioned above. 

  1. Storage policy with  Primary Copy to Azure cool tier. Used existing Global dedup Policy
  2.  Secondary Copy to Azure Cool/Archive tier.  I had to create new DDB partition for this. 

Tested a backup on primary copy, it was successfull. However, Aux copy is failing with 

 

Error Code: [13:138] Description: Error occurred while processing chunk [xxxxx] in media [xxxxxx], at the time of error in library [xxxx] and mount path [[xxxx.x.xx.com] xx], for storage policy [xxxxxxx] copy [xxxx] MediaAgent [xxxx]: Backup Job [xxxxx]. Undefined software error occurred. Source: xxxxx.xx.xxxxx.xx, Process: CVJobReplicatorODS 

 

In logs I see 

 

Failed to read the remote file  This operation is not permitted on an archived blob.

Reply