Storage and Deduplication
Discuss any topic related to storage or deduplication with fellow community members
- 647 Topics
- 3,303 Replies
Snap copy to Server Plan
Hello CV community!I see that from 11.24, you can add snapshot copies to server plans.https://documentation.commvault.com/v11/essential/139040_new_features_for_snapshot_management_in_1124.htmlIm not sure, this snap copy in supported only with specific type of storages?Does anyone actually use it ?Please for your feedback,Nikos
HyperScale Migration using Auxiliary copy to Azure / Metallic storage - DDB block size
Hi,I’m working on a project where a large Hyperscale environment needs to be migrated to Azure. Looking at using either Azure storage or Metallic recovery reserve and/or possibly a mix of the two. There’s short term 30 days as well as LTR 5 to 7 years so will use Hot / Cool tiers (possibly combined storage tiers if Metallic not used)HS is setup using the default DDB block size 128KB. In an ideal world - one could just setup a secondary copy for the cloud storage (either Metallic or Azure) and kick off the Aux copy to cloud, just let it run to get the data over in to Azure then eventually promote it to primary copy… however…As the cloud storage will then be used as a primary copy - ideally, we want to configure it with 512KB DDB block size. Media agents will be setup in Azure as they will eventually become the production MA’s once things get cut over. Some key questions on the above:copying between storage policies with different DDB block sizes – how will this affect overall dedupl
Problem creating new storage copy - Internal Error. Incorrect parameter passed to the SIDB engine
Hello,I’ve set up a new media agent, v 11 SP24.34 and installed the CV MA software. I’ve created my Maglib also. I’ve created a storage policy and my primary copy. What i want to do now is to create a secondary copy. I’ve done that but when i click ok i get an error : “Internal Error. Incorrect parameter passed to the SIDB engine”. When i click ok to that error, i get another error : “Invalid library ”. My secondary copy does not get created. Has anyone experienced this before?RegardsFergus
Copy data of first week in SSD disks and copy data from 2nd week to 4th week to NLSAS disks
Hi,We have data to backup with a retention period of 4 weeks. The challenge is the following:the data within the fist week of retention period must be copied to SSD disksthe data within 2nd week to 4th week of retention must go to NLSAS disks. So, the goal is to not have the data of the first week retention in NLSAS disks to reduce the space.Is there a way to reach this goal? ThanksRegards,
PTL mount/unmount error during Oracle DB recovery
HelloAt the request of the client company, I proceeded with the recovery of Oracle DB backed up in PTL Media as follows.1. Backed up original DB to PTL Media. (SAN)2. The DB backed up with PTL was restored to the recovery server. (Network, using Backup Server MA)Different drives were used for backup and recovery.No issues occurred during the backup.The following message is output with a Failed unmount error during the recovery process:The path is being accessed by another application.Advice : please make sure that no other device explorer application like SAN explorer are running on the machine.PTL devices are used by Commvault only.I would appreciate it if you could advise me on what to check to resolve the issue.
AuxCopy free space is less than primary
We have 2 Synology 24TB storage libraries, one for the primary and the other at another site for the AuxCopy.The primary has Free Space 15.34TB, Size on Disk 8.59TB, Total Application Size 30.5TB.The AuxCopy has Free Space 3.62TB, Size on Disk 20.33TB, Total Application Size 30.33TB Shouldn’t the AuxCopy be an exact replica of the Primary? Why would it be larger? Thanks for any suggestions.Larry
Deduplication block level factor
Hello,In the documentation Deduplication Building Block Guide (commvault.com), it is mentioned that:“The DDBs created for Windows Media Agent should be formatted at 32 KB block size to reduce the impact of NTFS fragmentation over a time period.”Is the format at 32 KB referring to the block size of the disk itself (NTFS block size) or referring to the “Block Level Deduplication Factor” parameter of the Storage Policy as shown below ?Thanks
Seeding a Deduplicated Storage Policy
Has anyone ever transferred their initial baseline backup between two sites using an available removable disk drive (Seeding a Deduplicated Storage Policy). We are looking like may have to use this method to get a good AUX copy for some remote sites. How did it go? did you have any issues? Thanks for your input.
Import media from catalogic app
Hi, do I have an option to Import media from catalogic app? I have a customer that migrated to commvault from catalogic and he wants to know if he can import the catalogic tapes to commvault library.he has tapes with the last backup from catalogic.I think he will have to maintain his old backup system, but I’m don’t 100% sure.
Disk Library Mount path space usage monitoring
HI Team,Is there any API or report using which we can monitor individual library mount paths ?CV alerts are for complete library but i need to configure an alert specific to mount paths( coming from different media agents or servers ) part of same disk library. We receive below alerts when one or few of the mount paths of library met reserved space .Failure Reason: Insufficient disk space. Available mount paths are not enabled for write or have met reserved space limit. Enable/add more mount paths or add more disk space to existing mount paths. Please check mount paths on the library Need to configure alerts on mount path level so that we can disable the mount paths in advance .Regards, Mohit
Problem with media agent
Hello Occured today on a strange error:HP Ultrium 7-SCSI_2 - A SCSI command to the drive is stuck on the active drive controlling MediaAgent.Id restarted tape libary but this problem is not resovled i assume i need to restart MediaAgent but dont know exacly how anyone can help?
Verify DDB Reconstruction Jobs
Hello Community, I am new to Commvault. I am trying to check the status of a DDB Reconstruction failed job. I checked the Storage policy but I don’t see the job that created the internal ticket. Type: Job Management - DeDup DB ReconstructionDetected Criteria: Job StartedDetected Thanks.
Planning / sizing for cloud storage tiers and documentation question
Hi,our documentation around combined storage tiers in cloud could be a bit misleading around sizing for Warm and archive tier storage requirements.See in red below.what’s the ideal ratio to size? (or do we need to just go 10% on warm tier to be safe?)https://documentation.commvault.com/2022e/expert/139246_combined_storage_tier.html BackupsCommvaults combined storage tiers works by placing Commvault metadata used in both deduplication and non-deduplicated backups in the warmer or frequent access tiers. This allows you to perform a simple Browse and Restore of the archival data, without the delay of a cloud archive storage recall.As a result, more than 90% of the backup data gets stored in the Archive tier, while only up to 10% of the data is stored in the warmer tiers. Deduplicated and non-deduplicated data is supported in Commvault combined storage tiers. For deduplicated data, the backup indexes and deduplication metadata is stored in the warmer tier. For non-deduplicated data, th
My customer is currently at 11.24.60, should I upgrade the environment to 2022E prior to start deploying the HSX Cluster ?
Hi, quick on the ISO 2.3 for reference architecture deployment dvd_10072022_113351.iso ?I don't know if I’m in the right place for the question but !This ISO is based on which FR ? FR.24 !?My customer is currently at 11.24.60, should I upgrade the environment to 2022E prior to start deploying the HSX Cluster ?I’ve seen a lot of new features for monitoring and securing nodes !Thank you,
MANAGING PENDING ACTIONS IN VAULT TRACKER
There seems to be some question surrounding VAULT TRACKER and how to manage PENDING ACTIONS.This is the correct process on how to manage the pending actions for Vault Tracker:https://documentation.commvault.com/11.26/essential/111089_managing_pending_vault_tracker_actions.htmlSince some organizations are retaining there tape footprint for archival and data protection from Ransomware, Vault Tracker is a excellent tool for tape management. Dwayne
Already have an account? Login
Login to the community
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.