Skip to main content
Solved

How to optimize deduplication to get best %


Forum|alt.badge.img+14

Hello,

 

I have a question about saving % of dedup. What is best practise setup/setting to get above 90% of saving on dedup.

 

For exaplare i have alot of vm backup where yestoday saving of 90% and next day on same vm backup lover then 90%

 

What is de case of bad dedup???

Best answer by Onno van den Berg

So what’s up with these clients? Do they contain databases? If this is the case then I would switch to database agent protection. Depending on the version you are running it could result in less data being protected via VM backup because Commvault is aware of DB agent backup and thus excludes the database files from backup. Do note this is limited to just a few supported agents. Anyway you could of course also exclude the volumes that contain the DB files, I do hope you have separated the database files from the rest.

View original
Did this answer your question?

6 replies

Onno van den Berg
Commvault Certified Expert
Forum|alt.badge.img+19

@Egor Skepko there is not much too optimize from a Commvault perspective other than making sure you organize the data as such that it is aligned correctly towards the DDB engines. The amount of savings is depended on the amount of data that is eligible for deduplication. So for example if there is a lot of encrypted data than your savings will go down. Same thing for large databases with a high change rate.


Forum|alt.badge.img+14
  • Author
  • Byte
  • 164 replies
  • July 13, 2022

@Onno van den Berg  Thank Onno, wil be diffrerent backup metrhode change the low rating dedup backup? for examplare instead vm back file system? 


Onno van den Berg
Commvault Certified Expert
Forum|alt.badge.img+19

It depends on the configuration. If you just switch to FS backup you will still backup the same data so only when you start using separate subclients or filters it could get better, but that requires you to know where this data resides. I’m however curious to understand why you are so keen on getting a 90% or higher deduplication ratio. For me this would never be a key indicator but something that is really nice to have as it saves cost on storage and network bandwidth during backup. 


Forum|alt.badge.img+14
  • Author
  • Byte
  • 164 replies
  • July 13, 2022

@Onno van den Berg i am trying to do it to save some space on storage, i am seeing dedup issue on clients which are 15 TB and backup size on bad dedup wil be around 4-6TB. 


Onno van den Berg
Commvault Certified Expert
Forum|alt.badge.img+19
  • Commvault Certified Expert
  • 1227 replies
  • Answer
  • July 13, 2022

So what’s up with these clients? Do they contain databases? If this is the case then I would switch to database agent protection. Depending on the version you are running it could result in less data being protected via VM backup because Commvault is aware of DB agent backup and thus excludes the database files from backup. Do note this is limited to just a few supported agents. Anyway you could of course also exclude the volumes that contain the DB files, I do hope you have separated the database files from the rest.


Forum|alt.badge.img+14
  • Author
  • Byte
  • 164 replies
  • July 19, 2022

@Onno van den Berg Yes we seperating sql backup from vm backup aswell. 


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings