Skip to main content
Solved

Backing up large files

  • 10 October 2023
  • 3 replies
  • 50 views

I have a subclient with a single 4.6TB file, that will only get larger. Since it’s one file I don’t think additional data readers are going to help.  I’m wondering if anyone has come across this challenge and has some tuning suggestions. 

The client is Linux and the filesystem of this file is in 4K blocks.  Based on that I didn’t think increasing network agents from 2 to 4 or increasing application read size to say 4MB would help but I tried anyway and no increase in performance during testing.

The target storage does dedup and compression, so it’s currently off at the client level.

3 replies

Userlevel 5
Badge +13

hi @elvizzi 

Did you try extent-based backups?

https://documentation.commvault.com/2023e/expert/130822_optimized_backups_using_extent_based_technology.html

 

Thanks,

Sunil

Badge +3

Thanks! I was unaware of that setting and change in behavior at our version.  Extent-based backups bEnableFileExtentBackup were already enabled and I simply had to add an additional data reader which indeed increased throughput.

Userlevel 5
Badge +13

Great! Glad to know it helped.

 

Thanks,

Sunil

Reply