Skip to main content
Solved

Backing up large files


Forum|alt.badge.img+3

I have a subclient with a single 4.6TB file, that will only get larger. Since it’s one file I don’t think additional data readers are going to help.  I’m wondering if anyone has come across this challenge and has some tuning suggestions. 

The client is Linux and the filesystem of this file is in 4K blocks.  Based on that I didn’t think increasing network agents from 2 to 4 or increasing application read size to say 4MB would help but I tried anyway and no increase in performance during testing.

The target storage does dedup and compression, so it’s currently off at the client level.

Best answer by Sunil

hi @elvizzi 

Did you try extent-based backups?

https://documentation.commvault.com/2023e/expert/130822_optimized_backups_using_extent_based_technology.html

 

Thanks,

Sunil

View original
Did this answer your question?

3 replies

Sunil
Vaulter
Forum|alt.badge.img+13
  • Vaulter
  • 300 replies
  • Answer
  • October 10, 2023

Forum|alt.badge.img+3
  • Author
  • Byte
  • 8 replies
  • October 10, 2023

Thanks! I was unaware of that setting and change in behavior at our version.  Extent-based backups bEnableFileExtentBackup were already enabled and I simply had to add an additional data reader which indeed increased throughput.


Sunil
Vaulter
Forum|alt.badge.img+13
  • Vaulter
  • 300 replies
  • October 11, 2023

Great! Glad to know it helped.

 

Thanks,

Sunil


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings