Skip to main content
Solved

System Created DDB Verification schedule policy - use of streams


Forum|alt.badge.img+15

Hi there,

 

is it safe to limit the number of streams to be used in parallel for System Created DDB Verification schedule policy? I am asking because in our case this verification policy consumes 30 streams out of 30 streams. However, there is set up use maximum number of streams, why is 30 limit then?

Secondly, I would like to discuss Data Verification Options. Which option do you use preferably? Would it be enough to use only Verification of Deduplication Database instead of Verification of existing jobs on disk and deduplication database?

 

 

Best answer by Laurent

Unless you have other parallel jobs using the same Storage and Storage policies at the time the verification job is executed, you can let the default settings to use maximum.

There are (maybe too much but) many locations where you can define streams limits, like storage/mount paths, Storage policies, Storage Policy copies.. 

I recommend to schedule an incremental verification of both jobs on disk and DDB, because that’s the way to make sure your new blocks are usable for your copies, restores, or next backups with same signatures.. 

You should always schedule such data verification as soon as you create a DDB, because the more you wait, the more data would have to be verified, which can take very long, depending on so many things like storage performance, volumes, you MA’s resources..   

View original
Did this answer your question?

3 replies

Forum|alt.badge.img+15
  • Byte
  • 386 replies
  • Answer
  • July 22, 2021

Unless you have other parallel jobs using the same Storage and Storage policies at the time the verification job is executed, you can let the default settings to use maximum.

There are (maybe too much but) many locations where you can define streams limits, like storage/mount paths, Storage policies, Storage Policy copies.. 

I recommend to schedule an incremental verification of both jobs on disk and DDB, because that’s the way to make sure your new blocks are usable for your copies, restores, or next backups with same signatures.. 

You should always schedule such data verification as soon as you create a DDB, because the more you wait, the more data would have to be verified, which can take very long, depending on so many things like storage performance, volumes, you MA’s resources..   


Forum|alt.badge.img+15
  • Author
  • Byte
  • 111 replies
  • July 23, 2021
Laurent wrote:

Unless you have other parallel jobs using the same Storage and Storage policies at the time the verification job is executed, you can let the default settings to use maximum.

There are (maybe too much but) many locations where you can define streams limits, like storage/mount paths, Storage policies, Storage Policy copies.. 

I recommend to schedule an incremental verification of both jobs on disk and DDB, because that’s the way to make sure your new blocks are usable for your copies, restores, or next backups with same signatures.. 

You should always schedule such data verification as soon as you create a DDB, because the more you wait, the more data would have to be verified, which can take very long, depending on so many things like storage performance, volumes, you MA’s resources..   


@Laurent Thank you very much for the explanation!


Forum|alt.badge.img+15
  • Byte
  • 386 replies
  • July 23, 2021

@drPhil Thank you too, glad to help :wink:


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings