Skip to main content
Answer

System Created DDB Verification schedule policy - use of streams

  • July 22, 2021
  • 3 replies
  • 445 views

Forum|alt.badge.img+15

Hi there,

 

is it safe to limit the number of streams to be used in parallel for System Created DDB Verification schedule policy? I am asking because in our case this verification policy consumes 30 streams out of 30 streams. However, there is set up use maximum number of streams, why is 30 limit then?

Secondly, I would like to discuss Data Verification Options. Which option do you use preferably? Would it be enough to use only Verification of Deduplication Database instead of Verification of existing jobs on disk and deduplication database?

 

 

Best answer by Laurent

Unless you have other parallel jobs using the same Storage and Storage policies at the time the verification job is executed, you can let the default settings to use maximum.

There are (maybe too much but) many locations where you can define streams limits, like storage/mount paths, Storage policies, Storage Policy copies.. 

I recommend to schedule an incremental verification of both jobs on disk and DDB, because that’s the way to make sure your new blocks are usable for your copies, restores, or next backups with same signatures.. 

You should always schedule such data verification as soon as you create a DDB, because the more you wait, the more data would have to be verified, which can take very long, depending on so many things like storage performance, volumes, you MA’s resources..   

3 replies

Forum|alt.badge.img+15
  • Byte
  • Answer
  • July 22, 2021

Unless you have other parallel jobs using the same Storage and Storage policies at the time the verification job is executed, you can let the default settings to use maximum.

There are (maybe too much but) many locations where you can define streams limits, like storage/mount paths, Storage policies, Storage Policy copies.. 

I recommend to schedule an incremental verification of both jobs on disk and DDB, because that’s the way to make sure your new blocks are usable for your copies, restores, or next backups with same signatures.. 

You should always schedule such data verification as soon as you create a DDB, because the more you wait, the more data would have to be verified, which can take very long, depending on so many things like storage performance, volumes, you MA’s resources..   


Forum|alt.badge.img+15
  • Author
  • Byte
  • July 23, 2021

Unless you have other parallel jobs using the same Storage and Storage policies at the time the verification job is executed, you can let the default settings to use maximum.

There are (maybe too much but) many locations where you can define streams limits, like storage/mount paths, Storage policies, Storage Policy copies.. 

I recommend to schedule an incremental verification of both jobs on disk and DDB, because that’s the way to make sure your new blocks are usable for your copies, restores, or next backups with same signatures.. 

You should always schedule such data verification as soon as you create a DDB, because the more you wait, the more data would have to be verified, which can take very long, depending on so many things like storage performance, volumes, you MA’s resources..   


@Laurent Thank you very much for the explanation!


Forum|alt.badge.img+15
  • Byte
  • July 23, 2021

@drPhil Thank you too, glad to help :wink: