Windows media agent boostfs to DD6900

  • 7 December 2022
  • 3 replies

Badge +4

Is anyone using a data domain (we are using a 6900). its attached to a Windows media agent using Boostfs. Its new and we started doing backups about two weeks. All the backups completed successfully. Then we tried to do a tape aux copy and started gettiing a couple data verification errors some of the jobs. Id like to compare settings with someone because i can’t figure out whats causing it.


Best answer by Collin Harper 7 December 2022, 19:50

View original

3 replies

Userlevel 7
Badge +23

@gibby101 , can you share some of the log excerpts with the errors?  I can search our internal DB for some info.

Badge +4

Thank you. We got an email saying two jobs failed data verification.

There are the logs from the job

Failed to Copy or verify Chunk [12872086] in media [CV_MAGNETIC], Storage Policy [30 Day to Data Domain], Copy [Primary], Host [], Path [W:\\EYRY1U_11.10.2022_21.34\CV_MAGNETIC\V_303989], File Number [407], Backup Jobs [ 573826]. Data Integrity validation failed for the data read from media. Data Integrity validation failed for the data read from media.

I ran a restore for this job. Is a virtual subclient with 14 VMs. It got 7 VMs restored, then this error popped up.

Data Integrity validation failed for the data read from media.
Source: nrc3712, Process: cvd


the job never stopped running. it just slowed down like stopped making progress.


There are a couple of other jobs that failed data verification too. Ive only seen this when we started backing up to the boostfs two weeks ago. 



Userlevel 5
Badge +14

Hello @gibby101 

We see this happen when there are environmental factors at play. Usually AV, hardware, or OS related issues. Commvault does a CRC when we write the data to make sure what we needed to write was written.
Once its been written, if some environmental factor affects the data and corrupts it, we encounter these read errors. Even though the backup was successful, something has affected the data and it is not able to be read.


I would recommend:

  • Performing a Full DDB Verification against the DDB associated with these jobs and deleting any that failed verification.

Performing a Data Verification Operation on Deduplicated Data -

Thank you,