Solved

Backup job for empty DDB completed immediately as none of the stores are eligible for backup

  • 18 July 2022
  • 6 replies
  • 480 views

Userlevel 2
Badge +7

Hi Guys,

 

Hope you’re doing well !

We have some DDB backup jobs that completes with errors.

The job lasts 1 seconds only. In the events, it shows the below message:

“job completed immediately as none of the stores are eligible for backup”

 

Do you have any idea on what could be the reason?

Thanks a lot for your usual help.

 

Best Regards

icon

Best answer by Matt Medvedeff 18 July 2022, 16:02

View original

6 replies

Userlevel 6
Badge +15

Good morning.  In the properties of the DDB, can you please verify if the DDB is configured to seal and start a new DDB automatically when there is an issue or pause and recover the current DDB?

 

 

Userlevel 2
Badge +7

Hi @Orazan ,

 

In the DDB engine properties, it shows: Pause and Recover current DDB.

In “Deduplicaton DataBase creation”, “Create new DDB every” is selected and set to 7.

 

The seal is enabled since we use WORM on our cloud storage.

Regards

 

Userlevel 4
Badge +10

Hi @Adel BOUKHATEM 

Are the DDBs that are having this issue active? If you check under Storage Resources → Deduplication Engines → Find the DDBs in question:

Do they actually show the metrics in the right hand pane in the Console, like size on disk, primary blocks, number of jobs, etc?

If a DDB isn’t managing any existing data, then there’s nothing to back up and can cause the behavior you are seeing.

Userlevel 2
Badge +7

Hi @Matt Medvedeff ,

 

Here is what I found when checking the metrics for the DDB engine:

It seems that the DDB is empty for now. I guess because the DDB is sealed every 7 days.

Does it explain the behavior.

Regards

Userlevel 4
Badge +10

Yes, this is expected behavior for an empty DDB. If you complete some backup jobs to a storage policy associated with this DDB, you will be able to run a DDB backup.

Let me know if this answers your question. Thanks

 

 

Userlevel 2
Badge +7

Hi @Matt Medvedeff ,

 

Thanks a lot for your help.

 

Best Regards

Reply