Skip to main content

I think the answer’s probably no, but I feel like I need to ask this anyway:

We recently discovered that a content filter was missed on one subclient while configuring the Commvault agent on a database server, which caused the file system agent to back up an extra ~1.5TB of non-deduped data nightly for several months.  New nightly backups shrank considerably after we caught the oversight and set the appropriate filter, but not before Commvault wrote around 120TB of redundant Oracle database backups to disk that won’t begin to age out until sometime next year.

I know we can go in and manually expire those jobs to immediately free up space on our long term storage, but is there a way to apply the content filter to that subclient retroactively so that we can expire the just the 120TB or so that should have been excluded from the start, while keeping everything else?

Hello @John Hind 

I believe you will be able to achieve this using the workflow, however I would strictly recommend to do your testing on a sample client first or if you have any doubts or concerns raise a case with us:

https://documentation.commvault.com/2023e/expert/137363_deleting_backup_data_and_archive_data_using_deletedatafrombackup_workflow.html

Secondly, In order to make sure everything is pre-filtered after the configuration, you can use global filters and add the files/folders you need to filter.

https://documentation.commvault.com/2023e/expert/12734_filter_data_from_all_backups.html

Best,

Rajiv Singal


Reply