Skip to main content

Good morning.   

 

Our SQL administrators save their DB’s via a tool to produce flatfile backups to a share.  We then backup that share using CV.  I am wondering if there isn’t a canned script or set of regular expression “filters” we can have CV use to only grab the files that have been created or modified during a certain amount of time, say 72 hours for an example.  That way when we run a full, or synthetic full, against that share our capacity license impact will be reduced.

 

@Lynn Kearns  Depending on retention settings your capacity license would be reduced as data is aged off. However a script as @Damian Andre  noted would probably be the best method.

It might also be worth looking into the SQL disk caching method for backup. This leverages native SQL dumps but ingests them into commvault to view backup history and helps manage space as well.

https://documentation.commvault.com/11.24/expert/114691_disk_caching_for_frequent_log_backups.html


If you are handy with scripting, there is a concept called an on-demand backupset. This is designed so you can tell Commvault exactly what you want to backup - useful for a situation when you need to get real specific about what gets protected. It uses a directive file that lists out the content you want protected as an input parameter for the job.

Using a simple batch file, VBS or powershell script you could easily list files older than x hour or days, and then build this file to be used for to direct commvault what to backup.

https://documentation.commvault.com/11.24/expert/62541_content_file_format_for_on_demand_backups.html

on-demand backupsets only support traditional full backups from memory - how this is calculated for capacity licensing I am not sure.

Beyond this you could maybe look at filesystem archiving with the cleanup rules set to delete any file older than x (I think days is minimum) - but honestly these solutions all come with an element of risk since you are circumventing what is a well working backup to, lets say, optimize capacity usage :)


Could you set your Storage Policy to only retain 3 day’s worth of data? Assuming your DBA’s are creating the flat file on a regular schedule, your amount of data should stay consistent.  Set your storage policy to retain 3 day’s worth of backups, and anything older than that will just age off.


@Lynn Kearns , welcome to our community!  Thank you for joining and sharing with us!

I have a question for you.

Is the data (flat files) within the share staying there forever once created, or is there a script outside of CV that prunes off old copies?

An Incremental will grab the new/changed files since last backup, but there’s no simple way to say ‘only grab this file if it changed between last Tuesday and now’.

What I would suggest is using scripts to adjust what is on the share.  You can create a post-backup script to move the content to another folder (that you can purge as needed), or to delete the contents altogether once the backup finishes.


Reply