Solved

Is there a way to run a full Backup that only captures files modified or created during a certain time period?

  • 16 November 2021
  • 4 replies
  • 325 views

Badge +2

Good morning.   

 

Our SQL administrators save their DB’s via a tool to produce flatfile backups to a share.  We then backup that share using CV.  I am wondering if there isn’t a canned script or set of regular expression “filters” we can have CV use to only grab the files that have been created or modified during a certain amount of time, say 72 hours for an example.  That way when we run a full, or synthetic full, against that share our capacity license impact will be reduced.

 

icon

Best answer by Scott Reynolds 17 November 2021, 17:43

View original

4 replies

Userlevel 7
Badge +23

@Lynn Kearns , welcome to our community!  Thank you for joining and sharing with us!

I have a question for you.

Is the data (flat files) within the share staying there forever once created, or is there a script outside of CV that prunes off old copies?

An Incremental will grab the new/changed files since last backup, but there’s no simple way to say ‘only grab this file if it changed between last Tuesday and now’.

What I would suggest is using scripts to adjust what is on the share.  You can create a post-backup script to move the content to another folder (that you can purge as needed), or to delete the contents altogether once the backup finishes.

Badge +2

Could you set your Storage Policy to only retain 3 day’s worth of data? Assuming your DBA’s are creating the flat file on a regular schedule, your amount of data should stay consistent.  Set your storage policy to retain 3 day’s worth of backups, and anything older than that will just age off.

Userlevel 7
Badge +23

If you are handy with scripting, there is a concept called an on-demand backupset. This is designed so you can tell Commvault exactly what you want to backup - useful for a situation when you need to get real specific about what gets protected. It uses a directive file that lists out the content you want protected as an input parameter for the job.

Using a simple batch file, VBS or powershell script you could easily list files older than x hour or days, and then build this file to be used for to direct commvault what to backup.

https://documentation.commvault.com/11.24/expert/62541_content_file_format_for_on_demand_backups.html

on-demand backupsets only support traditional full backups from memory - how this is calculated for capacity licensing I am not sure.

Beyond this you could maybe look at filesystem archiving with the cleanup rules set to delete any file older than x (I think days is minimum) - but honestly these solutions all come with an element of risk since you are circumventing what is a well working backup to, lets say, optimize capacity usage :)

Userlevel 6
Badge +14

@Lynn Kearns  Depending on retention settings your capacity license would be reduced as data is aged off. However a script as @Damian Andre  noted would probably be the best method.

It might also be worth looking into the SQL disk caching method for backup. This leverages native SQL dumps but ingests them into commvault to view backup history and helps manage space as well.

https://documentation.commvault.com/11.24/expert/114691_disk_caching_for_frequent_log_backups.html

Reply