Skip to main content

Hi,

I was wondering how people are scheduling backups. (for example file system backups)

Do you only run incrementals and Synthetic fulls? Or do you sometimes run a “real” full?

For example, 1 “real” full a month, a daily incremental and a weekly synthetic full.

I’ve always heard never to “only” use incrementals and synthetic fulls, and to slip in a “real” full backup once in a while. What would be the reasoning behind it?

Looking forward to hear your feedback.

@Jeremy 

The first backup will be a FULL backup always as it will scan all the files and take the backup of that.

 

A full backup contains all the data in the sub-client contents. If a client computer has multiple agents installed, then the subclients of each agent require a full backup in order to secure all of the data on that client. Backups can also be performed at the backup set or instance level, and will apply to all of the subclients within the selected backup set or instance.

An incremental backup contains only data that is new or has changed since the last backup, regardless of the type. On average, incremental backups consume far less media and place less of a burden on resources than full backups.

Synthetic full backups consolidate the data from the latest full backup or synthetic full backup together with any subsequent incremental backups, instead of reading and backing up data directly from the client computer. Since synthetic full backups do not back up data from the client computer, this operation imposes no load on the client computer.

Synthetic FULL will save your disk space.  

Synthetic full backups do not backup data from the client computer directly, but uses list of latest objects from the previous backups to build a new backup image. So, it incorporates the latest full or synthetic full with any incremental backups that followed. This means that it does not read data directly from the client computers and therefore decreases the load on your production environment.

 

Full backup is a base line for a client, or a starting point, To initiate an incremental backup, we need to have a full performed. This backup type contains all the data in the subclient, and backup directly from the client computer.

https://documentation.commvault.com/11.24/expert/11694_synthetic_full_backups.html#advantages-of-synthetic-full-backups-over-full-backups

 

Advantages of Synthetic Full Backups Over Full Backups

Synthetic full backups have the following advantages over full backups:

  • They impose a lighter load on the production environment because they are created on the backup repository.

  • They have the ability to carry forward older or deleted versions of the objects backed up during the previous backup cycles.


Hello @Navneet Singh 

Thanks for this info! I already knew most of it 🙂. I actually wanted to know if it’s best practice to first, run a “Full” Backup (to create a baseline) and afterwards only incremental and synthetic full backups. Will this pose issues down the line? I’m talking years in the future.

The reason I’m asking this is because I’m having an issue with one of my Filesystem backups where incremental backups take ages to complete, but are only a couple of GB in size, and contain only about 90000 files, which isn’t that much. We’ve ran a “real full” backup to create the baseline months ago, but since then only ran incremental backups and synthetic full backups. The scan phase of the incremental takes about 95% of the total duration of the backup, and 90% of the load is DDB lookups. I’m wondering if the incremental/synthetic full schedule is the reason behind it. We might need to trigger a new real full backup, but I wanted to hear from the community first.


@Jeremy 

The first backup should be a FULL backup and then you can run only Synth FULL and incremental.

Its not mandate to have a FULL backup monthly. 


Hello @Jeremy,

 

If you create a plan there is no notion of running a ‘regular’ FULL anymore and you only run incrementals and synthetic (dash with deduplication enabled) fulls.

 

This pattern has been the main configuration for years now with Commvault and indeed should impose the least amount of stress on the source system.

 

If your incrementals are running long or unexpectedly long I’d suggest to investigate the system whether the journal of the file system is working as expected. Commvault uses the journal to find the changed files since last protection job, if that fails it will do a CRC OR recursive scan to check on all files to ascertain which ones were changed.

As you can imagine the latter part is a long and annoying process.

 

If you suspect there is an issue or cannot find a reason then the best course is to ask Commvault support to look into it. They have seen lots of similar cases and have a huge collection of previous cases they can look through to find possible reasons.

 

Coming back to a regular FULL or a Synthetic FULL and removing the burden pushed onto client systems there is another reason to not want a regular FULL, but only if you use advanced features like “keep last X versions of files”.

Those advanced features only work correct with synthetic full (carry forward) and are reset on a regular FULL which might not be wanted.

 

Hope this helps.

 

Regards,

Mike


Hello @Navneet Singh 

Thanks for this info! I already knew most of it 🙂. I actually wanted to know if it’s best practice to first, run a “Full” Backup (to create a baseline) and afterwards only incremental and synthetic full backups. Will this pose issues down the line? I’m talking years in the future.

The reason I’m asking this is because I’m having an issue with one of my Filesystem backups where incremental backups take ages to complete, but are only a couple of GB in size, and contain only about 90000 files, which isn’t that much. We’ve ran a “real full” backup to create the baseline months ago, but since then only ran incremental backups and synthetic full backups. The scan phase of the incremental takes about 95% of the total duration of the backup, and 90% of the load is DDB lookups. I’m wondering if the incremental/synthetic full schedule is the reason behind it. We might need to trigger a new real full backup, but I wanted to hear from the community first.

Jeremy,

I am not sure about “Commvault best practices” but only running incremental and synthetic full backups is “my best practice” with some of my environment.

All of my laptop clients are running incremental and synthetic full backups and have been for over 5 years. I haven’t run into any issues backing up or restoring data on a laptop client unless there is something wrong with the clients file system or network connection. Each laptop only gets one initial full and after that it is a daily incremental and weekly synthetic full.

Not only does this option take hardware resource load off of the client it also works best in situations where the client is roaming such as, a laptop client that may end up at a location with a slow internet connection. Trying to run a full backup of a client that is somewhere with poor internet is not always ideal for speed and sometimes internet data purposes.

I have never tested server clients for use with synthetic full backups so I do not know the performance and reliability on how a single file or bare metal restore would work. I wouldn’t think there would be a problem as long as the whole backup chain is free from any type of corruptions.


Reply