We have some Aux copies that go to our AWS s3 bucket. The storage policy this is under has a 30 day on prem and 365 day cloud policy. The 30 day on prem (primary) has data aging turned on and seems to be pruning and getting rid of jobs past 30 days. I took a look at the properties of the Aux copy job though and noticed that the check box for data aging was not selected. When I view all jobs for this Aux copy, it showed jobs back from years ago unfortunately. So that tells me that nothing is aging out or getting cleaned up.
Our s3 bucket is getting very large and we need to clean up all of these old jobs to bring it down to a reasonable size. My question is how best to do this clean up? Can I view the jobs under the Aux copy and then just select all of them past our retention and delete? Would this delete data out of the s3 bucket also if I did this?
I did select the data aging check box now and hit ok, then ran a data aging job from the commcell root and just ran it against our cloud library. That didn’t seem to do much. The data aging job ran for about 4 minutes and completed successfully, but it didn’t look like it cleaned out any of the jobs, nor does it show it aged out anything from what I can see.
What’s the best way to quickly clean this old Aux/cloud stuff up?
Edit - I should add that this data goes straight to Glacier. The process is that the last full backup on prem will get sent to the s3 bucket under the Glacier storage class.