We are using AWS GIR S3 public endpoint for AUX COPY on remote site
we get trouble in SDWAN, and I have a large amount of data to catch up on 100Mb link… not enough
The idea is to get a local snowball on catch up aux copy, anyone experience this ?
Anyone experience this ?
Best answer by JosserandM
Thx !!
the only thing is that we already have a cloud lib S3 using GIR, so in this case, CS knows already the bucket and we have some directory trees… I m afraid to have 2 Lib in the same time..
Or i have to get the entire bucket back to continue aux copy on it in waiting
I did this same exact scenario about 5-6 years ago and seeded a locally racked snowball unit to then copy to an AWS s3 bucket. Here are the steps i used now updated with today’s newer processes.
Step-by-Step: NAS to Amazon S3 Cold Storage via Snowball
the only thing is that we already have a cloud lib S3 using GIR, so in this case, CS knows already the bucket and we have some directory trees… I m afraid to have 2 Lib in the same time..
Or i have to get the entire bucket back to continue aux copy on it in waiting