Hi all,
we have recently set up a new Linux server with a 6+ TB DB. The normal CV backup works good, much faster than our old System. We have a problem although with database logs. When the System is being used it can produce new logs every three seconds. Our old system had an open pipe to the backup System and was able to Archive the files fast enough that we didn’t have a problem. CommVault takes around 15 seconds per file. When the system is under load we have a problem with our log space. We now have a temporary log area (2TB) which we need to give back.
The way I understand it CV opens up ist stream to DB2, Archives to log the closes the stream and checks that everything has been properly archived. The Problem here is the overhead involved. The actual archiving of the file is fast enough, it just that everything around is to slow.
Anybody have any suggestions how we can accelerate the process?
Thanks in Advance!