Skip to main content

Just wondering…

Background: I’ve been recently running “CVDiskPerf -PATH (a folder) -OUTFILE (a log file)” vias it appears we might be getting low read throughput to locally mounted storage on my media agents and I’m getting vastly different readings.  Like 400 GB/hr for one run, then a few minutes later 1500 GB/hr.  I know there will be “some variance” but i was under the impression that this tool would probably “run for a bit” and give a more solid reading.  our storage nor our servers running CommVault are showing CPU or “IOPS pressure” in any metric that shows they aren’t able to handle more load/throughput. Before I started running this tool I was under the impression it would “work a little harder” to get throughput stats and not finish in a minute (like it would really try to push a limit of CPU/throughput/etc and the storage system would notice it as it was getting fed a lot of read/write requests).  Currently the storage system doesn’t register a blip when it runs, making me feel like… the defaults do not give the best results to work with.

Question:  Does anyone run this executable/troubleshooting tool with different arguments “as a general rule” to get better/more accurate results?  I know there are several options that one could run, I’m just wondering if there’s an undocumented/unspoken “yeah, this is probably the best combination of options to get a good reliable/repeatable stats from it”. yeah, I know I could just run it ‘a few times” with the defaults and take an average of the readings.

Hi @tigger2,

There are some additional parameters you can use to increase the test duration (block count / file count) and intensity (threadcount). Check out this video for more details:

 


Reply