Share Commvault best practices
Share use cases, tips & ideas with others
- 97 Topics
- 420 Replies
Recover a CSDB to a higher level version?
Hi guys, I’m planning an upgrade and move of an Commserver version 11.20 on Windows Server 2012 R2 to a new server on Windows 2019 installed with Commserve 11.28 2022E to get the MSSQL upgraded as well. I will use DR Assistant Tool as I usually do and as far as I can see in docs this should work according to this statement…“Verify that the destination CommServe host installed with the same (or higher) service pack and hotfix pack as the database that is available in the DR backup that you plan to restore.” Has anyone done a similar upgrade/move between versions? Regards,Patrik
What are you currently working on?
Since we launched a few months ago, we’ve had thousands of members sharing their tips and tricks, as well as helping each other out. Each and every one of you has helped a peer empower themselves through this amazing community.Take a moment to introduce yourself and share what project or challenge you are currently focused on and let us know how we can help. Let’s use the power of our awesome community to boost each other ever upwards!
Building Custom Report
Hi Fellas,We have a BI server that we use in our own environment. Here we want to make a report on the BI server. This report will show the following information. It will show information such as Client Name, hostname, type of agent installed on it, content, last backup date according to Agent, whether there is Aux Copy, next backup date. Which tables should we read on CommServ DB for this information? Or is there a SQL Query you know about it? Best Regards.
HANA Logcommandline backup errors at single glance
SAP HANA Logcommandline Backup is invoked from HANA side (HANA Studio configuration), it automatically converted to Commvault backup jobs and normally there's nothing we should do, running every 15 minutes by default per HANA's setting.But this job is slightly different from the other "normal" jobs, when any interim errors, like disconnection of networks, shortage of Commvault resources (typically # of streams on libraries or strage policy copy level), the job would fail.As mentioned above, this job would repeat every 15 minutes (by default), so any failure would be recovered quickly so typically end user won't lose any of data. But sometimes there's another issues might be at CS/MA side, hard to keep watching even setting up alerts, job monitoring, etc. This is to list up all failure reasons from CSDB quickly (for a Japanese customer), if any suspicious errors identified you can dig into the specific job for detailed research:use CommServSET TRANSACTION ISOLATION LEVEL READ UNCOMMITTE
GCP CROSS REGION/ZONE BACKUP RETORES USING MULTI-NIC MEDIA AGENT
Introduction As a part of backup infrastructure implementation, using GCP (Google Cloud Platform) shared VPC’s (Virtual Private Cloud) to come up with multi-NIC media agent setup for CommVault infrastructure. This opened doors to backup data from one tier and restore the same across another tier. This has reduced the overall time and cost to migrate data to a separate client over the GCP network. It also ensured that there is network isolation between production and non-production tiers as no additional firewall ports had to be opened to transfer the backups.Media agent in GCP with two VPC’sBelow is the screenshot from GCP console which shows Commvault media agent has two NIC’s associated with different subnets from two separate shared VPC’s.Restore using Commvault multi-NIC media agentBelow is a high level diagram showing the refresh between production and non-production tiers (segregated across VPC’s) by using Multi-NIC media agent. Media agent will read data from storage bucket by
Slow performance of Intellisnap VM backup with NetApp
Hello, We are getting slow throughput for VMware backups using intellisnap. Around 150 gb per hour with 5 readers.Data store is on NetApp and VSA/Media agent is VM.It is currently using NBD transport mode.Can hotadd or another transport mode be used and will that be faster? Is there any other best practice?
Best practices for Read-Only Domain Controller server backup
Hi All, Any best practices from Commvault for backing up RODC server? Backup as VSA with app-ware option Usual FSA and AD agent level backup No backupI didn’t find any documentation on this.So, I am looking for suggestion on the same.Thanks.Sudeep
2FA and Okta
Hi all.Just wanted to share here that although Okta is not officially supported we have been able to get it work for 2FA as a basic Other standalone time-based PIN. We set 2FA against a group and then place users in one at a time. I wrote a couple of docs about it if anyone would like the details.
Improving Backup Performance of NFS Share?
I am backing up a fairly large NFS volume off a Netapp using the network share for NAS method. There are a mix of large files and many small files. Is it normal for a full backup to take several days (3-4) to complete? How can I improve the speed? Currently I have a Linux Media Agent that is configured as the data access node. Would performance increase if I add additional access nodes?
S3 Bucket task failing for the CloudApps job
Hi, I am still learning about commVault and I am trying to fix an issue with a Commvault > CloudApps job which have three buckets as Subclient. Two S3 buckets are backing up ok and the other one getting errors in the specific job history some times [82:129] and some times [19:583] Description: Another backup is running for client [Custom_AWS_S3], iDataAgent [Cloud Apps], Backup Set [defaultBackupSet], Subclient. I don’t know its anything to do with the credentials used for the S3_EU region but the client readiness checks saying “ready”. I am still struggling to find why two buckets backing up using the same IAM credentials? So why just one bucket backup task is failing for the job. Please give me some insight into it. ThanksSuj
CommVault Hidden GEMS - Share the CommVault links you have bookmarked that made your life easier.
CommVault has the classic problem of a product with so much wide ranging utility that it is practically impossible to keep track of every aspect of the product. That said I am sure we have all had the experience of finding a CommVault tool or website that is a game changer for your use case. Please share these links here: VSA Feature compatibility MatrixTape Storage Matrixadditional settings database
Retrieve information from CSDB - Whereabouts of the data
When creating workflows or reports you need to refer CSDB directly.If you can find out appropriate views or examples (in this community for instance), this would be helpful.But most of the cases there's no clue to identify the whereabouts of the data which you need to retrieve, this is a technique how to find out the data from CSDB.Suppose you want to identify the table location of subclient entry like: First prepare for "Full-text search" on CSDB using this kind of technique:Search all tables, all columns for a specific value SQL Server [duplicate], this query can search all tables for specific text, I'm creating this query as stored procedure for convenience and pass some literals to find out.Next, you should create easily-to-search text like:Then search entire CSDB using the query above, results like this:Bingo, the subclient content must be stored in a table named APP_ScFilterFile .There should be some knowledge to exclude unnecessary information, for instance, 3rd one indicates Au
SSO login Issue for Cross Domain
8332 18a0 05/05 11:06:42 ####### GetFromUsersPropDB() - Enter8332 18a0 05/05 11:06:42 ####### GetFromUsersPropDB() - Exit8332 18a0 05/05 11:06:42 ####### ::processAdUser() - Blobsize returned from processSSORequest = , dwErr=[0xc000019b]8332 18a0 05/05 11:06:42 ####### ::processAdUser() - Unexpected return code [0xc000019b] from processSSORequest8332 18a0 05/05 11:06:42 ####### ::processAdUser() - error not retrieved from formatMessageA(..)8332 18a0 05/05 11:06:42 ####### EvSecurityMgr::userLogin() - processAdUser returned [-1], "error not retrieved from formatMessageA(..)"8332 18a0 05/05 11:06:42 ####### EvSecurityMgr::userLogin() - Socket [0x0000000000004028]: Database error [-1/].8332 18a0 05/05 11:06:42 ####### ::sendResponse() - FAILED [DataBase Error.]8332 18a0 05/05 11:06:42 ####### handleLoginOperations() - Encrypted Login Failed.Browser Session Id 8332 6310 05/05 11:06:43 ####### dropConnection() - Socket [0x0000000000004028]: Closing Browser Sess
DB2 user account.. How to change it on multiple servers/ instances?
Hi.. With MS SQL we can set the account used for connecting to SQL on a group level….. Any idea on how to do the same for DB2? This seems to involve some scripting which I suck at… (unless it starts with an @echo off) :-) Our DB2 team is considering creating a single DB2 user for all DB2 servers/ instances to be used by CLVT… This user will have its password changed on an often basis, but we do not want to traverse every single DB2 server /instance to change the password.. .(we’ve got quite a few) Any idea/ solution would be highly appreciated :-) Thank you.. Kind regardsRubeck
Use Token for legacy CLI access
This is simple but working trick to maintain Commvault.When you want to start (typically backup) jobs via CLI instead of Schedule Policies, it involves qlogin in the first place to log into CommCell.This command is mostly straight-forward to use but when trying to invoke multiple jobs from one server it arises some errors, like Error 0x10b: User not logged in, Error 0x208 Token file is corrupted or relevalt errors.As BOL explains -f parameter to use Token file as follows:When using qlogin without -f option, it generates a file named "qsessions.OS-User, directly under Commvault installation directory.This file must be created with administrative privilege on this server to modify installation directory, also need to exist until any qoperations called in the shell (like qlist job), then it will be removed when the shell calls qlogout.This default token file is generated per OS user, not Commvault user so if multiple shells are called simultaneously it uses the same qsession file.So when
Backing up Nutanix files
Hi looking for some advice on Backing up Nutanix files.Nutanix ver 5.20.230 TB of capacityCommvault hyperscale 3 node cluster10 GB ethernet connectivity. CIFS And NFS enabled. Documentation states the backups run through Access nodes and at aminimum 1 access node per protocol. Is there any integration with intellisnap for Nutanix files? I did not see this covered in the documentation, but maybe not looking in the right place.
Hey All, I have a question to the following scenario: Windows MA with an NVME Flashcard Actually the flashcard is formatted with 4K blocksize and we are planning to format the disk for 32KB blocksize as written in BOL ( https://documentation.commvault.com/11.24/expert/12411_deduplication_building_block_guide.html ) On the Flashcard we are hosting 3 DDBs. DDB is for backups to an cloudian Backup Device over S3 Protocoll in local Datacenter DDB is for Backups to SAN Attached STorage DDB is for backups into S3 Storage out of the Datacenter I have the following question. Can we use one Windows Partition with 32K Blocksize or should we make 3 partitions with different windows block size ? The second question is which blocksize we need for the DDBs ? ( Block level Deduplication factor ) for all 3 DBs
Already have an account? Login
Login to the community
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.