Ask questions, give answers, get good karma
- 2,903 Topics
- 14,151 Replies
We have a major project that will be started.For various reasons some of our compute and applications will move to Azure Gov cloud.At the present we have two commcells serving the enterprise on prem.One of them contains backups for clients that need to move.There are Local libraries for primary and Azure Libraris (Commercial cloud) for secondary copies.Preferably being able to retain backups is desiered, at least the backups that exists in Azue.I know that clients can be migrated to a new comcell but the backup data isn't migrated.Anyone gone through something like this? If so how did you tackle the task.We have just started to plan out this change and is just throwing ideas around. BRHenke
Good morningI’m about to do my first Service Now integrations with Commvault. The SNOW admin is asking if Commvault is able to make use of webhooks for this integration. From what I’m seeing this is an available option.Is it possible to just confirm this before I continue with this?I may have more questions once I start.Thank you in advance.Regards,Mauro
When loading the dashboard through the command center, loads of errors for every field say “Failed to Authenticate Token.” I’m able to authenticate through SSO but the nothing on the dashboard loads through client computer and even when trying through localhost remoting into the Commvault server (different issue). Has anyone experienced this or does anyone know which logs on the Commvault server to look at in order to get a place to start troubleshooting.-Andrew
On CV 11.20.9 we backuped Oracle DB from SPARC Solaris to MA RedHar8.4.If we set “Optimized For concurrent LAN backups” on MA then speed = 1800 GB per hour, if we unset “Optimized For concurrent LAN backups”, then speed=5500 GB per hour.All other settings are identical.What changed in MA configuration when set “Optimized For concurrent LAN backups”?
Hi All, We recently had an issue where 1 client Commvault client certificate did not get renewed, our supported customer is a bit on edge and wants to get a report of all Commvault client certificates, when they were issued and their expiry date.Understand that Commvault auto renews the certificate 2 weeks before expiry and the details I require can be found in the Commcell Console but need to understand if such a report can be created, either from the web console, the commcell console or via the Commvault CLI that can be created in a .csv or html format.Any help would be appreciated.ThanksTom
Hi Folks,I am constantly battling the error System State Backup error below on many of my windows hosts: ERROR CODE [6:64]: System state backup of component [System Protected Files;] failed.Source: servername, Process: clBackup What is everyone doing to automate the remediation of this problem on your hosts? I have a moderate sized environment (~1500 hosts) so handholding each one when it pops up is a time consuming and arduous process. Thanks in advance!Sean
Has anyone experimented with the “Automatically use optimal number of data readers”?I set this on some test SubClients and it is defaulting to 1 regardless of what the manual setting is. This results in jobs taking dramatically longer? Does anyone know how the readers are optimised?Why does it default to 1 regardless of the starting position?
Hi, I have question about create iDA Oracle. I try handle make instance for mopsDS-02, but “"Unable to establish connectivity with these instance properties for [WROCMOPS]. [Unknown Event [^1%lu]. Do you want to modify properties again?".However, when I evoked "Discover Instance" the object was created with the values I entered when trying to manually configurei "All Task" | "New Instance (ORACLE SID)”In the environment we have, among others 2 * Windows 2008R2 with Oracle 10.2.0.4. Both servers are on a different network and we use the Network Gateway. On the DS-01 server, the agent has been configured without any problems and the backups are performed. On DS-02, when we execute "Discover Instance" in the CVD.log we have an entry:1928 1478 08/17 08:04:59 ### [CVD] Remote Command Request from remotehost = <cuisvm56.um.wroc>, RemoteClient = <cuisvm56>, RemoteIP (Sock) = <127.0.0.1>. Launched Process: <"C: \ Program Files \ Commvault \ ContentStore \ Base \ ClOraAgent
I am investigating a performance issue on a server that is both client and media agent.Looking in the PerformanceMetrics.log file I see the metrics for CPU load on some CV processes. But they seem always to be at zero.Isn’t those collected into the log file or does something need to be enabled for them to get there. I’ve attached a screenshot of that section from the log file
Hello to all!In a fresh Commvault installation for o365 backup, I’m trying to find the way to manual exclude some user licenses from SharePoint online!From the first time when configured the SharePoint online, automatically assign all the o365 users to SharePoint Online license (from Application User Licenses tab).From Command Center Im available to manual exclude users from Exchange and also OneDrive, but I cant find a way to exclude users from SharePoint online. In order to save some user licenses :)Any idea ?
Hi all,is there any way how to display overall throughtput withing the Commvault Commcell? In the Job Controller I can see a list of running jobs and I want to know the sum of all items in the current throughput column. Is it feasible to manage this task? Can help any tool with it?I have tried to run cmd q commands and count the throughput within the script, however, the start date of the backup job is not the real one? Then, I am not able to calculate the current throughput. In general, how can I calculate current throughput if it is not possible to directly display it? Example: in script there is an output for Job Id 35478 - jobEndTime 1629021648, however in CV console for the same job it shows 13:08:43 Thanks for any ideas.
Stub attachments replaced with "Archived Attachment List.txt" file which contains the names of the files that were archviced
Hi, I have had a user who has many older emails with stubbed attachments that have had them removed and replaced with the one file, “Archived Attachment List.txt" . This file is a list of the files that ere archived.It appears to be emails prior to April 2021, and more recent archived email has stubs as they should be and are able to be recalled.I cannot find any mention of this behaviour in documentation.I do not know how widespread this problem might be.Any help would be greatly appreciated.Cheers
Hi,I have an issue where my user have application that the license tied to MAC address of the vm. When I do restore the vm (overwrite existing), the vm restored having different MAC address than original. Is this supposed to happen? Is there any ways that I can preserved the original MAC address when restore the whole vm?This is using vsa Nutanix vm backup.regards,Fauzi
Hello, recently we figure out that we are not able to use decoupled rpm package for upgrades: yum update.even worse it can break yum update process. I found way how to create this kind of package. Solution is to create install and update package from same source and merge them to one. how to catch spec file is in my previous post than you have to copy binaries from packages to one. For upgrade binaries I changed path from /opt/commvaultUpgrade to /opt/commvault/Upgrade so everything is under one path /opt/commvault. Then you have to merge spec files. I used install spec files as destination and added upgrade spec differences. Here is how to recognize type of process (install/upgrade). I worked just with v11FR20MR60(Linux) so it is reason why this post is so generic. Filip
Good morning allI”m using a Plan to configure backup policies, etc.I would like to disable the option to do Incremental backups and only run Fulls. I’ve not been able to do this within the Plan. I need to go into the GUI and disable it in the Schedule.The Plan does update with an ‘invalid data’ error, which isn’t a massive issue. However, I’d prefer to use a Plan setting to disable this. Am I missing something?I’ve attached a copy of my current setting and not made the chante yet.
Hello - we are testing full instance restores of EC2 and VMware to EC2. Both VMs fail with the following error: Import task failed. FirstBootFailure: This import request failed because the instance failed to boot and establish network connectivity..Please refer http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/VMImportTroubleshooting.html for further troubleshooting.Am I to assume this is something with the source instance or possibly the permissions to finish the import? We created all the necessary roles and applied them to the user we created. The other thought is Windows licensing?My user in my test environment is assigned the admin role (I know - insecure) and my restore finished successfully. I also created and assigned the vmimport role to this user.Any pointers on this would be appreciated!Commserve and VSA are running in AWS so I don’t think we have firewall issues. Backups run just fine.Thanks!Melissa
HI, i was looking at the commserve hardware requirements. You can have a combination of servers, virtual machines, and laptops in a single CommCell environment. The total workload must equal the maximum number of servers, per CommCell environment type (Extra Large, Large, Medium, or Small). The workload equivalent for each entity is as follows:Server: 1 Virtual Machine: 0.5 Laptops: 0.2now i fully understand the above calculation of the environment size.however, what if a Commcell has let say 500 deconfigured clients.would these need to be calculated? if so, are they full “servers” or do they count for less then this.as technically they are not active but still in the Commserves DB. thanks in advance, kind Regards, Thos Gieskes.
Hi, I have a client who is extremely serious about security. Knowing what information is contained in CSDB and following “Trust no one” policy, does not allow for CS database to be send anywhere including Commvault support.There is an option to scrub log files which besides scrubbing logs should also scrub CSDB, but the documentation doesn’t mention what exactly is scrubbed in CS DB.https://documentation.commvault.com/11.23/essential/130818_configuring_data_masking_for_log_files.htmlI’m thinking about, client names, client hostnames, domain names, user names, user passwords, encryption keys and so on. Can anyone share some insight about what data is masked in CSDB?
we are helping a customer restore data from their old commserve envrioment; we had a VM level backup for the commserve; we restored it the services are working; however the credentials supplied by the customer is not working. Is there a way around it?' commvault sevrices appear to be working okay (looked at the Simpana services console; everything is showing as running)Also, is there a log file that I can check what’s going on. SQL is installed on the same VM. Also, becuase this was a old commvault system and required to be stood up to look for historical data; can we raise it with support team
Hi,I would appreciate some clarification / confirmation here. What is the difference between the iTeamsUseProtectedAPIs and the client1_iTeamsUseProtectedAPIs additional settings?Does "client1” needs to be replaced by something ?Is it correct that these additional settings must (can) be placed on an access node alone (e.g. not on the CommServe)?Thanks!
Hello All, I have configured the file system backup for a server and after starting its full backup getting the error “encountered failure in receiving data [The SDT data transfer was terminated on a request from the Job Manager.]” 17708 2e14 08/11 13:47:45 4370936 CPipelayer::SendPipelineBuffer() - Tail has reported error [The SDT data transfer was terminated on a request from the Job Manager.]. Cannot continue.17708 2e14 08/11 13:47:45 4370936 [PIPELAYER ] Error in flushing the current buffer.17708 2e14 08/11 13:47:45 4370936 CVArchive::WriteBuffer() - Cannot send the buffer. Ret 17708 2e14 08/11 13:47:45 4370936 CFileBackup::WriteBuffer(1683) - writeBuffer failed17708 2e14 08/11 13:47:45 4370936 CFileBackup::HandleReadAndSendFileDataError(1497) - WriteBuffer failed17708 2e14 08/11 13:47:45 4370936 CBackupBase::DoBackup(3689) - ReadAndSendFileData indicates FAIL_BACKUP17708 2e14 08/11 13:47:45 4370936 CBackupBase::DoBackup(2608) - --- 8:29.499975 17708 2e14
Login to the community
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.