Skip to main content
Question

Hadoop configuration || Unable to get index server information for client

  • May 2, 2026
  • 3 replies
  • 27 views

Forum|alt.badge.img

I have configured a Hadoop backup in Commvault. The scan phase completes successfully and files are being detected correctly.

However, before the backup starts, the job fails with the following error:

"Unable to get index server information for client [120]"

Because of this, the backup never starts. I have waited for a long time, but no events or additional logs are being generated.

Environment details:

Hadoop client installed with required packages

MediaAgents are already configured

Communication services appear to be running fine

Has anyone faced this issue before? Any suggestions on what to check or how to resolve it would be really helpful.

3 replies

Forum|alt.badge.img

Logs 

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

Machine : ndcpbkpcsa

File : Licensing.log

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

10340 2b88 04/29 16:18:04 205709 Check license: appType [64] Name [Big Data Apps] on client/library [118]- Valid

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

Machine : ndcpbkpcsa

File : JobManager.log

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

10340 2b88 04/29 16:18:04 205709 CVBkpJobSvr::backupinit This is a Distributed Cluster backup.

10340 2b88 04/29 16:18:05 205709 ArchiveManagerCS::createJobCopyInfo input params: jobId [205709], appId [314], appType [64], archGroupId [13], startDate [Wed Apr 29 16:18:05 2026

10340 2b88 04/29 16:18:05 205709 Servant [---- IMMEDIATE BACKUP REQUEST ----], taskid [1887] Clnt[hadoop] AppType[Big Data Apps][64] BkpSet[HDFS] SubClnt[default] BkpLevel[Full][1]

10340 2628 04/29 16:18:05 205709 AppManager Unable to get index server information for client [118]. Reason : [100- ].

10340 2628 04/29 16:18:05 205709 JobSvr Obj Instance [61]

10340 2628 04/29 16:18:05 205709 JobSvr Obj clients [112]

10340 2184 04/29 16:18:05 205709 Scheduler Phase [4-Scan] (0,0) started on [Hadoop1.apir162.nic.cloud.local] in [0] second(s) - CVDistributor.exe -j 205709 -a 2:314 -t 1 -d NDCPBKPMA2*NDCPBKPMA2*8400*8402 -r 0 -ab 0 -i 0 -cs ndcpbkpcsa -s "default" -jt 205709:4:1:0:0:21918 -scan -pkg Hadoop -mountPath -seb -scannedFiles 0 -scannedFolders 0 -ltr 0 -sct 1777366231 -lf 0 -li 0 -ls 0 -lsf 0 -attrEx 65536

10340 1254 04/29 16:18:21 205709 Servant Reg [Control] received. Client [Hadoop1] plattype = 4. Token [205709:4:1:0:0:21918]

10340 1254 04/29 16:18:34 205709 Servant Reg [Control] received. Client [Hadoop1] plattype = 4. Token [205709:4:1:0:0:21918]

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

Machine : Hadoop1

File : FileScan.log

@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@

1212835 1281a3 04/29 16:18:21 205709 DistributedIDA::CMaster::InitializeEvEvent(383) - =======================

1212835 1281a3 04/29 16:18:21 205709 DistributedIDA::CMaster::InitializeEvEvent(384) - STARTING DistributedIDA

1212835 1281a3 04/29 16:18:21 205709 DistributedIDA::CMaster::InitializeEvEvent(385) - -----------------------

1212835 1281a3 04/29 16:18:21 205709 ::GetSubclientDir() - The subclient directory is [/opt/commvault/iDataAgent/jobResults/2/314].

1212835 1281a3 04/29 16:18:21 205709 CvUCS::init_static() - Native charset="UTF-8".

1212835 1281a3 04/29 16:18:21 205709 CvUCS::init_static() - Great! This system is using ISO/IEC 10646 character set for wchar_t. We will not be using iconv() API and will be converting between MBE and UTF/UCS using C MBR functions.

1212835 1281a3 04/29 16:18:21 205709 CvUCS::init_static() - MBE happens to be UTF-8. Will use strcpy for MBE<=>UTF-8 conversion.

1212835 1281a3 04/29 16:18:21 205709 ::GetSubclientDir() - The subclient directory is [/opt/commvault/iDataAgent/jobResults/2/314].

1212835 1281a3 04/29 16:18:21 205709 initializeAdditionalSettings() - Initialize additional settings requested for entity id [314] entity type [7]

1212835 1281a3 04/29 16:18:21 205709 initializeAdditionalSettings() - No additional settings set for entity id [314] entity type [7].

1212835 1281a3 04/29 16:18:21 205709 CVDC_FileSystem::CScan::ShouldDistributeSubclientDirectory(1121) - JobResults files distribution is enabled

1212835 1281a3 04/29 16:18:21 205709 CVDC_FileSystem::CScan::CheckSCDirStatus(705) - Last job ID is 0, assuming this is the first multi-node job for the subclient.

1212835 1281a3 04/29 16:18:21 205709 CVDC_FileSystem::CScan::DeRegisterWithJM(1098) - Successfully unregistered the process

1212835 1281a3 04/29 16:18:21 205709 DistributedIDA::CMaster::startUserCommand(4413) - starting /opt/commvault/iDataAgent/ifind -j 205709 -a 2:314 -t 1 -d NDCPBKPMA2*NDCPBKPMA2*8400*8402 -r 0 -ab 0 -i 0 -cs ndcpbkpcsa -s default -jt 205709:4:1:0:0:21918 -scan -pkg Hadoop -mountPath -seb -scannedFiles 0 -scannedFolders 0 -ltr 0 -sct 1777366231 -lf 0 -li 0 -ls 0 -lsf 0 -attrEx 65536 -cn Hadoop1 -vm Instance001 -diDAFS

1212835 1281a3 04/29 16:18:21 205709 DistributedIDA::CMaster::startUserCommand(4419) - execv for '/opt/commvault/iDataAgent/ifind'

1212835 1281a3 04/29 16:18:34 205709 CFileScan::Process(239) - JobToken=205709:4:1:0:0:21918

1212835 1281a3 04/29 16:18:34 205709 EvEvent::SetupConnectionWithCVD() - Loading library under Base32

1212835 1281a3 04/29 16:18:34 205709 EvEvent::SetupConnectionWithCVD() - Loading library under Base64

1212835 1281a3 04/29 16:18:34 205709 CScanJob::ProcessCommandLine(151) - DistributedPkgType Hadoop

1212835 1281a3 04/29 16:18:34 205709 CScanJob::ProcessCommandLine(180) - Received Scanned Folders [0] Files [0]

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::ProcessCommandLine(285) - === JOB REFERENCE TIME is Clear ===

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::ProcessCommandLine(290) - FollowMountPoints=[1]

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::ProcessCommandLine(296) - Will skip incremental backup if there is no content to backup

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::ProcessCommandLine(444) - This is a Distributed(Multi-Node) backup job

1212835 1281a3 04/29 16:18:34 205709 CScanJob::GetScanStartTime(821) - Scan Start Time is set to 2026/04/29 10:48:34.549266Z||1777459714

1212835 1281a3 04/29 16:18:34 205709 CFileScan::Process(380) - Scan Start Time is 2026/04/29 10:48:34.549266Z||1777459714

1212835 1281a3 04/29 16:18:34 205709 CFileScan::Process(422) - Running ScanType - [ScanJob/FileSystem/FileSystemUnix] CallbackType - [CUnixScanJobCallback]

1212835 1281a3 04/29 16:18:34 205709 ::GetSubclientDir() - The subclient directory is [/opt/commvault/iDataAgent/jobResults/2/314].

1212835 1281a3 04/29 16:18:34 205709 CvUCS::init_static() - Native charset="UTF-8".

1212835 1281a3 04/29 16:18:34 205709 CvUCS::init_static() - Great! This system is using ISO/IEC 10646 character set for wchar_t. We will not be using iconv() API and will be converting between MBE and UTF/UCS using C MBR functions.

1212835 1281a3 04/29 16:18:34 205709 CvUCS::init_static() - MBE happens to be UTF-8. Will use strcpy for MBE<=>UTF-8 conversion.

1212835 1281a3 04/29 16:18:34 205709 CScanJob::Configure(411) - ResultPathJob=/opt/commvault/iDataAgent/jobResults/CV_JobResults/2/0/205709

1212835 1281a3 04/29 16:18:34 205709 CScanJob::Configure(412) - ResultPathSubclient=/opt/commvault/iDataAgent/jobResults/2/314

1212835 1281a3 04/29 16:18:34 205709 CScanJobCallback::initJobControl(158) - Initializing Job Manager interface using token '205709:4:1:0:0:21918'

1212835 1281a3 04/29 16:18:34 205709 initializeAdditionalSettings() - Initialize additional settings requested for entity id [314] entity type [7]

1212835 1281a3 04/29 16:18:34 205709 initializeAdditionalSettings() - No additional settings set for entity id [314] entity type [7].

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(204) - Subclient age [1] days < TrueUp threshold age [90] days

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(232) - TrueUpSkipped [No] TrueUpEnabled[No] TrueUpForced [No] DaysToRunTrueup[0] JobsToRunTrueup[0] TrueUpSkipAfterSynth[No] SkipDC[No]

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::IsClientEligibleForSmartStreamComputation(994) - Client is eligible for smart stream computation

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::ConfigureStreams(1022) - Stream count will be decided at the end of scan phase, based on content size.

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadScanType(1187) - UseDataClassification=0

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(305) - Client ID - 118

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(386) - Enabling incremental image mode for IndexV2

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(450) - Expand symbolic links path is disabled.

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::CheckAndSetFollowSymlinkOption(1404) - Follow symbolic links path is disabled

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(483) - Apple Double Support: Disabled

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(495) - Cluster file system filter enabled

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(511) - Will backup sub-mounts under skipped file systems

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(520) - SkipNonLocalMountDetails: no

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(526) - ClientBaseVersion [11]

1212835 1281a3 04/29 16:18:34 205709 CHadoopInstance::Load(67) - HadoopConfig - '<App_HadoopConfig><hadoopSites hdfsHost="hdfs://mycluster" hdfsNativeLibPath="/opt/hadoop/lib/native" hdfsPort="0" hdfsUser="hadoop" jvmLibPath="/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/amd64/server" siteType="1"/><coordinatorNode clientId="112" clientName="Hadoop1" displayName="Hadoop1"/><hadoopApps/></App_HadoopConfig>'

1212835 1281a3 04/29 16:18:34 205709 CHadoopInstance::Load(125) - Hadoop native lib path='/opt/hadoop/lib/native'

1212835 1281a3 04/29 16:18:34 205709 CHadoopInstance::Load(126) - JVM lib path='/usr/lib/jvm/java-8-openjdk-amd64/jre/lib/amd64/server'

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(622) - ======================================= this is a Hadoop Backup =======================================

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(623) - Hadoop URI: hdfs://mycluster

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(625) - Hadoop User: hadoop

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(670) - EnableFolderLevelMultithread : NO

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(723) - This is a Distributed subclient of type: 2

1212835 1281a3 04/29 16:18:34 205709 CScannerOptions::LoadConfiguration(782) - We'll not enumerate failed files from Index to re-try in this backup

1212835 1281a3 04/29 16:18:34 205709 FileExtentOptions::ConfigFileExtentOptions(91) - Extent backup is enabled for Hadoop

1212835 1281a3 04/29 16:18:34 205709 CUnixScannerOptions::LoadConfiguration(216) - HardLink Backup Disabled

1212835 1281a3 04/29 16:18:34 205709 CPostProcessingOptions::LoadConfiguration(78) - The number of subclient streams to use is [0]

1212835 1281a3 04/29 16:18:34 205709 CPostProcessingOptions::LoadConfiguration(85) - The number of subclient streams to use is [0]

1212835 1281a3 04/29 16:18:34 205709 CPostProcessingOptions::LoadConfiguration(126) - Collect files will not be split since number of streams is 0

1212835 1281a3 04/29 16:18:34 205709 CScanJob::Configure_AddTasks_MemoryManagementTask(689) - Additional setting [bEnableMemoryManagementForFileScan] is not set. Memory management task will not be added

1212835 1281d6 04/29 16:18:34 205709 CReportingThread::Run(23) - +++

1212835 1281a3 04/29 16:18:34 205709 ::GetSubclientDir() - The subclient directory is [/opt/commvault/iDataAgent/jobResults/2/314].

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::Configure_DistributedBackup(842) - Deleting Job JR state file : [/opt/commvault/iDataAgent/jobResults/2/314/SubclientdirState.cvf]

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::Configure_DistributedBackup(848) - Deleting Attempt JR state file : [/opt/commvault/iDataAgent/jobResults/2/314/ScanAttemptTimeStamp.cvf]

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::Configure_JobReferenceTime(930) - Full backup will not query for new subclient content

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystemUnix::GetImpersonationOption(230) - EnforceImpersonationForBackup []

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::Configure_ArchiveBit(899) - ArchiveBit=[0]

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::Configure_SubclientContentParameters(1039) - This is default subclient

1212835 1281a3 04/29 16:18:34 205709 CSubclientContent::Process(164) - +++

1212835 1281a3 04/29 16:18:34 205709 FSSubclientConf::setAllowOverlapSCContent(2447) - Allow overlapped content is enabled.

1212835 1281a3 04/29 16:18:34 205709 FSSubclientConf::load(699) - +++

1212835 1281a3 04/29 16:18:34 205709 FSSubclientConf::load(699) - --- 0:00.035669

1212835 1281a3 04/29 16:18:34 205709 CSubclientContent::LoadSystemFilters(667) - +++

1212835 1281a3 04/29 16:18:34 205709 CSystemFilterEnumerator::Process(147) - +++

1212835 1281a3 04/29 16:18:34 205709 CSystemFilterEnumerator::LoadFilters(227) - +++

1212835 1281a3 04/29 16:18:34 205709 CVGetHoneypotsCleanupList(147) - Failed to check if trap is legacy. Error: [0x80070306:{CCVDRansomwareRegistry::IsLegacyTrap(301)} + {CCVDRansomwareRegistry::GetRawTrapString(341)/W32.0.(The operation completed successfully. (ERROR_SUCCESS.0))}]. Assuming new trap anyway

1212835 1281a3 04/29 16:18:34 205709 CFilterFilesNotToBackup::BuildExclusionList(139) - Filtering OnePass snap directory [/opt/commvault/ProductionData/OnePassSnap]

1212835 1281a3 04/29 16:18:34 205709 CVPlatformInfo::GetCLDBRootFolder() - GetJob Results Directory returned.

1212835 1281a3 04/29 16:18:34 205709 CVPlatformInfo::GetCLDBRootFolder() - /opt/commvault/iDataAgent/jobResults/CV_CLDB

1212835 1281a3 04/29 16:18:34 205709 CFilterApplicationSpecificFiles::Process(31) - +++

1212835 1281a3 04/29 16:18:34 205709 CFilterApplicationSpecificFiles::Process(31) - --- 0:00.000100

1212835 1281a3 04/29 16:18:34 205709 CSystemFilterEnumerator::LoadFilters(369) - Loading DDB Snapshot filter

1212835 1281a3 04/29 16:18:34 205709 CVPI_ApplicationAgentBase::setSectionName() - Unknown AgentType=117

1212835 1281a3 04/29 16:18:34 205709 CFilterBlockLevelMounts::Process(35) - Filtering Block Level Mount Path [/opt/commvault/iDataAgent/jobResults/cvblk_mounts]

1212835 1281a3 04/29 16:18:34 205709 CSystemFilterEnumerator::LoadFilters(382) - Loading Block Level Mounts filter

1212835 1281a3 04/29 16:18:34 205709 CSystemFilterEnumerator::LoadFilters(415) - Loading CVFS Directory

1212835 1281a3 04/29 16:18:34 205709 CSystemFilterEnumerator::LoadFilters(227) - --- 0:00.034710

1212835 1281a3 04/29 16:18:34 205709 CSystemFilterEnumerator::Process(172) - 278 filters, 4 filter exceptions

1212835 1281a3 04/29 16:18:34 205709 CSystemFilterEnumerator::Process(147) - --- 0:00.034845

1212835 1281a3 04/29 16:18:34 205709 CSubclientContent::LoadSystemFilters(667) - --- 0:00.035181

1212835 1281a3 04/29 16:18:34 205709 CSubclientContent::Process(164) - --- 0:00.099806

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystemUnix::Configure_ExtentLevelBackup(2355) - HardLink Backup and Restore disabled for Extent Backup

1212835 1281d7 04/29 16:18:34 205709 CScanJobFileSystem::ReportAlive(4709) - Report alive thread started

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::Prepare(1757) - +++

1212835 1281a3 04/29 16:18:34 205709 CScanJobCallback::ShouldDisableScanRestart(199) - Scan restarting disabled for this distributed cluster type

1212835 1281a3 04/29 16:18:34 205709 ScanRestart::Prepare(156) - restartable scan not enabled or supported

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::Prepare(1843) - Last job was not a full job

1212835 1281a3 04/29 16:18:34 205709 CUnixTrueUpJobHandle::InitializeTrueUpJob(132) - +++

1212835 1281a3 04/29 16:18:34 205709 CUnixTrueUpJobHandle::IsJobQualifiedForRunningTrueUp(593) - TrueUp is enabled via subclient Option [0], Forced TrueUp [1], TrueUp reg [0] Incremental after SynthFull [0], State file set to run scan [0], State file set to run index query [0]

1212835 1281a3 04/29 16:18:34 205709 CUnixTrueUpJobHandle::InitializeTrueUpJob(179) - Job doesn't qualify for TrueUp

1212835 1281a3 04/29 16:18:34 205709 CUnixTrueUpJobHandle::InitializeTrueUpJob(132) - --- 0:00.000764

1212835 1281a3 04/29 16:18:34 205709 CPathCacheFlatFileIndexPrivate::InitDirChangeCache(151) - Cache Updation feature : [Enabled]

1212835 1281a3 04/29 16:18:34 205709 CPathCacheFlatFileIndexPrivate::LoadFromCurrentBackingFile(254) - +++

1212835 1281a3 04/29 16:18:34 205709 CPathCacheFlatFileIndexPrivate::LoadFromCurrentBackingFile(276) - CurrentBackingFile=[/opt/commvault/iDataAgent/jobResults/2/314/DCTmp.cvf]

1212835 1281a3 04/29 16:18:34 205709 CPathCacheFlatFileIndexPrivate::LoadFromCurrentBackingFile(254) - --- 0:00.000144

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::Prepare(2003) - previous DirChange [/opt/commvault/iDataAgent/jobResults/2/314/DCInc.cvf] not found

1212835 1281a3 04/29 16:18:34 205709 CSubclientContent::HasSubclientContentChanged(1402) - Failed to open content file [/opt/commvault/iDataAgent/jobResults/2/314/ScanView.txt]

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::Prepare(1757) - --- 0:00.009223

1212835 1281a3 04/29 16:18:34 205709 CvErrorControl::CvErrorControl(413) - Going to fetch rules, ClientId=118, AppType=64, EntityString=[<?xml version="1.0" encoding="UTF-8" standalone="no" ?><CvEntities_GenericEntity _SubclType_="0" _type_="3" clientId="118"/>]

1212835 1281a3 04/29 16:18:34 205709 CvErrorControl::ParseXMLRulesFromXMLString(528) - No error rules were parsed

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::Run1(2384) - Starting scan...

1212835 1281a3 04/29 16:18:34 205709 CWorkUnitManager::AssignWorkUnit(336) - Hadoop Scan

1212835 1281a3 04/29 16:18:34 205709 CWorkUnitManager::AssignWorkUnit(336) - Hadoop Scan

1212835 1281a3 04/29 16:18:34 205709 CWorkUnitManager::AssignWorkUnit(336) - Hadoop Scan

1212835 1281a3 04/29 16:18:34 205709 CWorkUnitManager::AssignWorkUnit(336) - Hadoop Scan

1212835 1281a3 04/29 16:18:34 205709 CWorkUnitManager::AssignWorkUnit(336) - Hadoop Scan

1212835 1281a3 04/29 16:18:34 205709 CWorkUnitManager::AssignWorkUnit(336) - Hadoop Scan

1212835 1281a3 04/29 16:18:34 205709 CScanJobFileSystem::ConfigureWorkersCount(4354) - Parallel WorkUnits is 15 but only 1 WorkUnits available, resetting it to 1

1212835 1281d8 04/29 16:18:34 205709 CUnixScanJobCallback::GetHadoopFileSystemInterface(2402) - Create Hadoop FS Interface

1212835 1281d8 04/29 16:18:34 205709 SetClasspath(177) - Getting Hadoop classpath, command='for i in $(hadoop classpath | sed -e "s/:/ /g"); do echo $i; done | sort | uniq | awk '{val=val":"$1} END {print val}''

1212835 1281d8 04/29 16:18:34 205709 CvProcess::system() - for i in $(hadoop classpath | sed -e "s/:/ /g"); do echo $i; done | sort | uniq | awk '{val=val":"$1} END {print val}'

1212835 1281d8 04/29 16:18:34 205709 CvProcess::system() - Command completed with rc=0

1212835 1281d8 04/29 16:18:34 205709 CvProcess::system() - hadoop envvars | grep HADOOP_CONF_DIR | awk -F "'" '{print $2}'

1212835 1281d8 04/29 16:18:35 205709 CvProcess::system() - Command completed with rc=0

1212835 1281d8 04/29 16:18:35 205709 GetConfPath(381) - Config Path: /opt/hadoop/etc/hadoop

1212835 1281d8 04/29 16:18:35 205709 CvProcess::system() - hadoop conftest -conffile /opt/hadoop/etc/hadoop/core-site.xml

1212835 1281d8 04/29 16:18:36 205709 CvProcess::system() - Command completed with rc=0

1212835 1281d8 04/29 16:18:36 205709 CvProcess::system() - hadoop conftest -conffile /opt/hadoop/etc/hadoop/hdfs-site.xml

1212835 1281d8 04/29 16:18:36 205709 CvProcess::system() - Command completed with rc=0

1212835 1281d8 04/29 16:18:36 205709 SetClasspath(202) - Setting CLASSPATH='/opt/commvault/iDataAgent/jobResults/2/314:/opt/hadoop/etc/hadoop:/opt/hadoop/etc/hadoop:/opt/hadoop/share/hadoop/common/hadoop-common-3.3.6-tests.jar:/opt/hadoop/share/hadoop/common/hadoop-common-3.3.6.jar:/opt/hadoop/share/hadoop/common/hadoop-kms-3.3.6.jar:/opt/hadoop/share/hadoop/common/hadoop-nfs-3.3.6.jar:/opt/hadoop/share/hadoop/com

 

 

 

 

 

 

 

 

 

 

 

 


Mohammed Ramadan
Vaulter

Hi ​@Laxman.Puvvalla  

Confirm the Index Server is online if so in Hadoop subclient properties check if the correct index server is selected ensure the Hadoop client and MA can communicate with the Index Server on ports ( 8400, 8403, 20000) It is very difficult to identify the issue without reviewing the logs so I recommend opening a support case 

Thanks and Regards,
Mohammed Ramadan


Forum|alt.badge.img

Hi Mohammed Ramadan,

Issue resolved thanks for your response,

Hostname / Communication Error:

Cluster name :Hadoop

Remote Agent Failure

Passwordless Login – Summary

Generated SSH key on Hadoop1 using ssh-keygen

Copied public key to target node (or same node) using ssh-copy-id

Updated ~/.ssh/authorized_keys with the public key

Set correct permissions for .ssh directory and files

Verified SSH configuration allows public key authentication

Restarted SSH service

Tested login using ssh Hadoop1 (no password prompt)

 

Total 5 nodes Master Node :Hadoop 1 and hadoop2--HA

 

Hodoop3,4,5, data nodes --online

Content: /tmp file

 

HDFS Backup completed successfully,