I’m setting up Activate to hopefully use for my company for file share optimization. My initial storage which was about 600GB is space is now full. I have not been able to determine in the configuration settings how to modify or add new storage to the current index. I’ve even attempted to create a new index server but it will not let me use the same node (media agent) as before so that I can assign new storage. In this case the local F: drive. Does anyone know where the configuration is for me to modify the target of Activate index files?
I can help you with this query. Can you please let me know more about your environment including your current service pack level and hardware configuration of the index server? For a large index server configuration, we need 500 GB for 500 million files but looks like yours went over 600 GB. Can you please let me know the number of files indexed so far?
PraveenV,
Approximately 4.97 million files for the current suspended job that is @ 57% complete.
Another filer in the index is 11.5 million files which completed successfully. So approximately 15 million files. The current index drive is an internal SSD that cannot be grown in size (800GB , 750GB usable)
Then new SAN storage I have added to the media agent is 2.5TB. (Drive F:) that I was going to assign and begin using for this index.
I must have deleted the information about my current service pack. 11.20.23
Can you please verify if you enabled Content Indexing as part of the plan option? File Storage Optimization is based on meta data. only and should be able to hold more than 500 million file information under 500 GB.
Praveen,
Yes I confirm I enabled Content. I wasn’t aware I should not have. Additionally, it appears looking at the duplicate files it is looking at multiple shares as duplicates when it’s the same path. NetApp filer share of User1$ and filername1$ is the same filepath. How can I configure Activate to not see these as two locations? Therefore my duplicate file count is double what it should be. I guess I need to delete my index and start completely over.
Yes, I would recommend doing the following
- Delete the existing data sources configured from FSO. Please make sure that there are no backups configured from these sub clients.
- Configure a new Data Classification plan with content indexing turned off
- Configure the data sources again from FSO and select the plan without content indexing.
As per the duplicate file dashboard, we qualify files as duplicates if file name, modified time and size are exact match. Are you sure they are not actually duplicates?
I will start over with my configuration and delete all plans and indexes. Next configuration for FSO will not include Content. Here is my example of a duplicate that is the exact same file location.
It’s just a different sharename pointing to the same location.
Any reason why you are adding the same content twice? Are you adding this to the same sub client or different sub clients?
Reply
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.