NetBackup IT Analytics DC Installation Guide For File Analytics v11.4
NetBackup IT Analytics DC Installation Guide For File Analytics v11.4
Release 11.4
NetBackup IT Analytics Data Collector installation
guide for File Analytics
Last updated: 2025-02-03
Legal Notice
Copyright © 2025 Veritas Technologies LLC. All rights reserved.
Veritas and the Veritas Logo are trademarks or registered trademarks of Veritas Technologies
LLC or its affiliates in the U.S. and other countries. Other names may be trademarks of their
respective owners.
This product may contain third-party software for which Veritas is required to provide attribution
to the third party (“Third-party Programs”). Some of the Third-party Programs are available
under open source or free software licenses. The License Agreement accompanying the
Software does not alter any rights or obligations you may have under those open source or
free software licenses. Refer to the Third-party Legal Notices document accompanying this
Veritas product or available at:
https://www.veritas.com/about/legal/license-agreements
The product described in this document is distributed under licenses restricting its use, copying,
distribution, and decompilation/reverse engineering. No part of this document may be
reproduced in any form by any means without prior written authorization of Veritas Technologies
LLC and its licensors, if any.
The Licensed Software and Documentation are deemed to be commercial computer software
as defined in FAR 12.212 and subject to restricted rights as defined in FAR Section 52.227-19
"Commercial Computer Software - Restricted Rights" and DFARS 227.7202, et seq.
"Commercial Computer Software and Commercial Computer Software Documentation," as
applicable, and any successor regulations, whether delivered by Veritas as on premises or
hosted services. Any use, modification, reproduction release, performance, display or disclosure
of the Licensed Software and Documentation by the U.S. Government shall be solely in
accordance with the terms of this Agreement.
Technical Support
Technical Support maintains support centers globally. All support services will be delivered
in accordance with your support agreement and the then-current enterprise technical support
policies. For information about our support offerings and how to contact Technical Support,
visit our website:
https://www.veritas.com/support
You can manage your Veritas account information at the following URL:
https://my.veritas.com
If you have questions regarding an existing support agreement, please email the support
agreement administration team for your region as follows:
Japan [email protected]
Documentation
Make sure that you have the current version of the documentation. Each document displays
the date of the last update on page 2. The latest documentation is available on the Veritas
website.
https://sort.veritas.com/data/support/SORT_Data_Sheet.pdf
Contents
■ CIFS shares
CIFS shares
■ This Data Collector can collect both Linux and Windows shares. The
recommended Windows Data Collector server operating system is Windows
Server. See the Data Collector Supported Operating Systems sections of the
NetBackup IT Analytics Certified Configurations Guide for supported OS versions.
■ The Windows LAN Manager authentication level, in the local security policy
security options, must be modified to: Send LM & NTLM - use NTLMv2 session
security if negotiated. This allows the Data Collector to invoke the net use
command with the password supplied on the command line. Without this setting,
later versions of Windows will terminate with a system error 86 (invalid password).
■ Windows CIFS Shares collection requires the Windows Domain User ID. This
User ID must have Administrative privileges.
■ Linux CIFS Shares collection requires super-user root privileges. Access control
commands, such as sudo, sesudo, and pbrun are also supported. If using any
of the access control commands, verify that the User ID has sudo, sesudo, or
pbrun privileges.
■ Collection of owner data for Windows and CIFS is configurable via Advanced
Parameters. Data collection completes faster when owner data is not collected
(the default). Although not recommended, you can configure an Advanced
Parameter, FA_RESOLVE_OWNERS set to Y, to enable owner collection. To
access Advanced Parameters in the Portal, select Admin > Advanced >
Parameters.
Pre-Installation setup for File Analytics 10
Host Discovery and Collection File Analytics probe
Field Description
Collector Domain The domain of the collector to which the collector backup policy is being added.
This is a read-only field. By default, the domain for a new policy will be the same
as the domain for the collector. This field is set when you add a collector.
Policy Domain The Collector Domain is the domain that was supplied during the Data Collector
installation process. The Policy Domain is the domain of the policy that is being
configured for the Data Collector. The Policy Domain must be set to the same
value as the Collector Domain. The domain identifies the top level of your host
group hierarchy. All newly discovered hosts are added to the root host group
associated with the Policy Domain.
Typically, only one Policy Domain will be available in the drop-down list. If you
are a Managed Services Provider, each of your customers will have a unique
domain with its own host group hierarchy.
To find your Domain name, click your login name and select My Profile from the
menu. Your Domain name is displayed in your profile settings.
Name* Enter a name that will be displayed in the list of Data Collector policies.
Schedule* Click the clock icon to create a schedule. Every Minute, Hourly, Daily, Weekly,
and Monthly schedules may be created. Advanced use of native CRON strings
is also available.
*/20 9-18 * * * means every 20 minutes between the hours of 9am and 6pm
Explicit schedules set for a Collector policy are relative to the time on the Collector
server. Schedules with frequencies are relative to the time that the Data Collector
was restarted.
Shares* Click Add to configure the CIFS shares that the collector will probe.
Notes Enter or edit notes for your data collector policy. The maximum number of
characters is 1024. Policy notes are retained along with the policy information
for the specific vendor and displayed on the Collector Administration page as a
column making them searchable as well.
Pre-Installation setup for File Analytics 14
Adding a File Analytics Data Collector policy
Host/Device* Enter the host IP address or host name for the device that is being 172.1.1.1
probed for CIFS shares. This also could be a non-host device, such as
a NetApp array.
Share* Enter the name of the CIFS share that the Data Collector will probe. HOME
If you are using credentials, click Add to configure the CIFS share
credentials, or select an existing credential definition and click Edit.
Pre-Installation setup for File Analytics 15
Adding a File Analytics Data Collector policy
Name* Assign a name to identify the set of credentials that you are defining.
Account* Enter the login account name used to log in to the hosts. If the policy includes root
a group of Windows hosts, use the Windows Domain user ID. This user ID
must have administrative privileges.
For Linux hosts, super-user root privileges are required. You also could use
an access control command, such as sudo, sesudo, or pbrun. If using any of
these access commands, ensure that the user ID has sudo, sesudo, or pbrun
privileges. Some enterprises prefer to create a new user and provide access
to commands via an access control command.
Windows Domain* For Windows hosts only: Specify the Windows domain name. If the host is
not a member of a domain, or to specify a local user account, use a period
(.)
Pre-Installation setup for File Analytics 16
Set up FIPS compliant Data Collector for File Analytics
Private Key File For Linux hosts only: If you have configured a Public/Private Key between
your Data Collector server and the hosts you intend to monitor, specify the
location of the Private Key file on the Data Collector Server.
Known Hosts File For Linux hosts only: If you have configured a Public Key/Private Key between
your Data Collector server and the hosts you intend to monitor, specify the
location of the Known Hosts file on the Data Collector Server.
The Credential Names, already configured for the current Domain, are displayed
at the top of the window.
Note: Steps to set up Windows file server in FIPS and Kerberos are beyond the
scope of this document. You can refer the relevant product documentation for the
same.
Chapter 2
File Analytics Export
Folder Size and Folder
Depth
This chapter includes the following topics:
java -classpath
/opt/aptare/portal/WEB-INF/lib/*:/opt/aptare/portal/WEB-INF/classes/
-DAPTARE_HOME="<ita-install-path>/aptare"
com.aptare.sc.service.fa.FaSubDirectoryReport
where the value of APTARE_HOME property is the absolute path of the aptare
directory.
For example:
java -classpath
/opt/aptare/portal/WEB-INF/lib/*:/opt/aptare/portal/WEB-INF/classes/
-DAPTARE_HOME="/opt/aptare"
com.aptare.sc.service.fa.FaSubDirectoryReport
Output format:
Server Name, Volume Name, Folder name, Size in MB, Last Modified
Where:
■ Folder name: The root-level folders in the volume
■ Size in MB: Sum of all the file sizes in the folder (recursively)
■ Last Modified: Maximum modified time stamp from within all the files in the
folder (recursively)
This table illustrates the expected results given the different parameter values:
Table 2-1
fa.export.folder fa.export.include Parents Directories Included in Report
Depth
0 N/A D1
D2
D3
1 N/A D1
D1/SD1
D2
D2/SD3
D3
2 No D1/SD1
D1/SD1/SD2
D2/SD3
D3
Chapter 3
Installing the Data
Collector software
This chapter includes the following topics:
■ Introduction
Introduction
This section includes the instructions for installing the Data Collector software on
the Data Collector Server. Data Collector software is supported in various flavors
of Linux and Windows. On Windows, if you are collecting data from host resources,
you may need to install the WMI Proxy Service. The WMI Proxy Service is installed
by default, as part of the Data Collector installation on a Windows server.
A GUI based version is available for Windows and a console (command line) based
interface is available for Linux.
When the NetBackup IT Analytics system collects data from any vendor subsystem,
the collection process expects name/value pairs to be in US English, and requires
the installation to be done by an Administrator with a US English locale. The server’s
language version can be non-US English.
Installing the Data Collector software 22
Installing the WMI Proxy service (Windows host resources only)
Note: Log in as a Local Administrator to have the necessary permissions for this
installation.
4. In the Connect window, preface the Namespace entry with the IP address or
hostname of the target remote server in the following format:
\\<IP Address>\root\cimv2
Installing the Data Collector software 24
Testing WMI connectivity
5. Complete the following fields in the Connect window and then click Connect.
■ User - Enter the credentials for accessing the remote computer. This may
require you to enable RPC (the remote procedure call protocol) on the
remote computer.
■ Password
■ Authority: Enter NTLMDOMAIN:<NameOfDomain>
where NameOfDomain is the domain of the user account specified in the
User field.
Non-English Linux OS
On a non-English Linux host:
■ The user locale can be one of the non-English supported locales if the Data
Collector will collect only from a Veritas product.
■ The user locale must be English if the Data Collector will be used to collect from
any non-Veritas product.
To install the Data Collector in one of the supported locales, verify whether the host
OS has multiple languages and then add the preferred locale for the installation.
The procedure below guides you to set one of the supported languages as the
system locale.
To set one of the supported languages as the system locale for Data Collector
installation, set the preferred language as described below:
1 Check the current language.
#locale
#locale -a
Installing the Data Collector software 26
Considerations to install Data Collector on non-English systems
3 To change the System locale into one of the supported languages, run the
command #vi /etc/profile and add the following at the end of the file based
on your preferred language:
■ To add Simplified Chinese:
export LANG=zh_CN.utf8
export LC_ALL=zh_CN.utf8
■ To add French:
export LANG=fr_FR.utf8
export LC_ALL=fr_FR.utf8
■ To add Korean
export LANG=ko_KR.utf8
export LC_ALL=ko_KR.utf8
■ To add Japanese
export LANG=ja_JP.utf8
export LC_ALL=ja_JP.utf8
4 Reboot the host to set the desired system locale for the Data Collector
installation.
Having completed setting the system locale, proceed with the Data Collector
installation, with the appropriate user locale.
See “Install Data Collector software on Linux” on page 36.
Non-English Windows OS
Veritas recommends that the user locale to be set to English while installing the
Data Collector on a non-English Windows host, be it for a Veritas or a non-Veritas
product.
To verify the user locale and system locale respectively before the Data Collector
installation, run the get-culture and get-winsystemlocale commands from
PowerShell Windows. This way, you can decide which user locale to set for the
Data Collector installation.
If you must run the Data Collector installer in one of the supported locales, ensure
the Windows OS is installed in either Simplified Chinese, French, Korean, or
Japanese. Avoid having Windows OS in English, installed with language pack and
changing the locale later. The Data Collector installer detects the locale from the
Windows Language Settings and launches the installer in the respective locale. If
Installing the Data Collector software 27
Install Data Collector Software on Windows
the Windows Time & Language Setting is set to a language other than Simplified
Chinese, French, Korean, or Japanese, the installer is launched in English.
See “Install Data Collector Software on Windows” on page 27.
5 Review the End User License Agreement (EULA), select I accept the terms
of the license agreement, and click Next.
Installing the Data Collector software 30
Install Data Collector Software on Windows
6 Specify the directory where you would like to install the Data Collector software
and click Next. The default Windows path is C:\Program
Files\AptareC:\Program Files\Veritas\AnalyticsCollector. Accepting
the default paths is recommended.
If you specify a custom directory, the install creates the AnalyticsCollector
folder within the specified directory.
7 Provide accurate details as described below on the next page and then click
Next.
Data Collector Registration File Enter the absolute path of the registration
file downloaded from the NetBackup IT
Analytics Portal.
8 Review the installation summary and the available disk space before you
proceed with the installation.
Installing the Data Collector software 33
Install Data Collector Software on Windows
If you wish to run checkinstall.bat later, you can run the script from the
command prompt.
Installing the Data Collector software 36
Install Data Collector software on Linux
For SUSE:
■ Access the command prompt.
■ Install the rng-tools.
mkdir /mnt/diska
mount -o loop <itanalytics_datacollector_linux_xxxxx.iso>
/mnt/diska
cd /
/mnt/diska/dc_installer.sh
7 Review the End User License Agreement (EULA) and enter accept to agree.
8 Provide the install location. The default location is
/usr/openv/analyticscollector. Accepting the default paths is
recommended.
If you specify a custom location, analyticscollector directory is created at
the specified location.
9 The installer requests for the following details.
■ Data Collector Registration File Path: Enter the absolute file path of the
registration file generated and downloaded from the NetBackup IT Analytics
Portal.
■ Web Proxy (HTTP) settings can be configured. Enter y to configure proxy.
The installer prompts for:
■ HTTP Proxy IP Address: Enter the hostname or IP address and a port
number.
■ HTTP Proxy Port: Enter the proxy port number for HTTP proxy.
■ Proxy UserId and password: Enter the credentials for the proxy server.
■ No Proxy For: Enter the host names or IP addresses separated by
commas that will not be routed through the proxy.
■ Obtain the following Data Collector details. You are required to supply these
details to the installer during the installation process.
■ Registry: The name of the registry to which you want to push the installer
images.
■ Absolute path of Data Receiver Certificate file: Absolute path of the data
receiver certificate file downloaded from NetBackup IT Analytics Portal.
■ Absolute path of the Data Collector Registration File: Absolute path of the
Data Collector registration file downloaded from the NetBackup IT Analytics
Portal.
■ Proxy settings:
■ Portal IP address: IP address of the system hosting the NetBackup IT
Analytics Portal.
■ Portal HostName: aptareportal.<DOMAIN> or itanalyticsportal.<DOMAIN>
■ Agent HostName: aptareagent.<DOMAIN> or itanalyticsagent.<DOMAIN>
■ StorageClass Name: Name of the Kubernetes storage class to be used.
Installing the Data Collector software 40
Deploy Data Collector in native Kubernetes environment
cd scripts/
sh itanalytics_dc_installer.sh
5 Provide the Data Collector configuration details when asked by the installer in
the following order.
■ Registry
The installer asks for a confirmation after providing the registry name to
proceed with pushing the images. You need to enter y for a fresh installation.
If for any reason, you are required to re-run the installation and this step
was successfully completed anytime before for the same cluster node, you
can enter n to avoid a rewrite and bypass this step.
■ Absolute path of Data Receiver Certificate file (if you have set an https://
URL for the data receiver)
■ Absolute path of the Data Collector registration file
■ Proxy settings
Installing the Data Collector software 41
Deploy Data Collector in native Kubernetes environment
■ Portal IP address
■ Portal HostName
■ Agent HostName
■ StorageClass Name
6 The installer asks to confirm the configuration details before proceeding with
the installation. Enter y to proceed with the data collector installation
After a successful installation, verify whether the Data Collector status appears
Online on the NetBackup IT Analytics Portal.
■ Validation methods
Validation methods
Validation methods are initiated differently based on subsystem vendor associated
with the Data Collector policy, but perform essentially the same functions. Refer to
the following table for vendor-specific validation methods.
■ Test Connection - Initiates a connection attempt directly from a data collector
policy screen that attempts to connect to the subsystem using the IP addresses
and credentials supplied in the policy. This validation process returns either a
success message or a list of specific connection errors.
■ On-Demand data collection run - Initiates an immediate end-to-end run of the
collection process from the Portal without waiting for the scheduled launch. This
on-demand run also serves to validate the policy and its values (the same as
Test Connection), providing a high-level check of the installation at the individual
policy level, including a check for the domain, host group, URL, Data Collector
policy and database connectivity. This is initiated at the policy-level from
Admin>Data Collection>Collector Administration.
See “Working with on-demand Data Collection” on page 45.
Validating Data Collection 43
Data Collectors: Vendor-Specific validation methods
■ CLI Checkinstall Utility- This legacy command line utility performs both the Test
Connection function and On-Demand data collection run from the Data Collector
server.
See “Using the CLI check install utility” on page 50.
Note: NetBackup IT Analytics does not recommend using the CLI Checkinstall
utility for any Data Collector subsystem vendor which supports On-Demand
runs.
Brocade Switch x
Cisco Switch x
Cohesity DataProtect x x
Commvault Simpana x
Compute Resources x
Dell Compellent x
EMC Avamar x
EMC Isilon x
EMC Symmetrix x x
EMC VNX x x
EMC VPLEX x
EMC XtremIO x x
HDS HCP x x
HDS HNAS x
HP 3PAR x
HP Data Protector x
HP EVA x
Hitachi Block x
Hitachi NAS x x
IBM Enterprise x
IBM SVC x
IBM VIO x x
IBM XIV x
Microsoft Azure x x
Microsoft Hyper-V x x
NetApp E Series x
Netapp x
OpenStack Ceilometer x x
OpenStack Swift x x
Test Connection is
included with the Get
Nodes function.
Pure FlashArray x x
VMWare x
Veritas NetBackup x x
On-Demand data collection serves multiple purposes. You can use it to:
Validating Data Collection 46
Working with on-demand Data Collection
■ Validate the collection process is working end-to-end when you create a data
collector policy
■ Launch an immediate run of the collection process without waiting for the
scheduled run
■ Populate your database with new/fresh data
■ Choose to view the collection logs on the portal while performing an on-demand
run.
To initiate an on-demand data collection
1 Select Admin > Data Collection > Collector Administration. All Data
Collectors are displayed.
2 Click Expand All to browse for a policy or use Search.
3 Select a data collector policy from the list. If the vendor is supported, the Run
button is displayed on the action bar.
4 Click Run. A dialog allowing you to select servers and individual probes to test
the collection run is displayed. The following example shows the Amazon Web
Services dialog. See the vendor specific content for details on probes and
servers.
6 The portal enables the user to log the messages at various level during the
collection process. Following are the available options:
■ Enable Real-Time Logs: This option enables the user to log generally
useful information in real-time when the collection is in progress, select
Enable Real-Time Logs.
■ Enable Debug Logs: This option enables the user to log information at a
granular level, select Enable Debug Logs
7 Click Start. Data is collected just like a scheduled run plus additional logging
information for troubleshooting. Once started, you can monitor the status of
the run through to completion.
Note: If there is another data collection run currently in progress when you
click Start, the On-Demand run will wait to start until the in-progress run is
completed.
You can use the filter on the console to selectively view the logs of your choice.
The Collection Console icon is not visible if the data collection is not in
progress.
Note: The path for generated log file on data collector server:
<APTARE_HOME>/mbs/logs/validation/
Note: NetBackup IT Analytics does not recommend using the CLI Checkinstall
utility for any Data Collector subsystem vendor which supports On-Demand runs.
The following directions assume that the Data Collector files have been installed
in their default location:
■ Windows:C:\Program Files\Veritas\AnalyticsCollector
■ Linux:/usr/openv/analyticscollector
If you have installed the files in a different directory, make the necessary path
translations in the following instructions.
Note: Some of the following commands can take up to several hours, depending
on the size of your enterprise.
To run Checkinstall
1 Open a session on the Data Collector server.
Windows: Open a command prompt window.
Linux: Open a terminal/SSH session logged in as root to the Data Collector
Server.
2 Change to the directory where you’ll run the validation script.
Windows: At the command prompt, type:
cd /usr/openv/analyticscollector <enter>
■ Linux: /usr/openv/analyticscollector/mbs/logs/validation/
Validating Data Collection 52
List Data Collector configurations
For example:
For example:
/opt/aptare/UninstallerData/uninstall_dc.sh
Uninstalling the Data Collector 54
Uninstall the Data Collector on Windows
■ Introduction
Introduction
The installer configures the Data Collector to start automatically, however, it does
not actually start it upon completion of the installation because you must first validate
the installation. Follow these steps, for the relevant operating system, to manually
start the Data Collector service.
This also starts the Aptare Agent process, Zookeeper, and Kafka services on the
respective systems.
On Windows
The installer configures the Data Collector process as a Service.
On Linux
The installer automatically copies the Data Collector “start” and “stop” scripts to the
appropriate directory, based on the vendor operating system.
Manually starting the Data Collector 56
Introduction
/opt/aptare/mbs/bin/aptare_agent start
Chapter 7
File Analytics Export folder
size and folder depth
This chapter includes the following topics:
■ Data export
java -classpath
/opt/aptare/portal/WEB-INF/lib/*:/opt/aptare/portal/WEB-INF/classes/
-DAPTARE_HOME="<ita-install-path>/aptare"
com.aptare.sc.service.fa.FaSubDirectoryReport
where the value of APTARE_HOME property is the absolute path of the aptare
directory.
For example:
java -classpath
/opt/aptare/portal/WEB-INF/lib/*:/opt/aptare/portal/WEB-INF/classes/
-DAPTARE_HOME="/opt/aptare"
com.aptare.sc.service.fa.FaSubDirectoryReport
Server Name, Volume Name, Folder name, Size in MB, Last Modified
Where:
■ Folder name: The root-level folders in the volume
■ Size in MB: Sum of all the file sizes in the folder (recursively)
■ Last Modified: Maximum modified time stamp from within all the files in the
folder (recursively)
Table 7-1
fa.export.folder fa.export.include Parents Directories Included in Report
Depth
0 N/A D1
D2
D3
1 N/A D1
D1/SD1
D2
D2/SD3
D3
2 No D1/SD1
D1/SD1/SD2
D2/SD3
D3
Data export
The data collected for File Analytics is stored in Bantam format on the portal server
in the /opt/aptare/fa/db folder. Separate folders titled by timestamp are created
and each data file name is appended by the host ID and .bam3 extension. Each
folder also contains a context file (.bam3.cxt) for each host. You can verify the host
ID from Inventory > Backup Servers > Hosts on the portal. The summaries of
this data are stored in the Oracle database, which enable reporting from Reports
> File Analytics.
The export mechanism reads the data from /opt/aptare/fa/db during the export.
To export the File Analytics data:
1 From the portal, go to Admin > Export. A File List Export window is displayed.
2 Click New Export Request. The New Export Request window is displayed.
3 Provide a Name to the export request and click Modify to define the scope of
your export.
4 On the Report Scope Selector window, select the data sources. You can
select file shares and volumes from Groups tab, whereas you can se the
Devices tab to select individual endpoints.
File Analytics Export folder size and folder depth 60
Data export
5 Click OK to save your scope and return to the New Export Request window.
6 Once you defined the scope, you can optionally apply the following filters to
your data export:
■ Owners
■ File Categories
■ Create, Modified, and Accessed date ranges
■ Directory Paths
■ File Extensions
■ File Size
■ File Name
7 Click OK to save the export request. The export request is saved and is
displayed on the File List Export window, with details about the exported files.
To download the File Analytics data locally, you can also select the export
request on the File List Export window and click Download. A compressed
zip file is saved to your default download location.
Appendix A
Firewall configuration:
Default ports
This appendix includes the following topics:
https 443
Kafka 9092
MS Exchange 389
MS SQL 1433
Oracle 1521
WMI 135
ZooKeeper 2181
Note: NetBackup IT Analytics uses
standalone installation of single-node
Apache ZooKeeper server. For secure
communications, ZooKeeper
single-node cluster must be protected
from external traffic using network
security such as firewall. This is
remediated by ensuring that the
ZooKeeper port (2181) is only
accessible on the local host where
NetBackup IT Analytics Portal/Data
Collector is installed (that includes
Apache ZooKeeper).
EMC VNX (Celerra) XML API 443, 2163, 6389, 6390, 6391,
6392
HP EVA 2372
Hitachi Vantara All-Flash and Hybrid Flash Storage Hitachi Ops Center Configuration
Manager REST API: 23450 for HTTP
and 23451 for HTTPS.
DSCLI
WMI 135
80/443
DCOM >1023
Dell EMC Networker Backup & Recovery Port used for Dell EMC NetWorker
REST API connection. Default: 9090.
SSH 22
80/443
SSH 22