0% found this document useful (0 votes)
189 views65 pages

NetBackup IT Analytics DC Installation Guide For File Analytics v11.4

The NetBackup IT Analytics Data Collector Installation Guide for File Analytics provides detailed instructions for setting up and managing data collection of unstructured data across enterprise environments. It includes information on pre-installation setup, installation procedures for both Windows and Linux systems, and policies for data collection. The guide emphasizes the importance of understanding system requirements and configurations to optimize data collection and management processes.

Uploaded by

maulet2001
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
189 views65 pages

NetBackup IT Analytics DC Installation Guide For File Analytics v11.4

The NetBackup IT Analytics Data Collector Installation Guide for File Analytics provides detailed instructions for setting up and managing data collection of unstructured data across enterprise environments. It includes information on pre-installation setup, installation procedures for both Windows and Linux systems, and policies for data collection. The guide emphasizes the importance of understanding system requirements and configurations to optimize data collection and management processes.

Uploaded by

maulet2001
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 65

NetBackup IT Analytics

Data Collector Installation


Guide for File Analytics

Release 11.4
NetBackup IT Analytics Data Collector installation
guide for File Analytics
Last updated: 2025-02-03

Legal Notice
Copyright © 2025 Veritas Technologies LLC. All rights reserved.

Veritas and the Veritas Logo are trademarks or registered trademarks of Veritas Technologies
LLC or its affiliates in the U.S. and other countries. Other names may be trademarks of their
respective owners.

This product may contain third-party software for which Veritas is required to provide attribution
to the third party (“Third-party Programs”). Some of the Third-party Programs are available
under open source or free software licenses. The License Agreement accompanying the
Software does not alter any rights or obligations you may have under those open source or
free software licenses. Refer to the Third-party Legal Notices document accompanying this
Veritas product or available at:

https://www.veritas.com/about/legal/license-agreements

The product described in this document is distributed under licenses restricting its use, copying,
distribution, and decompilation/reverse engineering. No part of this document may be
reproduced in any form by any means without prior written authorization of Veritas Technologies
LLC and its licensors, if any.

THE DOCUMENTATION IS PROVIDED "AS IS" AND ALL EXPRESS OR IMPLIED


CONDITIONS, REPRESENTATIONS AND WARRANTIES, INCLUDING ANY IMPLIED
WARRANTY OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE OR
NON-INFRINGEMENT, ARE DISCLAIMED, EXCEPT TO THE EXTENT THAT SUCH
DISCLAIMERS ARE HELD TO BE LEGALLY INVALID. VERITAS TECHNOLOGIES LLC
SHALL NOT BE LIABLE FOR INCIDENTAL OR CONSEQUENTIAL DAMAGES IN
CONNECTION WITH THE FURNISHING, PERFORMANCE, OR USE OF THIS
DOCUMENTATION. THE INFORMATION CONTAINED IN THIS DOCUMENTATION IS
SUBJECT TO CHANGE WITHOUT NOTICE.

The Licensed Software and Documentation are deemed to be commercial computer software
as defined in FAR 12.212 and subject to restricted rights as defined in FAR Section 52.227-19
"Commercial Computer Software - Restricted Rights" and DFARS 227.7202, et seq.
"Commercial Computer Software and Commercial Computer Software Documentation," as
applicable, and any successor regulations, whether delivered by Veritas as on premises or
hosted services. Any use, modification, reproduction release, performance, display or disclosure
of the Licensed Software and Documentation by the U.S. Government shall be solely in
accordance with the terms of this Agreement.

Veritas Technologies LLC


2625 Augustine Drive.
Santa Clara, CA 95054
http://www.veritas.com

Technical Support
Technical Support maintains support centers globally. All support services will be delivered
in accordance with your support agreement and the then-current enterprise technical support
policies. For information about our support offerings and how to contact Technical Support,
visit our website:

https://www.veritas.com/support

You can manage your Veritas account information at the following URL:

https://my.veritas.com

If you have questions regarding an existing support agreement, please email the support
agreement administration team for your region as follows:

Worldwide (except Japan) [email protected]

Japan [email protected]

Documentation
Make sure that you have the current version of the documentation. Each document displays
the date of the last update on page 2. The latest documentation is available on the Veritas
website.

Veritas Services and Operations Readiness Tools (SORT)


Veritas Services and Operations Readiness Tools (SORT) is a website that provides information
and tools to automate and simplify certain time-consuming administrative tasks. Depending
on the product, SORT helps you prepare for installations and upgrades, identify risks in your
datacenters, and improve operational efficiency. To see what services and tools SORT provides
for your product, see the data sheet:

https://sort.veritas.com/data/support/SORT_Data_Sheet.pdf
Contents

Chapter 1 Pre-Installation setup for File Analytics ........................ 6

Pre-Installation setup for File Analytics ............................................... 6


File Analytics Data Collection overview ............................................... 7
File Analytics Data Collection architecture ........................................... 8
File Analytics Data Collector policies .................................................. 8
Prerequisites for adding Data Collectors (File Analytics) ......................... 9
CIFS shares .................................................................................. 9
Host Discovery and Collection File Analytics probe .............................. 10
File Analytics Probe Configurations by operating system ................ 10
Both Windows and Linux servers ............................................... 10
Best practices for host Inventory File Analytics probes ................... 11
Adding a File Analytics Data Collector policy ...................................... 11
Importing the CIFS share configuration ....................................... 16
Set up FIPS compliant Data Collector for File Analytics ........................ 16

Chapter 2 File Analytics Export Folder Size and Folder


Depth ............................................................................... 57
Extracting File Analytics export folder size ......................................... 57
Specifying the File Analytics folder depth ........................................... 58

Chapter 3 Installing the Data Collector software .......................... 21


Introduction ................................................................................. 21
Installing the WMI Proxy service (Windows host resources only) ............ 22
Testing WMI connectivity ................................................................ 22
Considerations to install Data Collector on non-English systems ............ 25
Install Data Collector Software on Windows ....................................... 27
Install Data Collector software on Linux ............................................. 36
Deploy Data Collector in native Kubernetes environment ...................... 39

Chapter 4 Validating Data Collection ............................................... 42


Validation methods ....................................................................... 42
Data Collectors: Vendor-Specific validation methods ............................ 43
Working with on-demand Data Collection .......................................... 45
View real-time logging during an on-demand collection .................. 47
Contents 5

Generating debug level logs during an on-demand collection .......... 48


Using the CLI check install utility ...................................................... 50
List Data Collector configurations ..................................................... 52

Chapter 5 Uninstalling the Data Collector ...................................... 53


Uninstall the Data Collector on Linux ................................................ 53
Uninstall the Data Collector on Windows ........................................... 54

Chapter 6 Manually starting the Data Collector ............................ 55


Introduction ................................................................................. 55

Chapter 7 File Analytics Export folder size and folder depth


........................................................................................... 57

Extracting File Analytics export folder size ......................................... 57


Specifying the File Analytics folder depth ........................................... 58
Data export ................................................................................. 59

Appendix A Firewall configuration: Default ports ............................ 61

Firewall configuration: Default ports .................................................. 61


Chapter 1
Pre-Installation setup for
File Analytics
This chapter includes the following topics:

■ Pre-Installation setup for File Analytics

■ File Analytics Data Collection overview

■ File Analytics Data Collection architecture

■ File Analytics Data Collector policies

■ Prerequisites for adding Data Collectors (File Analytics)

■ CIFS shares

■ Host Discovery and Collection File Analytics probe

■ Adding a File Analytics Data Collector policy

■ Set up FIPS compliant Data Collector for File Analytics

Pre-Installation setup for File Analytics


In most cases, a single instance of the Data Collector can support any number of
enterprise objects. However, each environment has its own unique deployment
configuration requirements, so it is important to understand where the Data Collector
software must be installed so that you can determine how many Data Collectors
must be installed and which servers are best suited for the deployment.
When the NetBackup IT Analytics system collects data from any vendor subsystem,
the collection process expects name/value pairs to be in US English, and requires
Pre-Installation setup for File Analytics 7
File Analytics Data Collection overview

the installation to be done by an Administrator with a US English locale. The server’s


language version can be non-US English.

File Analytics Data Collection overview


File Analytics offers a highly optimized, lightweight data collector that uncovers files
that are needlessly consuming storage. The bulk of wasted space often can be
attributed to unstructured data--that is, files that contain data such as email, word
processing documents, presentations, and digital media. Unlike structured data,
which is managed by database administrators, unstructured data tends to proliferate
with reckless abandon, resulting in stagnant and duplicated data. To assess the
value of data files, whether they are critical to operations or are merely forgotten
files, File Analytics provides data profiling. This data classification enables
aggregation of file types and data growth forecasting. In addition, storage
administrators can make archiving decisions based on the age of the data and its
relevance to business objectives.
Once data collection completes, a number of reports provide insight into file profiles
in your enterprise. Using built-in categorization and also file categories that you can
customize, File Analytics becomes part of your arsenal of storage management
tools. File Analytics aggregates the data, making it easy for you to identify offenders,
such as:
■ Deduplication candidates
■ File ownership (or lack thereof)
■ Largest files
■ File Categories (Top file types by volume)
Pre-Installation setup for File Analytics 8
File Analytics Data Collection architecture

File Analytics Data Collection architecture

File Analytics Data Collector policies


File Analytics data collection can be configured in different ways, depending on the
location of the files you want to profile in your enterprise. Determine which of the
following approaches is relevant for your environment:
■ Create a File Analytics Data Collector policy to configure a file shares probe.
This Data Collector traverses the network CIFS shares to collect and categorize
filesystem storage consumption metadata. This highly optimized traversal profiles
unstructured data to enable you to identify storage that can be reclaimed. Use
this for any CIFS share that is serving up files that you want to profile, regardless
of manufacturer of the appliance.
■ Host Discovery and Collection option directly interrogates hosts’ attached storage
to profile files. Configure a host probe for Windows (WMI Proxy) and Linux (SSH)
collection of a host’s locally attached storage.
See “Host Discovery and Collection File Analytics probe” on page 10.
■ Create a Veritas NetBackup Collection policy and configure File Analytics in it
to collect file information of backed up clients from the NetBackup Primary
Server. Refer to the Configuring File Analytics in NetBackup Data Collector
Policy section of the Data Collector installation Guide for Backup Manager for
the configuration steps.
Pre-Installation setup for File Analytics 9
Prerequisites for adding Data Collectors (File Analytics)

Prerequisites for adding Data Collectors (File


Analytics)
■ 64-bit OS. See the Certified Configurations Guide for supported operating
systems.
■ Supports Amazon Corretto 17. Amazon Corretto is a no-cost, multi-platform,
production-ready distribution of the Open Java Development Kit (OpenJDK).
■ For performance reasons, do not install Data Collectors on the same server as
the NetBackup IT Analytics Portal. However, if you must have both on the same
server, verify that the Portal and Data Collector software do not reside in the
same directory.
■ Install only one Data Collector on a server (or OS instance).

CIFS shares
■ This Data Collector can collect both Linux and Windows shares. The
recommended Windows Data Collector server operating system is Windows
Server. See the Data Collector Supported Operating Systems sections of the
NetBackup IT Analytics Certified Configurations Guide for supported OS versions.
■ The Windows LAN Manager authentication level, in the local security policy
security options, must be modified to: Send LM & NTLM - use NTLMv2 session
security if negotiated. This allows the Data Collector to invoke the net use
command with the password supplied on the command line. Without this setting,
later versions of Windows will terminate with a system error 86 (invalid password).
■ Windows CIFS Shares collection requires the Windows Domain User ID. This
User ID must have Administrative privileges.
■ Linux CIFS Shares collection requires super-user root privileges. Access control
commands, such as sudo, sesudo, and pbrun are also supported. If using any
of the access control commands, verify that the User ID has sudo, sesudo, or
pbrun privileges.
■ Collection of owner data for Windows and CIFS is configurable via Advanced
Parameters. Data collection completes faster when owner data is not collected
(the default). Although not recommended, you can configure an Advanced
Parameter, FA_RESOLVE_OWNERS set to Y, to enable owner collection. To
access Advanced Parameters in the Portal, select Admin > Advanced >
Parameters.
Pre-Installation setup for File Analytics 10
Host Discovery and Collection File Analytics probe

Host Discovery and Collection File Analytics


probe
Using Host Resources data collection, hosts are discovered and added to the Host
Discovery and Collection list. Once a host is listed in the inventory, it can be selected
and the File Analytics probe can be configured. To access the Host Inventory to
enable File Analytics probes: Admin > Data Collection > Host Discovery and
Collection
Note that by design, File Analytics host resources data collection occurs via
activation of the probe in the Host Discovery and Collection window in the Portal.

File Analytics Probe Configurations by operating system


Refer to the Certified Configurations Guide for complete details.
Windows servers: A Domain Administrator user ID is required when collecting
file-level data for File Analytics.
Linux servers: Linux is only supported with the following requirements:
■ Root user access is supported.
■ Non-root user access with sudo access control is supported.
■ Non-root user access without sudo is not supported.
■ Running collection with a sudo user on a Linux server requires the addition of
an access control command for the server in the Host Discovery and Collection’s
Manage Access Control window: Admin > Data Collection > Host Discovery
and Collection
Also, an advanced parameter must be created: FA_USE_SUDO set to Y.
To access Advanced Parameters in the Portal, select Admin > Advanced >
Parameters.

Both Windows and Linux servers


If running collection via the checkinstall utility, verify the following:
■ An advanced parameter must be created: FA_HOST_VALIDATE set to Y. To
access Advanced Parameters in the Portal, select Admin > Advanced >
Parameters.
Pre-Installation setup for File Analytics 11
Adding a File Analytics Data Collector policy

Best practices for host Inventory File Analytics probes


Since most environments have hundreds, even thousands of hosts, it is
recommended that File Analytics probes be configured in a staggered schedule so
as not to overload the Data Collector server.

Adding a File Analytics Data Collector policy


One of the types of data collection that can be configured for File Analytics is
collection of CIFS shares. The Data Collector will take the configuration that you
specify, including the share names and credentials, and then traverse the file system
structure to identify these shared resources on your network and collect the relevant
metadata.
Before adding the policy: A Data Collector must exist in the Portal, to which you
will add Data Collector Policies.
For specific prerequisites and supported configurations for a specific vendor, see
the Certified Configurations Guide.
To add the policy
1 Select Admin > Data Collection > Collector Administration. Currently
configured Portal Data Collectors are displayed.
2 Search for a Collector if required.
3 Select a Data Collector from the list.
Pre-Installation setup for File Analytics 12
Adding a File Analytics Data Collector policy

4 Click Add Policy, and then select File Analytics policy.


Pre-Installation setup for File Analytics 13
Adding a File Analytics Data Collector policy

5 Enter or select the parameters. Mandatory parameters are denoted by an


asterisk (*):

Field Description

Collector Domain The domain of the collector to which the collector backup policy is being added.
This is a read-only field. By default, the domain for a new policy will be the same
as the domain for the collector. This field is set when you add a collector.

Policy Domain The Collector Domain is the domain that was supplied during the Data Collector
installation process. The Policy Domain is the domain of the policy that is being
configured for the Data Collector. The Policy Domain must be set to the same
value as the Collector Domain. The domain identifies the top level of your host
group hierarchy. All newly discovered hosts are added to the root host group
associated with the Policy Domain.

Typically, only one Policy Domain will be available in the drop-down list. If you
are a Managed Services Provider, each of your customers will have a unique
domain with its own host group hierarchy.

To find your Domain name, click your login name and select My Profile from the
menu. Your Domain name is displayed in your profile settings.

Name* Enter a name that will be displayed in the list of Data Collector policies.

Schedule* Click the clock icon to create a schedule. Every Minute, Hourly, Daily, Weekly,
and Monthly schedules may be created. Advanced use of native CRON strings
is also available.

Examples of CRON expressions:


*/30 * * * * means every 30 minutes

*/20 9-18 * * * means every 20 minutes between the hours of 9am and 6pm

*/10 * * * 1-5 means every 10 minutes Mon - Fri.

Explicit schedules set for a Collector policy are relative to the time on the Collector
server. Schedules with frequencies are relative to the time that the Data Collector
was restarted.

Shares* Click Add to configure the CIFS shares that the collector will probe.

Click Edit to modify a CIFS share configuration.


Note that the Import button in this window enables bulk loading of CIFS shares.

See “Importing the CIFS share configuration” on page 16.

Notes Enter or edit notes for your data collector policy. The maximum number of
characters is 1024. Policy notes are retained along with the policy information
for the specific vendor and displayed on the Collector Administration page as a
column making them searchable as well.
Pre-Installation setup for File Analytics 14
Adding a File Analytics Data Collector policy

6 Enter or select CIFS shares configuration parameters in the File Analytics


Shares window.

Field Description Sample Value

Host/Device* Enter the host IP address or host name for the device that is being 172.1.1.1
probed for CIFS shares. This also could be a non-host device, such as
a NetApp array.

Share* Enter the name of the CIFS share that the Data Collector will probe. HOME

Protocol* CIFS is currently the only option.

Authentication Click either Anonymous or Use Credentials.

If you are using credentials, click Add to configure the CIFS share
credentials, or select an existing credential definition and click Edit.
Pre-Installation setup for File Analytics 15
Adding a File Analytics Data Collector policy

7 Enter credentials in the Credentials window.

Field Description Sample Value

Name* Assign a name to identify the set of credentials that you are defining.

Account* Enter the login account name used to log in to the hosts. If the policy includes root
a group of Windows hosts, use the Windows Domain user ID. This user ID
must have administrative privileges.

For Linux hosts, super-user root privileges are required. You also could use
an access control command, such as sudo, sesudo, or pbrun. If using any of
these access commands, ensure that the user ID has sudo, sesudo, or pbrun
privileges. Some enterprises prefer to create a new user and provide access
to commands via an access control command.

Description Enter a note to help identify the credential.

Password Enter the password for the account

OS Type* Select either Windows, Linux, or NAS

Windows Domain* For Windows hosts only: Specify the Windows domain name. If the host is
not a member of a domain, or to specify a local user account, use a period
(.)
Pre-Installation setup for File Analytics 16
Set up FIPS compliant Data Collector for File Analytics

Field Description Sample Value

Private Key File For Linux hosts only: If you have configured a Public/Private Key between
your Data Collector server and the hosts you intend to monitor, specify the
location of the Private Key file on the Data Collector Server.

Known Hosts File For Linux hosts only: If you have configured a Public Key/Private Key between
your Data Collector server and the hosts you intend to monitor, specify the
location of the Known Hosts file on the Data Collector Server.

8 Click OK to close and save the configuration in each window.

Importing the CIFS share configuration


The import feature facilitates entry of a large number of CIFS shares. Simply paste
the details in comma-separated format into the window and click OK.
Data Format:

host, share, protocol (CIFS), credential name

The Credential Names, already configured for the current Domain, are displayed
at the top of the window.

Set up FIPS compliant Data Collector for File


Analytics
To become FIPS 140-2 compliant, you must configure the Data Collector for File
Analytics as recommended below:
Pre-Installation setup for File Analytics 17
Set up FIPS compliant Data Collector for File Analytics

1. To enable FIPS compliance, you must install the Data Collector on a


FIPS-compliant system.
2. Ensure that the Data Collector and the target Windows file server both are
configured in FIPS mode.
3. Specify vers=2 as the protocol version used between the collector and the
target system.
4. Ensure Kerberos authentication is used on the target system.

Note: Steps to set up Windows file server in FIPS and Kerberos are beyond the
scope of this document. You can refer the relevant product documentation for the
same.
Chapter 2
File Analytics Export
Folder Size and Folder
Depth
This chapter includes the following topics:

■ Extracting File Analytics export folder size

■ Specifying the File Analytics folder depth

Extracting File Analytics export folder size


To extract the first-level folder size information from the File Analytics database:
1. At the Linux command prompt, run the following command:

java -classpath
/opt/aptare/portal/WEB-INF/lib/*:/opt/aptare/portal/WEB-INF/classes/
-DAPTARE_HOME="<ita-install-path>/aptare"
com.aptare.sc.service.fa.FaSubDirectoryReport

where the value of APTARE_HOME property is the absolute path of the aptare
directory.
For example:

java -classpath
/opt/aptare/portal/WEB-INF/lib/*:/opt/aptare/portal/WEB-INF/classes/
-DAPTARE_HOME="/opt/aptare"
com.aptare.sc.service.fa.FaSubDirectoryReport

This generates an output file: report.csv


File Analytics Export Folder Size and Folder Depth 19
Specifying the File Analytics folder depth

Output format:

Server Name, Volume Name, Folder name, Size in MB, Last Modified

Where:
■ Folder name: The root-level folders in the volume
■ Size in MB: Sum of all the file sizes in the folder (recursively)
■ Last Modified: Maximum modified time stamp from within all the files in the
folder (recursively)

Specifying the File Analytics folder depth


A parameter, Dfa.export, is available to specify folder depth for File Analytics.
■ To specify the folder depth for the report summary, add the following parameter
when executing the command -Dfa.export.folderDepth=x where "x" is the
depth. By default the depth is set to 1.
■ To turn off reporting on parents, add the following parameter when executing
the command -Dfa.export.includeParents=No. By default reporting on parents
is turned on.
■ To specify the name of the output file use
-Dfa.export.reportFileName=SomeReportName.csv. If this parameter is not
specified the default output file will be report.csv.
For example:

java -classpath /opt/aptare/portal/WEB-INF/lib/*:/opt/aptare/portal/


WEB-INF/classes/ -Dfa.export.folderDepth=2 -Dfa.export.includeParents=No
-Dtest.resourceLocation=opt/aptare/portal/WEB-INF/classes/
com.aptare.sc.service.fa.FaSubDirectoryReport

Sample Directory Structures and Results


As an example, the table that follows, uses these directory structures to show the
results of different parameter values:
■ D1
■ D1/SD1
■ D1/SD1/SD2
■ D2/SD3
■ D3
File Analytics Export Folder Size and Folder Depth 20
Specifying the File Analytics folder depth

This table illustrates the expected results given the different parameter values:

Table 2-1
fa.export.folder fa.export.include Parents Directories Included in Report
Depth

0 N/A D1

D2

D3

1 N/A D1

D1/SD1

D2

D2/SD3

D3

2 No D1/SD1

D1/SD1/SD2

D2/SD3

D3
Chapter 3
Installing the Data
Collector software
This chapter includes the following topics:

■ Introduction

■ Installing the WMI Proxy service (Windows host resources only)

■ Testing WMI connectivity

■ Considerations to install Data Collector on non-English systems

■ Install Data Collector Software on Windows

■ Install Data Collector software on Linux

■ Deploy Data Collector in native Kubernetes environment

Introduction
This section includes the instructions for installing the Data Collector software on
the Data Collector Server. Data Collector software is supported in various flavors
of Linux and Windows. On Windows, if you are collecting data from host resources,
you may need to install the WMI Proxy Service. The WMI Proxy Service is installed
by default, as part of the Data Collector installation on a Windows server.
A GUI based version is available for Windows and a console (command line) based
interface is available for Linux.
When the NetBackup IT Analytics system collects data from any vendor subsystem,
the collection process expects name/value pairs to be in US English, and requires
the installation to be done by an Administrator with a US English locale. The server’s
language version can be non-US English.
Installing the Data Collector software 22
Installing the WMI Proxy service (Windows host resources only)

Note: Log in as a Local Administrator to have the necessary permissions for this
installation.

Installing the WMI Proxy service (Windows host


resources only)
To collect data from Windows hosts, choose a Windows host on which to install
the WMI proxy.
■ This is required only if you are collecting data from Windows Host Resources.
■ The WMI Proxy needs to be installed on only one Windows host.
■ If the Data Collector is on a Windows server, the WMI Proxy will be installed
there as part of the Data Collector installation.
■ If the Data Collector is on a Linux server, you must identify a Windows server
on which to install the WMI proxy service.
See “Install Data Collector Software on Windows” on page 27.

Testing WMI connectivity


The Windows Management Instrumentation (WMI) Proxy is used by NetBackup IT
Analytics to collect data from Windows hosts. Should you have connectivity issues,
these steps can be taken to test and troubleshoot connectivity.
To verify that WMI is working properly, take the following steps:
1. Log in to the Data Collector server as an Administrator.
2. From the Windows Start menu, type Run in the search box to launch the
following window where you will enter wbemtest.exe and click OK.
Installing the Data Collector software 23
Testing WMI connectivity

3. In the Windows Management Instrumentation Tester window, click Connect.

4. In the Connect window, preface the Namespace entry with the IP address or
hostname of the target remote server in the following format:

\\<IP Address>\root\cimv2
Installing the Data Collector software 24
Testing WMI connectivity

5. Complete the following fields in the Connect window and then click Connect.
■ User - Enter the credentials for accessing the remote computer. This may
require you to enable RPC (the remote procedure call protocol) on the
remote computer.
■ Password
■ Authority: Enter NTLMDOMAIN:<NameOfDomain>
where NameOfDomain is the domain of the user account specified in the
User field.

6. Click Enum Classes.


7. In the Superclass Info window, select the Recursive radio button, but do not
enter a superclass name. Then, click OK.
8. The WMI Tester will generate a list of classes. If this list does not appear, go
to the Microsoft Developer Network web site for troubleshooting help.
http://msdn.microsoft.com/en-us/library/ms735120.aspx
Installing the Data Collector software 25
Considerations to install Data Collector on non-English systems

Considerations to install Data Collector on


non-English systems
This section describes the prerequisites of NetBackup IT Analytics Data Collector
installation on a non-English Windows or a non-English Linux host. Apart from
English, Data Collector installation is supported in the following locales, provided
the Data Collector host system locale is set to any one of these languages:
■ Simplified Chinese
■ French
■ Korean
■ Japanese
After you have set one of the above as system locale, the installation progress and
responses appear in the preferred locale. If the system locale is set to any other
non-supported locale, the installation progress and responses appear in English.
The OS-specific requirements mentioned below.

Non-English Linux OS
On a non-English Linux host:
■ The user locale can be one of the non-English supported locales if the Data
Collector will collect only from a Veritas product.
■ The user locale must be English if the Data Collector will be used to collect from
any non-Veritas product.
To install the Data Collector in one of the supported locales, verify whether the host
OS has multiple languages and then add the preferred locale for the installation.
The procedure below guides you to set one of the supported languages as the
system locale.
To set one of the supported languages as the system locale for Data Collector
installation, set the preferred language as described below:
1 Check the current language.

#locale

2 Check whether your system has multiple languages:

#locale -a
Installing the Data Collector software 26
Considerations to install Data Collector on non-English systems

3 To change the System locale into one of the supported languages, run the
command #vi /etc/profile and add the following at the end of the file based
on your preferred language:
■ To add Simplified Chinese:

export LANG=zh_CN.utf8
export LC_ALL=zh_CN.utf8

■ To add French:

export LANG=fr_FR.utf8
export LC_ALL=fr_FR.utf8

■ To add Korean

export LANG=ko_KR.utf8
export LC_ALL=ko_KR.utf8

■ To add Japanese

export LANG=ja_JP.utf8
export LC_ALL=ja_JP.utf8

4 Reboot the host to set the desired system locale for the Data Collector
installation.
Having completed setting the system locale, proceed with the Data Collector
installation, with the appropriate user locale.
See “Install Data Collector software on Linux” on page 36.

Non-English Windows OS
Veritas recommends that the user locale to be set to English while installing the
Data Collector on a non-English Windows host, be it for a Veritas or a non-Veritas
product.
To verify the user locale and system locale respectively before the Data Collector
installation, run the get-culture and get-winsystemlocale commands from
PowerShell Windows. This way, you can decide which user locale to set for the
Data Collector installation.
If you must run the Data Collector installer in one of the supported locales, ensure
the Windows OS is installed in either Simplified Chinese, French, Korean, or
Japanese. Avoid having Windows OS in English, installed with language pack and
changing the locale later. The Data Collector installer detects the locale from the
Windows Language Settings and launches the installer in the respective locale. If
Installing the Data Collector software 27
Install Data Collector Software on Windows

the Windows Time & Language Setting is set to a language other than Simplified
Chinese, French, Korean, or Japanese, the installer is launched in English.
See “Install Data Collector Software on Windows” on page 27.

Install Data Collector Software on Windows


To install your Data Collector software:
1 Login to the Data Collector server as a local administrator.
2 Go to the downloads section under Support on www.veritas.com and click the
relevant download link.
Once, downloaded, the Data Collector Installation Wizard launches
automatically. If it does not, navigate to its directory and double-click the
executable file Setup.exe.
3 Review the recommendations on the welcome page and click Next.
You are advised to close all other programs during this installation.
Installing the Data Collector software 28
Install Data Collector Software on Windows

4 The installation wizard validates the system environment. On successful


validation, click Next.
Installing the Data Collector software 29
Install Data Collector Software on Windows

5 Review the End User License Agreement (EULA), select I accept the terms
of the license agreement, and click Next.
Installing the Data Collector software 30
Install Data Collector Software on Windows

6 Specify the directory where you would like to install the Data Collector software
and click Next. The default Windows path is C:\Program
Files\AptareC:\Program Files\Veritas\AnalyticsCollector. Accepting
the default paths is recommended.
If you specify a custom directory, the install creates the AnalyticsCollector
folder within the specified directory.

7 Provide accurate details as described below on the next page and then click
Next.

Data Collection Task Select Data Collector (includes WMI


Proxy) or WMI Proxy Server (only) from
the list.

A single Data Collector can be installed for


multiple vendor subsystem on a single
server.
Installing the Data Collector software 31
Install Data Collector Software on Windows

Data Collector Registration File Enter the absolute path of the registration
file downloaded from the NetBackup IT
Analytics Portal.

If a registration file is not available,


generate and download it from the Portal
and provide its path. This will auto-populate
the next three fields.

Data Collector Name Read-only and auto-populated.

Data Collector Passcode Read-only and auto-populated.

Data Receiver URL Read-only and auto-populated.

Proxy Settings 1 HTTP/HTTPS: Enter the hostname


or IP address and a port number.

2 UserId: User ID of the proxy server.

3 Password: Password of the proxy


server.

4 No Proxy For: Enter the host names


or IP addresses separated by
commas that will not be routed
through the proxy.
Installing the Data Collector software 32
Install Data Collector Software on Windows

8 Review the installation summary and the available disk space before you
proceed with the installation.
Installing the Data Collector software 33
Install Data Collector Software on Windows

9 Click Next to initiate the installation.


Installing the Data Collector software 34
Install Data Collector Software on Windows

10 Review the post install details and click Next.


Installing the Data Collector software 35
Install Data Collector Software on Windows

11 To validate the Data Collector installation, run the C:\Program


Files\Veritas\AnalyticsCollector\mbs\bin\checkinstall.bat batch
file.
Close the terminal window once the validation is complete and then click Next.

If you wish to run checkinstall.bat later, you can run the script from the
command prompt.
Installing the Data Collector software 36
Install Data Collector software on Linux

12 On successful installation of NetBackup IT Analytics Data Collector, click Finish.


Your Data Collector installation is complete.

Install Data Collector software on Linux


To install Data Collector software on Linux:
1 Login as root on the server where NetBackup IT Analytics Data Collector has
to be installed.
2 If the Data Collector system is having low entropy, it can affect the performance
of cryptographic functions and such steps can take considerable amount of
time to complete. You can identify the entropy level of the system from the
content of the /proc/sys/kernel/random/entropy_avail file using command
# cat /proc/sys/kernel/random/entropy_avail. If this value is not more
than 400 consistently, install the rng-tools and start the services as described
below on the data collector system.
Install the rng-tools and start the services as described below.
For RHEL or OEL:
■ Access the command prompt.
Installing the Data Collector software 37
Install Data Collector software on Linux

■ Install the rng-tools.

yum install rng-tools

■ Start the services.

systemctl start rngd

■ Enable the services.

systemctl start rngd

For SUSE:
■ Access the command prompt.
■ Install the rng-tools.

zypper install rng-tools

■ Start the services.

systemctl start rng-tools

■ Enable the services.

systemctl enable rng-tools

3 Ensure the following rpms are present on the system:


On SUSE: libXrender1 and libXtst6 insserv-compat
On other Linux systems: libXtst and libXrender chkconfig
Since the above rpms are essential for proper functioning of the Data Collector,
you can run the below commands on the Data Collector server to check whether
the rpms are present.
On SUSE: rpm -q libXrender1 libXtst6 insserv-compat
On other Linux systems: rpm -q libXtst libXrender chkconfig
The output of the above commands will print the rpms that are present on the
system.
4 Go to the downloads section under Support on www.veritas.com and click the
relevant download link.
Installing the Data Collector software 38
Install Data Collector software on Linux

5 Mount the ISO image that you downloaded.

mkdir /mnt/diska
mount -o loop <itanalytics_datacollector_linux_xxxxx.iso>
/mnt/diska

Substitute the name of the ISO image downloaded have downloaded.


6 Start the installer:

cd /
/mnt/diska/dc_installer.sh

7 Review the End User License Agreement (EULA) and enter accept to agree.
8 Provide the install location. The default location is
/usr/openv/analyticscollector. Accepting the default paths is
recommended.
If you specify a custom location, analyticscollector directory is created at
the specified location.
9 The installer requests for the following details.
■ Data Collector Registration File Path: Enter the absolute file path of the
registration file generated and downloaded from the NetBackup IT Analytics
Portal.
■ Web Proxy (HTTP) settings can be configured. Enter y to configure proxy.
The installer prompts for:
■ HTTP Proxy IP Address: Enter the hostname or IP address and a port
number.
■ HTTP Proxy Port: Enter the proxy port number for HTTP proxy.
■ Proxy UserId and password: Enter the credentials for the proxy server.
■ No Proxy For: Enter the host names or IP addresses separated by
commas that will not be routed through the proxy.

The Data Collector installation is complete. You can run the


<Data_Collector_Install_Location>/analyticscollector/mbs/bin/checkinstall.sh
file for verification.
Installing the Data Collector software 39
Deploy Data Collector in native Kubernetes environment

Deploy Data Collector in native Kubernetes


environment
This procedure provides the steps to deploy Data Collector Docker image on a
Kubernetes cluster through an operator with the required configuration on Linux
hosts. This method enables efficient Data Collector installation and reduces the
human errors caused during manual or ISO-based installations.

Prerequisites and dependencies


System requirements and installation dependencies for the system on which Data
Collector will be installed are listed below:
■ Obtain the Docker image generated from the CI/CD build.
■ Kubernetes must be pre-installed on the system.
■ Assume root role on the host system.
■ Kubernetes cluster must be accessible on the system.
■ Ensure that the file system supporting the /data directory has enough free
space as recommended in the NetBackup IT Analytics Certified Configurations
Guide for Data Collector.
The /data directory in the host system will be mounted inside the container as
/usr/openv/analyticscollector.

■ Obtain the following Data Collector details. You are required to supply these
details to the installer during the installation process.
■ Registry: The name of the registry to which you want to push the installer
images.
■ Absolute path of Data Receiver Certificate file: Absolute path of the data
receiver certificate file downloaded from NetBackup IT Analytics Portal.
■ Absolute path of the Data Collector Registration File: Absolute path of the
Data Collector registration file downloaded from the NetBackup IT Analytics
Portal.
■ Proxy settings:
■ Portal IP address: IP address of the system hosting the NetBackup IT
Analytics Portal.
■ Portal HostName: aptareportal.<DOMAIN> or itanalyticsportal.<DOMAIN>
■ Agent HostName: aptareagent.<DOMAIN> or itanalyticsagent.<DOMAIN>
■ StorageClass Name: Name of the Kubernetes storage class to be used.
Installing the Data Collector software 40
Deploy Data Collector in native Kubernetes environment

■ Obtain the itanalytics_k8s_artificats.tar from the Veritas Download


Center. The tarball has the container image, operater image, set of .yaml files,
and the scripts.

Deploy the Data Collector in Kubernetes environment


To deploy the Data Collector in Kubernetes environment:
1 Login to the Kubernetes cluster.
2 Run this command on the primary node and label the node on which you want
to deploy the Data Collector.

kubectl label node <worker_node_name>


itaDcNodeKey=itaDcDeploymentNode

3 From the itanalytics_k8s_artifacts.tar location, run this command to


initiate the Data Collector installation.

tar -xvf itanalytics_k8s_artifacts.tar scripts

This saves a scripts folder at the itanalytics_k8s_artifacts.tar file


location,.
4 From the scripts folder, run this script.

cd scripts/
sh itanalytics_dc_installer.sh

Note: The installation logs are saved to


itanalytics_dc_installer_<time_stamp>.log.

5 Provide the Data Collector configuration details when asked by the installer in
the following order.
■ Registry
The installer asks for a confirmation after providing the registry name to
proceed with pushing the images. You need to enter y for a fresh installation.
If for any reason, you are required to re-run the installation and this step
was successfully completed anytime before for the same cluster node, you
can enter n to avoid a rewrite and bypass this step.
■ Absolute path of Data Receiver Certificate file (if you have set an https://
URL for the data receiver)
■ Absolute path of the Data Collector registration file
■ Proxy settings
Installing the Data Collector software 41
Deploy Data Collector in native Kubernetes environment

■ Portal IP address
■ Portal HostName
■ Agent HostName
■ StorageClass Name

6 The installer asks to confirm the configuration details before proceeding with
the installation. Enter y to proceed with the data collector installation
After a successful installation, verify whether the Data Collector status appears
Online on the NetBackup IT Analytics Portal.

Connect to the pod instance


Run this command to connect to the pod instance and also to facilitate debugging
when required.

# kubectl exec -it<pod ID> -- bash


Chapter 4
Validating Data Collection
This chapter includes the following topics:

■ Validation methods

■ Data Collectors: Vendor-Specific validation methods

■ Working with on-demand Data Collection

■ Using the CLI check install utility

■ List Data Collector configurations

Validation methods
Validation methods are initiated differently based on subsystem vendor associated
with the Data Collector policy, but perform essentially the same functions. Refer to
the following table for vendor-specific validation methods.
■ Test Connection - Initiates a connection attempt directly from a data collector
policy screen that attempts to connect to the subsystem using the IP addresses
and credentials supplied in the policy. This validation process returns either a
success message or a list of specific connection errors.
■ On-Demand data collection run - Initiates an immediate end-to-end run of the
collection process from the Portal without waiting for the scheduled launch. This
on-demand run also serves to validate the policy and its values (the same as
Test Connection), providing a high-level check of the installation at the individual
policy level, including a check for the domain, host group, URL, Data Collector
policy and database connectivity. This is initiated at the policy-level from
Admin>Data Collection>Collector Administration.
See “Working with on-demand Data Collection” on page 45.
Validating Data Collection 43
Data Collectors: Vendor-Specific validation methods

■ CLI Checkinstall Utility- This legacy command line utility performs both the Test
Connection function and On-Demand data collection run from the Data Collector
server.
See “Using the CLI check install utility” on page 50.

Note: NetBackup IT Analytics does not recommend using the CLI Checkinstall
utility for any Data Collector subsystem vendor which supports On-Demand
runs.

Data Collectors: Vendor-Specific validation


methods
Table 4-1 Vendor-specific validation requirements.

Vendor Name Test Connection On-Demand CLI Checkinstall


Utility

Amazon Web Services (AWS) x x

Brocade Switch x

Brocade Zone Alias x x

Cisco Switch x

Cisco Zone Alias x x

Cohesity DataProtect x x

Commvault Simpana x

Compute Resources x

Dell Compellent x

Dell EMC Elastic Cloud Storage (ECS) x x

Dell EMC NetWorker Backup & Recovery x

Dell EMC Unity x x

EMC Avamar x

EMC Data Domain Backup x x

EMC Data Domain Storage x x


Validating Data Collection 44
Data Collectors: Vendor-Specific validation methods

Table 4-1 Vendor-specific validation requirements. (continued)

Vendor Name Test Connection On-Demand CLI Checkinstall


Utility

EMC Isilon x

EMC Symmetrix x x

EMC VNX x x

EMC VNX Celerra x

EMC VPLEX x

EMC XtremIO x x

HDS HCP x x

HDS HNAS x

HP 3PAR x

HP Data Protector x

HP EVA x

HPE Nimble Storage x x

Hitachi Block x

Hitachi Content Platform (HCP) x x

Hitachi NAS x x

IBM Enterprise x

IBM SVC x

IBM Spectrum Protect (TSM) x

IBM VIO x x

IBM XIV x

Microsoft Azure x x

Microsoft Hyper-V x x

Microsoft Windows Server x x

NAKIVO Backup & Replication x x


Validating Data Collection 45
Working with on-demand Data Collection

Table 4-1 Vendor-specific validation requirements. (continued)

Vendor Name Test Connection On-Demand CLI Checkinstall


Utility

NetApp E Series x

Netapp x

Netapp Cluster Mode x

OpenStack Ceilometer x x

OpenStack Swift x x

Test Connection is
included with the Get
Nodes function.

Oracle Recovery Manager (RMAN) x x

Pure FlashArray x x

Rubrik Cloud Data Management x x

VMWare x

Veeam Backup & Replication x x

Veritas Backup Exec x

Veritas NetBackup x x

Veritas NetBackup Appliance X x

Working with on-demand Data Collection


Collections can run on a schedule or on-demand using the Run button on the action
bar. On-demand allows you to select which probes and devices to run. The
on-demand run collects data just like a scheduled run plus additional logging
information for troubleshooting. A stopped Policy still allows an on-demand collection
run, provided the policy is assigned to one of the specified vendors and the collector
is online.

Note: On-demand data collection is not available for all policies.

On-Demand data collection serves multiple purposes. You can use it to:
Validating Data Collection 46
Working with on-demand Data Collection

■ Validate the collection process is working end-to-end when you create a data
collector policy
■ Launch an immediate run of the collection process without waiting for the
scheduled run
■ Populate your database with new/fresh data
■ Choose to view the collection logs on the portal while performing an on-demand
run.
To initiate an on-demand data collection
1 Select Admin > Data Collection > Collector Administration. All Data
Collectors are displayed.
2 Click Expand All to browse for a policy or use Search.
3 Select a data collector policy from the list. If the vendor is supported, the Run
button is displayed on the action bar.
4 Click Run. A dialog allowing you to select servers and individual probes to test
the collection run is displayed. The following example shows the Amazon Web
Services dialog. See the vendor specific content for details on probes and
servers.

5 Select the servers and probes for data collection.


Validating Data Collection 47
Working with on-demand Data Collection

6 The portal enables the user to log the messages at various level during the
collection process. Following are the available options:
■ Enable Real-Time Logs: This option enables the user to log generally
useful information in real-time when the collection is in progress, select
Enable Real-Time Logs.
■ Enable Debug Logs: This option enables the user to log information at a
granular level, select Enable Debug Logs

7 Click Start. Data is collected just like a scheduled run plus additional logging
information for troubleshooting. Once started, you can monitor the status of
the run through to completion.

Note: If there is another data collection run currently in progress when you
click Start, the On-Demand run will wait to start until the in-progress run is
completed.

See “View real-time logging during an on-demand collection” on page 47.


See “Generating debug level logs during an on-demand collection” on page 48.

View real-time logging during an on-demand collection


By default, real-time logging is enabled when you initiate an on-demand collection
for a data collector. Admin > Data Collection > Collector Administration provides
a window to view the logs in real-time as the collection progresses.
Validating Data Collection 48
Working with on-demand Data Collection

The following steps help you to view the real-time logging:


1 Go to Admin > Data Collection > Collector Administration. All Data
Collectors are displayed.
2 Initiate an on-demand data collection as described under Working with
on-demand Data Collection with Enable Real-Time Logs selected.
The Policy State column displays status as Collecting and an icon to open
the Collection Console pop-up.
3 Click the icon next to the Collecting link to view the real-time logs in the
Collection Console. Real-time logs are visible as long as the data collection
is in progress and the Collection Console is open.

You can use the filter on the console to selectively view the logs of your choice.
The Collection Console icon is not visible if the data collection is not in
progress.

Generating debug level logs during an on-demand collection


By default, Enable Debug Logs (Backend only) option is not selected when you
initiate an on-demand collection for a data collector. The Collector Administration
provides a window to generate debug level information as the collection progresses.
Validating Data Collection 49
Working with on-demand Data Collection

The following steps to enable debug level log file generation:


1 Go to Admin > Data Collection > Collector Administration. All Data
Collectors are displayed.
2 Initiate an on-demand data collection as described under Working with
on-demand Data Collection with Enable Debug logs (Backend only) option
selected.
Validating Data Collection 50
Using the CLI check install utility

Note: The path for generated log file on data collector server:
<APTARE_HOME>/mbs/logs/validation/

Using the CLI check install utility


This legacy utility performs both the Test Connection function and On-Demand data
collection run from a command line interface launched from the Data Collector
server.

Note: NetBackup IT Analytics does not recommend using the CLI Checkinstall
utility for any Data Collector subsystem vendor which supports On-Demand runs.

The following directions assume that the Data Collector files have been installed
in their default location:
■ Windows:C:\Program Files\Veritas\AnalyticsCollector
■ Linux:/usr/openv/analyticscollector
If you have installed the files in a different directory, make the necessary path
translations in the following instructions.

Note: Some of the following commands can take up to several hours, depending
on the size of your enterprise.

To run Checkinstall
1 Open a session on the Data Collector server.
Windows: Open a command prompt window.
Linux: Open a terminal/SSH session logged in as root to the Data Collector
Server.
2 Change to the directory where you’ll run the validation script.
Windows: At the command prompt, type:

cd C:\Program Files\Veritas\AnalyticsCollector <enter>

Linux: In the SSH session, type:

cd /usr/openv/analyticscollector <enter>

3 Execute the validation script.


Windows: At the command prompt, type: checkinstall.bat <enter>
Validating Data Collection 51
Using the CLI check install utility

Linux: In the SSH session. type: ./checkinstall.sh <enter>


The checkinstall utility performs a high-level check of the installation, including
a check for the domain, host group and URL, Data Collector policy and database
connectivity. For a component check, specifically for Host Resources, run the
hostresourcedetail.sh|batutility.
This utility fails if :
■ The Data Collector policy has NOT been configured in the Portal.
OR
■ The version of the Data Collector doesn’t match with the version of
NetBackup IT Analytics Portal.
To upgrade Data Collector version to match with the version of NetBackup IT
Analytics Portal.
Login to portal.
Navigating to Admin > Data Collection > Collector Updates page.
Trigger the upgrade.
Ensure that the Data Collector services are online.
Checkinstall includes an option to run a probe for one or more specific devices.
Note that certain Data Collectors will not allow individual selection of devices.
Typically these are collectors that allow the entry of multiple server addresses
or ranges of addresses in a single text box.
These collectors include: Cisco Switch, EMC Data Domain, EMC VNX arrays,
HP 3PAR, IBM mid-range arrays, IBM XIV arrays and VMware.
Data Collectors that probe all devices that are attached to a management
server also do not allow individual selection of devices: EMC Symmetric, File
Analytics, Hitachi arrays and IBM VIO.
4 If the output in the previous steps contains the word FAILED and the reason
of failure is NOT because of version mismatch, then contact Support and have
the following files ready for review:
■ Windows: C:\Program
Files\Veritas\AnalyticsCollector\mbs\logs\validation\

■ Linux: /usr/openv/analyticscollector/mbs/logs/validation/
Validating Data Collection 52
List Data Collector configurations

List Data Collector configurations


Use this utility to list the various child threads and their configurations encapsulated
within a data collector configuration. This utility can be used in conjunction with
other scripts, such as checkinstall.[sh|bat].
On Linux: ./listcollectors.sh
On Windows: listcollectors.bat
Chapter 5
Uninstalling the Data
Collector
This chapter includes the following topics:

■ Uninstall the Data Collector on Linux

■ Uninstall the Data Collector on Windows

Uninstall the Data Collector on Linux


This uninstall process assumes that the Data Collector was installed using the
standard installation process.
To uninstall the Data Collector software from a Linux host:
1 Login to the Data Collector server as root.
2 For NetBackup IT Analytics Data Collector version 10.6 or lower, execute the
Uninstall APTARE IT Analytics Data Collector Agent script located at
<Data Collector home folder>/UninstallerData

For example:

/opt/aptare/UninstallerData/Uninstall APTARE IT Analytics Data


Collector Agent

3 For NetBackup IT Analytics Data Collector version 11.0 or later, execute


uninstall_dc.sh script located at <Data Collector home
folder>/UninstallerData/uninstall_dc.sh

For example:

/opt/aptare/UninstallerData/uninstall_dc.sh
Uninstalling the Data Collector 54
Uninstall the Data Collector on Windows

Uninstall the Data Collector on Windows


This uninstall process assumes that the Data Collector was installed using the
standard installation process.
To uninstall the Data Collector software from a Windows host:
1 Login to the Data Collector server as an administrator.
2 Go to Control Panel > Add and Remove Program > Programs and Features
and uninstall NetBackup IT Analytics Data Collector.
The uninstaller may not delete the entire Data Collector directory structure.
Sometimes new files that were created after the installation are retained along with
their parent directories. If the Data Collector was upgraded from version 10.6 or
older, you may find entries of Kafka and Zookeeper services on the services panel
(default C:\Program Files\Aptare), even after the uninstallation of the Data
Collector. You must manually delete the services and reboot the system.
Chapter 6
Manually starting the Data
Collector
This chapter includes the following topics:

■ Introduction

Introduction
The installer configures the Data Collector to start automatically, however, it does
not actually start it upon completion of the installation because you must first validate
the installation. Follow these steps, for the relevant operating system, to manually
start the Data Collector service.
This also starts the Aptare Agent process, Zookeeper, and Kafka services on the
respective systems.

On Windows
The installer configures the Data Collector process as a Service.

To view the Data Collector Status:


1. Click Start > Settings > Control Panel
2. Click Administrative Tools.
3. Click Services. The Services dialog is displayed.
4. Start the Aptare Agent service.

On Linux
The installer automatically copies the Data Collector “start” and “stop” scripts to the
appropriate directory, based on the vendor operating system.
Manually starting the Data Collector 56
Introduction

To start the data collector, use the following command:

/opt/aptare/mbs/bin/aptare_agent start
Chapter 7
File Analytics Export folder
size and folder depth
This chapter includes the following topics:

■ Extracting File Analytics export folder size

■ Specifying the File Analytics folder depth

■ Data export

Extracting File Analytics export folder size


To extract the first-level folder size information from the File Analytics database:
1. At the Linux command prompt, run the following command:

java -classpath
/opt/aptare/portal/WEB-INF/lib/*:/opt/aptare/portal/WEB-INF/classes/
-DAPTARE_HOME="<ita-install-path>/aptare"
com.aptare.sc.service.fa.FaSubDirectoryReport

where the value of APTARE_HOME property is the absolute path of the aptare
directory.
For example:

java -classpath
/opt/aptare/portal/WEB-INF/lib/*:/opt/aptare/portal/WEB-INF/classes/
-DAPTARE_HOME="/opt/aptare"
com.aptare.sc.service.fa.FaSubDirectoryReport

This generates an output file: report.csv


Output format:
File Analytics Export folder size and folder depth 58
Specifying the File Analytics folder depth

Server Name, Volume Name, Folder name, Size in MB, Last Modified

Where:
■ Folder name: The root-level folders in the volume
■ Size in MB: Sum of all the file sizes in the folder (recursively)
■ Last Modified: Maximum modified time stamp from within all the files in the
folder (recursively)

Specifying the File Analytics folder depth


A parameter, Dfa.export, is available to specify folder depth for File Analytics.
■ To specify the folder depth for the report summary, add the following parameter
when executing the command -Dfa.export.folderDepth=x where "x" is the
depth. By default the depth is set to 1.
■ To turn off reporting on parents, add the following parameter when executing
the command -Dfa.export.includeParents=No. By default reporting on parents
is turned on.
■ To specify the name of the output file use
-Dfa.export.reportFileName=SomeReportName.csv. If this parameter is not
specified the default output file will be report.csv.
For example:

java -classpath /opt/aptare/portal/WEB-INF/lib/*:/opt/aptare/portal/


WEB-INF/classes/ -Dfa.export.folderDepth=2 -Dfa.export.includeParents=No
-Dtest.resourceLocation=opt/aptare/portal/WEB-INF/classes/
com.aptare.sc.service.fa.FaSubDirectoryReport

Sample Directory Structures and Results


As an example, the table that follows, uses these directory structures to show the
results of different parameter values:
■ D1
■ D1/SD1
■ D1/SD1/SD2
■ D2/SD3
■ D3
This table illustrates the expected results given the different parameter values:
File Analytics Export folder size and folder depth 59
Data export

Table 7-1
fa.export.folder fa.export.include Parents Directories Included in Report
Depth

0 N/A D1
D2

D3

1 N/A D1

D1/SD1

D2

D2/SD3

D3

2 No D1/SD1

D1/SD1/SD2

D2/SD3

D3

Data export
The data collected for File Analytics is stored in Bantam format on the portal server
in the /opt/aptare/fa/db folder. Separate folders titled by timestamp are created
and each data file name is appended by the host ID and .bam3 extension. Each
folder also contains a context file (.bam3.cxt) for each host. You can verify the host
ID from Inventory > Backup Servers > Hosts on the portal. The summaries of
this data are stored in the Oracle database, which enable reporting from Reports
> File Analytics.
The export mechanism reads the data from /opt/aptare/fa/db during the export.
To export the File Analytics data:
1 From the portal, go to Admin > Export. A File List Export window is displayed.
2 Click New Export Request. The New Export Request window is displayed.
3 Provide a Name to the export request and click Modify to define the scope of
your export.
4 On the Report Scope Selector window, select the data sources. You can
select file shares and volumes from Groups tab, whereas you can se the
Devices tab to select individual endpoints.
File Analytics Export folder size and folder depth 60
Data export

5 Click OK to save your scope and return to the New Export Request window.
6 Once you defined the scope, you can optionally apply the following filters to
your data export:
■ Owners
■ File Categories
■ Create, Modified, and Accessed date ranges
■ Directory Paths
■ File Extensions
■ File Size
■ File Name

7 Click OK to save the export request. The export request is saved and is
displayed on the File List Export window, with details about the exported files.
To download the File Analytics data locally, you can also select the export
request on the File List Export window and click Download. A compressed
zip file is saved to your default download location.
Appendix A
Firewall configuration:
Default ports
This appendix includes the following topics:

■ Firewall configuration: Default ports

Firewall configuration: Default ports


The following table describes the standard ports used by the Portal servers, the
Data Collector servers, and any embedded third-party software products as part of
a standard “out-of-the-box” installation.

Table A-1 Components: Default Ports

Component Default Ports

Apache Web Server http 80

https 443

Jetty Server on Data Collector Server 443

Kafka 9092

Linux Hosts SSH 22

Managed Applications Oracle ASM 1521

MS Exchange 389

MS SQL 1433

File Analytics CIFS 137, 139


Firewall configuration: Default ports 62
Firewall configuration: Default ports

Table A-1 Components: Default Ports (continued)

Component Default Ports

Oracle 1521

Oracle TNS listener port

Tomcat - Data Receiver 8011, 8017

Apache connector port and shutdown port for Data


Receiver instance of tomcat

Tomcat - Portal 8009, 8015

Apache connector port and shutdown port for


Portal instance of tomcat

Windows Hosts TCP/IP 1248

WMI 135

DCOM TCP/UDP > 1023

SMB TCP 445

ZooKeeper 2181
Note: NetBackup IT Analytics uses
standalone installation of single-node
Apache ZooKeeper server. For secure
communications, ZooKeeper
single-node cluster must be protected
from external traffic using network
security such as firewall. This is
remediated by ensuring that the
ZooKeeper port (2181) is only
accessible on the local host where
NetBackup IT Analytics Portal/Data
Collector is installed (that includes
Apache ZooKeeper).

Table A-2 Storage Vendors: Default Ports

Storage Vendor Default Ports and Notes

Dell Compellent 1433

SMI-S http (5988)

SMI-S https (5989)

Dell EMC Elastic Cloud Storage (ECS) REST API 4443


Firewall configuration: Default ports 63
Firewall configuration: Default ports

Table A-2 Storage Vendors: Default Ports (continued)

Storage Vendor Default Ports and Notes

Dell EMC Unity REST API version 4.3.0 on 443 or 8443

EMC Data Domain Storage SSH 22

EMC Isilon SSH 22

EMC Symmetrix SymCLI over Fibre Channel 2707

EMC VNX NaviCLI 443, 2163, 6389, 6390, 6391,


6392

EMC VNX (Celerra) XML API 443, 2163, 6389, 6390, 6391,
6392

EMC VPLEX https TCP 443

EMC XtremIO REST API https 443

HP 3PAR 22 for CLI

HP EVA 2372

HPE Nimble Storage 5392, REST API Reference Version


5.0.1.0

Hitachi Block Storage TCP 2001

For the HIAA probe: 22015 is used for


HTTP and 22016 is used for HTTPS.

Hitachi Content Platform (HCP) SNMP 161

REST API https 9090

Hitachi NAS (HNAS) SSC 206

Hitachi Vantara All-Flash and Hybrid Flash Storage Hitachi Ops Center Configuration
Manager REST API: 23450 for HTTP
and 23451 for HTTPS.

HIAA : 22015 for HTTP, and 22016 for


HTTPS

IBM Enterprise TCP 1751, 1750, 1718

DSCLI

IBM SVC SSPC w/CIMOM 5988, 5989

IBM XIV XCLI TCP 7778


Firewall configuration: Default ports 64
Firewall configuration: Default ports

Table A-2 Storage Vendors: Default Ports (continued)

Storage Vendor Default Ports and Notes

Microsoft Windows Server 2016

WMI 135

DCOM TCP/UDP > 1023

NetApp E-Series SMCLI 2436

NetApp ONTAP 7-Mode and Cluster-Mode ONTAP API

80/443

Pure Storage FlashArray REST API https 443

Table A-3 Data protection: Default ports

Data Protection Vendor Default Ports and Notes

Cohesity DataProtect REST API on Port 80 or 443

Commvault Simpana 1433, 135 (skipped files)

445 (CIFS over TCP)

DCOM >1023

Dell EMC Networker Backup & Recovery Port used for Dell EMC NetWorker
REST API connection. Default: 9090.

EMC Avamar 5555

SSH 22

EMC Data Domain Backup SSH 22

HP Data Protector 5555 WMI ports SSH 22 (Linux)

IBM Spectrum Protect (TSM) 1500

NAKIVO Backup & Replication Director Web UI port (Default: 4443)

Oracle Recovery Manager (RMAN) 1521

Rubrik Cloud Data Management REST API 443

Veeam Backup & Replication 9392

Veritas Backup Exec 1433


Firewall configuration: Default ports 65
Firewall configuration: Default ports

Table A-4 Network & Fabrics: Default Ports

Network & Fabrics Vendor Default Ports and Notes

Brocade Switch SMI-S 5988/5989

Cisco Switch SMI-S 5988/5989

Table A-5 Virtualization Vendors: Default Ports

Virtualization Vendor Default Ports and Notes

IBM VIO SSH 22

Microsoft Hyper-V WMI 135

DCOM TCP/UDP > 1023

VMware ESX or ESXi,vCenter,vSphere vSphere VI SDK

https TCP 443

Table A-6 Replication Vendors: Default Ports

Replication Vendor Default Ports and Notes

NetApp ONTAP 7-Mode ONTAP API

80/443

Table A-7 Cloud Vendors: Default Ports

Cloud Vendor Default Ports and Notes

Microsoft Azure https 443

OpenStack Ceilometer 8774, 8777

Keystone Admin 3537

Keystone Public 5000

OpenStack Swift Keystone Admin 35357


Keystone Public 5000

SSH 22

Google Cloud Platform https 443

You might also like