DIH 10.5 Installation Guide
DIH 10.5 Installation Guide
10.5
This software and documentation are provided only under a separate license agreement containing restrictions on use and disclosure. No part of this document may be
reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC.
U.S. GOVERNMENT RIGHTS Programs, software, databases, and related documentation and technical data delivered to U.S. Government customers are "commercial
computer software" or "commercial technical data" pursuant to the applicable Federal Acquisition Regulation and agency-specific supplemental regulations. As such,
the use, duplication, disclosure, modification, and adaptation is subject to the restrictions and license terms set forth in the applicable Government contract, and, to the
extent applicable by the terms of the Government contract, the additional rights set forth in FAR 52.227-19, Commercial Computer Software License.
Informatica, the Informatica logo, PowerCenter, and PowerExchange are trademarks or registered trademarks of Informatica LLC in the United States and many
jurisdictions throughout the world. A current list of Informatica trademarks is available on the web at https://www.informatica.com/trademarks.html. Other company
and product names may be trade names or trademarks of their respective owners.
Portions of this software and/or documentation are subject to copyright held by third parties. Required third party notices are included with the product.
DISCLAIMER: Informatica LLC provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied
warranties of noninfringement, merchantability, or use for a particular purpose. Informatica LLC does not warrant that this software or documentation is error free. The
information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation
is subject to change at any time without notice.
NOTICES
This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software
Corporation ("DataDirect") which are subject to the following terms and conditions:
1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES
OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT, BREACH
OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS.
The information in this documentation is subject to change without notice. If you find any problems in this documentation, report them to us at
[email protected].
Informatica products are warranted according to the terms and conditions of the agreements under which they are provided. INFORMATICA PROVIDES THE
INFORMATION IN THIS DOCUMENT "AS IS" WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING WITHOUT ANY WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND ANY WARRANTY OR CONDITION OF NON-INFRINGEMENT.
Table of Contents 3
Step 8. Configure Informatica Data Quality Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Step 9. Complete the Installation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 46
Installing Data Integration Hub on a UNIX Operating System in Console Mode . . . . . . . . . . . . . . 47
Step 1. Run the Installer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Step 2. Define Installation Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Step 3. Configure Data Integration Hub Repositories. . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Step 4. Set Up the Operational Data Store. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Step 5. Configure User Authentication. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Step 6. Configure Document Store and Web Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Step 7. Configure PowerCenter Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Step 8. Configure Processing Engine Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Step 9. Complete the Installation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Installing Data Integration Hub in a Silent Mode. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Configuring the Installation Properties. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Sample of the Installer Properties Configuration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Running the Silent Installer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
4 Table of Contents
Step 5. Configure the Web Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Step 6. Configure PowerCenter Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Step 7. Configure Data Quality Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Step 8. Complete the Installation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
Upgrading Data Integration Hub on a UNIX Operating System. . . . . . . . . . . . . . . . . . . . . . . . . 98
Step 1. Run the Installer. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Step 2. Define Installation Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Step 3. Configure Data Integration Hub Repositories. . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Step 4. Set Up the Operational Data Store. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
Step 5. Configure the Web Server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102
Step 6. Configure PowerCenter Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
Step 7. Configure Processing Engine Settings. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
Step 8. Complete the Installation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
After You Upgrade. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
Reapplying Configuration Modifications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
Registering the Dashboard and Reports License. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
Replacing the Operational Data Store Loader Workflow. . . . . . . . . . . . . . . . . . . . . . . . . . 107
Configure Credentials for Windows Authentication. . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Updating the Security Configuration Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Updating the Run Publication Subscription Web Service API. . . . . . . . . . . . . . . . . . . . . . 109
Restart the Data Integration Hub Services . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
Validating Invalidated Subscriptions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
Table of Contents 5
Chapter 9: Installing and Configuring the Data Integration Hub Accelerator
for Data Archive. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
Installing and Configuring Data Integration Hub Accelerator for Data Archive Overview. . . . . . . . 123
Pre-Installation Steps. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
Installing the Data Integration Hub Accelerator for Data Archive. . . . . . . . . . . . . . . . . . . . . . . 124
Configuring the Data Archive Source Connection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 125
Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
6 Table of Contents
Preface
Follow the instructions in the Data Integration Hub Installation and Configuration Guide to install and
configure Data Integration Hub. The guide also includes information about the pre-install tasks and post-
install tasks that you need to perform to complete the installation.
Informatica Resources
Informatica provides you with a range of product resources through the Informatica Network and other online
portals. Use the resources to get the most from your Informatica products and solutions and to learn from
other Informatica users and subject matter experts.
Informatica Network
The Informatica Network is the gateway to many resources, including the Informatica Knowledge Base and
Informatica Global Customer Support. To enter the Informatica Network, visit
https://network.informatica.com.
To search the Knowledge Base, visit https://search.informatica.com. If you have questions, comments, or
ideas about the Knowledge Base, contact the Informatica Knowledge Base team at
[email protected].
Informatica Documentation
Use the Informatica Documentation Portal to explore an extensive library of documentation for current and
recent product releases. To explore the Documentation Portal, visit https://docs.informatica.com.
If you have questions, comments, or ideas about the product documentation, contact the Informatica
Documentation team at [email protected].
7
Informatica Product Availability Matrices
Product Availability Matrices (PAMs) indicate the versions of the operating systems, databases, and types of
data sources and targets that a product release supports. You can browse the Informatica PAMs at
https://network.informatica.com/community/informatica-network/product-availability-matrices.
Informatica Velocity
Informatica Velocity is a collection of tips and best practices developed by Informatica Professional Services
and based on real-world experiences from hundreds of data management projects. Informatica Velocity
represents the collective knowledge of Informatica consultants who work with organizations around the
world to plan, develop, deploy, and maintain successful data management solutions.
You can find Informatica Velocity resources at http://velocity.informatica.com. If you have questions,
comments, or ideas about Informatica Velocity, contact Informatica Professional Services at
[email protected].
Informatica Marketplace
The Informatica Marketplace is a forum where you can find solutions that extend and enhance your
Informatica implementations. Leverage any of the hundreds of solutions from Informatica developers and
partners on the Marketplace to improve your productivity and speed up time to implementation on your
projects. You can find the Informatica Marketplace at https://marketplace.informatica.com.
To find your local Informatica Global Customer Support telephone number, visit the Informatica website at
the following link:
https://www.informatica.com/services-and-training/customer-success-services/contact-us.html.
To find online support resources on the Informatica Network, visit https://network.informatica.com and
select the eSupport option.
8 Preface
Chapter 1
Installation Overview
This chapter includes the following topics:
Core application component. Includes the Operation Console, Data Integration Hub server, the Data
Integration Hub repository, and the publication repository. The PowerCenter services must be running
when you install Data Integration Hub. You must set up the database user accounts before you install
this component.
You can make the Dashboard available in Data Integration Hub in two ways. When you install Data
Integration Hub, you can enable the dashboard that uses operational data store. Otherwise, you can use
the dashboard that uses metadata repository which is available by default.
This is a business activity monitoring component. Includes the dashboard application and the
operational data store repository. You must set up a different user account from the user account that
you use for the Data Integration Hub repository.
You must install the Data Integration Hub component to install this component.
PowerCenter repository plug-in that Data Integration Hub uses to run Data Integration Hub
transformations in PowerCenter. The installation includes files to add to the classpath of the
PowerCenter Integration Service. You must install this plug-in on the same machine as the PowerCenter
services.
After you install this component, you must register the plug-in to the PowerCenter repository before you
create and run Data Integration Hub workflows.
9
Data Integration Hub PowerCenter Client plug-in
PowerCenter Client plug-in that displays Data Integration Hub transformation properties in PowerCenter
mappings. You install this plug-in on all PowerCenter Client machines that you plan to use to build
mappings and workflows for Data Integration Hub transformations.
Available when you install Data Integration Hub on a Linux operating system.
Connecting module between Data Integration Hub and the Hadoop cluster. Enables Data Integration Hub
to perform operations on the Hadoop publication repository.
Select this component if you want to define a Hadoop-based publication repository and manage some of
your topics on Hadoop.
Note: The Data Integration Hub Hadoop Service must be installed on an edge node in the Hadoop
cluster. Unless you are installing Data Integration Hub on a Hadoop edge node, do not select this
component during Data Integration Hub installation. In this case, after you complete the Data Integration
Hub installation, run the installation again on the edge node and select to install only the Data Integration
Hub Hadoop Service component on the node.
Enables Data Integration Hub to use Data Engineering Integration mappings for custom publications and
subscriptions that publish and subscribe to a Hadoop-based publication repository.
Available when you install Data Integration Hub on a Linux operating system.
Enables Data Integration Hub to use Data Quality mappings for custom publications and subscriptions
from and to on-premises applications.
The Data Integration Hub includes the following additional applications and components:
Server environment that manages publication and subscription processing in Data Integration Hub.
Operation Console
Web interface to customize and monitor processing, manage users, and set preferences.
Apache Tomcat
Java JDK
Java run-time environment in which the Data Integration Hub server, Data Integration Hub Operation
Console, and Data Integration Hub command line client tools run.
• User Accounts, 12
• Port Numbers, 14
11
User Accounts
Before you install, verify that you have the user names and passwords for the required database and domain
accounts.
Database Database user account that you use to log in to the database server and create tables
and views for the Data Integration Hub repository and the publication repository. If you
install the Dashboard and Reports component, you also use a user account for the
operational data store.
You must install all the repositories on the same type of database server. You must
create a separate user account for each repository.
The user accounts must have privileges to perform the following actions:
- Select data from tables and views.
- Insert data into tables, delete data from tables, and update data in tables.
- Create, change, and delete the following elements:
- Tables
- Views
- Synonyms
- Indexes
- Custom data types
- Triggers
- Create, change, delete, and run stored procedures and functions.
If you use a Microsoft SQL Server database, you must set up separate databases for
each repository. It is recommended that you grant database owner privileges to the user
accounts.
If you use Data User account for Informatica domain authentication. The user account must be created
Integration Hub with in the Informatica Administrator tool with the manage roles/groups/users privileges. The
Informatica domain Data Integration Hub administrator synchronizes the user account after the installation.
authentication:
Informatica security
domain
PowerCenter Repository User account that Data Integration Hub uses to perform operations in the PowerCenter
Service Repository Service. The user accounts must have privileges and permissions to perform
the following actions:
General
- Access Repository Manager privilege
Folders
- Read on folder permission
- Create privilege
- Copy privilege
Design Objects
- Read on folder permission
- Read on shared folder permission
- Read and Write on destination folder permission
- Create, Edit, and Delete privilege with the Read on original folder, Read and Write on
destination folder, and Read and Write on folder permissions
Sources and Targets
- Read on folder permission
- Read on shared folder permission
- Read and Write on destination folder permission
- Create, Edit, and Delete privilege with the Read on original folder, Read and Write on
destination folder, and Read and Write on folder permissions
Run-time Objects
- Read on folder permission
- Create, Edit, and Delete privilege with the Read on original folder, Read and Write on
destination folder, Read on connection object, and Read and Write on folder
permissions
- Monitor privilege with the Read on folder permission
- Execute privilege with the Read and Execute on folder permission
Global Objects
- Read on connection object permission
- Read and Write on connection object
- Create Connections privilege
- Execute privilege with the Read and Execute on folder permission
User Accounts 13
Port Numbers
The installer sets the default port numbers for the installation components. If another application uses the
same port number as one of the installation components, a port conflict might prevent the component from
running correctly or cause errors.
You can change the port numbers after installation. Before you start Data Integration Hub, verify that the port
numbers do not conflict with other applications and change the port numbers in Data Integration Hub to
prevent port conflicts.
18000 UDP multicast port that Data Integration Hub uses for internal communications.
18005 Operation Console shutdown port. Only required to be available on the machine where Data
Integration Hub is installed.
18050 Port that the Operation Console uses for internal communications.
18080 Operation Console HTTP port. Required only if you use an HTTP port for the Operation Console.
18095 and RMI ports that the Operation Console and PowerCenter workflows use to communicate with the
18096 Data Integration Hub server.
18100 Port that the Data Integration Hub server uses for internal communications.
18443 Operation Console HTTPS port. Required only if you use an HTTPS port for the Operation Console.
Pre-Installation Tasks
This chapter includes the following topics:
Note: Data Integration Hub and the PowerCenter Integration Service that Data Integration Hub uses must be
installed on the same type of operating system. Both must be installed either on a machine or machines that
are running Windows operating systems, or on a machine or machines that are running non-Windows
operating systems.
The following components must reside on machines with the same locale and the same time zone:
15
Verify the Minimum System Requirements
Verify that your system meets the minimum requirements.
System Requirement
RAM 8 GB
The following table describes the minimum system requirements to run the installer:
System Requirement
RAM 512 MB
Disk space 1 GB
For more information about product requirements and supported platforms, see the Product Availability
Matrix on Informatica Network:
https://network.informatica.com/community/informatica-network/product-availability-matrices
The following table describes the database requirements for Data Integration Hub:
Database Description
Component
Database Type of database on which to install the repositories. You can use one of the following database
System systems:
- Amazon RDS for Oracle
- Oracle
- Microsoft SQL Server
- Microsoft Azure SQL Server
Note: You must have the Oracle Enterprise Edition to use Filter Accelerator
Database Two database instances. Data Integration Hub uses one database for the Data Integration Hub
Instances repository and one database for the publication repository. Both database instances must be on the
same type of database system.
If you use Oracle databases, it is recommended that each database has a unique user account and a
unique schema.
If you use Microsoft Azure SQL Server, it is recommended that you create a separate Data Source
Name (DSN) for each database.
If you install the Dashboard and Reports component, an additional database instance is required for
the operational data store. The operational data store must be on the same type of database system
as the Data Integration Hub repositories.
Note: If you install the Dashboard and Reports component, your Data Integration Hub and operational
data store repositories are installed on Microsoft SQL Servers, and you use PowerCenter version 10,
you must configure the repository connections in PowerCenter Workflow Manager. For details, see
“Configuring Repository Connections on PowerCenter Version 10” on page 122 .
Disk space The Data Integration Hub repository database requires at least 512 MB of disk space for the core
application.
You also need additional space on the publication repository database based on the number of
publications and publication instances that you need to retain.
Note: Unicode data requires twice as much storage than single-byte character sets.
Database Multiple database connections for each repository must always be available.
connections The number of required connections for each repository depends on the number of publications and
subscriptions that run concurrently. Use the following formula to calculate the number of required
database connections for each repository:
NumberOfConcurrentPublicationsOrSubscriptions X 3 + 2
If you do not have enough database connections available, Data Integration Hub might fail or
encounter database deadlocks.
• PowerCenter. Install PowerCenter before you install Data Integration Hub. Make sure to install
PowerCenter services on a machine that is accessible to Data Integration Hub. After you install
PowerCenter, verify that the PowerCenter Web Services Hub is running.
If you do not install the PowerCenter services on the same machine that you install Data Integration Hub,
install the PowerCenter pmrep command line utility on the machine where you install Data Integration
Hub. Verify that Data Integration Hub and PowerCenter can be accessed with the same drive and file path.
Note: Verify that repository agent caching on the PowerCenter Repository Service is disabled. In
Informatica Administrator, access the advanced properties of the Repository Service and verify that the
option Enable Repagent caching is cleared.
• Java Development Kit (JDK). On IBM AIX operating systems, install the IBM JDK version 8.0.5.16 (8.0
Service Refresh 5 Fix Pack 16) and configure the INFA_JDK_HOME environment variable before you install
Data Integration Hub. Verify that the login shell can access the INFA_JDK_HOME environment variable.
For more information about Java installation, see the Java website at the following address: https://
www.ibm.com/developerworks/java/jdk/fixes/8/index.html
The software available for download at the referenced links belongs to a third party or third parties, not
Informatica LLC. The download links are subject to the possibility of errors, omissions or change.
Informatica assumes no responsibility for such links and/or such software, disclaims all warranties, either
express or implied, including but not limited to, implied warranties of merchantability, fitness for a
particular purpose, title and non-infringement, and disclaims all liability relating thereto.
For more information about product requirements and supported platforms, see the Product Availability
Matrix on Informatica Network:
https://network.informatica.com/community/informatica-network/product-availability-matrices
• Microsoft Visual C++ 2008 Redistributable Package (x86). Install this package if you use the Data
Integration Hub PowerCenter Client plug-in on a Windows Server 2008 64-bit operating system.
The software available for download at the referenced links belongs to a third party or third parties, not
Informatica LLC. The download links are subject to the possibility of errors, omissions or change.
Informatica assumes no responsibility for such links and/or such software, disclaims all warranties, either
express or implied, including but not limited to, implied warranties of merchantability, fitness for a
particular purpose, title and non-infringement, and disclaims all liability relating thereto.
• Java Cryptography Extension (JCE). Install this package if you are installing Data Integration Hub on an
IBM AIX operating system.
To download the utility, contact Informatica Shipping. The utility version must match the PowerCenter
version.
1. Extract the ZIP file on your local machine to a directory that is accessible by the Data Integration Hub
installer.
By default, the installer searches for the utility in the following directory: <LocalDrive>\Informatica
\version
2. Configure the utility settings based on your operating system.
For information about the utility settings, see the Informatica Repository Guide.
To test the utility settings, run the utility from the command line and verify that no errors appear in the run
results.
Note: If you upgrade the pmrep command line utility at a later time, clean up all CNX files from the Tmp folder
on your home directory.
1. Set the INFA_HOME environment variable to point to the Informatica installation directory.
2. Set the INFA_DOMAINS_FILE environment variable to the path and the file name of the domains.infa
file.
3. On Solaris and Linux, add <INFA_HOME>/server/bin to the LD_LIBRARY_PATH environment variable.
4. On AIX, add <INFA_HOME>/server/bin to the LIBPATH environment variable.
5. Verify that the pmrep utility code page matches the PowerCenter Repository Service code page. You
specify the code page in the INFA_CODEPAGENAME environment variable of the utility.
6. To reduce the length of time to wait before the pmrep utility reports an error when connecting to
PowerCenter, change the value of the INFA_CLIENT_RESILIENCE_TIMEOUT environment variable in the
utility.
The default timeout waiting time is 180 seconds.
7. On Linux, to use SQL repositories installed on Windows as the Data Integration Hub and publication
repositories, set the ODBCINST environment variable to <INFA_HOME>/ODBC7.1/odbcinst.ini.
Note: This step is relevant to systems running PowerCenter version 10 or higher.
Create data source names on the operating system as provided in the following sections:
Note: Ensure that you configure a separate Data Source Name (DSN) entry for every database on which Data
Integration Hub is installed.
1. Open ODBC Data Sources from the Windows environment and select 64-bit ODBC Data Source for
DataDirect 7.1 New SQL Server Wire Protocol.
The ODBC DataSource Administrator window is displayed.
2. Select System DSN and click Add New.
The Create New DataSource window is displayed.
3. Select the following driver: DataDirect 7.1 New SQL Server Wire Protocol.
The ODBC SQL Server Wire Protocol Driver Setup window is displayed.
4. Enter the following details in the General tab:
• DataSource Name. Name of the data source.
• Host Name. Host name of the machine where the database server is installed.
• Port Number. Port number of the database. The default port number for Microsoft Azure SQL
Database is 1433.
• Database. Name of the database instance.
5. Select Advanced > Extended Options and add WorkArounds2=2.
Note: Enabling the WorkArounds2=2 option, causes the driver to ignore the column size, decimal digits,
or datetime values specified by the application and use the database defaults instead. Some
applications incorrectly specify the column size or decimal digits when binding timestamp parameters.
6. Select Security and update the following information:
• User Name. Name of the Microsoft Azure SQL Database user.
• Select any Encryption Method.
7. Click Save.
Data Source Name (DSN) details are saved.
8. Test the driver with credentials of the user that you have provided in the procedure and ensure that the
connection passes.
Note: Ensure that you configure a separate Data Source Name (DSN) entry for every database on which Data
Integration Hub is installed. The driver should point to the DWsqls27.so driver by using an absolute path.
1. Navigate to the <pwc_install_path>/ODBC7.1 folder in the PowerCenter installation directory and edit
the odbc.ini driver to update the following information:
• Driver. Enter a path to the driver.
An example of the path is as follows: /data/Informatica/10.1.1/ODBC7.1/lib/DWsqls27.so.
• Description. Enter the description of the DSN entry.
An example of the description is as follows: Azure SQL DATABASE Connection for ODL
• Address. Enter the host name of the machine where the database server is installed.
• LogonID. Enter name for the Microsoft Azure SQL Database user.
• Password. Enter a password for the Microsoft Azure SQL Database user.
• QuotedId. Select No.
• AnsiNPW. Select Yes.
• EncryptionMethod. Enter a numerical value that corresponds to the encryption method that you want
to select.
• Enter WorkArounds2=2.
Note: Enabling the WorkArounds2=2 option causes the driver to ignore the column size or decimal
digits specified by the application and use the database defaults instead. Some applications
incorrectly specify the column size or decimal digits when binding timestamp parameters.
• If the $ODBINI environment variable pointing to the odbc.ini file was not configured, then configure
the $ODBINI environment variable as follows:$ODBCINI=<pwc_install_path>/ODBC7.1/odbc.ini.
Note: ODBC environment variables are configured before installing PowerCenter.
An example of the configuration is as follows:
Driver=<PwC_Install_Loc>/ODBC7.1/lib/DWsqls27.so
Description=Azure SQL DATABASE Connection for ODL
Address=<server_name>
Database= <db_name>
LogonID=<usn>
Password=<pwd>
QuotedId=No
AnsiNPW=Yes
EncryptionMethod=1
ValidateServerCertificate=0
WorkArounds2=2
2. If you configure the environment variable while creating data source name on the operating system as
described in this procedure, then restart PowerCenter services.
The document store directory must be accessible to Data Integration Hub, Apache Tomcat, and PowerCenter
with the same drive and file path.
Note: If you use Microsoft SQL Server 2012, you can set the option Is read committed snapshot on in
Microsoft SQL Server Management Studio to true instead.
1. Open an SQL query for the database server with rights to set database options.
2. Run the following SQL statements:
ALTER DATABASE [<database_name>] SET SINGLE_USER WITH ROLLBACK IMMEDIATE
3. Run the following SQL query:
ALTER DATABASE <database_name> SET READ_COMMITTED_SNAPSHOT ON
4. To verify that this option is set, run the following SQL query:
SELECT is_read_committed_snapshot_on FROM sys.databases WHERE name =
'<database_name>'
If the option is set, the query returns the value 1. If the option is not set, the query returns the value 0.
5. Run the following SQL statement to forcefully disconnect all users from the system:
ALTER DATABASE [<database_name>] SET MULTI_USER
Before you install, verify that your environment meets the minimum system requirements, perform the pre-
installation tasks, and verify that the PowerCenter services are running.
Note: During the installation, Data Integration Hub saves log files in the home directory of the user in the
subdirectory named DXLogs. If the installation does not complete successfully, you can view the log files in
this location.
23
The Install or Upgrade page appears.
5. Select the option to install Data Integration Hub, and then click Next.
The Installation Directory page appears.
Installs the Data Integration Hub plug-in for the PowerCenter services. After the installation, you
register the plug-in to the PowerCenter repository.
Selected by default.
Installs the Data Integration Hub plug-in for the PowerCenter Client. Install this component on every
machine that runs the PowerCenter Client.
Selected by default.
Enables Data Integration Hub to use Data Quality mappings for custom publications and
subscriptions.
Select this component if you want to create custom publications and subscriptions that use Data
Quality mappings in Data Integration Hub.
Cleared by default.
3. Click Next.
The Select PowerCenter Version page appears.
4. Select the PowerCenter version for which to install Data Integration Hub and then click Next.
The Metadata Repository page appears.
2. Click Next.
Type of database to use for the Data Integration Hub metadata repository. You can choose one of
the following options:
• Oracle
• Microsoft SQL Server
Database URL
Instructs Data Integration Hub to authenticate user names against the Microsoft Windows
authentication mechanism. Available when you select a Microsoft SQL Server database.
Database username
Name of the database user account for the database where you do not use Windows authentication.
Password for the database account for the database where you do not use Windows authentication.
Data Integration Hub stores the password as an encrypted string.
4. Click Next.
The Publication Repository Connection page appears.
Type of database to use for the publication repository. The database type must match the Data
Integration Hub metadata repository database type and appears in read-only mode.
Database URL
Instructs Data Integration Hub to authenticate user names against the Microsoft Windows
authentication mechanism. Available when you select a Microsoft SQL Server database.
Database username
Name of the database user account for the database where you do not use Windows authentication.
Password of the database account for the database where you do not use Windows authentication.
Data Integration Hub stores the password as an encrypted string.
6. Click Next.
If you selected the Data Integration Hub Dashboard and Reports component, the Operational Data Store
page appears. If you did not select the Dashboard and Reports component, go to “Step 5. Configure User
Authentication” on page 33.
2. Click Next.
Location of the database. If you select this option, enter the values in the following fields:
• Database host name. Host name of the machine where the database server is installed.
• Database port number. Port number for the database. The default port number for an Oracle
database is 1521. The default port number for a Microsoft SQL server or a Microsoft Azure SQL
database is 1433.
• Database SID. System identifier for the database if you select Oracle as the database. Enter
either a fully qualified ServiceName or a fully qualified SID.
Note: It is recommended that you enter a ServiceName in this field.
• Microsoft SQL Server database or Microsoft Azure SQL Database. Database name.
Connection string to the database. If you select this option, enter values in one of the following
fields:
• JDBC string. JDBC connection string to the Operational Data Store.
• ODBC string. ODBC connection string to the Operational Data Store. Available if you install the
PowerCenter Client plug-in. The installer cannot verify the validity of the ODBC string.
Note: If you use a named Microsoft SQL Server database instance, you cannot connect to the
database instance using the Database URL option. Use the Custom Connection String option.
For example:
Instructs Data Integration Hub to authenticate user names against the Microsoft Windows
authentication mechanism. Available when you select a Microsoft SQL Server database.
Database username
Name of the operational data store user account for the database where you do not use Windows
authentication.
Password for the operational data store account for the database where you do not use Windows
authentication. Data Integration Hub stores the password as an encrypted string.
4. Click Next.
The User Authentication page appears.
The following table describes the settings that you need to configure for the Informatica Platform
Authentication page:
Gateway host
Host name of the Informatica security domain server. Data Integration Hub stores the host name in the
pwc.domain.gateway system property.
Port number for the Informatica security domain gateway. Data Integration Hub stores the port number
in the pwc.domain.gateway system property. Use the gateway HTTP port number to connect to the
domain from the PowerCenter Client. You cannot use the HTTPS port number to connect to the domain.
Username
User name to access the Administrator tool. You must create the user in the Administrator tool and
assign the manage roles/groups/user privilege to the user.
Password
Password of the Informatica security domain user.
Security domain
Security group
Optional. Security group within the Informatica security domain where Data Integration Hub users are
defined in the following format:
<security group>@<domain>
If you leave the field empty, the Informatica security domain synchronizes only the Data Integration Hub
administrator user account.
Data Integration Hub stores the security group in the dx.authentication.groups system property in the
following format:
<group name>@<security group>[;<groupname>@<security group>]
The following image shows the Informatica Domain with Kerberos Authentication page.
Service Principal Name (SPN) for the Data Integration Hub Operation Console.
Data Integration Hub stores the SPN in the dx-security-config.properties property file, in the
dx.kerberos.console.service.principal.name property.
Location of the keytab file for the Data Integration Hub Operation Console SPN.
<username>@<SECURITY_DOMAIN>
Gateway host
Security group
Optional. Security group within the Informatica security domain where Data Integration Hub users are
defined in the following format:
<security group>@<domain>
If you leave the field empty, the Informatica security domain synchronizes only the Data Integration Hub
administrator user account.
Data Integration Hub stores the security group in the dx.authentication.groups system property in the
following format:
<group name>@<security group>[;<groupname>@<security group>]
2. Click Next.
Instructs Data Integration Hub to use secure network communication when you open the Operation
Console in the browser. If you select HTTPS and HTTP, the Operation Console switches existing
HTTP connections with HTTPS connections.
Port number for the Tomcat connector to use when you open the Operation Console with HTTPS.
The default value is 18443.
Instructs the installer to generate a keystore file with an unregistered certificate. If you select this
option, ignore the security warning that you receive from the browser the first time you open the
Operation Console.
Instructs the installer to load an existing keystore file. Enter values in the following fields:
• Keystore password. Password for the keystore file.
• Keystore file. Path to the keystore file.
The keystore file must be in the Public Key Cryptography Standard (PKCS) #12 format.
Port number for the listener that controls the Tomcat server shutdown.
The default value is 18005.
URL that the PowerCenter Web Services Hub uses to process publication and subscription
workflows.
Service name
Host name of the node that runs the PowerCenter Repository Service.
Port number of the node that runs the PowerCenter Repository Service.
Username
Password
Password for the PowerCenter Repository Service user. Data Integration Hub stores the password
as an encrypted string.
Security domain
Optional. Name of the Informatica security domain in which the PowerCenter Repository Service
user is stored.
Default is Native.
3. Click Next.
Name of the Informatica domain that contains the PowerCenter Integration Service that runs Data
Integration Hub workflows.
Node name
Node in the Informatica domain on which the PowerCenter Integration Service runs.
The name of the PowerCenter Integration Service that Data Integration Hub uses to run workflows.
5. Click Next.
7. Click Next.
If you selected to install the Data Integration Hub Informatica Data Quality component, the Informatica
Data Quality Domain Settings page appears. If you did not select to install the Data Integration Hub
Informatica Data Quality component, the Pre-Installation Summary page appears.
Domain name of the node on which the Data Integration Service runs.
Host name
Host name of the node on which the Data Integration Service runs.
Port number
Port number of the node on which the Data Integration Service runs.
User name for the node on which the Data Integration Service runs.
Password for the node on which the Data Integration Service runs.
Security Domain
Optional. Name of the Informatica security domain in which the Model Repository Service user is
stored.
3. Enter the location of the infacmd command line utility or accept the default directory, and then click
Next.
The Pre-Installation Summary page appears.
During the installation process, the installer displays progress information. When the installation process
ends, the Post-Installation Actions page appears.
2. If you installed the Data Integration Hub PowerCenter server plug-in, follow the wizard instructions to
register the plug-in to the PowerCenter repository, and then click Next.
The Installation Complete page appears.
3. Click Done to close the installer.
4. To view the log files that the installer generates, navigate to the following directory:
<DIHInstallationDir>\logs.
5. Perform the required post-installation tasks.
For more information, see Chapter 5, “Post-Installation Tasks” on page 68.
Note: Perform only the tasks that are relevant for your environment.
6. Optionally, perform additional configuration tasks. For more information, see Chapter 8, “Optional Data
Integration Hub Configuration” on page 112.
Before you install, verify that your environment meets the minimum system requirements, perform the pre-
installation tasks, and verify that the PowerCenter services are running.
During the installation, Data Integration Hub saves log files in the home directory of the user, in the
subdirectory named DXLogs. If the installation does not complete successfully, you can view the log files in
this location.
You can make the Dashboard available in Data Integration Hub in two ways. When you install Data
Integration Hub, you can enable the dashboard that uses operational data store. Otherwise, you can
use the dashboard that uses metadata repository which is available by default.
Installs the Data Integration Hub Dashboard and Reports component. You must install Data
Integration Hub to install the Dashboard and Reports component.
Cleared by default.
Installs the Data Integration Hub PowerCenter server plug-in component. After the installation,
register the plug-in to the PowerCenter repository.
Selected by default.
Connecting module between Data Integration Hub and the Hadoop cluster. Enables Data Integration
Hub to perform operations on the Hadoop publication repository.
Select this component if you want to define a Hadoop-based publication repository and manage
some of your topics on Hadoop.
Note: The Data Integration Hub Hadoop Service must be installed on an edge node in the Hadoop
cluster. Unless you are installing Data Integration Hub on a Hadoop edge node, do not select this
component during Data Integration Hub installation. In this case, after you complete the Data
Integration Hub installation, run the installation again on the edge node and select to install only the
Data Integration Hub Hadoop Service component on the node.
Enables Data Integration Hub to use Data Engineering Integration and Data Quality mappings for
custom publications and subscriptions.
Select this component if you want to create custom publications and subscriptions that use Data
Engineering Integration or Data Quality mappings in Data Integration Hub.
3. Press Enter.
The Select PowerCenter Version section appears.
4. Select the PowerCenter version for which to install Data Integration Hub or accept the default selection:
1- PowerCenter version below 10.5.
Location of the database. If you select this option, enter values in the following fields:
• Database Host Name. Host name of the machine where the database server is installed.
• Database Port Number. Port number for the database. The default port number for Oracle is
1521. The default port for Microsoft SQL Server and Microsoft Azure SQL Database is 1433.
• Database SID. System identifier for the database.
• Microsoft SQL Server database or Microsoft Azure SQL Database. Database name. Name of the
database instance.
Connection string to the database. If you select this option, enter values in one of the following
fields:
• JDBC string. JDBC connection string to the metadata repository.
• ODBC string. ODBC connection string to the metadata repository. Applicable if you install the
PowerCenter client plug-in. The installer cannot verify the validity of the ODBC string.
Note: If you use a named Microsoft SQL Server database instance, you cannot connect to the
database instance using the Database URL option. Use the Custom Connection String option.
For example:
jdbc:informatica:sqlserver://MYSQLSERVERCOMPUTERHOSTNAME
\MYDBINSTANCENAME;DatabaseName=MYDATABASENAME;
5. Enter values in the following fields:
Database username
The password for the database account for the database. Data Integration Hub stores the password
as an encrypted string.
Location of the database. If you select this option, enter values in the following fields:
• Database host name. Host name of the machine where the database server is installed.
• Database port number. Port number for the database. The default port number for Oracle is
1521. The default port for Microsoft SQL Server or Microsoft Azure SQL Database.
• Oracle database. Database SID. System identifier for the database.
• Microsoft SQL Server database or Microsoft Azure SQL Database. Database name. Name of the
database instance.
Connection string to the database. If you select this option, enter values in one of the following
fields:
• JDBC string. JDBC connection string to the publication repository.
• ODBC string. Applicable if you install the PowerCenter client plug-in. ODBC connection string to
the publication repository. The installer cannot verify the validity of the ODBC string.
Note: If you use a named Microsoft SQL Server database instance, you cannot connect to the
database instance using the Database URL option. Use the Custom Connection String option.
For example:
jdbc:informatica:sqlserver://MYSQLSERVERCOMPUTERHOSTNAME
\MYDBINSTANCENAME;DatabaseName=MYDATABASENAME;
8. Press Enter.
9. Enter values in the following fields:
Database username
The password for the database account. Data Integration Hub stores the password as an encrypted
string.
10. Press Enter.
If you selected to install the Data Integration Hub Dashboard and Reports component, the Operational
Data Store section appears. If you did not select to install the Dashboard and Reports component, go to
“Step 5. Configure the Web Server ” on page 102.
Uses the tables and data in an existing operational data store repository.
Location of the database. If you select this option, enter values in the following fields:
• Database host name. Host name of the machine where the database server is installed.
• Database port number. Port number for the database. The default port number for Oracle is
1521. The default port for Microsoft SQL Server or Microsoft Azure SQL Database is 1433.
• Oracle database. Database SID. System identifier for the database.
• Microsoft SQL Server database or Microsoft Azure SQL Database. Database name. Name of the
database instance.
Connection string to the database. If you select this option, enter values in one of the following
fields:
• JDBC string. JDBC connection string to the Operational Data Store.
• ODBC string. ODBC connection string to the Operational Data Store. If you install the
PowerCenter client plug-in, the installer cannot verify the validity of the ODBC string.
Note: If you use a named Microsoft SQL Server database instance, you cannot connect to the
database instance using the Database URL option. Use the Custom Connection String option.
For example:
jdbc:informatica:sqlserver://MYSQLSERVERCOMPUTERHOSTNAME
\MYDBINSTANCENAME;DatabaseName=MYDATABASENAME;
3. Enter values for the operational data store in the following fields:
Database username
The password for the database account for the database. Data Integration Hub stores the password
as an encrypted string.
4. Press Enter.
The User Authentication section appears.
Gateway host
Host name of the Informatica security domain server. Data Integration Hub stores the host name in the
pwc.domain.gateway system property.
Gateway port
Port number for the Informatica security domain gateway. Data Integration Hub stores the port number
in the pwc.domain.gateway system property. Use the gateway HTTP port number to connect to the
domain from the PowerCenter Client. You cannot use the HTTPS port number to connect to the domain.
Username
User name to access the Administrator tool. You must create the user in the Administrator tool and
assign the manage roles/groups/user privilege to the user.
Password
Security domain
Security group
Optional. Security group within the Informatica security domain where Data Integration Hub users are
defined in the following format:
<security group>@<domain>
If you leave the field empty, the Informatica security domain synchronizes only the Data Integration Hub
administrator user account.
Data Integration Hub stores the security group in the dx.authentication.groups system property in the
following format:
<group name>@<security group>[;<groupname>@<security group>]
Service Principal Name (SPN) for the Data Integration Hub Operation Console.
Location of the keytab file for the Data Integration Hub Operation Console SPN.
If you change the property to point to a different file, you must enter the absolute path to the file using
the following format:
file://<full_path>
System Administrator
<username>@<SECURITY_DOMAIN>
Gateway host
Security group
Optional. Security group within the Informatica security domain where Data Integration Hub users are
defined in the following format:
<security group>@<domain>
If you leave the field empty, the Informatica security domain synchronizes only the Data Integration Hub
administrator user account.
Data Integration Hub stores the security group in the dx.authentication.groups system property in the
following format:
<group name>@<security group>[;<groupname>@<security group>]
Instructs Data Integration Hub to use secure network communication when you open the
Operation Console in the browser.
If you select HTTPS and HTTP, the Operation Console switches existing HTTP connections with
HTTPS connections.
2- Enable HTTP
Instructs Data Integration Hub to use regular HTTP network communication when you open the
Operation Console in the browser.
b. If you selected Enable HTTPS, enter values in the following fields:
Connector port number
Port number for the Tomcat connector to use when you open the Operation Console with
HTTPS.
The default value is 18443.
Instructs the installer to generate a keystore file with an unregistered certificate. If you select
this option, ignore the security warning that you receive from the browser the first time you
open the Operation Console.
Instructs the installer to load an existing keystore file. Enter values in the following fields:
• Keystore password. Password for the keystore file.
• Keystore file. Path to the keystore file.
The keystore file must be in the Public Key Cryptography Standard (PKCS) #12 format.
c. If you selected Enable HTTP, enter values in the following fields:
HTTP connector port number
Port number for the HTTP connector. If you clear this field, your browser must connect to the
Data Integration Hub server with HTTPS when you log in to the Operation Console.
The default value is 18080.
Port number for the listener that controls the Tomcat server shutdown.
The default value is 18005.
4. Press Enter.
If you selected to install the Data Integration Hub PowerCenter server plug-in or the Data Integration Hub
PowerCenter Client plug-in components, the PowerCenter Location section appears. If you did not select
the PowerCenter server or client components, the PowerCenter Web Services Hub section appears.
Host name of the node that runs the PowerCenter Repository Service.
Port number of the node that runs the PowerCenter Repository Service.
Username
Password
Password for the PowerCenter Repository Service user. Data Integration Hub stores the password
as an encrypted string.
Security domain
Optional. Name of the Informatica security domain in which the PowerCenter Repository Service
user is stored.
Default is Native.
5. Press Enter.
If you selected to install the Data Integration Hub server plug-in for PowerCenter component, the
PowerCenter Domain Settings section appears. If you did not select the PowerCenter server component,
the PowerCenter pmrep Command Line Utility Location section appears. Go to step 9.
6. Enter values in the following fields:
Domain name
Name of the Informatica domain that contains the PowerCenter Integration Service that runs Data
Integration Hub workflows.
Node name
Node in the Informatica domain on which the PowerCenter Integration Service runs.
Password for the Informatica domain administrator. Data Integration Hub stores the password as
an encrypted string.
7. Press Enter.
8. Enter the name of the PowerCenter Integration Service that Data Integration Hub uses to run workflows,
and then press Enter.
The silent install supports fresh installation of all Data Integration Hub components on a single node.
Upgrade of Data Integration Hub and High Availability multi-node installations of Data Integration Hub using
the silent installer is supported. Before you install, verify that your environment meets the minimum system
requirements, perform the pre-installation tasks, and verify that the PowerCenter services are running.
1. Configure the installation properties file and specify the installation options in the properties file.
2. Run the installer with the installation properties file.
The following table describes parameters that you add in the installation properties file:
Parameter Description
PC_SERVER_PLUGIN To install the Data Integration Hub server plug-in for the
PowerCenter services, set the parameter to 1. Else, set to 0.
DIH_HADOOP_SERVICE To install the Data Integration Hub Hadoop service, set the
parameter to 1. Else, set the parameter to 0. This applies
only for Linux.
Configure the Data Integration Hub repository using the following parameters:
DB_PORT_1 Port number for the database server. The default port
number for Oracle is 1521. The default port number for
Microsoft SQL Server is 1433.
DB_USER_1 Name of the database user account for the database where
you do not use Windows authentication.
DB_PASSWORD_1 Password for the database account for the database where
you do not use Windows authentication. Data Integration
Hub stores the password as an encrypted string.
STAGING_PORT_1 Port number for the staging server. The default port number
for Oracle is 1521. The default port number for Microsoft
SQL Server is 1433.
STAGING_USER_1 Name of the staging user account for the database where
you do not use Windows authentication.
STAGING_PASSWORD_1 Password for the database account for the staging where
you do not use Windows authentication. Data Integration
Hub stores the password as an encrypted string.
ODS_DB_PORT_1 Port number for the database. The default port number for
an Oracle database is 1521. The default port number for a
Microsoft SQL server is 1433.
ODS_DB_USER_1 Name of the database user account for the database where
you do not use Windows authentication.
ODS_DB_PASSWORD_1 Password for the database account for the database where
you do not use Windows authentication. Data Integration
Hub stores the password as an encrypted string.
Configure settings for Informatica Domain with Kerberos authentication using the following parameters:
Configure settings for Informatica domain authentication using the following parameters:
Configure Data Integration Hub native authentication using the following parameter:
Configure the Data Integration Hub document store using the following parameter:
TOMCAT_HTTPS_PORT_1 The Port number for the Tomcat connector to use when you
open the Operation Console with HTTPS.
The default value is 18443.
TOMCAT_SERVER_LISTENER_PORT_1 Port number for the listener that controls the Tomcat server
shutdown.
The default value is 18005.
PC_WEB_SERVICES_URL_1 URL that the PowerCenter Web Services Hub uses when
Data Integration Hub transfers documents to PowerCenter
for processing with batch workflows.
Configure Informatica Data Quality domain settings using the following parameters:
DIS_HOST_NAME_1 Host name of the node that runs the Data Integration
Service.
DIS_PORT_NUMBER_1 Port number of the node that runs the Data Integration
Service.
Configure Informatica Data Quality Command Line Utility using the following parameters:
#Install or Upgrade
#------------------
IS_INSTALL=1
IS_UPGRADE=0
#PowerCenter Version
#--------------------
PWC_VERSION_BEFORE_10_5=0
PWC_VERSION_10_5=1
#PowerCenter Location
#--------------------
POWERCENTER_HOME_1=C:\\Informatica\\105
#Installation Directory#Mandatory
#--------------------------------
USER_INSTALL_DIR=C:\\Informatica\\DIH
#Installation Components
#-----------------------
DX_SERVER=0
DX_DASHBOARD=0
PC_SERVER_PLUGIN=0
PC_CLIENT_PLUGIN=1
DIH_HADOOP_SERVICE=0
#Metadata Repository
#-------------------
BLANK_USER=0
CONFIGURED_USER=1
#User Authentication
#-------------------
INTERNAL_AUTH=1
INTERNAL_AUTH_DEFAULT=true
ISF_AUTH=0
ISF_AUTH_DEFAULT=false
KERBEROS_AUTH=0
KERBEROS_AUTH_DEFAULT=false
#Web Server
#----------
TOMCAT_ENABLE_HTTPS_1=1
TOMCAT_HTTPS_PORT_1=18443
TOMCAT_EXISTING_KEYSTORE_FILE_1=0
TOMCAT_KEYSTORE_PASSWORD_1=
TOMCAT_KEYSTORE_FILE_PATH_1=
TOMCAT_ENABLE_HTTP_1=1
TOMCAT_PORT_1=18080
TOMCAT_SERVER_LISTENER_PORT_1=18005
#PowerCenter Location
#--------------------
POWERCENTER_HOME_1=C:\\Informatica\\105
The silent installation fails if you incorrectly configure the properties file or if the installation directory is not
accessible. View the installation log files and correct the errors. Then run the silent installation again.
Post-Installation Tasks
This chapter includes the following topics:
1. If you installed the Data Integration Hub server plug-in for PowerCenter, register the plug-in to the
PowerCenter repository.
2. Configure PowerCenter to access Data Integration Hub.
3. If you want to use the Run Publication Subscription web service API, set up the web service.
4. If you installed the Data Integration Hub repositories on a Microsoft SQL Server and you selected to use
Windows authentication, configure credentials for Windows authentication.
5. Set up partitions on the Data Integration Hub publication repository database that stores published data
sets. Setting up partitions is highly recommended.
6. If you installed the Data Integration Hub Hadoop Service component, configure your environment for a
Hadoop-based publication repository.
7. Set the environment variable JRE_HOME to <DIH_HOME>/DataIntegrationHub/jdk1.8/jre, if you installed
Data Integration Hub on SUSE Linux operating system.
8. Start the Data Integration Hub services. For more information, see Chapter 7, “Starting and Stopping
Data Integration Hub” on page 110.
68
Note: If you installed the Data Integration Hub Hadoop Service component on a different machine than
the machine where you installed Data Integration Hub, start the services on both machines. For more
information, see “Starting and Stopping Data Integration Hub on Linux” on page 111.
9. Log in to the Data Integration Hub Operation Console.
10. Configure connections to the Data Integration Hub repositories in the Data Integration Hub Operation
Console.
11. If you installed Data Integration Hub repositories on Microsoft Azure SQL databases, configure
connections to the Microsoft Azure SQL databases.
12. If you installed Data Integration Hub with Informatica domain authentication or with Informatica domain
with Kerberos authentication, synchronize Data Integration Hub users in the Data Integration Hub
Operation Console.
13. If you installed the Dashboard and Reports component, activate the component.
14. If you want to configure subscriptions with the delivery option Insert new rows and update changed
rows, and if the column values are expected to contain NULLs, you must configure the PowerCenter
Integration Service property Treat Null In Comparison Operators As to Low.
Related Topics:
• “Overview of Starting and Stopping Data Integration Hub” on page 110
The PowerCenter Repository Service must be running in exclusive mode when you register the plug-in. After
the registration, restart the PowerCenter Repository Service in normal mode.
If the PowerCenter services and the Data Integration Hub server run on separate machines, verify that the
settings for the Data Integration Hub server are set correctly.
To use the web service, the Informatica domain must contain the following services:
1. Use the PowerCenter Repository Manager to import the following workflow file into the PowerCenter
repository: wf_m_DIH_WS_StartPublicationSubscription.xml.
2. In the Web Services Hub console, verify that the Data Integration Hub web service is correctly imported
into PowerCenter. If the import process is successful, the list of valid services includes the Data
Integration Hub web service.
3. You can use the Try-It application in the Web Services Hub console to test the Data Integration Hub web
service. On the XML Input tab, enter the data into the SOAP message and click Send. To avoid
authentication errors, do not use the Form Input page to test the Data Integration Hub web service.
After you verify that the web service is working, you can create a client application to send requests to
the web service.
In the Informatica Administrator, select the PowerCenter Integration Service that runs Data Integration Hub
workflows. Verify the following environment variable settings:
Environment Value
Variable
Before you start the configuration process, verify that all Data Integration Hub Windows services are stopped
and that the Data Integration Hub Operation Console and the Data Integration Hub server are not running.
When you set up partitioning, Data Integration Hub reduces fragmentation by deleting expired data sets and
reducing the time of data set creation and data consumption. You can set up partitions on Oracle and
Microsoft SQL Server databases.
If you switch between a partitioned database and a non-partitioned database, the change affects topics,
publications, and subscriptions that you create after the switch. Therefore, it is recommended that you
choose a partition state before you start to use Data Integration Hub. To switch the partition state for
existing topics, publications, or subscriptions, contact Informatica Global Customer Service.
1. On the machine where the Data Integration Hub Hadoop Service is installed, use a text editor to open the
dx-configuration.properties file from the following location:
<DIHInstallationDir>/DataIntegrationHub/tomcat/shared/classes/
2. Set the properties in the following sections of the file and then save the file:
HIVE settings
Property Description
dih.hadoop.hive.password Password of the user that connects to the Apache Hive server.
dih.hadoop.service.warehouse.dir Path to the Hive warehouse directory. Required if the Apache Hive
server uses a non default schema. If the Apache Hive server uses a
default schema, do not enter a value for this property.
For example:
dih.hadoop.hive.username=hive
dih.hadoop.hive.password=password
dih.hadoop.hive.url=jdbc:hive2://hive_host:10000/
myschema
dih.hadoop.service.warehouse.dir=/user/hive/
mydatawarehousedir
Property Description
KERBEROS settings
If the Hadoop cluster uses Kerberos authentication, configure the following settings:
Property Description
3. On the machine where the Data Integration Hub Hadoop Service is installed, use a text editor to open the
dih-hadoop-service.xml file from the following location:
<DIHInstallationDir>/DataIntegrationHub/tomcat/conf/Catalina/localhost
If you use Informatica platform authentication, verify that all user accounts and user passwords exist on the
authentication server.
To configure the connections, you must be logged in to the Data Integration Hub Operation Console with the
administrator user account.
NameNode URI
Hadoop Distribution
• The Data Integration Hub operation console with the administrator user account.
• The PowerCenter client operation console with the administrator user account.
In order to configure connections to the Microsoft Azure SQL database, edit the DIH_REPO connection in the
Data Integration Hub console, enable DSN for the DIH_REPO connection in PowerCenter, and then test the
connection in Data Integration Hub. Perform the same procedure for the DIH_STAGING connection.
1. Perform the following steps in the Data Integration Hub operation console to edit the DIH___REPO
connection:
To synchronize users in the Informatica security domain with Data Integration Hub, the following conditions
must be true:
• The Informatica security domain is configured on the Security page of Informatica Administrator.
• At least one security group in the Informatica security domain contains the Data Integration Hub users to
synchronize.
• The Data Integration Hub system property dx.authentication.groups contains the list of groups from
the Informatica security domain to synchronize, in the following format:
<group name>@<security domain> [;<groupname>@<security domain>]
• One of the groups that are defined in dx.authentication.groups contains the user that performs the
synchronization.
• The user that is defined in the Data Integration Hub system property pwc.repository.user.name has
privileges to manage users, groups, and roles.
• The Data Integration Hub user has privileges to synchronize users.
1. Contact Informatica Global Customer Support to receive the Logi Info Dashboard license files.
2. Start the Data Integration Hub services.
3. Move the file _Settings.lgx from the following location:
<DIHInstallationDir>\DataIntegrationHub\tomcat\webapps\dih-dashboard\_Definitions
To the following location:
<DIHInstallationDir>\DataIntegrationHub\tomcat\shared\classes
Rename the file to the following name:
dx_dashboard_configuration.xml
4. Copy the Logi Info Dashboard license file _Settings_encrypted.lgx to the following location:
<DIHInstallationDir>\DataIntegrationHub\tomcat\webapps\dih-dashboard\_Definitions
5. Rename the file _Settings_encrypted.lgx to _Settings.lgx.
6. Restart the Data Integration Hub services.
If the IP addresses of the machine that hosts Data Integration Hub change any time after the installation, you
must update the IP addresses in the Logi Info Dashboard license file. For more information, see “Updating
the Dashboard Configuration File” on page 120.
If you use an existing workflow with the name DX_ETL, rename the existing workflow in PowerCenter
Repository Manager before you import the ODS event loader workflow, or import the workflow to a different
folder.
Note: After you import the ODS event loader workflow, do not run the workflow manually. The workflow must
start at the scheduled time. If you start the workflow manually it might fail to store aggregated events in the
Data Integration Hub ODS.
Oracle <DIHInstallationDir>\powercenter\ETL\DX_ETL.xml
• 10.4.1
• 10.4.0
When you upgrade Data Integration Hub, the installer backs up the files of the previous version of Data
Integration Hub and installs the new version. If the document store is found under the Data Integration Hub
Installation folder, you must move the document store to its original location after the upgrade process
completes and before you start the DIH service. You can create a new repository for the new version or you
can use the existing repository. If you use the repository from the previous version, the installer upgrades the
repository to the latest version. When you upgrade the Data Integration Hub repository, you cannot change
the database server type and server location.
Before you start the upgrade process, perform the procedures that are described in “Before You Upgrade” on
page 82. Then run the Data Integration Hub installer. After the upgrade, perform the procedures that are
described in “After You Upgrade” on page 105.
Note: During the upgrade you cannot change the user authentication method that Data Integration Hub uses.
To change the user authentication method you must first upgrade the system and then switch to the required
authentication method. For information about switching between authentication methods see the Data
Integration Hub Administrator Guide.
81
Before You Upgrade
To prepare for the upgrade, perform the following tasks:
1. Verify that you have the user names and passwords for the required database accounts.
2. Verify that no publications or subscriptions are running and that all events are in a Final state.
Tip: You can view the state of all events in the Data Integration Hub Operation Console, on the Event List
page.
3. Ensure that the all topics are in a Valid status.
Tip: You can view the status of all topics in the Data Integration Hub Operation Console, on the Topics
page.
4. Ensure that the retention management job is not running. If the retention management job is running, you
will see an incomplete System Event for the retention management job.
Note: If the publication repository is used by applications other than Data Integration Hub, you might lose
data in the tables that are not managed by Data Integration Hub when the upgrade is complete.
5. Ensure that the dx.endpoint.file.prefix property is empty.
Note: This property is the landing zone path. Earlier releases of Data Integration Hub do not support this
property. If this property value is not empty, then set it to an empty value.
6. Stop all Data Integration Hub services. The Data Integration Hub upgrade modifies the Data Integration
Hub files. The installer cannot proceed if the files are in use.
Note: If the Data Integration Hub Hadoop Service component is installed on a different machine than the
machine where you installed Data Integration Hub, stop the services on both machines. For more
information, see “Starting and Stopping Data Integration Hub on Linux” on page 111.
7. Unschedule all custom workflows in PowerCenter. In PowerCenter, unschedule all custom workflows
and verify that they do not run until the upgrade is complete.
8. Back up the Data Integration Hub repository to be upgraded. Use the database server backup utility to
back up the repository. This ensures that you can recover from any errors that you encounter during the
upgrade.
9. Back up the Data Integration Hub publication repository to be upgraded. Use the database server backup
utility to back up the repository. This ensures that you can recover from any errors that you encounter
during the upgrade.
10. Back up the existing Data Integration Hub installation folder. Perform this action to help ensure that you
can recover from any errors encountered during the upgrade, and that, after the upgrade, you can reapply
modifications that were made to the configuration in previous versions.
11. If the PowerCenter services are not installed on the same machine where Data Integration Hub is
installed and you have upgraded the pmrep command line utility after you installed the previous version
of Data Integration Hub, clean up all CNX files from the Tmp folder on your root directory.
Before you install, verify that your environment meets the minimum system requirements, perform the pre-
installation tasks, and verify that the PowerCenter services are running.
Note: During the upgrade, Data Integration Hub saves log files in the home directory of the user in the
subdirectory named DXLogs. If the upgrade does not complete successfully, you can view the log files in this
location.
Note: You must select the same installation directory where you installed the previous Data Integration
Hub version.
2. Click Next.
The Installation Components page appears.
Installs the Data Integration Hub Dashboard and Reports component. Select this component to view
the reports in the Dashboard using operational data store.You must install Data Integration Hub to
install the Dashboard and Reports component.
Cleared by default.
Note:
• The Dashboard using metadata directory is installed by default. Select this component to view
the reports in the Dashboard using operational data store. After installation, you must ensure to
set the dx.dashboard.ods.page.show system property to true to view the reports in the
Dashboard using operational data store. For more information, see the Data Integration Hub
Administrator Guide and Data Integration Hub Operator Guide.
• If you install the Dashboard and Reports component, you must import the operational data store
event loader after you install Data Integration Hub.
• If you install the Dashboard and Reports component, your Data Integration Hub and operational
data store repositories are installed on Microsoft SQL Servers, and you use PowerCenter version
10, you must configure the repository connections in PowerCenter Workflow Manager. For
details, see “Configuring Repository Connections on PowerCenter Version 10” on page 122.
Installs the Data Integration Hub plug-in for the PowerCenter Client. Install this component on every
machine that runs the PowerCenter Client.
Selected by default.
Enables Data Integration Hub to use Data Quality mappings for custom publications and
subscriptions.
Select this component if you want to create custom publications and subscriptions that use Data
Quality mappings in Data Integration Hub.
Cleared by default.
4. Click Next.
5. Select the PowerCenter version for which to install Data Integration Hub and click Next.
The Metadata Repository page appears.
2. Click Next.
Type of database to use for the Data Integration Hub metadata repository. You can choose one of
the following options:
• Oracle
• Microsoft SQL Server
Database URL
Instructs Data Integration Hub to authenticate user names against the Microsoft Windows
authentication mechanism. Available when you select a Microsoft SQL Server database.
Database username
Name of the database user account for the database where you do not use Windows authentication.
Password for the database account for the database where you do not use Windows authentication.
Data Integration Hub stores the password as an encrypted string.
4. Click Next.
The Publication Repository Connection page appears.
Type of database to use for the publication repository. The database type must match the Data
Integration Hub metadata repository database type and appears in read-only mode.
Database URL
Instructs Data Integration Hub to authenticate user names against the Microsoft Windows
authentication mechanism. Available when you select a Microsoft SQL Server database.
Database username
Name of the database user account for the database where you do not use Windows authentication.
Password of the database account for the database where you do not use Windows authentication.
Data Integration Hub stores the password as an encrypted string.
6. Click Next.
If you selected the Data Integration Hub Dashboard and Reports component, the Operational Data Store
page appears. If you did not select the Dashboard and Reports component, go to “Step 5. Configure the
Web Server ” on page 92.
2. Click Next.
Location of the database. If you select this option, enter the values in the following fields:
• Database host name. Host name of the machine where the database server is installed.
• Database port number. Port number for the database. The default port number for an Oracle
database is 1521. The default port number for a Microsoft SQL server or a Microsoft Azure SQL
database is 1433.
• Database SID. System identifier for the database if you select Oracle as the database. Enter
either a fully qualified ServiceName or a fully qualified SID.
Note: It is recommended that you enter a ServiceName in this field.
• Microsoft SQL Server database or Microsoft Azure SQL Database. Database name.
Connection string to the database. If you select this option, enter values in one of the following
fields:
• JDBC string. JDBC connection string to the Operational Data Store.
• ODBC string. ODBC connection string to the Operational Data Store. Available if you install the
PowerCenter Client plug-in. The installer cannot verify the validity of the ODBC string.
Note: If you use a named Microsoft SQL Server database instance, you cannot connect to the
database instance using the Database URL option. Use the Custom Connection String option.
For example:
Instructs Data Integration Hub to authenticate user names against the Microsoft Windows
authentication mechanism. Available when you select a Microsoft SQL Server database.
Database username
Name of the operational data store user account for the database where you do not use Windows
authentication.
Password for the operational data store account for the database where you do not use Windows
authentication. Data Integration Hub stores the password as an encrypted string.
4. Click Next.
The Web Server page appears.
Instructs Data Integration Hub to use secure network communication when you open the Operation
Console in the browser. If you select HTTPS and HTTP, the Operation Console switches existing
HTTP connections with HTTPS connections.
Port number for the Tomcat connector to use when you open the Operation Console with HTTPS.
The default value is 18443.
Instructs the installer to generate a keystore file with an unregistered certificate. If you select this
option, ignore the security warning that you receive from the browser the first time you open the
Operation Console.
Instructs the installer to load an existing keystore file. Enter values in the following fields:
• Keystore password. Password for the keystore file.
• Keystore file. Path to the keystore file.
The keystore file must be in the Public Key Cryptography Standard (PKCS) #12 format.
2. Click Next.
If you selected to install the Data Integration Hub server plug-in for PowerCenter or the Data Integration
Hub client plug-in for PowerCenter components, the PowerCenter Location page appears. If you did not
select the PowerCenter server or client components, the PowerCenter Web Services Hub page appears.
URL that the PowerCenter Web Services Hub uses to process publication and subscription
workflows.
Service name
Host name of the node that runs the PowerCenter Repository Service.
Port number of the node that runs the PowerCenter Repository Service.
Username
Password
Password for the PowerCenter Repository Service user. Data Integration Hub stores the password
as an encrypted string.
Security domain
Optional. Name of the Informatica security domain in which the PowerCenter Repository Service
user is stored.
Default is Native.
2. Click Next.
If you selected to install the Data Integration Hub server plug-in for PowerCenter component, the
PowerCenter Domain Settings page appears.
Name of the Informatica domain that contains the PowerCenter Integration Service that runs Data
Integration Hub workflows.
Node name
Node in the Informatica domain on which the PowerCenter Integration Service runs.
Domain administrator username
Password for the Informatica domain administrator. Data Integration Hub stores the password as
an encrypted string.
Integration Service name
The name of the PowerCenter Integration Service that Data Integration Hub uses to run workflows.
4. Click Next.
6. Click Next.
If you selected to install the Data Integration Hub Informatica Data Quality component, the Informatica
Data Quality Domain Settings page appears. If you did not select to install the Data Integration Hub
Informatica Data Quality component, the Pre-Installation Summary page appears.
Domain name of the node on which the Data Integration Service runs.
Host name
Host name of the node on which the Data Integration Service runs.
Port number
Port number of the node on which the Data Integration Service runs.
User name for the node on which the Data Integration Service runs.
Password for the node on which the Data Integration Service runs.
Security Domain
Optional. Name of the Informatica security domain in which the Model Repository Service user is
stored.
3. Enter the location of the infacmd command line utility or accept the default directory, and then click
Next.
The Pre-Installation Summary page appears.
During the installation process, the installer displays progress information. When the installation process
ends, the Post-Installation Actions page appears.
2. If you installed the Data Integration Hub PowerCenter server plug-in, follow the wizard instructions to
register the plug-in to the PowerCenter repository, and then click Next.
The Installation Complete page appears.
3. Click Done to close the installer.
4. To view the log files that the installer generates, navigate to the following directory:
<DIHInstallationDir>\logs.
5. Perform the required post-installation tasks.
For more information, see Chapter 5, “Post-Installation Tasks” on page 68.
Note: Perform only the tasks that are relevant for your environment.
6. Optionally, perform additional configuration tasks. For more information, see Chapter 8, “Optional Data
Integration Hub Configuration” on page 112.
Before you install, verify that your environment meets the minimum system requirements, perform the pre-
installation tasks, and verify that the PowerCenter services are running.
Note: During the upgrade, Data Integration Hub saves log files in the home directory of the user, in the
subdirectory named DXLogs. If the upgrade does not complete successfully, you can view the log files in this
location.
Installs the Data Integration Hub Dashboard and Reports component. You must install Data
Integration Hub to install the Dashboard and Reports component. Select this component to view the
reports in the Dashboard using operational data store.
Cleared by default.
Installs the Data Integration Hub PowerCenter server plug-in component. After the installation,
register the plug-in to the PowerCenter repository.
Selected by default.
Connecting module between Data Integration Hub and the Hadoop cluster. Enables Data Integration
Hub to perform operations on the Hadoop publication repository.
Select this component if you want to define a Hadoop-based publication repository and manage
some of your topics on Hadoop.
Note: The Data Integration Hub Hadoop Service must be installed on an edge node in the Hadoop
cluster. Unless you are installing Data Integration Hub on a Hadoop edge node, do not select this
component during Data Integration Hub installation. In this case, after you complete the Data
Integration Hub installation, run the installation again on the edge node and select to install only the
Data Integration Hub Hadoop Service component on the node.
Enables Data Integration Hub to use Data Engineering Integration and Data Quality mappings for
custom publications and subscriptions.
Select this component if you want to create custom publications and subscriptions that use Data
Engineering Integration or Data Quality mappings in Data Integration Hub.
3. Press Enter.
The Select PowerCenter Version section appears.
4. Select the PowerCenter version for which to install Data Integration Hub or accept the default selection:
1- PowerCenter version below 10.5.
Location of the database. If you select this option, enter values in the following fields:
• Database Host Name. Host name of the machine where the database server is installed.
• Database Port Number. Port number for the database. The default port number for Oracle is
1521. The default port for Microsoft SQL Server and Microsoft Azure SQL Database is 1433.
• Database SID. System identifier for the database.
• Microsoft SQL Server database or Microsoft Azure SQL Database. Database name. Name of the
database instance.
Connection string to the database. If you select this option, enter values in one of the following
fields:
• JDBC string. JDBC connection string to the metadata repository.
• ODBC string. ODBC connection string to the metadata repository. Applicable if you install the
PowerCenter client plug-in. The installer cannot verify the validity of the ODBC string.
Note: If you use a named Microsoft SQL Server database instance, you cannot connect to the
database instance using the Database URL option. Use the Custom Connection String option.
For example:
jdbc:informatica:sqlserver://MYSQLSERVERCOMPUTERHOSTNAME
\MYDBINSTANCENAME;DatabaseName=MYDATABASENAME;
5. Enter values in the following fields:
Database username
The password for the database account for the database. Data Integration Hub stores the password
as an encrypted string.
6. Press Enter.
7. Enter the number for the publication repository connection type or accept the default connection type.
Note: When using Microsoft SQL Server named instances, you must define a custom connection string.
1- Database URL
Location of the database. If you select this option, enter values in the following fields:
• Database host name. Host name of the machine where the database server is installed.
• Database port number. Port number for the database. The default port number for Oracle is
1521. The default port for Microsoft SQL Server or Microsoft Azure SQL Database.
• Oracle database. Database SID. System identifier for the database.
• Microsoft SQL Server database or Microsoft Azure SQL Database. Database name. Name of the
database instance.
Connection string to the database. If you select this option, enter values in one of the following
fields:
• JDBC string. JDBC connection string to the publication repository.
The password for the database account. Data Integration Hub stores the password as an encrypted
string.
10. Press Enter.
If you selected to install the Data Integration Hub Dashboard and Reports component, the Operational
Data Store section appears. If you did not select to install the Dashboard and Reports component, go to
“Step 5. Configure the Web Server ” on page 102.
Uses the tables and data in an existing operational data store repository.
2. Enter the number for the database connection type for the operational data store or accept the default
connection type:
1- Database URL
Location of the database. If you select this option, enter values in the following fields:
• Database host name. Host name of the machine where the database server is installed.
• Database port number. Port number for the database. The default port number for Oracle is
1521. The default port for Microsoft SQL Server or Microsoft Azure SQL Database is 1433.
• Oracle database. Database SID. System identifier for the database.
• Microsoft SQL Server database or Microsoft Azure SQL Database. Database name. Name of the
database instance.
Connection string to the database. If you select this option, enter values in one of the following
fields:
• JDBC string. JDBC connection string to the Operational Data Store.
The password for the database account for the database. Data Integration Hub stores the password
as an encrypted string.
4. Press Enter.
The Web Server section appears.
Instructs Data Integration Hub to use secure network communication when you open the
Operation Console in the browser.
If you select HTTPS and HTTP, the Operation Console switches existing HTTP connections with
HTTPS connections.
2- Enable HTTP
Instructs Data Integration Hub to use regular HTTP network communication when you open the
Operation Console in the browser.
b. If you selected Enable HTTPS, enter values in the following fields:
Connector port number
Port number for the Tomcat connector to use when you open the Operation Console with
HTTPS.
The default value is 18443.
Instructs the installer to generate a keystore file with an unregistered certificate. If you select
this option, ignore the security warning that you receive from the browser the first time you
open the Operation Console.
Instructs the installer to load an existing keystore file. Enter values in the following fields:
• Keystore password. Password for the keystore file.
Port number for the HTTP connector. If you clear this field, your browser must connect to the
Data Integration Hub server with HTTPS when you log in to the Operation Console.
The default value is 18080.
Port number for the listener that controls the Tomcat server shutdown.
The default value is 18005.
2. Press Enter.
If you selected to install the Data Integration Hub PowerCenter server plug-in or the Data Integration Hub
PowerCenter Client plug-in components, the PowerCenter Location section appears. If you did not select
the PowerCenter server or client components, the PowerCenter Web Services Hub section appears.
Host name of the node that runs the PowerCenter Repository Service.
Port number of the node that runs the PowerCenter Repository Service.
Username
Password
Password for the PowerCenter Repository Service user. Data Integration Hub stores the password
as an encrypted string.
Security domain
Optional. Name of the Informatica security domain in which the PowerCenter Repository Service
user is stored.
Default is Native.
5. Press Enter.
If you selected to install the Data Integration Hub server plug-in for PowerCenter component, the
PowerCenter Domain Settings section appears. If you did not select the PowerCenter server component,
the PowerCenter pmrep Command Line Utility Location section appears. Go to step 9.
Name of the Informatica domain that contains the PowerCenter Integration Service that runs Data
Integration Hub workflows.
Node name
Node in the Informatica domain on which the PowerCenter Integration Service runs.
Password for the Informatica domain administrator. Data Integration Hub stores the password as
an encrypted string.
7. Press Enter.
8. Enter the name of the PowerCenter Integration Service that Data Integration Hub uses to run workflows,
and then press Enter.
9. Enter the location of the pmrep command line utility and then press Enter. The location of the utility
depends on whether or not you install Data Integration Hub on the machine where the PowerCenter
services are installed.
Note: On Linux operating systems, pmrep must be executable.
1. Reapply modifications that were made to Data Integration Hub configuration files in previous versions.
Note: The Data Integration Hub installer does not delete the previous version of Data Integration Hub. The
installer renames the folder with the suffix _Backupn.n.n where n.n.n is the version number that you
upgraded. To ensure that you update the configuration files correctly, see the configuration files in the
directory of the previous version of Data Integration Hub.
To perform this procedure, you must have backed up the Data Integration Hub installation folder.
1. Open the following file from the location where you backed up the Data Integration Hub installation
folder:
<BackupDir>/conf/dx-configuration.properties
2. On the machine where Data Integration Hub is installed, open the server and console copies of the dx-
configuration.properties files in a text editor from the following locations:
<DIHInstallationDir>\DataIntegrationHub\tomcat\shared\classes\
<DIHInstallationDir>\conf\
3. Copy any relevant configuration changes from the file that you backed up to both the dx-
configuration.properties files.
4. Save the dx-configuration.properties files.
Note: If the Dashboard and Reports component was installed in the previous version of Data Integration Hub,
you must have unscheduled the Data Integration Hub ODS loader workflow before you upgraded Data
Oracle <DIHInstallationDir>\powercenter\ETL\DX_ETL.xml
3. If the Dashboard and Reports component was installed in the previous version of Data Integration Hub
and you are upgrading to the current version from version 9.6.1, select the PowerCenter repository folder
that contains the Data Integration Hub ODS loader workflow from the previous version as the import
folder target.
4. If the Dashboard and Reports component was installed in the previous version of Data Integration Hub,
in the Conflict Resolution Wizard, select Replace. In the Apply this resolution to list, select All Conflicts.
Click Next.
5. In the Global Copy Options area select the options Retain Sequence Generator, Normalizer, or XML key
current values and Retain Persistent Mapping Variable Values.
6. Follow the instructions in the Import Wizard to complete the import of the workflow.
Before you start the configuration process, verify that all Data Integration Hub Windows services are stopped
and that the Data Integration Hub Operation Console and the Data Integration Hub server are not running.
1. Open the following folder from the location where you backed up the Data Integration Hub installation
folder:
<BackupDir>\shared\conf\security\
1. Import the runpublicationsubscription Data Integration Hub web service workflow to PowerCenter.
2. Update your client application to use the new web service.
• Data Integration Hub system fields are mapped in the field mapping page of the subscription wizard.
• Subscription delivery strategy is to insert new rows and update rows that exist in the target.
You can view the list of invalidated subscriptions in the following log file: <Data Integration Hub
installation directory>/logs/B2B_DX_InstallLog.txt.
The following sample shows the list of invalidated subscriptions in the log file:
The invalidated subscriptions with delivery option 'Insert new rows and update changed rows'
and system fields mapped are: [employeedata, coursedescriptions, departmentdetails]
For example, start the services after you install Data Integration Hub, or stop the services before you upgrade
Data Integration Hub.
The installer creates shortcuts in the Start menu to start and stop all Data Integration Hub services.
Starting and Stopping Data Integration Hub from the Start Menu
On Windows operating systems, you can use the Start menu to start and stop all Data Integration Hub
services. You cannot start or stop a single service from the Start menu.
110
• Start Services. Starts all Data Integration Hub services.
• Stop Services. Stops all Data Integration Hub services.
• Operation Console. Opens the Operation Console in a Web browser.
Note: If you installed the Data Integration Hub Hadoop Service component on a different machine than the
machine where you installed Data Integration Hub, to stop or to start the Hadoop service on the machine
where the service is installed, run the startup.sh or shutdown.sh script from the following location:
<DIHInstallationDir>/DataIntegrationHub/tomcat/bin/
• The Data Integration Hub components send information through ports. You can change the default port
numbers based on the requirements of your network environment.
• When different components process information or encounter errors, log files contain information that
you can use to analyze and troubleshoot the installed components. You can change the location of the log
files or define custom logs.
• To increase performance and reliability, you can change the maximum memory allocation for the
embedded Data Integration Hub server broker, or the embedded Data Integration Hub console broker.
• If you change the database user credentials for the Data Integration Hub repositories or for the
operational data store, you must update the Data Integration Hub configuration files. If you are running
the Dashboard and Reports component, you must also update the relevant PowerCenter connections.
• If you use a Hadoop-based publication repository and you change the credentials for the user account of
the Data Integration Hub Hadoop Service, you must update the Data Integration Hub configuration files.
112
• If you use the Dashboard and Reports component, and the IP addresses of the machine that hosts Data
Integration Hub change any time after the installation, you must update the IP addresses in the dashboard
configuration file.
• During the Data Integration Hub installation or upgrade, you define a PowerCenter Integration Service that
Data Integration Hub uses to run workflows. If required, you can configure a different PowerCenter
Integration Service to access Data Integration Hub.
• If you use the Dashboard and Reports component, your Data Integration Hub and operational data store
repositories are installed on Microsoft SQL Servers, and you use PowerCenter version 10, configure the
repository connections in PowerCenter Workflow Manager.
1. On the machine where Data Integration Hub is installed, open the server and console copies of the dx-
configuration.properties files in a text editor from the following locations:
<DIHInstallationDir>/DataIntegrationHub/tomcat/shared/classes/
<DIHInstallationDir>/conf/
2. Enter the port number in the following property:
dx.rmi.port=
3. Save the dx-configuration.properties files.
4. In the Administrator tool, select the PowerCenter Integration Service that runs Data Integration Hub
transformations.
5. On the Processes tab of the PowerCenter Integration Service, add or edit the DX_SERVER_URL
environment variable and set the URL of the Data Integration Hub server in the following format:
rmi://<HostName>:<PortNumber>
6. Save the changes and restart the Data Integration Hub services.
Logs
The Data Integration Hub log files include information that you can use to analyze activity and troubleshoot.
• Debug logs
• RMI server logs
• Database debug logs
• Import logs
To send log messages to a different log file destination, you can create an SNMP appender to redirect the
logs to a custom destination.
Modifying the Data Integration Hub Server RMI Port Number 113
Default Log Files
Data Integration Hub creates log files that record diagnostic information regarding system and user
operations. The installer also creates log files that record installation selections and configuration.
You can configure log settings in the log4j.xml file located in the Data Integration Hub configuration
directory.
When you use the Data Integration Hub Hadoop Service, the log files are located in the following
directory:
<DIHInstallationDir>/DataIntegrationHub/tomcat/logs
1. Add an SNMP appender to the log4j properties file and set the logging level. Change the sample SNMP
appender in the log4j.xml file to the appender that you want to use. You can add multiple appenders to
the log4j.xml file that send different types of log messages to different SNMP outputs.
2. Configure an SNMP manager to listen for messages. For information about configuring the SNMP
manager to handle log4j messages, see the documentation for your SNMP network management
software.
For general information about the log4j utility, see the Apache Web site:
http://logging.apache.org/log4j/1.2/manual.html
The following table describes the SNMP parameters that you can define for Data Integration Hub:
Parameter Description
LocalIPAddress IP address of local SNMP embedded agent. You do not normally need
to modify this value.
Default is 127.0.0.1
ApplicationTrapOID Identifier of the application object that sends the trap messages. You
can set the value of this parameter to the name of the application
object in Data Integration Hub.
Default is 1.3.6.1.2.1.2.0.0.0.0
EnterpriseOID Identifier of the organization object sending the trap message. You can
set this parameter to any value that identifies the message in Data
Integration Hub.
Default is 1.3.6.1.2.1.2.0
ForwardStackTraceWithTrap Determines whether to include the stack trace in the log message.
Default is False
Logs 115
Parameter Description
SysUpTime Amount of time that the application is running. Set the value to 0 to
calculate the system up time when a message is sent.
Default is 0
1. Back up the log4j.xml file that you want to edit from one of the following locations:
• Data Integration Hub server: <DIHInstallationDir>/conf
• Operation Console: <DIHInstallationDir>/DataIntegrationHub/tomcat/shared/classes
2. Open the file in a text editor and search for the following text:
SNMP_TRAP is a sample appender
3. To edit the sample appender with the actual values of your appender, remove the comment indicators
from the SNMP_TRAP appender and edit the appender parameters and values based on your
requirements.
Note: You can also add an appender below the sample appender instead of editing the sample appender.
4. To set the formatting of the log messages, edit the layout element.
The following example shows the layout element of the sample appender:
<layout class="org.apache.log4j.PatternLayout">
<param name="ConversionPattern" value="%d{ISO8601} %-5p [%c] {%t} %m%n"/>
</layout>
For information about the layout pattern options, see the description on the Apache Website:
http://logging.apache.org/log4j/1.2/apidocs/org/apache/log4j/PatternLayout.html
5. To activate the appender, search for the following text:
<root>
6. Add the appender name to the appender list.
The following example shows the appender list after you add the appender name:
<root>
<priority value="INFO"/>
<appender-ref ref="BROKER-LOG"/>
<appender-ref ref="CONSOLE"/>
<appender-ref ref="SNMP_TRAP"/>
</root>
7. Save the log4j.xml file.
After you add the SNMP appender, configure your SNMP manager to listen for the log messages.
Change the maximum heap size in MB in the DX_SERVER_OPTS property. The default maximum heap size is
1024 MB.
Change the maximum heap size in MB in the CATALINA_OPTS property. The default minimum heap size is
128 MB and the default maximum heap size is 2048 MB.
Enter the maximum heap size in MB with integers and without letters. The default maximum heap size is
2048 MB.
Perform the following steps if you change the credentials for a database user account after you install Data
Integration Hub. Perform only the steps that are relevant to the changes that you are making. If you are not
running the Dashboard and Reports component, skip the steps that are only relevant to this component.
1. Stop the Data Integration Hub services and close the Operation Console.
2. Verify that the PowerCenter Integration Service is not running any Data Integration Hub workflows.
3. If you are running the Dashboard and Reports component, and you are changing credentials for the Data
Integration Hub repository or for the operational data store user account, use the PowerCenter Workflow
Manager to update the credentials in the following connections:
• For the Data Integration Hub repository, update the DX_REPO connection.
• For the operational data store, update the DX_ODS connection.
4. If you are changing a password, perform the following steps:
a. Run the password encryption utility and enter the new password in the following syntax:
• On Windows operating systems: <DIHInstallationDir>\dx-tools\dxpasswd.bat -p
<NewPassword>
• On UNIX operating systems: <DIHInstallationDir>/dx-tools/dxpasswd.sh -p <NewPassword>
The password encryption utility encrypts the password and displays an encrypted string. For
example, -->ywo+o3cw8+O3iLdlhPprW2YA==<--.
b. Copy the encrypted string without the --><-- indicators to the clipboard.
5. Open both copies of the dx-configuration.properties file from the following locations in a text editor:
<DIHInstallationDir>/DataIntegrationHub/tomcat/shared/classes/
<DIHInstallationDir>/conf/
6. In both copies of the dx-configuration.properties file, perform the following steps:
a. Search for the text that is relevant to the changes that you are making:
• Data Integration Hub repository:
dx.jdbc.username=<CurrentUsername>
dx.jdbc.password=<CurrentPassword>
• Publication repository:
dih.staging.jdbc.username=<CurrentUsername>
dih.staging.jdbc.password=<CurrentPassword>
• Operational data store:
dx.dashboard.jdbc.username=<CurrentUsername>
dx.dashboard.jdbc.password=<CurrentPassword>
b. Replace the relevant value with the new value. If you are replacing a password, enter the encrypted
string.
c. Save and close the files.
Note: The content in both copies of the dx-configuration.properties file must be identical.
Perform the following steps if you change the credentials of the Data Integration Hub Hadoop Service at a
later date.
1. Stop the Data Integration Hub services and close the Operation Console. For more information, see
“Starting and Stopping Data Integration Hub on Linux” on page 111
2. Verify that the PowerCenter Integration Service is not running any Data Integration Hub workflows.
3. If you are changing a password, perform the following steps:
a. Run the password encryption utility and enter the new password in the following syntax:
<DIHInstallationDir>/dx-tools/dxpasswd.sh -p <NewPassword>
Changing the Credentials for the Data Integration Hub Hadoop Service 119
The password encryption utility encrypts the password and displays an encrypted string. For
example, -->ywo+o3cw8+O3iLdlhPprW2YA==<--.
b. Copy the encrypted string without the --><-- indicators to the clipboard.
4. Open both copies of the dx-configuration.properties file in a text editor from the following locations
on the machine where Data Integration Hub is installed:
<DIHInstallationDir>/DataIntegrationHub/tomcat/shared/classes/
<DIHInstallationDir>/conf/
5. If the Data Integration Hub Hadoop Service component is installed on a different machine than the
machine where Data Integration Hub is installed, on the machine where the Data Integration Hub Hadoop
Service is installed, open the dx-configuration.properties file in a text editor from the following
location:
<DIHInstallationDir>/conf/
6. In all the copies of the dx-configuration.properties file, perform the following steps:
a. Search for the following text :
internal.service.username=
internal.service.password=
b. Replace the relevant value with the new value. If you are replacing a password, enter the encrypted
string.
c. Save and close the files.
Note: The content in all copies of the dx-configuration.properties file must be identical.
7. Start the Data Integration Hub Operation Console.
8. Verify that the PowerCenter Integration Service is running.
9. In the Navigator, click Hub Management > Connections.
The Connections page appears.
10. Click the Test Connection icon next to the DIH__STAGING__HADOOP connection.
Data Integration Hub tests the connection. The process might take a few moments.
11. When the message "Connection successful" shows, click OK in the Test Connection dialog box.
12. Start the Data Integration Hub Server services. For more information, see “Starting and Stopping Data
Integration Hub on Linux” on page 111
In the Java classpath for the PowerCenter Integration Service, add the path to the Data Integration Hub class
files.
1. Log in to the Administrator tool and select the PowerCenter Integration Service that runs the workflows
for Data Integration Hub.
2. On the Processes tab, edit the Java SDK ClassPath property and add the location of the Data Integration
Hub Java classes at the beginning of the ClassPath property:
<Data Integration Hub installation directory>/powercenter/lib/dx-client-
powercenter-10.5.0.jar;
<Data Integration Hub installation directory>/powercenter/lib/commons-
logging-1.1.3.jar;
<Data Integration Hub installation directory>/powercenter/lib/log4j-1.2.17.jar;
<Data Integration Hub installation directory>/powercenter/lib/activemq-all-5.15.9.jar
If the installation log file lists invalidated subscriptions, manually run them. You can find the installation
log file in the following directory: <Data Integration Hub installation directory>/logs
During the upgrade, Data Integration Hub invalidates the automatic relational database subscriptions
that have the following configurations:•
Subscription field mapping.•
Subscription delivery strategy is to insert new rows and update rows that exist in the target.
For more information about running the subscriptions manually, see the Data Integration Hub Operator
3. Add environment variables to the Data Integration Hub console and server integration services.
DX_CONSOLE_URL rmi://<HostName>:<dx.tpm.rmi.port>
DX_SERVER_URL rmi://<HostName>:<dx.rmi.port>
1. In the Workflow Manager, access the DX_REPO database connection and open the Connection Object
Definition dialog box.
2. Perform the following actions and then click OK:
a. Select Use DNS.
b. In the Connect String text box enter the connection name. The name is defined in the ODBC Data
Source Administrator interface, in ODBC SQL Server Wire Protocol Setup, in the Data Source Name
field.
3. Repeat steps 1 and 2 for the DX_ODS connection.
• Installing and Configuring Data Integration Hub Accelerator for Data Archive Overview, 123
• Pre-Installation Steps, 123
• Installing the Data Integration Hub Accelerator for Data Archive, 124
• Configuring the Data Archive Source Connection, 125
After you install Data Integration Hub Accelerator for Data Archive, configure the Data Archive source
connection.
Pre-Installation Steps
Before you install the Data Integration Hub accelerator in Data Archive, make sure that your system meets
the minimum requirements and follow the pre-installation steps.
1. Make sure you have an active installation of Data Archive. A limited Data Archive version is provided with
Data Integration Hub.
2. Create and assign privileges to the Data Integration Hub repository users.
123
3. In the Data Archive installation directory, open the conf.properties file and set the value of the
following properties:
• Set the value of the informia.useDbaViewsInSeamlessAccess property to false.
• Set the value of the informia.proceduresToExecute.inArchiveFromHistory property to java://
com.informatica.b2b.dx.ilm.MoveDXDocStoreDatabaseDAOImpl.
Note: Back up the conf.properties file before you modify the property.
4. Restart Data Archive.
Before you install the accelerator, install Data Archive and follow the pre-installation steps.
The minimum supported version of Data Archive is 6.1. If you have Data Archive version 6.1 installed, also
install EBF 11801 and EBF 11672.
1. Log in to Data Archive with administrator privileges and click Accelerators > Enterprise Data Manager.
The Enterprise Data Manager appears.
2. In the Enterprise Data Manager, click File > Import > Accelerator.
The Import Metadata Options window appears.
3. Select Continue Import through EDM and click OK.
4. Navigate to the location <DIHInstallationDir>/ILM-accelerator and select to import all the XML files
in the directory.
Note: Do not import the sub-folders in the directory.
The Enterprise Data Manager displays a progress window during the import process.
5. To verify the import process, restart the Enterprise Data Manager and make sure that the accelerator
appears in the Data Integration Hub node of the Explorer pane.
6. To add the accelerator to drop-down lists in Data Archive, log in to the database with the Data
Integration Hub repository credentials and run the SQL script on the Data Archive repository. The script
is located in one of the following locations:
Database Path
Oracle <DIHInstallationDir>/ILM-accelerator/sql/oracle_ilm_repository_update.sql
124 Chapter 9: Installing and Configuring the Data Integration Hub Accelerator for Data Archive
Configuring the Data Archive Source Connection
Configure the Data Archive source connection before you purge events from the database.
For information about creating source connections, see the Data Archive Administrator Guide.
1. Log in to Accelerator.
2. Click Administration > New Source Connection.
3. Enter the database-specific connection properties.
4. Set the application version of the connection to Data Integration Hub 10.5.0.
5. Set the property Source / Staging Attachment Location to the root path for the production document
store.
The path is the same as the path defined for the Data Integration Hub system property
dx.system.document.store.folder.
6. If the connection is to an SQL Server database, select the option Compile ILM Functions.
Note: The location specified in the Source / Staging Attachment Location property must be accessible to the
Data Archive purge process.
If you install the Data Integration Hub Hadoop Service, troubleshoot the installation of the service.
When I test the DIH__STAGING__HADOOP connection the test succeeds but when I try to run a
publication the running fails.
The following example shows a sample error message:
Wrong FS: hdfs://10.40.40.96:8020/user/infa_user/TEST_FILTER_87003, expected: hdfs://
tavdxcdh53n1:8020
In the Data Integration Hub Operation Console, in the Connections page, edit the DIH__STAGING__HADOOP
connection so that NameNode URI is identical to the setting of the property fs.default.name or the property
fs.defaultFS in the core-site.xml file.
126
When I test the DIH__STAGING__HADOOP connection the test fails with the following error: Data
Access Connection test failed with the following error:DXServerException: Cannot establish
connection to Apache Spark.
1. Check the Data Integration Hub Hadoop Service log file for additional details. For more information, see
“Default Log Files” on page 114.
2. Verify that Apache Spark is running.
3. On the machine where the Data Integration Hub Hadoop Service is installed, open the dx-
configuration.properties file in a text editor from the following location:
<DIHInstallationDir>/DataIntegrationHub/tomcat/shared/classes/
• Verify that the value of dih.hadoop.service.spark.url is correct. For more information, see
“Configuring the Environment for a Hadoop Publication Repository” on page 72.
• Verify that the value of dih.hadoop.service.spark.version is correct.
When I test the DIH__STAGING__HADOOP connection the test fails with the following error: Data
Access Connection test failed with the following error:DXServerException: Exception : org/
apache/hadoop/fs/FileSystem.
The definition of the classpath of the Data Integration Hub Hadoop Service is incorrect.
1. On the machine where the Data Integration Hub Hadoop Service is installed, open the dih-hadoop-
service.xml file in a text editor from the following location:
<DIHInstallationDir>/DataIntegrationHub/tomcat/conf/Catalina/localhost
2. Configure the correct locations of the JAR files.
When I test the DIH__STAGING__HADOOP connection the test fails with the following error: Data
Access Connection test failed with the following error:DXServerException: Cannot establish
connection to Apache Hive.
1. Check the Data Integration Hub Hadoop Service log file for additional details. For more information, see
“Default Log Files” on page 114.
2. Verify that Apache Hive is running.
3. On the machine where the Data Integration Hub Hadoop Service is installed, open the dx-
configuration.properties file in a text editor from the following location:
<DIHInstallationDir>/DataIntegrationHub/tomcat/shared/classes/
Verify that the value of dih.hadoop.service.hive.url is correct. For more information, see “Configuring the
Environment for a Hadoop Publication Repository” on page 72.
4. If the message java.lang.ClassNotFoundException: org.apache.hive.jdbc.HiveDriver appears in the log
file, or if a similar message appears in the file, this is an indication that the definition of the classpath of
the Data Integration Hub Hadoop Service is incorrect. Perform the following actions:
a. On the machine where the Data Integration Hub Hadoop Service is installed, open the dih-hadoop-
service.xml file in a text editor from the following location:
<DIHInstallationDir>/DataIntegrationHub/tomcat/conf/Catalina/localhost
When I test the DIH__STAGING__HADOOP connection the test fails with the following error: Data
Access Connection test failed with the following error:DXServerException: The Hadoop file
system is not available.
When I test the DIH__STAGING__HADOOP connection the test fails with the following error: Data
Access Connection test failed with the following error:DXServerException:
ResourceAccessException: The connection is not valid. The Data Integration Hub Hadoop
service is not running.
Log in to the machine where the Data Integration Hub Hadoop Service is installed and run the service. For
more information, see “Starting and Stopping Data Integration Hub on Linux” on page 111.
1. On the machine where the Data Integration Hub server is installed, open the dx-
configuration.properties file in a text editor from the following location:
<DIHInstallationDir>/DataIntegrationHub/tomcat/shared/classes/
Verify that the value of dx.server.url is correct.
2. If you installed the Data Integration Hub Hadoop Service on a different machine than the machine where
you installed Data Integration Hub, open the dx-configuration.properties file in a text editor from the
same location on the machine where the Data Integration Hub Hadoop Service is installed and verify that
the value of dx.server.url is correct.
3. On the machine where the Data Integration Hub Hadoop Service is installed, ping the URL that is defined
in dx.server.url and verify that it is accessible.
The running of publications and subscriptions fails. The Data Integration Hub Hadoop Service log
shows that the service repeatedly tries to access localhost:8020 and fails each time.
In Cloudera Manager, enable the option Bind NameNode to Wildcard Address and then restart the HDFS
service.
The installer creates an installation log file during and after the installation. You can find the installation log
file in the following directory:
<DIHInstallationDir>/logs
Uninstallation
This chapter includes the following topics:
Uninstallation Overview
Uninstall Data Integration Hub to remove the core application and additional components that you installed
on the machine.
The uninstallation process does not delete the repositories or the Data Integration Hub document store.
The uninstallation process depends on the operating system on which Data Integration Hub is installed,
Windows or UNIX.
130
Uninstalling Data Integration Hub from UNIX
Operating Systems
1. Stop all Data Integration Hub services.
2. Run the Uninstall.exe file from the Data Integration Hub installation directory.
The Uninstall Data Integration Hub section appears.
3. Click Next.
The Pre-Uninstall Summary section appears.
4. Click Uninstall.
The uninstaller displays the progress of the uninstallation process. When the uninstallation process
ends, the Uninstall Complete section appears.
5. Click Done to exit the uninstaller.
A I
after you upgrade Installation
description 105 additional components 10
reapply configuration modifications 107 components 9
tasks 105 uninstalling from UNIX 131
uninstalling from Windows 130
installer requirements
C M
configuration minimum system requirements
Java heap size 117 installer 16
SNMP logs 114
connections
repositories 75, 77
credentials
O
changing for Data Integration Hub Hadoop Service 119 operating system
changing for repository user account 118 minimum system requirements 16
Operation Console
logging in 75
D
Dashboard and reports
importing operational data store event loader workflow 79
P
data archive partitions
configure source connection 125 description 71
Data Archive port numbers
installing the Data Integration Hub accelerator 123, 124 default 14
Data Integration Hub Hadoop Service post-installation
changing the credentials 119 description 68
document store Hadoop publication repository 72
setting up 22 PowerCenter Integration Service 121
registering PowerCenter server plug-in 69
set up database partitions 71
E tasks 68
PowerCenter
event purging connections 75, 77
Data Archive 123, 124 prerequisite
Microsoft SQL Database 22
pmrep 19
H software 18
Hadoop
configuration files 72
R
repositories
connections 75, 77
132
repository user account SNMP appender (continued)
changing the credentials 118 parameters 115
requirements SNMP logs
database 17 configuration 114
RMI port number source connection
modifying 113 configure 125
system requirements
user accounts 12
S
services
starting and stopping on Windows 110, 111
U
starting on Linux 111 user accounts
SNMP appender installation 12
add to file 116
Index 133