DS 1010 GettingStartedGuide en
DS 1010 GettingStartedGuide en
10.1.0
This software and documentation contain proprietary information of Informatica LLC and are provided under a license agreement containing restrictions on use and
disclosure and are also protected by copyright law. Reverse engineering of the software is prohibited. No part of this document may be reproduced or transmitted in any
form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC. This Software may be protected by U.S. and/or
international Patents and other Patents Pending.
Use, duplication, or disclosure of the Software by the U.S. Government is subject to the restrictions set forth in the applicable software license agreement and as
provided in DFARS 227.7202-1(a) and 227.7702-3(a) (1995), DFARS 252.227-7013©(1)(ii) (OCT 1988), FAR 12.212(a) (1995), FAR 52.227-19, or FAR 52.227-14 (ALT III),
as applicable.
The information in this product or documentation is subject to change without notice. If you find any problems in this product or documentation, please report them to
us in writing.
Informatica, Informatica Platform, Informatica Data Services, PowerCenter, PowerCenterRT, PowerCenter Connect, PowerCenter Data Analyzer, PowerExchange,
PowerMart, Metadata Manager, Informatica Data Quality, Informatica Data Explorer, Informatica B2B Data Transformation, Informatica B2B Data Exchange Informatica
On Demand, Informatica Identity Resolution, Informatica Application Information Lifecycle Management, Informatica Complex Event Processing, Ultra Messaging,
Informatica Master Data Management, and Live Data Map are trademarks or registered trademarks of Informatica LLC in the United States and in jurisdictions
throughout the world. All other company and product names may be trade names or trademarks of their respective owners.
Portions of this software and/or documentation are subject to copyright held by third parties, including without limitation: Copyright DataDirect Technologies. All rights
reserved. Copyright © Sun Microsystems. All rights reserved. Copyright © RSA Security Inc. All Rights Reserved. Copyright © Ordinal Technology Corp. All rights
reserved. Copyright © Aandacht c.v. All rights reserved. Copyright Genivia, Inc. All rights reserved. Copyright Isomorphic Software. All rights reserved. Copyright © Meta
Integration Technology, Inc. All rights reserved. Copyright © Intalio. All rights reserved. Copyright © Oracle. All rights reserved. Copyright © Adobe Systems Incorporated.
All rights reserved. Copyright © DataArt, Inc. All rights reserved. Copyright © ComponentSource. All rights reserved. Copyright © Microsoft Corporation. All rights
reserved. Copyright © Rogue Wave Software, Inc. All rights reserved. Copyright © Teradata Corporation. All rights reserved. Copyright © Yahoo! Inc. All rights reserved.
Copyright © Glyph & Cog, LLC. All rights reserved. Copyright © Thinkmap, Inc. All rights reserved. Copyright © Clearpace Software Limited. All rights reserved. Copyright
© Information Builders, Inc. All rights reserved. Copyright © OSS Nokalva, Inc. All rights reserved. Copyright Edifecs, Inc. All rights reserved. Copyright Cleo
Communications, Inc. All rights reserved. Copyright © International Organization for Standardization 1986. All rights reserved. Copyright © ej-technologies GmbH. All
rights reserved. Copyright © Jaspersoft Corporation. All rights reserved. Copyright © International Business Machines Corporation. All rights reserved. Copyright ©
yWorks GmbH. All rights reserved. Copyright © Lucent Technologies. All rights reserved. Copyright © University of Toronto. All rights reserved. Copyright © Daniel
Veillard. All rights reserved. Copyright © Unicode, Inc. Copyright IBM Corp. All rights reserved. Copyright © MicroQuill Software Publishing, Inc. All rights reserved.
Copyright © PassMark Software Pty Ltd. All rights reserved. Copyright © LogiXML, Inc. All rights reserved. Copyright © 2003-2010 Lorenzi Davide, All rights reserved.
Copyright © Red Hat, Inc. All rights reserved. Copyright © The Board of Trustees of the Leland Stanford Junior University. All rights reserved. Copyright © EMC
Corporation. All rights reserved. Copyright © Flexera Software. All rights reserved. Copyright © Jinfonet Software. All rights reserved. Copyright © Apple Inc. All rights
reserved. Copyright © Telerik Inc. All rights reserved. Copyright © BEA Systems. All rights reserved. Copyright © PDFlib GmbH. All rights reserved. Copyright ©
Orientation in Objects GmbH. All rights reserved. Copyright © Tanuki Software, Ltd. All rights reserved. Copyright © Ricebridge. All rights reserved. Copyright © Sencha,
Inc. All rights reserved. Copyright © Scalable Systems, Inc. All rights reserved. Copyright © jQWidgets. All rights reserved. Copyright © Tableau Software, Inc. All rights
reserved. Copyright© MaxMind, Inc. All Rights Reserved. Copyright © TMate Software s.r.o. All rights reserved. Copyright © MapR Technologies Inc. All rights reserved.
Copyright © Amazon Corporate LLC. All rights reserved. Copyright © Highsoft. All rights reserved. Copyright © Python Software Foundation. All rights reserved.
Copyright © BeOpen.com. All rights reserved. Copyright © CNRI. All rights reserved.
This product includes software developed by the Apache Software Foundation (http://www.apache.org/), and/or other software which is licensed under various
versions of the Apache License (the "License"). You may obtain a copy of these Licenses at http://www.apache.org/licenses/. Unless required by applicable law or
agreed to in writing, software distributed under these Licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express
or implied. See the Licenses for the specific language governing permissions and limitations under the Licenses.
This product includes software which was developed by Mozilla (http://www.mozilla.org/), software copyright The JBoss Group, LLC, all rights reserved; software
copyright © 1999-2006 by Bruno Lowagie and Paulo Soares and other software which is licensed under various versions of the GNU Lesser General Public License
Agreement, which may be found at http:// www.gnu.org/licenses/lgpl.html. The materials are provided free of charge by Informatica, "as-is", without warranty of any
kind, either express or implied, including but not limited to the implied warranties of merchantability and fitness for a particular purpose.
The product includes ACE(TM) and TAO(TM) software copyrighted by Douglas C. Schmidt and his research group at Washington University, University of California,
Irvine, and Vanderbilt University, Copyright (©) 1993-2006, all rights reserved.
This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit (copyright The OpenSSL Project. All Rights Reserved) and
redistribution of this software is subject to terms available at http://www.openssl.org and http://www.openssl.org/source/license.html.
This product includes Curl software which is Copyright 1996-2013, Daniel Stenberg, <[email protected]>. All Rights Reserved. Permissions and limitations regarding this
software are subject to terms available at http://curl.haxx.se/docs/copyright.html. Permission to use, copy, modify, and distribute this software for any purpose with or
without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies.
The product includes software copyright 2001-2005 (©) MetaStuff, Ltd. All Rights Reserved. Permissions and limitations regarding this software are subject to terms
available at http://www.dom4j.org/ license.html.
The product includes software copyright © 2004-2007, The Dojo Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to
terms available at http://dojotoolkit.org/license.
This product includes ICU software which is copyright International Business Machines Corporation and others. All rights reserved. Permissions and limitations
regarding this software are subject to terms available at http://source.icu-project.org/repos/icu/icu/trunk/license.html.
This product includes software copyright © 1996-2006 Per Bothner. All rights reserved. Your right to use such materials is set forth in the license which may be found at
http:// www.gnu.org/software/ kawa/Software-License.html.
This product includes OSSP UUID software which is Copyright © 2002 Ralf S. Engelschall, Copyright © 2002 The OSSP Project Copyright © 2002 Cable & Wireless
Deutschland. Permissions and limitations regarding this software are subject to terms available at http://www.opensource.org/licenses/mit-license.php.
This product includes software developed by Boost (http://www.boost.org/) or under the Boost software license. Permissions and limitations regarding this software
are subject to terms available at http:/ /www.boost.org/LICENSE_1_0.txt.
This product includes software copyright © 1997-2007 University of Cambridge. Permissions and limitations regarding this software are subject to terms available at
http:// www.pcre.org/license.txt.
This product includes software copyright © 2007 The Eclipse Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms
available at http:// www.eclipse.org/org/documents/epl-v10.php and at http://www.eclipse.org/org/documents/edl-v10.php.
This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?License, http://
www.stlport.org/doc/ license.html, http://asm.ow2.org/license.html, http://www.cryptix.org/LICENSE.TXT, http://hsqldb.org/web/hsqlLicense.html, http://
httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt , http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/
release/license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/OpenSourceLicense.html, http://fusesource.com/downloads/
license-agreements/fuse-message-broker-v-5-3- license-agreement; http://antlr.org/license.html; http://aopalliance.sourceforge.net/; http://www.bouncycastle.org/
licence.html; http://www.jgraph.com/jgraphdownload.html; http://www.jcraft.com/jsch/LICENSE.txt; http://jotm.objectweb.org/bsd_license.html; . http://www.w3.org/
Consortium/Legal/2002/copyright-software-20021231; http://www.slf4j.org/license.html; http://nanoxml.sourceforge.net/orig/copyright.html; http://www.json.org/
license.html; http://forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/
software/tcltk/license.html, http://www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html, http://www.slf4j.org/license.html; http://www.iodbc.org/dataspace/
iodbc/wiki/iODBC/License; http://www.keplerproject.org/md5/license.html; http://www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/
index.html; http://www.net-snmp.org/about/license.html; http://www.openmdx.org/#FAQ; http://www.php.net/license/3_01.txt; http://srp.stanford.edu/license.txt;
http://www.schneier.com/blowfish.html; http://www.jmock.org/license.html; http://xsom.java.net; http://benalman.com/about/license/; https://github.com/CreateJS/
EaselJS/blob/master/src/easeljs/display/Bitmap.js; http://www.h2database.com/html/license.html#summary; http://jsoncpp.sourceforge.net/LICENSE; http://
jdbc.postgresql.org/license.html; http://protobuf.googlecode.com/svn/trunk/src/google/protobuf/descriptor.proto; https://github.com/rantav/hector/blob/master/
LICENSE; http://web.mit.edu/Kerberos/krb5-current/doc/mitK5license.html; http://jibx.sourceforge.net/jibx-license.html; https://github.com/lyokato/libgeohash/blob/
master/LICENSE; https://github.com/hjiang/jsonxx/blob/master/LICENSE; https://code.google.com/p/lz4/; https://github.com/jedisct1/libsodium/blob/master/
LICENSE; http://one-jar.sourceforge.net/index.php?page=documents&file=license; https://github.com/EsotericSoftware/kryo/blob/master/license.txt; http://www.scala-
lang.org/license.html; https://github.com/tinkerpop/blueprints/blob/master/LICENSE.txt; http://gee.cs.oswego.edu/dl/classes/EDU/oswego/cs/dl/util/concurrent/
intro.html; https://aws.amazon.com/asl/; https://github.com/twbs/bootstrap/blob/master/LICENSE; https://sourceforge.net/p/xmlunit/code/HEAD/tree/trunk/
LICENSE.txt; https://github.com/documentcloud/underscore-contrib/blob/master/LICENSE, and https://github.com/apache/hbase/blob/master/LICENSE.txt.
This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and
Distribution License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary
Code License Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php), the new BSD License (http://
opensource.org/licenses/BSD-3-Clause), the MIT License (http://www.opensource.org/licenses/mit-license.php), the Artistic License (http://www.opensource.org/
licenses/artistic-license-1.0) and the Initial Developer’s Public License Version 1.0 (http://www.firebirdsql.org/en/initial-developer-s-public-license-version-1-0/).
This product includes software copyright © 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this
software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab.
For further information please visit http://www.extreme.indiana.edu/.
This product includes software Copyright (c) 2013 Frank Balluffi and Markus Moeller. All rights reserved. Permissions and limitations regarding this software are subject
to terms of the MIT license.
DISCLAIMER: Informatica LLC provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied
warranties of noninfringement, merchantability, or use for a particular purpose. Informatica LLC does not warrant that this software or documentation is error free. The
information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation
is subject to change at any time without notice.
NOTICES
This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software
Corporation ("DataDirect") which are subject to the following terms and conditions:
1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES
OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT, BREACH
OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS.
4 Table of Contents
Adding the Domain and Repository. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Connecting to the Model Repository. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Step 3. Create a Project. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Step 4. Create a Folder. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Step 5. Select the Default Data Integration Service. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Informatica Developer Tips. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Table of Contents 5
Step 1. Export a Mapping to PowerCenter. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Informatica Developer Tips. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Appendix A: Glossary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
6 Table of Contents
Preface
The Informatica Data Services Getting Started Guide is written for data services developers. It provides a
tutorial to help first-time users learn how to use Informatica Developer for data services tasks. This guide
assumes that you have an understanding of flat file concepts, relational database concepts, web services
concepts, and the database engines in your environment.
Informatica Resources
Informatica Network
Informatica Network hosts Informatica Global Customer Support, the Informatica Knowledge Base, and other
product resources. To access Informatica Network, visit https://network.informatica.com.
To access the Knowledge Base, visit https://kb.informatica.com. If you have questions, comments, or ideas
about the Knowledge Base, contact the Informatica Knowledge Base team at
[email protected].
Informatica Documentation
To get the latest documentation for your product, browse the Informatica Knowledge Base at
https://kb.informatica.com/_layouts/ProductDocumentation/Page/ProductDocumentSearch.aspx.
If you have questions, comments, or ideas about this documentation, contact the Informatica Documentation
team through email at [email protected].
7
Informatica Product Availability Matrixes
Product Availability Matrixes (PAMs) indicate the versions of operating systems, databases, and other types
of data sources and targets that a product release supports. If you are an Informatica Network member, you
can access PAMs at
https://network.informatica.com/community/informatica-network/product-availability-matrices.
Informatica Velocity
Informatica Velocity is a collection of tips and best practices developed by Informatica Professional
Services. Developed from the real-world experience of hundreds of data management projects, Informatica
Velocity represents the collective knowledge of our consultants who have worked with organizations from
around the world to plan, develop, deploy, and maintain successful data management solutions.
If you are an Informatica Network member, you can access Informatica Velocity resources at
http://velocity.informatica.com.
If you have questions, comments, or ideas about Informatica Velocity, contact Informatica Professional
Services at [email protected].
Informatica Marketplace
The Informatica Marketplace is a forum where you can find solutions that augment, extend, or enhance your
Informatica implementations. By leveraging any of the hundreds of solutions from Informatica developers
and partners, you can improve your productivity and speed up time to implementation on your projects. You
can access Informatica Marketplace at https://marketplace.informatica.com.
To find your local Informatica Global Customer Support telephone number, visit the Informatica website at
the following link:
http://www.informatica.com/us/services-and-training/support-services/global-support-centers.
If you are an Informatica Network member, you can use Online Support at http://network.informatica.com.
8 Preface
Chapter 1
Business analysts and developers can collaborate to determine the business logic used to transform data
and make it available in a virtual database. Developers can also evaluate the quality of data, create a model
of the data to establish uniformity within the enterprise, and transform the data based on business needs
before exposing the data through a virtual database or a web service.
You can use Informatica Data Services to complete the following projects:
• Create a prototype of a data warehouse. Before you create a data warehouse, you can use Informatica
Data Services to create a virtual data warehouse. You can verify the data requirements and quickly
validate the prototype without the time-consuming task of implementing a physical data warehouse. After
you validate the prototype, you can reuse the same logic to create the physical data warehouse in
PowerCenter.
• Create a prototype of changes to a data warehouse. When a data warehouse requires changes, you can
create a prototype. You can validate the changes before you make them to a physical data warehouse.
• Create a virtual database that allows you to make more data available quickly to enable accurate
reporting.
9
Data Services Tasks
You can use Informatica Data Services to perform different tasks based on organization requirements.
Business analysts and developers can collaborate on the development of mapping specifications. Business
analysts can use Informatica Analyst (the Analyst tool) to create mapping specifications. After the mapping
specifications are complete, developers can use Informatica Developer (the Developer tool) to export them
as virtual tables that analysts and developers can run SQL queries against. Developers can also use the
Developer tool to export a mapping specification to a PowerCenter mapping to physically transform and
move the source data to a target.
Example
A business analyst and a developer want to collaborate on the development of a virtual table that combines
data from two data sources. The business analyst creates a mapping specification based on the two data
sources in the Analyst tool. The developer views and edits the business logic in the mapping specification in
the Developer tool. The business analyst exports the mapping specification logic as a virtual table for end
users to run SQL queries against.
Developers can use the Developer tool to create column profiles at different stages in the data integration
process. Developers can run a column profile at each stage where they want to analyze the data. They can
also create a scorecard on a column profile to periodically review data quality. A scorecard is a graphical
representation of the quality measurements in a profile.
Example
A developer wants to combine customer address information from relational tables in two data sources.
Before they combine the data, they run a column profile on the State columns in both tables to verify that the
tables use the same values for states.
A logical data object model contains logical data objects and defines relationships between them. Logical
data objects represent business entities such as customers, orders, accounts, and products. Each logical
data object maps to data in underlying data sources.
Developers can expose a logical data object as a virtual table to allow business analysts to build reports
based on the virtual table instead of building reports for each data source. The reports can aggregate
information from multiple data sources. Developers can also make a logical data object the source of a web
service operation to allow end users to access the logical data object over the Web.
Developers can export the logic used to build a logical data object to a PowerCenter mapplet to physically
move the data from the data source to a target.
Example
A developer works for an organization that acquires another organization. To create a standardized model of
customer information, employee information, and order information between the two organizations, the
developer creates a logical data object model. The logical data object model contains logical data objects for
customers, employees, and orders. The developer runs column profiles on the logical data objects to verify
that the data is consistent across both companies. The developer exposes each logical data object as a
virtual table. A business analyst builds business intelligence reports based on the customers, employees, and
orders virtual tables.
To expose data as a virtual database, a developer can use the Developer tool to create an SQL data service.
An SQL data service is a virtual database that end users can query using third-party client tools, like business
intelligence tools. A developer can build an SQL data service based on one or more physical data sources,
logical data objects, or both. A developer builds an SQL data service based on a logical data object when they
want to pull data from multiple, heterogeneous data sources, such as relational databases and flat files, or
from data sources that reside in different locations. They can export a mapping specification to a virtual
table. They can also run column profiles on a virtual table in an SQL data service.
Example
A developer wants to add data to business intelligence reports, but some of the data does not exist in the
data warehouse. Before they create the logic to physically load the new data into the data warehouse, they
use an SQL data service to create a prototype of the changes. They create an SQL data service to combine
the additional data with the data in the data warehouse. They build a business intelligence report based on
the virtual table in the SQL data service.
An external application or a Web Service Consumer transformation can connect to a web service as a web
service client.
You create a logical data object model to prototype the modified data warehouse. You build an SQL data
service on the logical data object model to enable a business user to query the data in the logical data object
model. The business user runs queries to test the changes and provides feedback. After incorporating
feedback from the business user, you use the logic defined for the logical data object model to generate a
PowerCenter mapping that integrates the flat file data into the data warehouse.
To prototype the changes to the data warehouse, you perform the following steps:
To integrate the flat file data into the data warehouse, you can export the logical data object model as a
PowerCenter mapping.
The following figure shows the Informatica Data Services components that run within the Informatica
domain:
Application Clients
A group of clients that you use to access underlying Informatica functionality. Application clients send
requests to the Service Manager or application services. The Service Manager runs the application services
and performs domain functions including authentication, authorization, and logging.
The Informatica domain includes the following application clients for Informatica Data Services:
Informatica Analyst
Informatica Analyst (the Analyst tool) is a web-based application client that analysts can use to perform
data integration and data federation tasks in an enterprise. Use the Analyst tool to collaborate with
developers on data integration and data federation solutions.
Informatica Developer
Informatica Developer (the Developer tool) is an application client that developers can use to design and
implement data integration and data federation solutions.
Informatica Administrator
Informatica Administrator (the Administrator tool) is an application client that consolidates the
administrative tasks for domain objects such as services, connections, and licenses. Administrators
manage the domain and the security of the domain through the Administrator tool.
The Informatica domain includes the following application services for Informatica Data Services:
The Data Integration Service is an application service that runs data integration jobs for the Developer
tool and external clients. Data integration jobs include previewing data and running profiles and
mappings.
The Model Repository Service is an application service that manages the Model repository.
Analyst Service
The Analyst Service is an application service that runs the Analyst tool in the Informatica domain. The
Analyst Service manages the connections between service components and the users that have access
to the Analyst tool.
The Informatica domain includes the following databases and directory for Informatica Data Services:
Model repository
The Model repository is a relational database that stores the metadata for projects. The Model
repository also stores run-time and configuration information for applications that are deployed to a
Data Integration Service.
The domain configuration repository is a set of domain metadata tables stored in a relational database.
Each time an administrator makes a change to the domain, the Service Manager writes the change to the
domain configuration repository.
Profile warehouse
The profile warehouse is a relational database that the Data Integration Services uses to store profile
results.
The flat file cache directory stores flat files that can be used as sources in the Analyst tool. The Analyst
Service manages the connection to the flat file cache directory.
Informatica Analyst
Informatica Analyst (the Analyst tool) is a web-based client tool that is available to multiple Informatica
products and is used by business users to collaborate on projects within an organization. For example,
business analysts can use the Analyst tool to collaborate on data integration projects in an organization.
Use the Analyst tool to define data integration logic and collaborate on projects to accelerate project delivery.
Use assets such as flat file data objects, table data objects, or mapping specifications in the Analyst tool to
You can use the Analyst tool to complete the following tasks:
Create a data object.
Import metadata to create a flat file data object or table data object. Create flat file or table data objects
for sources, lookups, and targets that you want to use in a mapping specification or profile.
Run a profile.
Run a profile to analyze the structure and content of your data, and to determine the quality of your data.
Use a mapping specification to define business logic that populates a target with data. You can also
share the logic or results of the mapping specification with other analysts and developers.
The Analyst tool opens on the Start workspace. The Start workspace lists the workspaces that you have the
license and privilege to use through workspace access panels.
New
Informatica Analyst 15
Open
Notifications alert
Manage
Open temporary workspaces and Notifications. You can open the Connections, Data Domains, Job
Status, Projects, and Business Glossary Security workspaces.
User name
Set user preferences to change the password and to log out of the Analyst tool.
Help
Start
Access other workspaces that you have the license to access through the workspace access panels. If
you have the license to perform exception management, your tasks appear on the My Tasks panel of the
workspace.
Glossary
Define and describe business concepts that are important to your organization. You can create and
manage business terms, categories, glossaries, and policies.
Discovery
Analyze the quality of data and metadata in source systems. You can create and manage profiles, flat
file data objects, and table data objects.
Design
Design business logic that helps analysts and developers collaborate. You can create and manage
mapping specifications, reference tables, and rule definitions.
Scorecards
Open, edit, and run scorecards that you created from profile results. You can add metrics, drill down on
columns, add scorecard filters, and view trend charts for a scorecard.
Informatica Developer
Informatica Developer (the Developer tool) is an application client that you can use to design and implement
data integration and data federation solutions.
You can use the Developer tool to import metadata, create connections, and create logical data objects. You
can use the Developer tool to create SQL data services and web services. You can also use the Developer
tool to create and run profiles and mappings.
You can use the Developer tool to complete the following tasks:
Create a physical data object.
Import metadata to create a physical data object or manually create the physical data object. Create
physical data objects for sources, lookups, and targets that you want to use in a mapping, profile, or
mapping specification.
A logical view of data describes the structure and use of data in an enterprise. You can create a logical
data object model that shows the types of data your enterprise uses and how that data is structured.
Then, you can create a mapping that links objects in a logical model to data sources or targets.
Run column profiles to discover the number of unique values and null values in each column. You can
also run column profiles to view patterns of data in each column and the frequencies with which these
values occur.
An SQL data service is a virtual database that end users can query. Create an SQL data service so that
end users can run SQL queries against the virtual tables through a third-party client tool.
A web service provides access to data integration functionality. Create a web service so that a web
service client can connect to a web service to access, transform, or deliver data.
Develop mappings.
Informatica Developer 17
Informatica Developer User Interface
The Developer tool user interface consists of a workbench with multiple views that you use to create data
integration and data federation solutions.
Displays projects, folders, and the objects within the projects and folders.
Outline view
Displays objects that are dependent on an object selected in the Object Explorer view.
Displays the cheat sheet that you open. To open a cheat sheet, click Help > Cheat Sheets and select a
cheat sheet.
Displays source data, profile results, and previews the output of a transformation. You can also preview
web service messages or run an SQL query from the Data Viewer view.
Alerts view
The Developer tool can display other views also. You can hide views and move views to another location in
the Developer tool workbench. Click Window > Show View to select the views you want to display.
Informatica Administrator
Informatica Administrator (the Administrator tool) is an application client that consolidates the
administrative tasks for domain objects such as services, connections, and licenses.
You manage the domain and the security of the domain through the Administrator tool.
• Domain administrative tasks. Manage logs, domain objects, user permissions, and domain reports.
Generate and upload node diagnostics. Monitor Data Integration Service jobs and applications. Domain
objects include application services, nodes, grids, folders, database connections, operating system
profiles, and licenses.
• Security administrative tasks. Manage users, groups, roles, and privileges.
Informatica Administrator 19
Informatica Administrator User Interface
The Administrator tool is an application that you use to manage the Informatica domain and the security of
the Informatica domain. The Administrator tool interface contains tabs, header items, views, a navigator, and
a contents panel.
1. Navigator
2. View in the tab
3. Tab
4. Header area
5. Contents panel
The tabs and views that are available in the Administrator tool differ based on your product license and user
permissions. The Navigator displays a hierarchy of objects. The types of objects in the Navigator differ based
on the tab that you select. The contents panel displays details about the object that you select in the
Navigator.
• Manage. View and edit the properties of the domain and objects within the domain.
• Monitor. View the status of profile jobs, scorecard jobs, preview jobs, mapping jobs, SQL data services,
web services, and workflows for each Data Integration Service.
• Monitor. View the status of profile jobs, preview jobs, mapping jobs, SQL data services, and web services
for each Data Integration Service.
• Logs. View log events for the domain and services within the domain.
• Reports. Run a Web Services Report or License Management Report.
• Security. Manage users, groups, roles, and privileges.
• Cloud. View information about your Informatica Cloud® organization.
• Tutorial Overview, 21
• Tutorial Lessons, 22
• Tutorial Files, 23
• Tutorial Prerequisites, 24
Tutorial Overview
The Informatica Data Services tutorial includes multiple lessons. Each lesson contains typical, related tasks
that you perform in Informatica Developer.
Objectives
Each lesson contains objectives that list the tasks included in the lesson and describe the skill level
required for the lesson.
Prerequisites
Each lesson lists the prerequisites that you must complete before you start the lesson.
Note: The tutorial also has tutorial prerequisites that you must complete before you start any lesson.
Timing
Each lesson specifies the estimated amount of time required to complete all tasks in the lesson.
Tasks
Each lesson consists of tasks. Perform the tasks in the order specified. The tasks help you understand
the lesson concepts and learn how to use the Developer tool to accomplish the tasks.
Each task consists of steps. The steps provide detailed instructions about how to perform the task.
Data Results
Some lessons contain instructions about how to preview the data. Preview the data to verify that the
output data looks correct.
21
Tips
Each lesson provides tips about using the Developer tool. The tips are related to the tasks included in the
corresponding lesson.
After completing the tutorial, you should be able to perform the following high-level tasks:
Tutorial Lessons
The tutorial consists of multiple lessons. Some lessons are prerequisites for subsequent lessons.
You start and set up the Developer tool. To set up the Developer tool, you connect to the Model
repository, and create a project and folder to store your work. You also select the default Data
Integration Service to preview data and run mappings.
You import the Boston_Customers.csv and LA_Customers.csv tutorial flat file as physical data objects.
You import a logical data object model that contains the Customer and Order logical data objects. You
also create a logical data object read mapping with the Customer logical data object as the mapping
output. You run the mapping to view customer data from multiple sources.
You create an SQL data service to define a virtual database that contains customer data. You preview
the virtual data. You create an application that contains the SQL data service, and deploy the application
to the Data Integration Service.
You create a web service that provides access to customer data. You view and validate the operation
mapping. You create an application that contains the web service, and deploy the application to the Data
Integration Service.
• Boston_Customers.csv
• LA_customers.csv
• Customer_Order.xsd
The following table provides the machines and directories where you can find the tutorial files:
Machine Directory
Machine that runs the Data Integration Service <Informatica installation directory>\server
\Tutorials
The following table shows sample data from the Boston_Customers.csv tutorial file:
The following table shows sample data from the LA_Customers.csv tutorial file:
Tutorial Files 23
Customer_Order.xsd tutorial file
The Customer_Order.xsd file contains an XML schema definition of the Customer_Order logical data object
model. In a lesson, you import the Customer_Order.xsd file into the Developer tool to create a logical data
object model.
Tutorial Prerequisites
Before you can begin any lesson, you must complete the tutorial prerequisites.
1. Verify that the Informatica domain, Model Repository Service, and Data Integration Service are running.
2. Verify that the tutorial files are in the following directory on the machine that runs the Developer tool:
<Informatica installation directory>\clients\DeveloperClient\Tutorials
3. Verify that the tutorial files are in the following directory on the machine that runs the Data Integration
Service:
<Informatica installation directory>\server\Tutorials
To set up the Developer tool, you connect to the Model repository and create a project and folder to store
your work. You select the default Data Integration Service to preview data and run mappings.
Lesson Concepts
The Informatica domain is a collection of services that perform data integration jobs and monitoring jobs.
You manage the domain through the Administrator tool.
The Model Repository Service manages the Model repository. The Model repository is a relational database
that stores the metadata for projects and folders. A project stores objects that you create in the Developer
tool. A project can also contain folders that store related objects that are part of the same business
requirement.
The Data Integration Service performs data integration jobs for the Developer tool. Data integration jobs
include previewing data, and running profiles, mappings, and workflows.
Lesson Objectives
In this lesson, you complete the following beginner-level tasks:
25
• Select the default Data Integration Service to preview data and to run mappings in the Developer tool.
Lesson Prerequisites
Before you start this lesson, complete the following prerequisites:
• Ask a domain administrator to verify that the Model Repository Service and Data Integration Service are
running in the domain.
• Get the following information from a domain administrator:
- Domain name, host name, and port number to connect to a domain.
Lesson Timing
Set aside 5 to 10 minutes to complete the tasks in this lesson.
1. From the Windows Start menu, click Informatica <version> > Client > Developer Client > Launch
Informatica Developer.
The Welcome page of the Developer tool appears. If you have started the Developer tool before, the
Developer tool opens to the Workbench.
When you add the domain, use the domain name, host name, and port number that you got from the domain
administrator.
1. From the Developer tool menu, click File > Connect to Repository.
The Connect to Repository dialog box appears.
3. Click Add.
The New Domain dialog box appears.
4. Enter the domain name, host name, and port number for the domain.
The following table lists the default values for the domain:
8. Click OK.
The Connect to Repository dialog box appears.
9. Click Browse to select the Model Repository Service associated with the Model repository.
The Choose Service dialog box appears.
10. Expand the domain and select the Model Repository Service.
The following figure shows the selected Model Repository Service:
1. In the Object Explorer view, right-click the Model Repository Service associated with the Model
repository to which you want to connect.
2. Select Connect.
The Connect to Domain dialog box appears.
3. Enter the user name and password provided to you.
4. Click OK.
The Developer tool connects to the Model repository.
1. From the Developer tool menu, click File > New > Project.
The New Project dialog box appears.
2. Enter "Tutorial" for the project name.
5. Click OK.
If you hide views or move views to another location in the Developer tool workbench, you can reset the
Developer tool perspective to the default values. Click Window > Reset Perspective.
Lesson Concepts
A physical data object is a Model repository object that represents a flat file or relational database table. You
can import a flat file or relational database table as a physical data object to use as a source, target, or
lookup in a mapping.
Lesson Objectives
In this lesson, you complete the following beginner-level tasks:
Lesson Prerequisites
Before you start this lesson, complete the following prerequisites:
• Set up Informatica Developer. For more information, see “Setting Up Informatica Developer Overview” on
page 25.
• Verify that the Boston_Customers.csv and LA_Customers.csv tutorial files are in the following directory on
the Developer tool machine:
<Informatica installation directory>\clients\DeveloperClient\Tutorials
• Verify that the Boston_Customers.csv and LA_Customers.csv tutorial files are also in the following
directory on every machine that runs the Data Integration Service:
<Informatica installation directory>\server\Tutorials
Lesson Timing
Set aside 5 to 10 minutes to complete the tasks in this lesson.
35
Step 1. Import the Boston_Customers Flat File Data
Object
In this task, you import the Boston_Customers.csv flat file as a physical data object. The flat file contains
data about customers from a Boston office.
2. Right-click the Tutorial_Objects folder and select New > Data Object.
3. Select Physical Data Objects > Flat File Data Object and click Next.
9. Click Next.
10. Select Import column names from first line.
Note: The Developer tool machine must have access to the source file directory on the machine that runs
the Data Integration Service. If the Developer tool cannot access the source file directory, the Developer
tool cannot preview data in the source file or run mappings that access data in the source file. If you run
multiple Data Integration Services, there is a separate source file directory for each Data Integration
Service.
15. Click the Data Viewer view.
16. In the Data Viewer view, click Run.
Export data.
You can export the data that displays in the Data Viewer view to a tab-delimited flat file, such as a TXT or
CSV file. Export data when you want to create a local copy of the data. To export the data, right-click a
row of data in the Data Viewer view and select Export Data.
Lesson Concepts
A logical view of data is a data model in an enterprise.
To develop a single view of data, define a logical data object model. A logical data object model describes
data in an organization and the relationship between the data. You can use a data modeling tool, such as
Erwin, to create the logical data object model. Or, you can manually create the model.
A logical data object model contains logical data objects. A logical data object is an object that describes a
logical entity in an organization, such as a customer or an order. It has attributes and keys, and it describes
relationships between attributes.
The logical data object model describes the relationship between logical data objects. For example, a logical
data model defines a relationship between the Order ID attribute of the Order logical data object and the
Customer ID attribute of the Customer logical data object. The model states that each order ID must be
associated with a customer ID.
A logical data object mapping links a logical data object to one or more physical data objects. The mapping
can include transformation objects that define the logic to transform data. For example, you can use a logical
data object read mapping to access data from multiple sources and apply the output to a logical data object.
44
The following figure shows the components of a logical view of data:
Note: A logical data object mapping can also access data from one logical data object and apply the output
to another logical data object.
After you create logical data object mappings for logical data objects in the model, you can create a data
service for each logical data object mapping, and deploy the data services.
Lesson Objectives
In this lesson, you complete the following beginner-level tasks:
• Import a logical data object model that contains the Customer and Order logical data objects.
• Create a logical data object read mapping with the Customer logical data object as the mapping output.
The mapping defines a single view of the customer data from the Los Angeles and Boston offices. The
mapping also transforms the Boston customer data to conform to the format of the Los Angeles
customer data.
• Run the mapping to view the combined customer data.
Lesson Prerequisites
Before you start this lesson, complete the following prerequisites:
• Set up Informatica Developer. For more information, see “Setting Up Informatica Developer Overview” on
page 25.
• Import the physical data objects. For more information, see “Importing Physical Data Objects
Overview” on page 35.
• Verify that the Customer_Order.xsd tutorial file is in the following directory on the Developer tool machine:
<Informatica installation directory>\clients\DeveloperClient\Tutorials
Lesson Timing
Set aside 20 minutes to complete the tasks in this lesson.
7. In the Value column of the File property, click the Open button ( ) to select an XSD file.
The Open dialog box appears.
8. Navigate to Customer_Order.xsd in the following directory: <Informatica installation directory>
\clients\DeveloperClient\Tutorials
9. Click Open.
The Open dialog box closes. The New Logical Data Object Model dialog box shows the directory path
and name of the model file.
10. Click Next.
11. Click the Move all items button to add the Customer and Order logical data objects to the logical data
object model.
To create the logical data object mapping, you will complete the following tasks:
1. In the Object Explorer view, expand the Logical Data Object Models folder in the Tutorial project.
2. Double-click the Customer_Order logical data object model.
The Customer_Order logical data object model opens.
The following image shows the Customer_Order logical data object model in the editor:
3. In the Object Explorer, double-click the Customer logical data object to open it in the editor.
The logical data object editor opens. The editor contains two areas, a General area where you can add a
read mapping or a write mapping, and an Attribute area where you can edit column attributes.
4. In the General area, click Add to add a read mapping to the Customer logical data object.
The following image shows the Add button:
1. In the Object Explorer view, expand the Logical Data Object Models node and the Customer_Order logical
data object model.
The following image shows the Customer_Order logical data object model and the logical data objects
that it contains, Customer and Order:
4. In the Object Explorer view, expand the Physical Data Objects folder in the Tutorial project.
5. Drag LA_Customers in to the Read Mapping editor.
The Add to Mapping dialog box appears.
6. Verify that the Data Access Type is Read, and click OK.
Read_LA_Customers appears in the editor.
7. Drag Boston_Customers in to the editor.
8. Verify that the Data Access Type is Read, and click OK.
Read_Boston_Customers appears in the editor.
9. Click File > Save to save the logical data object mapping.
1. Right-click an empty area in the editor, and then select Add Transformation.
The Add Transformation dialog box appears.
2. Select the Expression transformation, and then click OK.
An Expression transformation appears in the editor.
3. To create ports in the Expression transformation, select all columns in the Read_Boston_Customers
source and drag them to the Expression transformation.
Tip: To select all columns in the source, right-click inside of the Read_Boston_Customers source in the
editor, and then click Select All.
9. In the Expression column for the FullName port, click the Open button ( ) to open the Expression editor.
10. Replace the existing expression in the Expression editor with the following expression:
CONCAT(CONCAT(FIRSTNAME,' '),LASTNAME)
11. Click Validate to validate the expression.
12. Click OK.
13. Click OK to exit the Expression editor.
14. Select the Expression transformation in the editor.
15. In the Expression transformation, select the FullName port.
16. Click the Move Up button until you move the FullName port below the CustomerTier port.
The following figure shows the FullName port below the CustomerTier port:
You move the port to match the order of the ports in Read_LA_Customers source. The port order must
match to combine data from both sources in the Union transformation.
17. Click File > Save to save the logical data object mapping.
18. Select the Expression transformation in the editor.
19. Click the Data Viewer view.
1. Right-click an empty area in the editor, and then select Add Transformation.
The Add Transformation dialog box appears.
2. Select the Union transformation, and then click OK.
A Union transformation appears in the editor.
3. To add the columns of the Read_LA_Customers source as ports in the Union transformation, select all
columns in the Read_LA_Customers source and drag them to the Union transformation.
The ports appear in the input group and the output group of the Union transformation.
12. Select all ports in the output group of the Union transformation, except Customer_Region, and drag them
to the Customer transformation.
Tip: Hold down the Shift key to select multiple columns. You might need to scroll down the list of
columns to select all of them.
The Developer tool links the ports in the Union transformation to the ports in the Customer mapping
output.
13. Right-click an empty area in the editor and click Validate to validate the mapping.
The Developer tool displays a message stating whether validation errors occurred.
14. Click OK.
15. Click File > Save to save the logical data object mapping.
Tip: Right-click an empty area in the editor and click Arrange All to arrange mapping objects in the
editor.
u Right-click an empty area in the editor and click Run Data Viewer to run the mapping.
The Data Viewer view appears, and the Data Integration Service runs the mapping.
After the Data Integration Service runs the mapping, the Developer tool shows the data in the Output section
of the Data Viewer view. The Output section shows the combined data from the Read_LA_Customers source
When you link ports automatically, you can link by position or by name. When you link ports
automatically by name, you can specify a prefix or suffix by which to link the ports.
To link ports automatically, select Mapping > Auto Link, select the to and from objects, and then select
whether to link the ports by name or position. If you link ports by name, you can click Show Advanced to
specify a prefix or suffix for port names.
You can convert the mapping objects to icons and align the icons in the editor. To align mapping objects
as icons, click Layout > Arrange All Iconic.
A quick outline displays objects that are dependent on a mapping object selected in the editor. You can
use the quick outline to sort dependent objects by name or by type, or to search for dependent objects.
Select an object in the quick outline to navigate to the object in the editor.
To display the quick outline, select the mapping or an object in the mapping and then click Navigate >
Quick Outline.
Use the point-and-click method to add functions and ports to a port expression.
When you create an expression, you can enter the expression manually or use the point-and-click
method. To minimize errors when you create an expression, select functions and ports from the point-
and-click interface.
To add a function to an expression, double-click the function on the Functions tab. To add a port to an
expression, double-click the port name in the Ports tab.
Add comments to describe the expression or to specify a valid URL to access business documentation
about the expression. The Data Integration Service ignores comments when processing the expression.
Lesson Concepts
A virtual view of data is a virtual database defined by an SQL data service that you can query as if it were a
physical database.
To create a virtual database, you define an SQL data service in the Developer tool. The SQL data service must
contain at least one virtual schema and virtual table. A virtual table can have a virtual table mapping that
defines the data flow between the sources and the virtual table. You can create a virtual table manually, or
create it from a physical or logical data object.
To run an SQL data service on a Data Integration Service, you must add the SQL data service to an
application, and then deploy the application to the Data Integration Service. An application is a deployable
object that can contain data objects, mappings, SQL data services, web services, and workflows. You deploy
the application to a Data Integration Service to make the virtual database available for end users to query.
The Data Integration Service processes end-user queries on objects included in deployed applications.
60
The following figure shows the components of a virtual view of data:
Lesson Objectives
In this lesson, you complete the following beginner-level tasks:
• Create an SQL data service to define a virtual database that contains customer data.
• Preview the virtual data.
• Create an application that contains the SQL data service.
• Deploy the application to a Data Integration Service.
Lesson Prerequisites
Before you start this lesson, complete the following prerequisites:
• Set up Informatica Developer. For more information, see “Setting Up Informatica Developer Overview” on
page 25.
• Import the physical data objects. For more information, see “Importing Physical Data Objects
Overview” on page 35.
• Create the Customer_Order logical data object model. For more information, see “Creating a Logical View
of Data Overview” on page 44.
Lesson Timing
Set aside 15 to 20 minutes to complete the tasks in this lesson.
5.
To create a virtual table, click the New button ( ).
The Developer tool adds a virtual table to the list of virtual tables.
6. Enter Customers for the virtual table name.
7. In the Data Object field for the virtual table, click the Open button ( ) to add a logical data object.
8. In the Tutorial folder, expand the Customer_Order logical data object model, and select the Customer
logical data object.
9. Click OK.
The Developer tool adds Customer as the virtual table source. It also specifies Logical Data Object as
the source type and the Tutorial project as the location.
10. Enter Customer_Schema in the Virtual Schemas column and click Enter.
1. Select the Data Viewer view to preview the data of the SQL data service.
2. In the Input section of the Data Viewer view, enter the following SQL statement: SELECT * from
customers
3. Click Run.
The Output section of the Data Viewer view displays the combined customer data from the Los Angeles
and Boston offices.
4. Click Add.
6. Click Finish.
When the deployment succeeds, the Deploy Completed dialog box appears.
7. Click OK.
View the SQL query plan to troubleshoot queries on an SQL data service.
View the SQL query plan to troubleshoot queries against a deployed SQL data service.
To run an SQL query plan, open the SQL data service in the editor, click the Data Viewer view, and then
click SQL Query Plan.
A tag is metadata that defines an object in the Model repository based on business usage. Create tags
to group objects according to their business usage.
To assign a tag to an object, create the tag, open the object in the editor, click Edit in the Tags view, and
then assign the tag to the object.
Group error messages by object or object type in the Validation Log view.
You can group error messages by object or object type in the Validation Log view.
To group error messages, in the Validation Log view, select the Menu button ( ), select Group By, and
then select Object or Object Type.
Limit the number of error messages per group that appear in the Validation Log view.
You can limit the number of error messages that appear in the Validation Log view.
To limit the number of error messages, click Window > Preferences, select Informatica > Validation in
the Preferences dialog box, and then select the Use Error Limits check box and set the number of error
messages.
Lesson Concepts
You create web services in the Developer tool. A web service can have one or more operations. Each
operation defines an action that the web service client can perform when it connects to a web service.
Operations of a web service are defined in a WSDL.
A WSDL is an XML schema that describes the protocols, formats, and signatures of the web service
operations. It contains a description of the data to be passed to the web service so that both the sender and
the receiver of the service request understand the data being exchanged.
In the Developer tool, each operation corresponds to an operation mapping. The operation mapping
processes the data that it receives in the SOAP request.
To run a web service on a Data Integration Service, you must add the web service to an application, and then
deploy the application to the Data Integration Service. An application is a deployable object that can contain
data objects, mappings, SQL data services, web services, and workflows. You deploy the application to a Data
Integration Service to make the web service available for web service clients to connect to.
69
The following figure shows the components of a web service:
Lesson Objectives
In this lesson, you complete the following intermediate-level tasks:
Lesson Prerequisites
Before you start this lesson, complete the following prerequisites:
• Set up Informatica Developer. For more information, see “Setting Up Informatica Developer Overview” on
page 25.
• Import the physical data objects. For more information, see “Importing Physical Data Objects
Overview” on page 35.
• Create the Customer_Order logical data object model. For more information, see “Creating a Logical View
of Data Overview” on page 44.
Lesson Timing
Set aside 15 to 20 minutes to complete this lesson.
5. Click the arrow next to the New button and click Operation > Create from Reusable Object.
The Select Reusable Object dialog box appears.
6. Expand the Tutorial project, browse to Logical Data Object Models > Customer_Order, and then select
Customer.
7. Click OK.
Operations appear in the Add Operations to Web Service dialog box.
12. To review the output of the operation mapping, select getCustomerByID_Output and then select the
Mapping Output tab.
13. Click Finish.
The Developer tool creates the Customer_Details web service and an operation mapping for the
operation.
1. Right-click an empty area in the editor, and select Run Data Viewer.
The Data Integration Service runs the operation mapping. The operation returns an error because you did
not provide the customer ID as input.
2. In the Input window, replace the question mark (?) with 10110147.
10110147 is a customer ID.
3. Click Run.
The Output window displays the SOAP response based on the customer ID you entered.
4. Click Add.
6. Click Finish.
When the deployment succeeds, the Deploy Completed dialog box appears.
7. Click OK.
After you deploy the application, you can view the WSDL URL in the Administrator tool.
To create an operation for an existing web service, open the web service, right-click the web service in
the Object Explorer view or Outline view, and then select New > Operation.
Exporting a Mapping to
PowerCenter
This chapter includes the following topics:
Lesson Concepts
You can export mappings and mapplets from a Model repository to a PowerCenter repository. You export the
objects to run them in PowerCenter.
Before you use PowerCenter to build a data warehouse, you can use Data Services to build a data warehouse
prototype. In the prototype, you can build logical data objects in a logical data object model to describe and
relate enterprise entities such as customer and order. For each logical data object, you can build a logical
data object read mapping to make data in physical data objects accessible in the logical data object.
If performance or usage needs increase, you can replace the prototype with a physical data warehouse. To
populate the physical data warehouse using the transformation logic that you built in the logical data object
read mappings, export the mappings to PowerCenter. When you export the logical data object read mappings,
the Developer tool converts them into PowerCenter mapplets that you can use to load the physical data
warehouse.
Lesson Objectives
In this lesson, you complete the following beginner-level task:
Lesson Prerequisites
Before you start this lesson, complete the following prerequisites:
• Set up Informatica Developer. For more information, see “Setting Up Informatica Developer Overview” on
page 25.
80
• Import the physical data objects. For more information, see “Importing Physical Data Objects
Overview” on page 35.
• Create the Customer_Order logical data object model. For more information, see “Creating a Logical View
of Data Overview” on page 44.
• Verify that you can connect to the PowerCenter repository into which you want to export the Developer
tool mapping. To get the repository login information, contact a domain administrator.
Lesson Timing
Set aside 5 to 10 minutes to complete this task.
5. In the Project field, select the project from which you want to import objects.
6. In the Target Release field, select the version of the PowerCenter repository in to which you want to
import the objects.
Different versions of PowerCenter store metadata differently. Select the version of PowerCenter to
ensure that the mapping metadata is imported into the PowerCenter repository correctly.
7. In the Export Selected Objects To field, select PowerCenter Repository to export the objects to a
PowerCenter repository.
8. Click Browse next to the PowerCenter Repository field to enter the connection properties for the
PowerCenter repository.
Before you export the Developer tool objects to PowerCenter, validate them against the PowerCenter version.
Before you export the Developer tool objects to PowerCenter, you can validate whether the Developer
tool objects are compatible with a particular PowerCenter version.
To enable validation, set the compatibility level to a particular PowerCenter version. To disable
validation, do not select a PowerCenter version. To set the compatibility level, click Edit > Compatibility
Level.
Glossary
application
A deployable object that can contain data objects, mappings, SQL data services, web services, and
workflows.
cost-based optimization
Optimization method that reduces the run time for mappings that perform join operations. With cost-based
optimization, the Data Integration Service creates different plans to run a mapping and calculates a cost for
each plan. The Data Integration Service runs the plan with the smallest cost. The Data Integration Service
calculates cost based on database statistics, I/O, CPU, network, and memory.
data service
A collection of reusable operations that you can run to access and transform data. A data service provides a
unified model of data you can access through a web service or run an SQL query against.
deploy
To make objects within an application accessible to end users. Depending on the types of objects in the
application, end users can then run queries against the objects, access web services, run mappings, or run
workflows.
Informatica Administrator
Informatica Administrator (the Administrator tool) is an application that consolidates the administrative
tasks for domain objects such as services, nodes, licenses, and grids. You manage the domain and the
security of the domain through the Administrator tool.
Informatica Developer
Informatica Developer (the Developer tool) is an application that you use to design data integration solutions.
The Model repository stores the objects that you create in the Developer tool.
mapping
A set of inputs and outputs linked by transformation objects that define the rules for data transformation.
mapplet
A reusable object that contains a set of transformations that you can use in multiple mappings or validate as
a rule.
86 Glossary
Model Repository Service
An application service in the Informatica domain that runs and manages the Model repository. The Model
repository stores metadata created by Informatica products in a relational database to enable collaboration
among the products.
node
A representation of a level in the hierarchy of a web service message.
operation mapping
A mapping that performs the web service operation for the web service client. An operation mapping can
contain an Input transformation, an Output transformation, and multiple Fault transformations.
predicate expression
An expression that filters the data in a mapping. A predicate expression returns true or false.
predicate optimization
Optimization method that simplifies or rewrites the predicate expressions in a mapping. With predicate
optimization, the Data Integration Service attempts to apply predicate expressions as early as possible to
increase mapping performance.
project
The top-level container to store objects created in Informatica Analyst and Informatica Developer. Create
projects based on business goals or requirements. Projects appear in both Informatica Analyst and
Informatica Developer.
pushdown optimization
Optimization method that pushes transformation logic to a source or target database. With pushdown
optimization, the Data Integration Service translates the transformation logic into SQL queries and sends the
SQL queries to the database. The database runs the SQL queries to process the data.
semi-join optimization
Optimization method that reduces the number of rows extracted from the source. With semi-join
optimization, the Data Integration Service modifies the join operations in a mapping. The Data Integration
Service applies the semi-join optimization method to a Joiner transformation when a larger input group has
rows that do not match a smaller input group in the join condition. The Data Integration Service reads the
rows from the smaller group, finds the matching rows in the larger group, and performs the join operation.
Appendix A: Glossary 87
SQL data service
A virtual database that you can query. It contains virtual objects and provides a uniform view of data from
disparate, heterogeneous data sources.
team-based development
The collaboration of team members on a development project. Collaboration includes functionality such as
versioning through checking out and checking in repository objects.
transformation
A repository object in a mapping that generates, modifies, or passes data. Each transformation performs a
different function.
virtual data
The information get when you query virtual tables or run stored procedures in an SQL data service.
virtual database
An SQL data service that you can query. It contains virtual objects and provides a uniform view of data from
disparate, heterogeneous data sources.
virtual schema
A schema in a virtual database that defines the database structure.
virtual table
A table in a virtual database.
88 Glossary