DS 100 SQLDataServiceGuide en
DS 100 SQLDataServiceGuide en
0)
This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?License, http://
www.stlport.org/doc/ license.html, http://asm.ow2.org/license.html, http://www.cryptix.org/LICENSE.TXT, http://hsqldb.org/web/hsqlLicense.html, http://
httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt , http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/release/
license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/OpenSourceLicense.html, http://fusesource.com/downloads/licenseagreements/fuse-message-broker-v-5-3- license-agreement; http://antlr.org/license.html; http://aopalliance.sourceforge.net/; http://www.bouncycastle.org/licence.html;
http://www.jgraph.com/jgraphdownload.html; http://www.jcraft.com/jsch/LICENSE.txt; http://jotm.objectweb.org/bsd_license.html; . http://www.w3.org/Consortium/Legal/
2002/copyright-software-20021231; http://www.slf4j.org/license.html; http://nanoxml.sourceforge.net/orig/copyright.html; http://www.json.org/license.html; http://
forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/software/tcltk/license.html, http://
www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html, http://www.slf4j.org/license.html; http://www.iodbc.org/dataspace/iodbc/wiki/iODBC/License; http://
www.keplerproject.org/md5/license.html; http://www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/index.html; http://www.net-snmp.org/about/
license.html; http://www.openmdx.org/#FAQ; http://www.php.net/license/3_01.txt; http://srp.stanford.edu/license.txt; http://www.schneier.com/blowfish.html; http://
www.jmock.org/license.html; http://xsom.java.net; http://benalman.com/about/license/; https://github.com/CreateJS/EaselJS/blob/master/src/easeljs/display/Bitmap.js;
http://www.h2database.com/html/license.html#summary; http://jsoncpp.sourceforge.net/LICENSE; http://jdbc.postgresql.org/license.html; http://
protobuf.googlecode.com/svn/trunk/src/google/protobuf/descriptor.proto; https://github.com/rantav/hector/blob/master/LICENSE; http://web.mit.edu/Kerberos/krb5current/doc/mitK5license.html; http://jibx.sourceforge.net/jibx-license.html; https://github.com/lyokato/libgeohash/blob/master/LICENSE; https://github.com/hjiang/jsonxx/
blob/master/LICENSE; https://code.google.com/p/lz4/; https://github.com/jedisct1/libsodium/blob/master/LICENSE; http://one-jar.sourceforge.net/index.php?
page=documents&file=license; https://github.com/EsotericSoftware/kryo/blob/master/license.txt; http://www.scala-lang.org/license.html; https://github.com/tinkerpop/
blueprints/blob/master/LICENSE.txt; http://gee.cs.oswego.edu/dl/classes/EDU/oswego/cs/dl/util/concurrent/intro.html; https://aws.amazon.com/asl/; https://github.com/
twbs/bootstrap/blob/master/LICENSE; https://sourceforge.net/p/xmlunit/code/HEAD/tree/trunk/LICENSE.txt; https://github.com/documentcloud/underscore-contrib/blob/
master/LICENSE, and https://github.com/apache/hbase/blob/master/LICENSE.txt.
This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and Distribution
License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary Code License
Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php), the new BSD License (http://opensource.org/
licenses/BSD-3-Clause), the MIT License (http://www.opensource.org/licenses/mit-license.php), the Artistic License (http://www.opensource.org/licenses/artisticlicense-1.0) and the Initial Developers Public License Version 1.0 (http://www.firebirdsql.org/en/initial-developer-s-public-license-version-1-0/).
This product includes software copyright 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this
software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab.
For further information please visit http://www.extreme.indiana.edu/.
This product includes software Copyright (c) 2013 Frank Balluffi and Markus Moeller. All rights reserved. Permissions and limitations regarding this software are subject
to terms of the MIT license.
See patents at https://www.informatica.com/legal/patents.html.
DISCLAIMER: Informatica LLC provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied
warranties of noninfringement, merchantability, or use for a particular purpose. Informatica LLC does not warrant that this software or documentation is error free. The
information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation is
subject to change at any time without notice.
NOTICES
This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating company of Progress Software
Corporation ("DataDirect") which are subject to the following terms and conditions:
1. THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
2. IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR ANY DIRECT, INDIRECT,
INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE OF THE ODBC DRIVERS, WHETHER OR NOT
INFORMED OF THE POSSIBILITIES OF DAMAGES IN ADVANCE. THESE LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT
LIMITATION, BREACH OF CONTRACT, BREACH OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS.
This software and documentation contain proprietary information of Informatica LLC and are provided under a license agreement containing restrictions on use and
disclosure and are also protected by copyright law. Reverse engineering of the software is prohibited. No part of this document may be reproduced or transmitted in any
form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC. This Software may be protected by U.S. and/or
international Patents and other Patents Pending.
Use, duplication, or disclosure of the Software by the U.S. Government is subject to the restrictions set forth in the applicable software license agreement and as
provided in DFARS 227.7202-1(a) and 227.7702-3(a) (1995), DFARS 252.227-7013(1)(ii) (OCT 1988), FAR 12.212(a) (1995), FAR 52.227-19, or FAR 52.227-14
(ALT III), as applicable.
The information in this product or documentation is subject to change without notice. If you find any problems in this product or documentation, please report them to us
in writing.
Informatica, Informatica Platform, Informatica Data Services, PowerCenter, PowerCenterRT, PowerCenter Connect, PowerCenter Data Analyzer, PowerExchange,
PowerMart, Metadata Manager, Informatica Data Quality, Informatica Data Explorer, Informatica B2B Data Transformation, Informatica B2B Data Exchange Informatica
On Demand, Informatica Identity Resolution, Informatica Application Information Lifecycle Management, Informatica Complex Event Processing, Ultra Messaging and
Informatica Master Data Management are trademarks or registered trademarks of Informatica LLC in the United States and in jurisdictions throughout the world. All
other company and product names may be trade names or trademarks of their respective owners.
Portions of this software and/or documentation are subject to copyright held by third parties, including without limitation: Copyright DataDirect Technologies. All rights
reserved. Copyright Sun Microsystems. All rights reserved. Copyright RSA Security Inc. All Rights Reserved. Copyright Ordinal Technology Corp. All rights
reserved.Copyright Aandacht c.v. All rights reserved. Copyright Genivia, Inc. All rights reserved. Copyright Isomorphic Software. All rights reserved. Copyright Meta
Integration Technology, Inc. All rights reserved. Copyright Intalio. All rights reserved. Copyright Oracle. All rights reserved. Copyright Adobe Systems
Incorporated. All rights reserved. Copyright DataArt, Inc. All rights reserved. Copyright ComponentSource. All rights reserved. Copyright Microsoft Corporation. All
rights reserved. Copyright Rogue Wave Software, Inc. All rights reserved. Copyright Teradata Corporation. All rights reserved. Copyright Yahoo! Inc. All rights
reserved. Copyright Glyph & Cog, LLC. All rights reserved. Copyright Thinkmap, Inc. All rights reserved. Copyright Clearpace Software Limited. All rights
reserved. Copyright Information Builders, Inc. All rights reserved. Copyright OSS Nokalva, Inc. All rights reserved. Copyright Edifecs, Inc. All rights reserved.
Copyright Cleo Communications, Inc. All rights reserved. Copyright International Organization for Standardization 1986. All rights reserved. Copyright ejtechnologies GmbH. All rights reserved. Copyright Jaspersoft Corporation. All rights reserved. Copyright International Business Machines Corporation. All rights
reserved. Copyright yWorks GmbH. All rights reserved. Copyright Lucent Technologies. All rights reserved. Copyright (c) University of Toronto. All rights reserved.
Copyright Daniel Veillard. All rights reserved. Copyright Unicode, Inc. Copyright IBM Corp. All rights reserved. Copyright MicroQuill Software Publishing, Inc. All
rights reserved. Copyright PassMark Software Pty Ltd. All rights reserved. Copyright LogiXML, Inc. All rights reserved. Copyright 2003-2010 Lorenzi Davide, All
rights reserved. Copyright Red Hat, Inc. All rights reserved. Copyright The Board of Trustees of the Leland Stanford Junior University. All rights reserved. Copyright
EMC Corporation. All rights reserved. Copyright Flexera Software. All rights reserved. Copyright Jinfonet Software. All rights reserved. Copyright Apple Inc. All
rights reserved. Copyright Telerik Inc. All rights reserved. Copyright BEA Systems. All rights reserved. Copyright PDFlib GmbH. All rights reserved. Copyright
Orientation in Objects GmbH. All rights reserved. Copyright Tanuki Software, Ltd. All rights reserved. Copyright Ricebridge. All rights reserved. Copyright Sencha,
Inc. All rights reserved. Copyright Scalable Systems, Inc. All rights reserved. Copyright jQWidgets. All rights reserved. Copyright Tableau Software, Inc. All rights
reserved. Copyright MaxMind, Inc. All Rights Reserved. Copyright TMate Software s.r.o. All rights reserved. Copyright MapR Technologies Inc. All rights reserved.
Copyright Amazon Corporate LLC. All rights reserved. Copyright Highsoft. All rights reserved. Copyright Python Software Foundation. All rights reserved.
Copyright BeOpen.com. All rights reserved. Copyright CNRI. All rights reserved.
This product includes software developed by the Apache Software Foundation (http://www.apache.org/), and/or other software which is licensed under various versions
of the Apache License (the "License"). You may obtain a copy of these Licenses at http://www.apache.org/licenses/. Unless required by applicable law or agreed to in
writing, software distributed under these Licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied. See the Licenses for the specific language governing permissions and limitations under the Licenses.
This product includes software which was developed by Mozilla (http://www.mozilla.org/), software copyright The JBoss Group, LLC, all rights reserved; software
copyright 1999-2006 by Bruno Lowagie and Paulo Soares and other software which is licensed under various versions of the GNU Lesser General Public License
Agreement, which may be found at http:// www.gnu.org/licenses/lgpl.html. The materials are provided free of charge by Informatica, "as-is", without warranty of any
kind, either express or implied, including but not limited to the implied warranties of merchantability and fitness for a particular purpose.
The product includes ACE(TM) and TAO(TM) software copyrighted by Douglas C. Schmidt and his research group at Washington University, University of California,
Irvine, and Vanderbilt University, Copyright () 1993-2006, all rights reserved.
This product includes software developed by the OpenSSL Project for use in the OpenSSL Toolkit (copyright The OpenSSL Project. All Rights Reserved) and
redistribution of this software is subject to terms available at http://www.openssl.org and http://www.openssl.org/source/license.html.
This product includes Curl software which is Copyright 1996-2013, Daniel Stenberg, <[email protected]>. All Rights Reserved. Permissions and limitations regarding this
software are subject to terms available at http://curl.haxx.se/docs/copyright.html. Permission to use, copy, modify, and distribute this software for any purpose with or
without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies.
The product includes software copyright 2001-2005 () MetaStuff, Ltd. All Rights Reserved. Permissions and limitations regarding this software are subject to terms
available at http://www.dom4j.org/ license.html.
The product includes software copyright 2004-2007, The Dojo Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to
terms available at http://dojotoolkit.org/license.
This product includes ICU software which is copyright International Business Machines Corporation and others. All rights reserved. Permissions and limitations
regarding this software are subject to terms available at http://source.icu-project.org/repos/icu/icu/trunk/license.html.
This product includes software copyright 1996-2006 Per Bothner. All rights reserved. Your right to use such materials is set forth in the license which may be found at
http:// www.gnu.org/software/ kawa/Software-License.html.
This product includes OSSP UUID software which is Copyright 2002 Ralf S. Engelschall, Copyright 2002 The OSSP Project Copyright 2002 Cable & Wireless
Deutschland. Permissions and limitations regarding this software are subject to terms available at http://www.opensource.org/licenses/mit-license.php.
This product includes software developed by Boost (http://www.boost.org/) or under the Boost software license. Permissions and limitations regarding this software are
subject to terms available at http:/ /www.boost.org/LICENSE_1_0.txt.
This product includes software copyright 1997-2007 University of Cambridge. Permissions and limitations regarding this software are subject to terms available at
http:// www.pcre.org/license.txt.
This product includes software copyright 2007 The Eclipse Foundation. All Rights Reserved. Permissions and limitations regarding this software are subject to terms
available at http:// www.eclipse.org/org/documents/epl-v10.php and at http://www.eclipse.org/org/documents/edl-v10.php.
This product includes software licensed under the terms at http://www.tcl.tk/software/tcltk/license.html, http://www.bosrup.com/web/overlib/?License, http://
www.stlport.org/doc/ license.html, http://asm.ow2.org/license.html, http://www.cryptix.org/LICENSE.TXT, http://hsqldb.org/web/hsqlLicense.html, http://
httpunit.sourceforge.net/doc/ license.html, http://jung.sourceforge.net/license.txt , http://www.gzip.org/zlib/zlib_license.html, http://www.openldap.org/software/release/
license.html, http://www.libssh2.org, http://slf4j.org/license.html, http://www.sente.ch/software/OpenSourceLicense.html, http://fusesource.com/downloads/licenseagreements/fuse-message-broker-v-5-3- license-agreement; http://antlr.org/license.html; http://aopalliance.sourceforge.net/; http://www.bouncycastle.org/licence.html;
http://www.jgraph.com/jgraphdownload.html; http://www.jcraft.com/jsch/LICENSE.txt; http://jotm.objectweb.org/bsd_license.html; . http://www.w3.org/Consortium/Legal/
2002/copyright-software-20021231; http://www.slf4j.org/license.html; http://nanoxml.sourceforge.net/orig/copyright.html; http://www.json.org/license.html; http://
forge.ow2.org/projects/javaservice/, http://www.postgresql.org/about/licence.html, http://www.sqlite.org/copyright.html, http://www.tcl.tk/software/tcltk/license.html, http://
www.jaxen.org/faq.html, http://www.jdom.org/docs/faq.html, http://www.slf4j.org/license.html; http://www.iodbc.org/dataspace/iodbc/wiki/iODBC/License; http://
www.keplerproject.org/md5/license.html; http://www.toedter.com/en/jcalendar/license.html; http://www.edankert.com/bounce/index.html; http://www.net-snmp.org/about/
license.html; http://www.openmdx.org/#FAQ; http://www.php.net/license/3_01.txt; http://srp.stanford.edu/license.txt; http://www.schneier.com/blowfish.html; http://
www.jmock.org/license.html; http://xsom.java.net; http://benalman.com/about/license/; https://github.com/CreateJS/EaselJS/blob/master/src/easeljs/display/Bitmap.js;
http://www.h2database.com/html/license.html#summary; http://jsoncpp.sourceforge.net/LICENSE; http://jdbc.postgresql.org/license.html; http://
protobuf.googlecode.com/svn/trunk/src/google/protobuf/descriptor.proto; https://github.com/rantav/hector/blob/master/LICENSE; http://web.mit.edu/Kerberos/krb5current/doc/mitK5license.html; http://jibx.sourceforge.net/jibx-license.html; https://github.com/lyokato/libgeohash/blob/master/LICENSE; https://github.com/hjiang/jsonxx/
blob/master/LICENSE; https://code.google.com/p/lz4/; https://github.com/jedisct1/libsodium/blob/master/LICENSE; http://one-jar.sourceforge.net/index.php?
page=documents&file=license; https://github.com/EsotericSoftware/kryo/blob/master/license.txt; http://www.scala-lang.org/license.html; https://github.com/tinkerpop/
blueprints/blob/master/LICENSE.txt; http://gee.cs.oswego.edu/dl/classes/EDU/oswego/cs/dl/util/concurrent/intro.html; https://aws.amazon.com/asl/; https://github.com/
twbs/bootstrap/blob/master/LICENSE; https://sourceforge.net/p/xmlunit/code/HEAD/tree/trunk/LICENSE.txt; https://github.com/documentcloud/underscore-contrib/blob/
master/LICENSE, and https://github.com/apache/hbase/blob/master/LICENSE.txt.
This product includes software licensed under the Academic Free License (http://www.opensource.org/licenses/afl-3.0.php), the Common Development and Distribution
License (http://www.opensource.org/licenses/cddl1.php) the Common Public License (http://www.opensource.org/licenses/cpl1.0.php), the Sun Binary Code License
Agreement Supplemental License Terms, the BSD License (http:// www.opensource.org/licenses/bsd-license.php), the new BSD License (http://opensource.org/
licenses/BSD-3-Clause), the MIT License (http://www.opensource.org/licenses/mit-license.php), the Artistic License (http://www.opensource.org/licenses/artisticlicense-1.0) and the Initial Developers Public License Version 1.0 (http://www.firebirdsql.org/en/initial-developer-s-public-license-version-1-0/).
This product includes software copyright 2003-2006 Joe WaInes, 2006-2007 XStream Committers. All rights reserved. Permissions and limitations regarding this
software are subject to terms available at http://xstream.codehaus.org/license.html. This product includes software developed by the Indiana University Extreme! Lab.
For further information please visit http://www.extreme.indiana.edu/.
This product includes software Copyright (c) 2013 Frank Balluffi and Markus Moeller. All rights reserved. Permissions and limitations regarding this software are subject
to terms of the MIT license.
See patents at https://www.informatica.com/legal/patents.html.
DISCLAIMER: Informatica LLC provides this documentation "as is" without warranty of any kind, either express or implied, including, but not limited to, the implied
warranties of noninfringement, merchantability, or use for a particular purpose. Informatica LLC does not warrant that this software or documentation is error free. The
information provided in this software or documentation may include technical inaccuracies or typographical errors. The information in this software and documentation is
subject to change at any time without notice.
NOTICES
This Informatica product (the "Software") includes certain drivers (the "DataDirect Drivers") from DataDirect Technologies, an operating
company of Progress Software Corporation ("DataDirect") which are subject to the following terms and conditions:
1.
THE DATADIRECT DRIVERS ARE PROVIDED "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR
IMPLIED, INCLUDING BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY, FITNESS FOR A
PARTICULAR PURPOSE AND NON-INFRINGEMENT.
2.
IN NO EVENT WILL DATADIRECT OR ITS THIRD PARTY SUPPLIERS BE LIABLE TO THE END-USER CUSTOMER FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, CONSEQUENTIAL OR OTHER DAMAGES ARISING OUT OF THE USE
OF THE ODBC DRIVERS, WHETHER OR NOT INFORMED OF THE POSSIBILITIES OF DAMAGES IN ADVANCE. THESE
LIMITATIONS APPLY TO ALL CAUSES OF ACTION, INCLUDING, WITHOUT LIMITATION, BREACH OF CONTRACT,
BREACH OF WARRANTY, NEGLIGENCE, STRICT LIABILITY, MISREPRESENTATION AND OTHER TORTS.
Table of Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Informatica Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Informatica My Support Portal. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Informatica Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Informatica Product Availability Matrixes. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Informatica Web Site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Informatica How-To Library. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Informatica Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Informatica Support YouTube Channel. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Informatica Marketplace. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Informatica Velocity. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Informatica Global Customer Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Table of Contents
Table of Contents
Table of Contents
Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
Table of Contents
Preface
The Informatica SQL Data Service Guide discusses SQL data services, virtual data, configuration, connecting
to an SQL data service with a third-party tool, and troubleshooting. It also provides instructions on these
concepts. This guide is intended for data service developers. It assumes that you have an understanding of
flat files and relational databases in your environment.
Informatica Resources
Informatica My Support Portal
As an Informatica customer, the first step in reaching out to Informatica is through the Informatica My Support
Portal at https://mysupport.informatica.com. The My Support Portal is the largest online data integration
collaboration platform with over 100,000 Informatica customers and partners worldwide.
As a member, you can:
Search the Knowledge Base, find product documentation, access how-to documents, and watch support
videos.
Find your local Informatica User Group Network and collaborate with your peers.
Informatica Documentation
The Informatica Documentation team makes every effort to create accurate, usable documentation. If you
have questions, comments, or ideas about this documentation, contact the Informatica Documentation team
through email at [email protected]. We will use your feedback to improve our
documentation. Let us know if we can contact you regarding your comments.
The Documentation team updates documentation as needed. To get the latest documentation for your
product, navigate to Product Documentation from https://mysupport.informatica.com.
10
Informatica Marketplace
The Informatica Marketplace is a forum where developers and partners can share solutions that augment,
extend, or enhance data integration implementations. By leveraging any of the hundreds of solutions
available on the Marketplace, you can improve your productivity and speed up time to implementation on
your projects. You can access Informatica Marketplace at http://www.informaticamarketplace.com.
Informatica Velocity
You can access Informatica Velocity at https://mysupport.informatica.com. Developed from the real-world
experience of hundreds of data management projects, Informatica Velocity represents the collective
knowledge of our consultants who have worked with organizations from around the world to plan, develop,
deploy, and maintain successful data management solutions. If you have questions, comments, or ideas
about Informatica Velocity, contact Informatica Professional Services at [email protected].
Preface
11
The telephone numbers for Informatica Global Customer Support are available from the Informatica web site
at http://www.informatica.com/us/services-and-training/support-services/global-support-centers/.
12
Preface
CHAPTER 1
Virtual tables
Virtual table mappings that define the flow of data between sources and a virtual table
Virtual stored procedures that take optional inputs through parameters, transform the data, and optionally
return output through parameters
13
To make the virtual tables and virtual stored procedures available to you, a developer creates and deploys an
application that contains the SQL data service. The developer deploys the application to a Data Integration
Service and an administrator runs the application. When the application is running, you can query the virtual
tables and run the virtual stored procedures in the SQL data service.
When you query a virtual table or run a virtual stored procedure, the JDBC or ODBC driver sends the request
to the Data Integration Service. By default, the driver uses HTTP to communicate with the Data Integration
Service. If an administrator enables Transport Layer Security (TLS) for the domain, the driver uses TLS to
communicate with the Data Integration Service.
Before you can run SQL queries or virtual stored procedures for the first time, you must configure the
machine from which you want to access the SQL data service. You must also configure the client tool so that
it can connect to the SQL data service.
14
CHAPTER 2
Virtual Data
This chapter includes the following topics:
Virtual Tables, 17
Define a uniform view of data that you can expose to end users.
Define the virtual flow of data between the sources and the virtual tables. Transform and standardize the
data.
Provide end users with access to the data. End users can use a JDBC or ODBC client tool to run SQL
queries against the virtual tables as if they were actual, physical database tables.
Isolate the data from changes in data structures. You can add the virtual database to a self-contained
application. If you make changes to the virtual database in the Developer tool, the virtual database in the
application does not change until you redeploy it.
To create a virtual database, you must create an SQL data service. An SQL data service contains the virtual
schemas and the virtual tables or stored procedures that define the database structure. If the virtual schema
contains virtual tables, the SQL data service also contains virtual table mappings that define the flow of data
between the sources and the virtual tables.
After you create an SQL data service, you add it to an application and deploy the application to make the
SQL data service accessible by end users.
End users can query the virtual tables or run the stored procedures in the SQL data service by entering an
SQL query in a third-party client tool. When the user enters the query, the Data Integration Service retrieves
virtual data from the sources or from cache tables, if an administrator specifies that any of the virtual tables
should be cached.
15
Virtual tables. The virtual tables in the database. You can create virtual tables from physical or logical
data objects, or you can create virtual tables manually.
Virtual table mappings. Mappings that link a virtual table to source data and define the data flow between
the sources and the virtual table. If you create a virtual table from a data object, you can create a virtual
table mapping to define data flow rules between the data object and the virtual table. If you create a virtual
table manually, you must create a virtual table mapping to link the virtual table with source data and
define data flow.
Virtual stored procedures. Sets of data flow instructions that allow end users to perform calculations or
retrieve data.
2.
3.
4.
Create or update virtual table mappings to define the data flow between data objects and the virtual
tables.
5.
6.
2.
16
3.
4.
Click Next.
5.
6.
To create virtual tables in the SQL data service, click Next. To create an SQL data service without virtual
tables, click Finish.
If you click Next, the New SQL Data Service dialog box appears.
7.
8.
9.
10.
11.
12.
Select Read in the Data Access column to link the virtual table with the data object. Select None if you
do not want to link the virtual table with the data object.
13.
14.
Click Finish.
The Developer tool creates the SQL data service.
Virtual Tables
A virtual table is a table in a virtual database. Create a virtual table to define the structure of the data.
Create one or more virtual tables within a schema. If a schema contains multiple virtual tables, you can
define primary key-foreign key relationships between tables.
You can create virtual tables manually or from physical or logical data objects. Each virtual table has a data
access method. The data access method defines how the Data Integration Service retrieves data. When you
manually create a virtual table, the Developer tool creates an empty virtual table and sets the data access
method to none.
When you create a virtual table from a data object, the Developer tool creates a virtual table with the same
columns and properties as the data object. The Developer tool sets the data access method to read. If you
change columns in the data object, the Developer tool updates the virtual table with the same changes. The
Developer tool does not update the virtual table if you change the data object name or description.
To define data transformation rules for the virtual table, set the data access method to custom. The
Developer tool prompts you to create a virtual table mapping.
You can preview virtual table data when the data access method is read or custom.
Virtual Tables
17
Description
Read
The virtual table is linked to a physical or logical data object without data transformation. If
you add, remove, or change a column in the data object, the Developer tool makes the
same change to the virtual table. However, if you change primary key-foreign key
relationships, change the name of the data object, or change the data object description, the
Developer tool does not update the virtual table.
If you change the data access method to read, the Developer tool prompts you to choose a
data object. If the virtual table has a virtual table mapping, the Developer tool deletes the
virtual table mapping.
When an end user queries the virtual table, the Data Integration Service retrieves data from
the data object.
Custom
The virtual table is linked to a physical or logical data object through a virtual table mapping.
If you update the data object, the Developer tool does not update the virtual table.
If you change the data access method to custom, the Developer tool prompts you to create
a virtual table mapping.
When an end user queries the virtual table, the Data Integration Service applies any
transformation rule defined in the virtual table mapping to the source data. It returns the
transformed data to the end user.
2.
3.
Drag a physical or logical data object from the Object Explorer view to the editor.
The Add Data Objects to SQL Data Service dialog box appears. The Developer tool lists the data
object in the Data Object column.
4.
5.
Click Finish.
The Developer tool places the virtual table in the editor and sets the data access method to read.
18
2.
3.
4.
5.
Click Finish.
The virtual table appears in the Schema view.
6.
To add a column to the virtual table, right-click Columns and click New.
Verify that the virtual column names are not reserved words for the SQL standard.
7.
To make a column a primary key, click the blank space to the left of the column name.
2.
3.
Click the column you want to assign as a foreign key in one table. Drag the pointer from the foreign key
column to the primary key column in another table.
The Developer tool uses an arrow to indicate a relationship between the tables. The arrow points to the
primary key table.
2.
3.
4.
5.
6.
Click Run.
Virtual Tables
19
20
21
The following table shows examples of SQL statements that you can use to insert literal data and query
data into a temporary table:
Type
Description
Literal
data
Literals describe a user or system-supplied string or value that is not an identifier or keyword.
Use strings, numbers, dates, or boolean values when you insert literal data into a temporary
table. Use the following statement format to insert literal data into a temporary table:
INSERT INTO <TABLENAME> <OPTIONAL COLUMN LIST> VALUES (<VALUE
LIST>), (<VALUE LIST>)
For example, INSERT INTO temp_dept (dept_id, dept_name, location)
VALUES (2, 'Marketing', 'Los Angeles').
Query
data
You can query an SQL data service and insert data from the query into a temporary table. Use
the following statement format to insert query data into a temporary table:
INSERT INTO <TABLENAME> <OPTIONAL COLUMN LIST> <SELECT QUERY>
For example, INSERT INTO temp_dept(dept_id, dept_name, location) SELECT
dept_id, dept_name, location from dept where dept_id = 99.
You can use a set operator, such as UNION, in the SQL statement when you insert query data
into a temporary table. Use the following statement format when you use a set operator:
INSERT INTO <TABLENAME> <OPTIONAL COLUMN LIST> (<SELECT QUERY>
<SET OPERATOR> <SELECT QUERY>)
For example, INSERT INTO temp_dept select * from north_america_dept
UNION select * from asia_dept.
22
You can specify schema and default schema for a temporary table.
You can place the primary key, NULL, NOT NULL, and DEFAULT constraints on a temporary table.
You cannot place a foreign key or CHECK and UNIQUE constraints on a temporary table.
You cannot issue a query that contains a common table expression or a correlated subquery against a
temporary table.
Sources. Physical or logical data objects that describe the characteristics of source tables or files. A
virtual table mapping must contain at least one source.
Transformations. Objects that define the rules for data transformation. Use different transformation objects
to perform different functions. Transformations are optional in a virtual table mapping.
Links. Connections between columns that define virtual data flow between sources, transformations, and
the virtual table.
Example
You want to make order information available to one of your customers.
The orders information is stored in a relational database table that contains information for several
customers. The customer is not authorized to view the orders information for other customers.
Create an SQL data service to retrieve the orders information. Create a virtual table from the orders table and
set the data access method to custom. Add a Filter transformation to the virtual table mapping to remove
orders data for the other customers.
After you create and deploy an application that contains the SQL data service, the customer can query the
virtual table that contains his orders information.
2.
3.
4.
23
Open the SQL data service that contains the virtual table for which you want to create a virtual table
mapping.
2.
3.
In the Tables section, change the data access method for the virtual table to Custom.
The New Virtual Table Mapping dialog box appears.
4.
5.
Click Finish.
The Developer tool creates a view for the virtual table mapping and places the virtual table in the editor.
If you created the virtual table from a data object, the Developer tool adds the data object to the mapping
as a source.
6.
To add sources to the mapping, drag data objects from the Object Explorer view into the editor.
You can add logical or physical data objects as sources.
7.
Optionally, add transformations to the mapping by dragging them from the Object Explorer view or
Transformation palette into the editor.
8.
Link columns by selecting a column in a source or transformation and dragging it to a column in another
transformation or the virtual table.
The Developer tool uses an arrow to indicate the columns are linked.
2.
3.
4.
If the Validation Log view lists errors, correct the errors and revalidate the virtual table mapping.
24
1.
Open the SQL data service that contains the virtual table mapping.
2.
3.
Select the object for which you want to preview output. You can select a transformation or the virtual
table.
4.
5.
Click Run.
The Developer tool displays results in the Output section.
Inputs. Objects that pass data into the virtual stored procedure. Inputs can be input parameters, Read
transformations, or physical or logical data objects. Input parameters pass data to the stored procedure.
Read transformations extract data from logical data objects. A virtual stored procedure must contain at
least one input.
Transformations. Objects that define the rules for data transformation. Use different transformation objects
to perform different functions. Transformations are optional in a virtual stored procedure.
Outputs. Objects that pass data out of a virtual stored procedure. Outputs can be output parameters,
Write transformations, or physical or logical data objects. Output parameters receive data from the stored
procedure. Write transformations write data to logical data objects. A virtual stored procedure must
contain at least one output. Virtual stored procedures do not return result sets.
Links. Connections between ports that define virtual data flow between inputs, transformations, and
outputs.
Example
An end user needs to update customer email addresses for customer records stored in multiple relational
databases.
To allow the end user to update the email addresses, first create a logical data object model to define a
unified view of the customer. Create a logical data object that represents a union of the relational tables.
Create a logical data object write mapping to write to the relational tables. Add a Router transformation to
determine which relational table contains the customer record the end user needs to update.
Next, create an SQL data service. In the SQL data service, create a virtual stored procedure that contains
input parameters for the customer ID and email address. Create a Write transformation based on the logical
data object and add it to the virtual stored procedure as output.
Finally, deploy the SQL data service. The end user can call the virtual stored procedure through a third-party
client tool. The end user passes the customer ID and updated email address to the virtual stored procedure.
The virtual stored procedure uses the Write transformation to update the logical data object. The logical data
object write mapping determines which relational table to update based on the customer ID and updates the
customer email address in the correct table.
25
2.
Add inputs, transformations, and outputs to the virtual stored procedure, and link the ports.
3.
4.
In theOutline view for an SQL data service, right-click the data service and select New > Virtual Stored
Procedure.
The New Virtual Stored Procedure dialog box appears.
2.
3.
4.
If the virtual stored procedure has input parameters or output parameters, select the appropriate option.
5.
Click Finish.
The Developer tool opens the virtual stored procedure in the editor. If you select input parameters or
output parameters, the Developer tool adds an Input Parameter transformation or an Output Parameter
transformation, or both, in the editor.
6.
7.
8.
Optionally, add transformations to the virtual stored procedure by dragging them from the Object
Explorer view or the Transformation palette into the editor.
9.
Link ports by selecting a port in a source or transformation and dragging it to a port in another
transformation or target.
The Developer tool uses an arrow to indicate the ports are linked.
2.
3.
26
If the Validation Log view lists errors, correct the errors and revalidate the virtual stored procedure.
2.
3.
If the virtual stored procedure contains input parameters, enter them in the Input section.
4.
Click Run.
The Developer tool displays results in the Output section.
Push SQL set operations such as UNION, UNION ALL, DISTINCT, INTERSECT, DISTINCT, and MINUS
to relational data objects.
Push SQL keyword LIMIT to IBM DB2, MS SQL, and Oracle relational data objects.
You can view the original query plan and the optimized query plan from the Data Viewer view. The resulting
optimized query can contain different transformations or transformations in a different order. The Data
Integration Service can push transformations and SQL operations to the relational data object to minimize
data read from the source.
You can configure different optimizer levels in the Developer tool. Different optimizer levels produce different
queries. The query optimization depends on the optimizer level that you select and the complexity of the
query. When you run a simple query against a virtual table, different optimizer levels might produce the same
optimized query. When you run a query that contains multiple clauses and subqueries, different optimizer
levels produce a different optimized queries.
27
The non-optimized representation displays the query plan based on the query you enter with the DISTINCT
operation.
The following figure shows the SQL query plan that appears in the Optimized tab:
The optimized representation displays the query plan as the Data Integration Service runs it. The Data
Integration Service pushes the DISTINCT operation to the source to increase performance.
Filter Transformation
You want to query the CUSTOMERS virtual table in an SQL data service to filter and order customer data.
The Data Integration Service can push transformations such as a Filter transformation to the relational data
object with the normal optimizer level.
You can enter the following query in the Data Viewer view:
select * from CUSTOMERS where CUSTOMER_ID > 150000 order by LAST_NAME
The following figure shows the SQL query plan that appears in the Non-Optimized tab:
The non-optimized representation displays the query plan based on the query you enter. The Developer tool
displays the WHERE clause as a Filter transformation and the ORDER BY clause as a Sorter transformation.
The Developer tool uses a pass-through Expression transformation to rename ports.
28
The following figure shows the optimized SQL query plan that appears in the Optimized tab:
The optimized representation displays the query plan as the Data Integration Service runs it. Because the
optimizer level is normal, the Data Integration Service pushes the filter condition to the source. Pushing the
filter condition improves query performance because it reduces the number of rows that the Data Integration
Service reads from the source.
As in the non-optimized query, the Developer tool displays the ORDER BY clause as a Sorter transformation.
The Data Integration Service uses pass-through Expression transformations to enforce the data types that
you configure in the logical transformations.
Open an SQL data service that contains at least one virtual table.
2.
3.
4.
Optionally, select a data viewer configuration that contains the optimizer level you want to apply to the
query.
5.
6.
29
CHAPTER 3
SQL Syntax
This chapter includes the following topics:
Datatypes, 31
Operators, 31
Functions, 31
Queries, 40
Reserved Words, 42
Escape Syntax, 42
Virtual tables
Virtual table mappings that define the flow of data between sources and a virtual table
Virtual stored procedures that take optional inputs through parameters, transform the data, and optionally
return output through parameters
To allow end users to query the virtual tables and run the virtual stored procedures in an SQL data service, a
developer creates and deploys an application that contains the SQL data service. The developer deploys the
application to a Data Integration Service and an administrator runs the application. When the application is
running, end users can make SQL queries against the virtual tables and run the virtual stored procedures in
the SQL data service.
SQL data services support ANSI SQL-92 operators, functions, statements, and keywords.
30
Datatypes
SQL data services support common SQL datatypes.
SQL data services support the following datatypes:
Bigint
Binary
Boolean
Char
Date
Decimal
Double
Int
Time
Timestamp
Varchar
Operators
SQL data services support common operators. Use operators to perform mathematical computations,
combine data, or compare data.
SQL data services support the following operators in an SQL query:
|| (concatenate strings)
BETWEEN
CASE
EXISTS
IN, NOT IN
Functions
You can use SQL and Informatica functions to run queries against an SQL data service.
Some functions are SQL and Informatica equivalents, such as the ABS function. Some functions are unique
to ANSI SQL or to Informatica.
Datatypes
31
Note: You cannot use filter conditions with Informatica functions in the SQL.
The following table provides the syntax and functions that you can use to query an SQL data service:
Function
Syntax
ABS
ABS( numeric_value )
Description
Returns the absolute value of a numeric value.
Informatica and SQL function.
ADD_TO_DATE
ADD_TO_DATE( date,
format, amount )
ASCII
ASCII ( string )
AVG
AVG( numeric_value )
CASE
(Simple)
CASE
(Searched)
CEIL
CASE input_expression
WHEN
when_expression THEN
result_expression
[ ...n ]
[ ELSE
else_result_expression
]
END
CASE
WHEN
Boolean_expression
THEN
result_expression
[ ...n ]
[ ELSE
else_result_expression
]
END
CEIL( numeric_value )
CHAR_LENGTH
CHAR_LENGTH(
numeric_value )
CHR
CHR( numeric_value )
CHRCODE
CHRCODE ( string )
32
Function
Syntax
Description
COALESCE
COALESCE
( first_argument,
second_argument[,
third_argument, ...] )
CONCAT
CONVERT_BASE
COS
CONCAT( first_string,
second_string )
CONVERT_BASE( string,
source_base,
dest_base )
COS( numeric_value )
SQL function.
Concatenates two strings.
Informatica and SQL function.
Converts a non-negative numeric string from one
base value to another base value.
Informatica and SQL function.
Returns the cosine, expressed in radians, of a
numeric value.
Informatica and SQL function.
COSH
COSH( numeric_value )
COUNT
COUNT( value )
CRC32
CRC32( value )
CUME
CUME( numeric_value )
CURRENT_DATE
CURRENT_DATE
CURRENT_TIME
CURRENT_TIME
Functions
33
Function
Syntax
Description
CURRENT_TIMEST
AMP
CURRENT_TIMESTAMP
DATE_COMPARE
DATE_COMPARE( date1,
date2 )
EXP
DATE_DIFF( date1,
date2, format )
EXP( exponent )
EXTRACT
EXTRACT( YEAR|MONTH|
DAY|HOUR|MINUTE|
SECOND FROM date )
FLOOR
FLOOR( numeric_value )
FV
GET_DATE_PART
GET_DATE_PART( date,
format )
INITCAP
INITCAP( string )
34
Function
Syntax
Description
INSTR
INSTR( string,
search_value [,start
[,occurrence
[,comparison_type ]]]
)
IS_DATE
IS_NUMBER
IS_DATE( value
[,format] )
IS_NUMBER( value )
IS_SPACES
IS_SPACES( value )
ISNULL
ISNULL( value )
ISNUMERIC
ISNUMERIC( value )
LAST_DAY
LAST_DAY( date )
Returns the date of the last day of the month for each
date in a column.
Informatica and SQL function.
LN
LN( numeric_value )
LOCATE
LOCATE( string,
search_value )
LOG
LOWER
LOWER( string )
LPAD
LTRIM
LPAD( first_string,
length
[,second_string] )
LTRIM( string [,
trim_set] )
Functions
35
Function
Syntax
Description
MAKE_DATE_TIME
MAKE_DATE_TIME( year,
month, day, hour,
minute, second,
nanosecond )
MAX( value )
MAX
SQL function.
MD5
MD5( value )
METAPHONE
METAPHONE( string
[,length] )
MIN
MIN( value )
MOD
MOVINGAVG
MOVINGSUM
NPER
MOD( numeric_value,
divisor )
MOVINGAVG(
numeric_value,
rowset )
MOVINGSUM(
numeric_value,
rowset )
Informatica function.
Informatica function.
POSITION
POWER
POWER( base,
exponent )
PV
36
Function
Syntax
Description
RAND
RAND( seed )
RATE
REG_EXTRACT
REG_EXTRACT( subject,
'pattern',
subPatternNum )
REG_MATCH
REG_MATCH( subject,
pattern )
REG_REPLACE
REG_REPLACE( subject,
pattern, replace,
numReplacements )
REPLACECHR
ROUND (dates)
ROUND (numbers)
REPLACECHR( CaseFlag,
InputString,
OldCharSet, NewChar )
ROUND( date
[,format] )
ROUND( numeric_value
[, precision] )
RPAD
RTRIM
RPAD( first_string,
length
[,second_string] )
RTRIM( string [,
trim_set] )
SET_DATE_PART
SET_DATE_PART( date,
format, value )
Functions
37
Function
Syntax
Description
SIGN
SIGN( numeric_value )
SIN
SIN( numeric_value )
SINH
SINH( numeric_value )
SOUNDEX
SOUNDEX( string )
SQRT( numeric_value )
SUBSTR
SUM
SUM( numeric_value )
TAN
TAN( numeric_value )
TANH
TANH( numeric_value )
TO_BIGINT
TO_BIGINT( value [,
flag] )
TO_CHAR
TO_CHAR( value )
38
Function
Syntax
Description
TO_DATE
TO_DATE( string [,
format] )
TO_DECIMAL
TO_FLOAT
TO_DECIMAL( value [,
scale] )
TO_FLOAT( value )
TO_INTEGER
TO_INTEGER( value [,
flag] )
TRIM
TRIM( [operand]
string )
TRUNC( date
[,format] )
TRUNC (numbers)
UPPER
TRUNC( numeric_value
[, precision] )
UPPER( string )
39
FROM
GROUP BY
HAVING
ORDER BY
WHERE
ALL
CROSS JOIN
DISTINCT
EXCEPT
INNER JOIN
INTERSECT
LIMIT
MINUS
Queries
You can issue non-correlated subqueries, correlated subqueries, and parameterized queries when you query
virtual tables and run virtual stored procedures in an SQL data service.
Non-Correlated Subqueries
A non-correlated subquery is a subquery that is not dependent on the outer query. Use non-correlated
subqueries to filter or modify data when you query virtual tables in an SQL data service.
You can use non-correlated subqueries in the following places:
Expressions
BETWEEN operator
CASE operator
FROM clause
HAVING clause
SELECT statement
WHERE clause
40
Correlated Subqueries
A correlated subquery is a subquery that uses values from the outer query in its WHERE clause. The subquery
is evaluated once for each row processed by the outer query. Use correlated subqueries to filter or modify
data when you query virtual tables in an SQL data service.
You can issue a correlated subquery from an ODBC client, JDBC client, or from the query plan window in the
Developer tool.
The following table provides examples of the types of correlated subqueries that you can issue against an
SQL data service:
Type
Description
IN
A correlated subquery that uses the IN keyword within an SQL WHERE clause to select rows
from the values returned by the correlated subquery.
For example, SELECT * FROM vs.nation a WHERE a.n_regionkey IN (SELECT
distinct b.r_regionkey FROM vs.region b WHERE b.r_regionkey =
a.n_regionkey).
Quantified
comparison
A correlated subquery that contains a comparison operator within an SQL WHERE clause.
For example, SELECT n_name FROM vs.nation a WHERE 2 > (SELECT 1 FROM
vs.nation b WHERE a.n_nationkey=b.n_nationkey).
Query
Non-flattened
Flattened
The Data Integration Service can flatten a correlated subquery into a normal join when it meets the following
requirements:
It does not contain a GROUP BY clause, aggregates in a SELECT list, or an EXIST or NOT IN logical
operator.
It generates unique results. One column in the corelated subquery is a primary key. For example, if
r_regionkey column is a primary key for the vs.nation virtual table, you can issue the following query:
SELECT * FROM vs.nation WHERE n_regionkey IN (SELECT b.r_regionkey FROM vs.region b WHERE
b.r_regionkey = n_regionkey).
Queries
41
If it contains a FROM list, each table in the FROM list is a virtual table in the SQL data service.
Parameterized Queries
A parameterized query uses a precompiled SQL statement with placeholders for values that change.
Parameterized queries can improve processing efficiency and protect the database from SQL injection
attacks. You can use prepared statements and call stored procedures in a parameterized query that you run
against an SQL data service.
Define parameters in the PreparedStatement or CallableStatement object in a JDBC program or in a
statement handle prepared by SQLPrepare for an ODBC program. Use the PreparedStatement object to
store a precompiled SQL statement that you can run multiple times. Use the CallableStatement object to
call stored procedures.
You can use standard method calls and set methods in the PreparedStatement object of the parameterized
query.
An SQL data service accepts common datatypes when you configure default values for parameters in stored
procedures. The date, time, and timestamp datatypes default to the ISO format.
You cannot use the following items in a parameterized query that you run against an SQL data service:
Array datatype
Reserved Words
Some keywords are reserved for specific functions.
The following words are reserved words:
To use reserved words in an SQL query, enclose the word in double quotation marks.
Escape Syntax
SQL data services support escape clauses for functions, date formats, time formats, and timestamp formats.
An escape clause contains a keyword enclosed in curly brackets.
42
The following table lists the keywords you can use in an escape clause:
Category
Keyword
Syntax
Functions
fn
Date formats
{d 'value'}
The format for the date value must match the SQL data service default date
format. Therefore, if the default date format for the SQL data service is YYYYMM-DD, the date value must include a 4-digit year.
For example:
SELECT * FROM Orders WHERE OrderDate > {d
'2005-01-01'}
Time formats
{t 'value'}
The format for the time value must match the SQL data service default time
format. Therefore, if the default time format for the SQL data service is
HH:MI:SS, the time value cannot include fractional seconds.
For example:
SELECT * FROM Orders WHERE OrderTime < {t '12:00:00'}
Timestamp
formats
{ts 'value'}
ts
The format for the timestamp value must match the SQL data service default
timestamp format. Therefore, if the default timestamp format for the SQL data
service is YYYY-MM-DD HH:MI:SS, the timestamp value cannot include
fractional seconds.
For example:
SELECT * FROM Sales WHERE TransactTime > {ts
'2010-01-15 12:00:00'}
Syntax
Description
CURTIMESTAMP
CURTIMESTAMP()
EXP
EXP( exponent )
Escape Syntax
43
44
Function
Syntax
Description
EXTRACT
EXTRACT( YEAR|MONTH|
DAY|HOUR|MINUTE|
SECOND FROM date )
FLOOR
FLOOR( numeric_value )
LCASE
LCASE( string )
LENGTH
LENGTH( string )
LOCATE
LOCATE( string,
search_value )
LOG
LOG( numeric_value )
LTRIM
LTRIM( string )
MOD
MOD( numeric_value,
divisor )
POWER
POWER( base,
exponent )
RTRIM
RTRIM( string )
SIN
SIN( numeric_value )
SINH
SINH( numeric_value )
SQRT
SQRT( numeric_value )
SUBSTRING
SUBSTRING( string,
start [,length] )
TAN
TAN( numeric_value )
TANH
TANH( numeric_value )
Function
Syntax
Description
TRIM
TRIM( [operand]
string )
UCASE( string )
45
The following table describes how to enable the dumpMapping parameter for each SQL data service
connection type:
Connection Type
Method
JDBC connections
ODBC connections on
Windows
Enter the following value in the Optional Parameters field in the Create a
New Data Source window: dumpMapping=true
I entered an SQL query that converts a large number to a binary value using the CONVERT_BASE function, and the result
is truncated.
Use the CAST() function when converting large numbers to binary. For example, the following
CONVERT_BASE query converts 2222 from base 10 to base 2:
CAST(CONVERT_BASE( 2222, 10, 2 ) AS VARCHAR(100))
I entered an SQL query that converts a large number to a binary value using the CONVERT_BASE function, and the result
is truncated.
Use the CAST() function when converting large numbers to binary. For example, the following
CONVERT_BASE query converts 2222 from base 10 to base 2:
CAST(CONVERT_BASE( 2222, 10, 2 ) AS VARCHAR(100))
When I use the TO_DECIMAL function to convert a string or numeric value to a decimal value, the query fails with a decimal
overflow error, or the query returns an unexpected decimal value.
Use the CAST() function to change the SQL statement when you use the TO_DECIMAL function in an
SQL query. For example, the following TO_DECIMAL query uses the CAST function to return the
decimal value 60.250:
CAST(TO_DECIMAL((60 + .25, 3))
46
CHAPTER 4
JDBC Connections, 53
ODBC Connections, 56
2.
3.
4.
47
Installation DVD. Download the Informatica zip or tar file from the installation DVD to a directory on your
machine and then extract the installer files. Or, extract the installer files directly from the DVD to a
directory on your machine.
FTP download. Download the Informatica installation zip or tar file from the Informatica Electronic
Software Download site to a directory on your machine and then extract the installer files.
48
Required Information
Description
Data Integration
Service name
Data Integration Service that runs the application that contains the SQL data service.
Name of the SQL data service that contains the virtual tables you want to query or the
virtual stored procedures that you want to run. The run-time SQL data service name
includes the application name that contains the SQL data service and uses the
following format: <application name>.<SQL data service name>
User name
User password
Required Information
Description
Truststore file
If the Informatica domain has secure communication enabled, you must have the
location of the truststore file that contains the SSL certificate for the domain.
Authentication type
The mode of authentication used to connect to the SQL data service. You can select
one of the following authentication modes:
Native or LDAP Authentication
Uses an Informatica domain user account to connect to the SQL data service in an
Informatica domain that uses Native or LDAP authentication. The user account can
be in a native or LDAP security domain.
Kerberos with keytab
Uses the service principal name (SPN) of an Informatica domain user account to
connect to the SQL data service in an Informatica domain that uses Kerberos
authentication.
Kerberos with user name and password
Uses an Informatica domain user account to connect to the SQL data service in an
Informatica domain that uses Kerberos authentication.
Logged in user
Uses the user account logged in to the client machine to connect to the SQL data
service in an Informatica domain that uses Native, LDAP, or Kerberos
authentication.
49
2.
3.
Run install.bat.
4.
5.
Click Next.
The Installation Prerequisites page displays the system requirements. Verify that all installation
requirements are met before you continue the installation.
6.
Click Next.
7.
On the Installation Directory page, enter the absolute path for the installation directory.
8.
On the Pre-Installation Summary page, review the installation information, and click Install.
The installer copies the driver files to the installation directory. The Post-Installation Summary page
indicates whether the installation completed successfully.
9.
Click Done.
You can view the installation log files to get more information about the tasks performed by the installer.
50
Use a text editor to open and change the values of the properties in the file.
The following table describes the installation properties that you can change:
Property
Description
INSTALL_TYPE
Default is 0.
USER_INSTALL_DIR
2.
3.
4.
2.
3.
2.
51
3.
4.
5.
Run install.sh.
6.
7.
Option
Description
Press Enter.
The Installation Prerequisites section displays the system requirements.
8.
Verify that all installation requirements are met before you continue the installation.
9.
Press Enter.
The Installation Directory section appears.
10.
11.
Press Enter.
12.
In the Pre-Installation Summary section, review the installation information, and then press Enter.
13.
Press Enter.
For more information about the install tasks, see the installation debug log.
Use a text editor to open and change the values of the properties in the file.
The following table describes the installation properties that you can change:
Property
Description
INSTALL_TYPE
Default is 0.
USER_INSTALL_DIR
52
2.
3.
4.
JDBC Connections
You can connect to an SQL data service through a JDBC client tool such as the SQL SQuirreL client.
To connect to an SQL data service through a JDBC client tool, you must configure the JDBC connection.
Value
Class
name
com.informatica.ds.sql.jdbcdrv.INFADriver
JDBC
URL
JDBC Connections
53
The following table describes the Data Integration Service parameters that you can configure:
Parameter
Value
application
ACCESS
Enter this parameter when you query Microsoft Access virtual tables that
contain date columns. When you configure the ODBC driver with this
parameter, the Data Integration Service converts Microsoft Access date data
to the date/time data type. The parameter applies only to Microsoft Access
date data.
optimizeLevel
Sets the mapping optimization level. Enter one of the following values:
- 0. Sets the optimization level to None.
- 1. Sets the optimization level to Minimal.
- 2. Sets the optimization level to Normal.
- 3. Sets the optimization level to Full.
The default value is 1.
highPrecision
defaultDateFormat
Specifies the date and time formats. Enter one of the following values:
defaultTimeFormat
defaultTimeStampFormat
YYYY-MM-DD HH24:MI:SS
YYYY/MM/DD HH24:MI:SS
YYYY/MM/DD
MM/DD/YYYY
MM/DD/YYYY HH24:MI:SS
DD/MM/YY
DD.MM.YY
DD-MON-YY
DD/MM/YY HH24:MI:SS
DD.MM.YY HH24:MI:SS
dumpMapping
Creates XML files for SQL query mappings and stores them in the following
location: <Informatica installation directory>\tomcat\bin
\dslogs\sql. If a query fails, you can send these files to Informatica
Global Customer Support for analysis. Enter true or false. The default
value is false.
ResultSetCacheExpirationPeriod
Amount of time in milliseconds that a result set is available for use after it is
populated. For example, if the value is 0, result set caching is disabled. If the
value is 5, the result set is available for 5 milliseconds after it is populated.
2.
3.
54
2.
3.
commons-codec-1.3.jar
commons-httpclient-3.1.jar
commons-logging-1.1.jar
commons-pool-1.4.jar
FastInfoset-1.2.3.jar
JDBC Connections
55
log4j-1.2.12.jar
spring-2.5.jar
If the JDBC client machine includes an instance of one of the bundled third-party libraries, a conflict might
occur.
To resolve third-party library conflicts, use the infadsjdbclight.jar file that is also installed by the
Informatica JDBC/ODBC driver installation program. infadsjdbclight.jar contains the Informatica JDBC
driver and is installed in the following location: <Informatica installation directory>\jdbcdrv
\infadsjdbc. The infadsjdbc folder also contains all of the third-party libraries that are included with
infadsjdbc.jar.
To use infadsjdbclight.jar, modify the CLASSPATH environment variable with the location of
infadsjdbclight.jar and with the location of the third-party libraries that do not cause a conflict. For
example, if the JDBC client machine includes an instance of the spring-2.5.jar library, remove
infadsjdbc.jar from the CLASSPATH environment variable. Then, add the following files to the
CLASSPATH:
<Informatica installation directory>\jdbcdrv\infadsjdbc\infadsjdbclight.jar
<Informatica installation directory>\jdbcdrv\infadsjdbc\commons-codec-1.3.jar
<Informatica installation directory>\jdbcdrv\infadsjdbc\commons-httpclient-3.1.jar
<Informatica installation directory>\jdbcdrv\infadsjdbc\commons-logging-1.1.jar
<Informatica installation directory>\jdbcdrv\infadsjdbc\commons-pool-1.4.jar
<Informatica installation directory>\jdbcdrv\infadsjdbc\FastInfoset-1.2.3.jar
<Informatica installation directory>\jdbcdrv\infadsjdbc\log4j-1.2.12.jar
ODBC Connections
You can connect to an SQL data service through an ODBC client tool such as IBM Cognos.
To connect to an SQL data service through a JDBC client tool, you must configure the ODBC connection.
56
Option
Definition
Authentication Mode
The Authentication Mode parameter can have one of the following values:
Native or LDAP authentication.
Kerberos with keytab file.
Kerberos with username & password.
Logged in user.
DSN Name
Option
Definition
Host Name
Port
Data Integration Service that runs the application that contains the SQL data service.
Name of the SQL data service that contains the virtual tables you want to query or the
virtual stored procedures that you want to run. The run-time SQL data service name
includes the application name that contains the SQL data service and uses the
following format: <application name>.<SQL data service name>
User Name
Informatica domain user name. Required if you select the Native or LDAP
Authentication or Kerberos with username & password authentication mode.
Password
Informatica domain user password. Required if you select the Native or LDAP
Authentication or Kerberos with username & password authentication mode.
Security Domain
Security domain for the Informatica domain user account. Required if the user
account is in an LDAP security domain.
Absolute path and file name for the keytab file on the client machine. Required if you
select the Kerberos with keytab file authentication mode.
Service principal name for the user account. Required if you select the Kerberos with
keytab file authentication mode.
The following table describes the Data Integration Service parameters that you can configure:
Parameter
Value
application
ACCESS
Enter this parameter when you query Microsoft Access virtual tables that
contain date columns. When you configure the ODBC driver with this
parameter, the Data Integration Service converts Microsoft Access date data
to the date/time data type. The parameter applies only to Microsoft Access
date data.
optimizeLevel
Sets the mapping optimization level. Enter one of the following values:
- 0. Sets the optimization level to None.
- 1. Sets the optimization level to Minimal.
- 2. Sets the optimization level to Normal.
- 3. Sets the optimization level to Full.
The default value is 1.
highPrecision
ODBC Connections
57
Parameter
Value
defaultDateFormat
Specifies the date and time formats. Enter one of the following values:
defaultTimeFormat
defaultTimeStampFormat
YYYY-MM-DD HH24:MI:SS
YYYY/MM/DD HH24:MI:SS
YYYY/MM/DD
MM/DD/YYYY
MM/DD/YYYY HH24:MI:SS
DD/MM/YY
DD.MM.YY
DD-MON-YY
DD/MM/YY HH24:MI:SS
DD.MM.YY HH24:MI:SS
dumpMapping
Creates XML files for SQL query mappings and stores them in the following
location: <Informatica installation directory>\tomcat\bin
\dslogs\sql. If a query fails, you can send these files to Informatica
Global Customer Support for analysis. Enter true or false. The default
value is false.
ResultSetCacheExpirationPeriod
Amount of time in milliseconds that a result set is available for use after it is
populated. For example, if the value is 0, result set caching is disabled. If the
value is 5, the result set is available for 5 milliseconds after it is populated.
2.
3.
Click Add.
4.
5.
Click Finish.
The Create a New Data Service window appears.
6.
7.
8.
9.
58
Click Test Connection to verify that the connection is valid and then click OK.
AIX
LIBPATH
HP-UX
SHLIB_PATH
Linux
LD_LIBRARY_PATH
Solaris
LD_LIBRARY_PATH
Configure the shared library environment variable to include the following directories:
The directory where the ODBC driver libraries reside. The driver libraries reside in libinfadsodbc.
libinfadsodbc is found in <Informatica installation directory>/tools/odbcdrv.
The directory where the driver manager library files reside. Use the unixODBC driver manager. For more
information about the location of the unixODBC driver manager, contact your system administrator.
Edit the odbc.ini file or copy the odbc.ini file to the root directory and edit it.
This file exists in the $ODBCHOME directory.
$ cp $ODBCHOME/odbc.ini $HOME/.odbc.ini
2.
Add an entry for the ODBC user under the section [<DSN>].
For example:
[<DSN>]
DataIntegrationService=<Data Integration Service name>
SQLDataService=<runtime SQL data service name>
Driver=$ODBC_DRIVER_INSTALL_LOCATION/bin/$OS/libinfadsodbc.so
HostName=<domain host name>
Port=<domain HTTP port>
Authentication Mode=<type>
Optional Parameters=defaultDateFormat=DD/MM/YYYY&defaultTimeStampFormat=DD/MM/YYYY
HH24:MI:SS
WCHARLengthInChars=true
Note: Configure WCHARLengthInChars for MicroStrategy.
ODBC Connections
59
3.
60
CHAPTER 5
BusinessObjects, 62
MicroStrategy Configuration, 65
QlikView Configuration, 67
Tableau Configuration, 70
WinSQL Configuration, 72
BusinessObjects
IBM Cognos
MicroStrategy
QlikView
61
Tableau
WinSQL
BusinessObjects
You can access the virtual data in an SQL data service through SAP BusinessObjects. Use the Information
Design Tool provided by SAP BusinessObjects to extract, define, and manipulate metadata for
BusinessObject BI applications.
Use the Information Design Tool to create a project, to define data source connections, and to import
metadata. Create the data foundation with required connections and then set up a business layer. When you
have the required metadata in the form of universe, you can publish the universe to the BusinessObjects
Server. SAP BusinessObjects uses universes created by the Informatica Design Tool for data analysis and to
query the data and generate enterprise reports.
BusinessObjects Configuration
You can access the virtual data in an SQL data service through a BusinessObjects universe. Import metadata
from the SQL data service into the universe. Use a BusinessObjects application to query the data or generate
reports.
Create the ODBC connection through a BusinessObjects application such as BusinessObjects Designer. To
create the ODBC connection, use the application New Connection wizard.
To configure BusinessObjects to access an SQL data service, complete the following tasks:
1.
2.
3.
Enter a connection name and select a generic ODBC or ODBC3 data source driver.
4.
Click Next.
The Login Parameters page appears.
5.
6.
Option
Description
Authentication Mode
User name
Password
62
7.
Click OK.
After you perform this task, you can import metadata from an SQL data service into the universe and
generate reports based on the data.
Update the IBM Cognos configuration file to include the Informatica ODBC driver information.
2.
Create an ODBC connection to the SQL data service, and import SQL data service metadata in a
Cognos project.
C:\Program Files\cognos\tools\c8\bin\
C:\Program Files\cognos\c8\bin\
2.
3.
63
2.
Create a project.
3.
4.
5.
Click Next.
The Select Data Source window appears.
6.
Click New.
The New Data Source wizard appears.
7.
In the name and description page, enter a name and optional description for the data source.
8.
Click Next.
9.
In the connection page, select the ODBC database type, select an isolation level, and click Next.
10.
In the connection string page, enter the SQL data service ODBC data source name in the ODBC data
source and ODBC connect string fields. Enter timeouts or sign-on information, if required. Enter the
user ID and password if they are not part of the Informatica ODBC driver connect string.
11.
Click Test the connection to test the connection to the Informatica ODBC driver.
12.
13.
Click Next.
14.
In the Select Objects page, select the objects you want to import and specify how the import handles
duplicate object names.
15.
16.
64
Click Finish.
MicroStrategy Configuration
MicroStrategy is a business intelligence platform that allows you to analyze, distribute, and customize
business information. MicroStrategy Desktop allows you to create projects and reports. Within a project, you
can create data source connections and import data source metadata.
To configure MicroStrategy to access an SQL data service, complete the following tasks:
1.
2.
2.
Create a project.
3.
Select Schema > Warehouse Catalog to open the project Warehouse Catalog.
The Warehouse Database Instance dialog box appears.
4.
Click New.
The Database Instance Wizard opens.
5.
Click Next.
6.
In the Database Instance General Information page, enter a name for the database instance and
select Generic DBMS as the database type.
7.
Click Next.
8.
In the ODBC Data Source Information page, select the ODBC data source name for the SQL data
service and enter the Informatica domain user name and password.
9.
Click Finish.
10.
11.
12.
Click Options.
13.
14.
In the Warehouse Connection settings, select the database instance and click Edit.
The Database Instances dialog box opens.
15.
16.
On the General tab, enter a database connection name and select the ODBC data source name for the
SQL data service.
17.
18.
19.
Set the character set encoding option for Windows and UNIX drivers to Non UTF-8.
MicroStrategy Configuration
65
20.
21.
22.
In the Read Settings, select Use standard ODBC calls to obtain the database catalog.
23.
Click OK.
24.
In the Warehouse Catalog, click Save and Close to save the changes.
Select Schema > SQL Generation Options to open the SQL generation options.
2.
In the SQL Data Warehouses settings, select the database instance you use to connect to the SQL data
service.
3.
Click VLDB Properties to edit the VLDB properties for the database instance.
4.
5.
In the Drop Temp Tables Method settings, set the drop temp table method to Do nothing.
6.
In the Intermediate Table Type settings, set the intermediate table type to Derived table.
7.
In the Table Creation Type settings, set the table creation type to Implicit Table.
8.
In the CREATE and INSERT Support settings, select the Create and insert are not supported option.
9.
2.
66
3.
Name the file <RPDfilename>.rpd and enter the repository password twice.
4.
Select the data source name created for the ODBC connection.
5.
6.
QlikView Configuration
You can access the virtual data in an SQL data service through QlikView. To read data from an SQL data
service into your QlikView document, use the Script Editor. The script that you create uses an ODBC
connection to connect to and retrieve data from the SQL data service.
1.
2.
3.
In the Data view, select ODBC as the database and click Connect.
The Connect to Data Source dialog box appears.
4.
Select the ODBC data source name for the SQL data service and enter the user name and password for
the Informatica domain user.
5.
6.
7.
8.
In the Data view of the Edit Script dialog box, click Select to create an SQL SELECT statement that
retrieves information from the SQL data service.
9.
Click OK.
10.
Run the script to retrieve data from the SQL data service.
67
2.
Create a new project and select Business Intelligence Project as the project type.
3.
4.
5.
Click Next.
6.
7.
Enter the data source name and select ODBC as the type.
8.
Click Edit.
The Connection Properties dialog box appears.
9.
Description
Select the ODBC data source name for the SQL data service.
User ID
Password
10.
11.
Click Next.
12.
13.
68
14.
15.
16.
Run an SQL query and verify that the data displays as expected.
17.
Click OK.
18.
19.
20.
Copy the Informatica JDBC driver to the SQuirreL SQL Client library directory.
2.
Create the Informatica JDBC driver and the database alias in SQuirreL SQL Client.
After you perform these tasks, you can import data from an SQL data service into SQuirreL SQL Client.
Copy the Informatica JDBC driver, infadsjdbc.jar, from the following directory:
<Informatica Installation Directory>\tools\jdbcdrv\
To the following directory:
<SQuirreL SQL Client Installation Directory>\lib\
2.
3.
Description
Name
Example URL
69
4.
Option
Description
Website URL
Extra Class
Path
Class Name
com.informatica.ds.sql.jdbcdrv.INFADriver
Click OK.
SQuirreL SQL Client displays a message saying that driver registration is successful.
5.
6.
7.
8.
Option
Description
Name
Alias name.
Driver
URL
User Name
Password
Click Test.
SQuirreL SQL Client displays a message saying that the connection is successful.
9.
Click OK.
Tableau Configuration
You can access the virtual data in an SQL data service through Tableau. Tableau uses the 32-bit Informatica
Data Services ODBC Driver to read source data from an SQL data service.
70
1.
Start Tableau.
2.
3.
4.
Select DSN to use an existing 32-bit ODBC connection or select Driver to provide the credentials to
connect to the SQL data service using the Informatica SQL Data Services ODBC driver.
If you select Driver, provide the connection information to connect to an SQL data service. Tableau
saves the credentials and options in the Tableau Workbook (.twb) file when you save the report. By
default, the .twb files are located in the following directory C:\Users\<username>\ Documents\My
Tableau Repository\Workbooks.
5.
Click Connect.
6.
7.
If you need to drag and drop date or numeric fields in Tableau, make the following modifications on the
Tableau Workbook file:
a.
b.
For more information about Tableau customization, see the Tableau documentation.
Related Topics:
2.
3.
4.
Click Add.
5.
Select the ODBC driver from the list, and click Finish.
6.
Specify the configuration properties required for the database in the windows configuration dialog box.
71
The following table describes the configuration properties that you can specify:
7.
Option
Description
Select the ODBC data source name for the SQL data service.
User ID
Password
8.
Option
Description
Select the data source name you added in the previous steps.
User
Password
Database
Driver
Category
Select or create a category if you want to color code Editor tabs for a specific
connection. This can help differentiate between development and production
databases. You can also set an option to color code the Object Explorer pane and
object editor windows .
Click Connect to save the connection and immediately connect to the database, or click Save to save
the connection without connecting to the database.
WinSQL Configuration
You can access the virtual data in an SQL data service through WinSQL. To read data from an SQL data
service into WinSQL, create a new connection. WinSQL imports data from the SQL data service based on the
connection information.
1.
Create a query.
2.
3.
72
The following table describes the ODBC data source properties that you can enter:
4.
Option
Description
Select the ODBC data source name for the SQL data service.
User ID
Password
Click OK.
An error occurs when I test a new ODBC connection through the Informatica Data Services ODBC Driver:
[SQLCMN_10007] The SQL Service Module could not find an SQL data service on the server
with the name [<SQL data service name>]. Check the SQL data service name.
When you enter the SQLData Service Name, use the correct syntax. This is the correct syntax:
<application>.<SQL data service name>
73
CHAPTER 6
74
1.
2.
3.
Installation DVD. Download the Informatica zip or tar file from the installation DVD to a directory on your
machine and then extract the installer files. Or, extract the installer files directly from the DVD to a
directory on your machine.
FTP download. Download the Informatica installation zip or tar file from the Informatica Electronic
Software Download site to a directory on your machine and then extract the installer files.
Description
Data Integration
Service name
Data Integration Service that runs the application that contains the SQL data service.
Name of the SQL data service that contains the virtual tables you want to query or the
virtual stored procedures that you want to run. The run-time SQL data service name
includes the application name that contains the SQL data service and uses the
following format: <application name>.<SQL data service name>
User name
User password
75
Required Information
Description
Truststore file
If the Informatica domain has secure communication enabled, you must have the
location of the truststore file that contains the SSL certificate for the domain.
Authentication type
The mode of authentication used to connect to the SQL data service. You can select
one of the following authentication modes:
Native or LDAP Authentication
Uses an Informatica domain user account to connect to the SQL data service in an
Informatica domain that uses Native or LDAP authentication. The user account can
be in a native or LDAP security domain.
Kerberos with keytab
Uses the service principal name (SPN) of an Informatica domain user account to
connect to the SQL data service in an Informatica domain that uses Kerberos
authentication.
Kerberos with user name and password
Uses an Informatica domain user account to connect to the SQL data service in an
Informatica domain that uses Kerberos authentication.
Logged in user
Uses the user account logged in to the client machine to connect to the SQL data
service in an Informatica domain that uses Native, LDAP, or Kerberos
authentication.
2.
3.
Run install.bat.
4.
5.
Click Next.
The Installation Prerequisites page displays the system requirements. Verify that all installation
requirements are met before you continue the installation.
6.
76
Click Next.
7.
On the Installation Directory page, enter the absolute path for the installation directory.
8.
On the Pre-Installation Summary page, review the installation information, and click Install.
The installer copies the driver files to the installation directory. The Post-Installation Summary page
indicates whether the installation completed successfully.
9.
Click Done.
You can view the installation log files to get more information about the tasks performed by the installer.
Use a text editor to open and change the values of the properties in the file.
The following table describes the installation properties that you can change:
Property
Description
INSTALL_TYPE
Default is 0.
USER_INSTALL_DIR
2.
3.
4.
2.
77
3.
Click Add.
4.
5.
Click Finish.
6.
Value
DSN Name
Connect String
Location for
INFADSJDBC.JAR
Path and file name of infadsjdbc.jar. Click Browse to select the jar file for
the driver. By default, the jar file is installed in the following directory:
<Informatica installation directory>\tools\jdbcdrv
JVM Options
Optional. JVM parameters that you can set to configure the JDBC connection.
Use the following arguments to configure the parameters:
- java -Xms<size>. Sets the initial Java heap size.
- java -Xmx<size>. Sets the maximum Java heap size.
For example, java -Xmx2048m -Xms256m starts the JVM with 256 MB of
memory, and allows the process to use up to 2048 MB of memory.
7.
Treat Length as
Characters (Deferred
Parameters)
Disabled.
Multithreaded
application
Enabled.
Click Test Connection to verify that the connection is valid and then click OK.
78
1.
2.
3.
2.
3.
4.
5.
Run install.sh.
6.
7.
Option
Description
Press Enter.
The Installation Prerequisites section displays the system requirements.
8.
Verify that all installation requirements are met before you continue the installation.
9.
Press Enter.
The Installation Directory section appears.
10.
11.
Press Enter.
12.
In the Pre-Installation Summary section, review the installation information, and then press Enter.
13.
Press Enter.
For more information about the install tasks, see the installation debug log.
Use a text editor to open and change the values of the properties in the file.
79
The following table describes the installation properties that you can change:
Property
Description
INSTALL_TYPE
Default is 0.
USER_INSTALL_DIR
2.
3.
4.
AIX
LIBPATH
HP-UX
SHLIB_PATH
Linux
LD_LIBRARY_PATH
Solaris
LD_LIBRARY_PATH
Configure the shared library environment variable to include the following directories:
80
The directory where the platform libjvm and j9vm libraries reside.
The directory where the driver manager library files reside. Use the DataDirect driver manager. The
DataDirect driver manager is found in ODBCHOME/lib.
Use the DataDirect driver manager to create an OBDCINST environment variable to point to the
odbcinst.ini file.
Edit odbc.ini or copy odbc.ini to the home directory and edit it.
This file exists in the $ODBCHOME directory.
$ cp $ODBCHOME/odbc.ini $HOME/.odbc.ini
2.
Add an entry for the ODBC user under the section [<user name>_odbc].
For example:
[<user name>_odbc]
ConnectionString=jdbc:informatica:sqlds/<optional security domain\><user name>/<user
password>@<domain host name>:<domain HTTP port>?dis=<Data Integration Service
name>&sqlds=<run-time SQL data service name>&authType=<type>
Driver=$ODBC_DRIVER_INSTALL_LOCATION/odbcdrv/libinfadsodbc.so
IsMultiThreaded=true
JDBCDriverLocation=<Informatica installation directory>/tools/jdbcdrv/infadsjdbc.jar
UseDetach=false
The Authentication Mode parameter can have one of the following values:
3.
native_uid
kerberos_keytab
kerberos_uid
sso
81
To increase the -Xmx value, set the environment variable INFA_ODBCJVM to -Xmx<megabytes>m. For
example, to set the -Xmx value to 64 MB, set INFA_ODBCJVM to -Xmx 64m. If you set the -Xmx value to a
very large value, for example >500 MB, the Memory Manager may not be able to allocate the memory.
The Informatica Data Services ODBC/JDBC driver for PowerCenter does not support single sign-on
based authentication when the driver is used with PowerCenter clients to import virtual tables in
Informatica domains that use Kerberos authentication.
You must provide the keytab or username and password to import virtual tables and run workflows with the
ODBC/JDBC driver and PowerCenter client.
82
CHAPTER 7
Assign permissions. Enable SQL data service security and assign permissions on SQL data service
objects.
Configure the SQL data service. Configure read-only general properties, Data Integration Service settings,
logical data object, and caching properties.
View the SQL data service logs. View Data Integration Service logs for an SQL data service.
Monitor the SQL data service. Use the Administrator tool or the Monitoring tool to monitor SQL data
service requests.
83
Virtual table
When you assign permissions on an SQL data service object, the user or group inherits the same
permissions on all objects that belong to the SQL data service object. For example, you assign a user select
permission on an SQL data service. The user inherits select permission on all virtual tables in the SQL data
service.
You can deny permissions to users and groups on some SQL data service objects. When you deny
permissions, you configure exceptions to the permissions that users and groups might already have. For
example, you cannot assign permissions to a column in a virtual table, but you can deny a user from running
an SQL SELECT statement that includes the column.
You can restrict access to specific columns and rows to prevent users from accessing data in an SQL data
service when they query a virtual table. Configure column level security to restrict access to specific columns
in a virtual table. Configure row level security to restrict access to specific rows of data in a virtual table.
Apply pass-through security to restrict access to data in an SQL data service based on user credentials.
Grant permission. Users can grant and revoke permissions on the SQL data service objects using the
Administrator tool or using the infacmd command line program.
Execute permission. Users can run virtual stored procedures in the SQL data service using a JDBC or
ODBC client tool.
Select permission. Users can run SQL SELECT statements on virtual tables in the SQL data service using
a JDBC or ODBC client tool.
Some permissions are not applicable for all SQL data service objects.
The following table describes the permissions for each SQL data service object:
84
Object
Grant Permission
Execute Permission
Select Permission
SQL data
service
Virtual table
Virtual stored
procedure
The query returns a substitute value instead of the data. The query returns a substitute value in each row
that it returns. The substitute value replaces the column value through the query. If the query includes
filters or joins, the results substitute appears in the results.
Pass-Through Security
Pass-through security is the capability to connect to an SQL data service with the client user credentials
instead of the credentials from a connection object.
Users might have access to different sets of data based on the job in the organization. Client systems restrict
access to databases by the user name and the password. When you create an SQL data service, you might
combine data from different systems to create one view of the data. However, when you define the
connection to the SQL data service, the connection has one user name and password.
If you configure pass-through security, you can restrict users from some of the data in an SQL data service
based on their user name. When a user connects to the SQL data service, the Data Integration Service
ignores the user name and the password in the connection object. The user connects with the client user
name or the LDAP user name.
Configure pass-through security for a connection in the connection properties of the Administrator tool or with
infacmd dis UpdateServiceOptions. You can set pass-through security for connections to deployed
applications. You cannot set pass-through security in the Developer tool. Only SQL data services and web
services recognize the pass-through security configuration.
For more information about configuring security for SQL data services, see the Informatica How-To Library
article "How to Configure Security for SQL Data Services":
http://communities.informatica.com/docs/DOC-4507.
85
Description
Allow Caching
Allows data object caching for all pass-through connections in the Data Integration
Service. Populates data object cache using the credentials from the connection
object.
Note: When you enable data object caching with pass-through security, you might
allow users access to data in the cache database that they might not have in an
uncached environment.
86
Virtual tables
Virtual columns
The Applications view displays read-only general properties for SQL data services and the objects contained
in the SQL data services. Properties that appear in the view depend on the object type.
The following table describes the read-only general properties for SQL data services, virtual tables, virtual
columns, and virtual stored procedures:
Property
Description
Name
Description
Short description of the selected object. Appears for all object types.
Type
Location
The location of the selected object. This includes the domain and Data Integration Service name.
Appears for all object types.
JDBC URL
JDBC connection string used to access the SQL data service. The SQL data service contains
virtual tables that you can query. It also contains virtual stored procedures that you can run.
Appears for SQL data services.
Column Type
The following table describes the configurable SQL data service properties:
Property
Description
Startup Type
Determines whether the SQL data service is enabled to run when the application starts or
when you start the SQL data service. Enter ENABLED to allow the SQL data service to run.
Enter DISABLED to prevent the SQL data service from running.
Trace Level
Level of error written to the log files. Choose one of the following message levels:
-
OFF
SEVERE
WARNING
INFO
FINE
FINEST
ALL
Default is INFO.
Connection
Timeout
Maximum number of milliseconds to wait for a connection to the SQL data service. Default is
3,600,000.
Request Timeout
Maximum number of milliseconds for an SQL request to wait for an SQL data service
response. Default is 3,600,000.
Sort Order
Sort order that the Data Integration Service uses for sorting and comparing data when
running in Unicode mode. You can choose the sort order based on your code page. When the
Data Integration runs in ASCII mode, it ignores the sort order value and uses a binary sort
order. Default is binary.
Maximum Active
Connections
87
Property
Description
Result Set
Cache Expiration
Period
The number of milliseconds that the result set cache is available for use. If set to -1, the
cache never expires. If set to 0, result set caching is disabled. Changes to the expiration
period do not apply to existing caches. If you want all caches to use the same expiration
period, purge the result set cache after you change the expiration period. Default is 0.
Number of milliseconds that the DTM instance stays open after it completes the last request.
Identical SQL queries can reuse the open instance. Use the keep alive time to increase
performance when the time required to process the SQL query is small compared to the
initialization time for the DTM instance. If the query fails, the DTM instance terminates.
Must be an integer. A negative integer value means that the DTM Keep Alive Time for the
Data Integration Service is used. 0 means that the Data Integration Service does not keep
the DTM instance in memory. Default is -1.
Optimization
Level
The optimizer level that the Data Integration Service applies to the object. Enter the numeric
value that is associated with the optimizer level that you want to configure. You can enter one
of the following numeric values:
- 0. The Data Integration Service does not apply optimization.
- 1. The Data Integration Service applies the early projection optimization method.
- 2. The Data Integration Service applies the early projection, early selection, push-into, and
predicate optimization methods.
- 3. The Data Integration Service applies the cost-based, early projection, early selection, pushinto, predicate, and semi-join optimization methods.
SQL Properties
The following table describes the SQL properties for the Data Integration Service:
Property
Description
Number of milliseconds that the DTM instance stays open after it completes the last request.
Identical SQL queries can reuse the open instance. Use the keep alive time to increase
performance when the time required to process the SQL query is small compared to the
initialization time for the DTM instance. If the query fails, the DTM instance terminates.
Must be greater than or equal to 0. 0 means that the Data Integration Service does not keep
the DTM instance in memory. Default is 0.
You can also set this property for each SQL data service that is deployed to the Data
Integration Service. If you set this property for a deployed SQL data service, the value for the
deployed SQL data service overrides the value you set for the Data Integration Service.
Table Storage
Connection
88
Relational database connection that stores temporary tables for SQL data services. By default,
no connection is selected.
Property
Description
Maximum
Memory Per
Request
The behavior of Maximum Memory Per Request depends on the following Data Integration
Service configurations:
- The service runs jobs in separate local or remote processes, or the service property Maximum
Memory Size is 0 (default).
Maximum Memory Per Request is the maximum amount of memory, in bytes, that the Data
Integration Service can allocate to all transformations that use auto cache mode in a single
request. The service allocates memory separately to transformations that have a specific cache
size. The total memory used by the request can exceed the value of Maximum Memory Per
Request.
- The service runs jobs in the Data Integration Service process, and the service property Maximum
Memory Size is greater than 0.
Maximum Memory Per Request is the maximum amount of memory, in bytes, that the Data
Integration Service can allocate to a single request. The total memory used by the request cannot
exceed the value of Maximum Memory Per Request.
Default is 50,000,000.
Skip Log Files
Prevents the Data Integration Service from generating log files when the SQL data service
request completes successfully and the tracing level is set to INFO or higher. Default is false.
The following table describes the SQL properties for the Data Integration Service process:
Property
Description
Maximum # of Concurrent
Connections
Limits the number of database connections that the Data Integration Service
can make for SQL data services. Default is 100.
Description
Enable Caching
Cache Refresh
Period
Cache Table
Name
The name of the user-managed table from which the Data Integration Service accesses the
virtual table cache. A user-managed cache table is a table in the data object cache database
that you create, populate, and manually refresh when needed.
If you specify a cache table name, the Data Object Cache Manager does not manage the
cache for the object and ignores the cache refresh period.
If you do not specify a cache table name, the Data Object Cache Manager manages the
cache for the object.
89
Description
Create Index
Enables the Data Integration Service to generate indexes for the cache table based on this
column. Default is false.
Deny With
When you use column level security, this property determines whether to substitute the restricted
column value or to fail the query. If you substitute the column value, you can choose to substitute
the value with NULL or with a constant value.
Select one of the following options:
- ERROR. Fails the query and returns an error when an SQL query selects a restricted column.
- NULL. Returns a null value for a restricted column in each row.
- VALUE. Returns a constant value for a restricted column in each row.
Insufficient
Permission
Value
The constant that the Data Integration Service returns for a restricted column.
Description
The number of milliseconds that the result set cache is available for
use. If set to -1, the cache never expires. If set to 0, result set caching
is disabled. Changes to the expiration period do not apply to existing
caches. If you want all caches to use the same expiration period, purge
the result set cache after you change the expiration period. Default is
0.
90
Property
Description
Name
Description
Property
Description
Type
Location
The location of the logical data object. This includes the domain and Data Integration Service
name.
The following table describes the configurable logical data object properties:
Property
Description
Enable Caching
Cache the logical data object in the data object cache database.
Cache Refresh
Period
Cache Table
Name
The name of the user-managed table from which the Data Integration Service accesses the
logical data object cache. A user-managed cache table is a table in the data object cache
database that you create, populate, and manually refresh when needed.
If you specify a cache table name, the Data Object Cache Manager does not manage the
cache for the object and ignores the cache refresh period.
If you do not specify a cache table name, the Data Object Cache Manager manages the
cache for the object.
The following table describes the configurable logical data object column properties:
Property
Description
Create Index
Enables the Data Integration Service to generate indexes for the cache table based on this
column. Default is false.
Description
The number of milliseconds that the Data Integration Service waits before cleaning
up cache storage after a refresh. Default is 3,600,000.
Cache Connection
The database connection name for the database that stores the data object cache.
Select a valid connection object name.
91
Property
Description
Maximum Concurrent
Refresh Requests
Maximum number of cache refreshes that can occur at the same time. Limit the
concurrent cache refreshes to maintain system resources.
Indicates that the Data Integration Service can use cache data for a logical data
object used as a source or a lookup in another logical data object during a cache
refresh. If false, the Data Integration Service accesses the source resources even
if you enabled caching for the logical data object used as a source or a lookup.
For example, logical data object LDO3 joins data from logical data objects LDO1
and LDO2. A developer creates a mapping that uses LDO3 as the input and
includes the mapping in an application. You enable caching for LDO1, LDO2, and
LDO3. If you enable nested logical data object caching, the Data Integration
Service uses cache data for LDO1 and LDO2 when it refreshes the cache table for
LDO3. If you do not enable nested logical data object caching, the Data Integration
Service accesses the source resources for LDO1 and LDO2 when it refreshes the
cache table for LDO3.
Default is False.
Configure the result set cache properties in the Data Integration Service process properties.
2.
Configure the cache expiration period in the SQL data service properties.
The Data Integration Service purges result set caches in the following situations:
92
When the result set cache expires, the Data Integration Service purges the cache.
When you restart an application or run the infacmd dis purgeResultSetCache command, the Data
Integration Service purges the result set cache for objects in the application.
When you restart a Data Integration Service, the Data Integration Service purges the result set cache for
objects in applications that run on the Data Integration Service.
When you change the permissions for a user, the Data Integration Service purges the result set cache
associated with that user.
Configure the data object cache database connection in the cache properties for the Data Integration
Service.
2.
Enable caching in the properties of logical data objects or virtual tables in an application.
By default, the Data Object Cache Manager component of the Data Integration Service manages the cache
tables for logical data objects and virtual tables in the data object cache database. When the Data Object
Cache Manager manages the cache, it inserts all data into the cache tables with each refresh. If you want to
incrementally update the cache tables, you can choose to manage the cache tables yourself using a
database client or other external tool. After enabling data object caching, you can configure a logical data
object or virtual table to use a user-managed cache table.
Configuration. Log events about system or service configuration changes and application deployment or
removal.
Data Integration Service processes. Log events about application deployment, data object cache refresh,
and user requests to run mappings.
System failures. Log events about failures that cause the Data Integration service to be unavailable, such
as Model Repository connection failures or the service failure to start.
Monitoring tool. In the Developer tool Progress view, click Menu > Monitor Jobs. Select the Data
Integration Service that runs the SQL data service and click OK. The Monitoring tool opens.
Administrator tool. To monitor web services in the Administrator tool, click the Monitor tab.
93
When you monitor an SQL data service, you can view summary statistics or execution statistics for the
service. The Summary Statistics view displays graphical information about SQL data service distribution
and state. The Execution Statistics view displays information about SQL data services that are deployed in
an application.
To monitor an SQL data service, expand an application in the Navigator and select the SQL Data Services
folder. A list of SQL data services appears in the contents panel. The contents panel shows properties about
each SQL data service, such as the name, description, and state.
When you select an SQL data service in the contents panel, the contents panel shows the following views:
Properties view
Connections view
Requests view
Reports view
Aborting a Connection
You can abort a connection to prevent it from sending more requests to the SQL data service.
1.
94
2.
3.
4.
5.
6.
Select a connection.
7.
2.
3.
4.
5.
6.
7.
2.
3.
4.
5.
6.
95
7.
2.
3.
4.
5.
6.
7.
8.
96
Index
BusinessObjects
BusinessObjects Configuration 62
creating the ODBC connection 62
C
Cache Connection
property 91
Cache Removal Time
property 91
CLASSPATH
updating 55
client configuration
secure domain 49
configuration
client tool configuration 61
IBM Cognos 63
MicroStrategy 65
Oracle Business Intelligence Enterprise Edition 11g 66
Oracle Database Gateway 67
QlikView 67
SQL Server Business Intelligence Development Studio 68
SQuirreL SQL Client 69
third-party client driver configuration 47
Toad for Data Analysts 71
WinSQL 72
connecting
SQL data service 85
connections
overview 13
pass-through security 85
protocol 13
correlated subqueries
rules and guidelines for 41
SQL data service queries 41
D
data object cache
configuring 93
description 93
index cache 93
properties 91
user-managed tables 93
data object caching
with pass-through security 86
datatypes
SQL data service queries 31
F
functions
escape syntax 43
SQL data service queries 31
I
IBM Cognos
configuration 63
creating the ODBC connection 64
updating the configuration file 63
infadsjdbc.jar
configuring 55
troubleshooting 55
infadsjdbclight.jar
configuring 55
Informatica JDBC driver
infadsjdbc.jar 55
infadsjdbclight.jar 55
Informatica ODBC driver
configuring 77
J
JDBC client connections
connecting to an SQL data service 54
overview 48, 50, 75, 76
troubleshooting 55
updating the CLASSPATH 55
L
LD_LIBRARY_PATH
updating 80
LIBPATH
updating 80
logical data objects
caching in database 93
97
M
Maximum Active Connections
SQL data service property 86
Maximum Concurrent Connections
configuring 88
Maximum Concurrent Refresh Requests
property 91
MicroStrategy
configuration 65
configuring SQL generation options 66
creating the database instance 65
monitoring
SQL data services 93
N
non-correlated subqueries
SQL data service queries 40
O
ODBC client connections on UNIX
overview 51, 78
troubleshooting 81
updating odbc.ini 81
updating the shared library 80
ODBC client connections on Windows
configuring the driver 77
troubleshooting 81
odbc.ini
updating 81
operators
SQL data service queries 31
Oracle Business Intelligence Enterprise Edition 11g
configuration 66
Oracle Database Gateway
configuration 67
P
parameterized queries
SQL data service queries 42
troubleshooting 55
pass-through security
connecting to SQL data service 85
enabling caching 86
properties 86
web service operation mappings 85
Q
QlikView
configuration 67
queries
SQL data service queries 40
R
request timeout
SQL data services requests 86
reserved words
SQL data service queries 42
98
Index
S
secure domain
client configuration 49
shared library environment variable
updating 80
sort order
SQL data services 86
special characters
SQL data service queries 45
SQL data service
permission types 84
properties 86
SQL data service connections
overview 13
SQL data service queries
correlated subqueries 41
datatypes 31
escape syntax 42
function escape sequences 43
non-correlated subqueries 40
operators 31
overview 30
parameterized queries 42
queries 40
reserved words 42
special characters 45
SQL functions 31
SQL statements and keywords 39
SQL data services
connecting from JDBC 54
creating 16
defining 16
example 14
Informatica ODBC driver
installing on Windows 48, 50, 75, 76
JDBC client connections 48, 50, 75, 76
monitoring 93
ODBC client connections on UNIX 51, 78
ODBC client connections on Windows
installing the driver 48, 50, 75, 76
overview 48, 50, 75, 76
overview 13, 16
previewing data 19
SQL data services
ODBC client connections on Windows 48, 50, 75, 76
SQL keywords
SQL data service queries 39
SQL queries
previewing data 19
SQL query plans
example 28
overview 27
viewing 29
SQL Server Business Intelligence Development Studio
configuration 68
SQL statements
SQL data service queries 39
T
Tableau
configuration 70
temporary tables
description 20
operations 20
rules and guidelines 22
third-party client tools
supported third-party tools 61
Third-party Client Tools
troubleshooting 73
timeout
SQL data service connections 86
Toad for Data Analysts
configuration 71
troubleshooting
converting large numbers to binary 45
converting negative values 45
Third-party Client Tools 73
V
virtual column properties
configuring 90
virtual data
overview 15
W
web service
logs 93
property configuration 86
WinSQL
configuration 72
Index
99