0% found this document useful (0 votes)
94 views12 pages

Data Integration Platform Cloud Hands On Lab

This document provides instructions for a hands-on lab on using Oracle Data Integration Platform Cloud (DIPC) to execute an Oracle Data Integrator (ODI) scenario. The lab walks through creating an ODI Execution Task in DIPC to load data from an OLTP system into a target data warehouse. Users will import an existing ODI scenario, configure connections, and run the task to synchronize data between the systems. The task execution details can then be viewed in DIPC to monitor the load progress and troubleshoot if needed.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
94 views12 pages

Data Integration Platform Cloud Hands On Lab

This document provides instructions for a hands-on lab on using Oracle Data Integration Platform Cloud (DIPC) to execute an Oracle Data Integrator (ODI) scenario. The lab walks through creating an ODI Execution Task in DIPC to load data from an OLTP system into a target data warehouse. Users will import an existing ODI scenario, configure connections, and run the task to synchronize data between the systems. The task execution details can then be viewed in DIPC to monitor the load progress and troubleshoot if needed.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Data Integration Platform Cloud

Hands On Lab

<Name>
<Title>

<Name>
<Title>
Data Integration Platform Cloud: Hands-on Lab

Hands on Lab - Data Integration Platform Cloud –


ODI Execution
The rapid adoption of enterprise cloud-based solutions brings with it a new set of challenges.
Data Integration becomes one of the greatest challenges of any enterprise cloud-based solution.
Join this hands-on lab to have firsthand experience of the power and simplicity of Oracle Data
Integration Platform Cloud. See how DIPC simplifies the end to end creation/execution of the
historically arduous DI tasks of instantiating, loading, preparing, as well as real-time
synchronization of a cloud database from an on-premise database in just a few of clicks.

The following lessons will walk us through various steps that are needed to create a Data Integration Platform
Cloud ODI Execution Task to invoke an existing ODI scenario to load between an OLTP system and a target data
warehouse.

HANDS ON LAB - DATA INTEGRATION PLATFORM CLOUD – ODI EXECUTION .......................................................................... 2


OVERVIEW ................................................................................................................................................................. 3
Time to Complete ............................................................................................................................................... 3
Prerequisites ...................................................................................................................................................... 3
Lab Environment ................................................................................................................................................ 3
TASK 0: PREPARATION STEPS ......................................................................................................................................... 4
TASK 1: CREATE ODI EXECUTION TASK............................................................................................................................ 9
SUMMARY ............................................................................................................................................................... 12

Last Updated: 7-Aug-18 Page 2 of 12


Data Integration Platform Cloud: Hands-on Lab

Overview
Time to Complete
Perform all tasks – 20 Minutes

Prerequisites
Before you begin this tutorial, you should

 Have a general understanding of RDBMS and data integration concepts.


 Have a general understanding of ETL and data synchronization concepts.

Lab Environment
For this lab, the Data Integration Platform Cloud and the client environment are contained
within on environment for simplicity. Most user interactions with Data Integration Platform
Cloud will be through a browser installed on your local machine (Chrome preferred, Firefox is
also supported).

Ravello Setup
1. Log into Ravello and click on Applications then click on ‘+Create Application’

2. Enter a Name and pick the DIPC Blueprint that was shared with you

Last Updated: 7-Aug-18 Page 3 of 12


Data Integration Platform Cloud: Hands-on Lab

3. Next you should land in the Canvas for this new Application, click Publish

4. Enter the information required and click Publish

5. You will need to get the hostname of the Ravello Application you have started. When
the VM shows as Started on the Canvas, go to Console to get access to the VM and
write down the hostname listed at the top of the screen:

There will also be tasked that will be performed within the VM as the Data Integration
Platform Cloud administrator.

Task 0: Preparation Steps


In these steps you will clean up and setup the environment for this exercise
1. Login to the VM as DIPC user – Password is “welcome1”

Last Updated: 7-Aug-18 Page 4 of 12


Data Integration Platform Cloud: Hands-on Lab

If the screensaver is on just press “enter” to open the login screen.

2. On the Desktop, double-click on Start Docker Containers and wait for both of
them to fully start

3. Then double click on Start DIPC WLS Managed Server to start WebLogic

Last Updated: 7-Aug-18 Page 5 of 12


Data Integration Platform Cloud: Hands-on Lab

4. Log into Data Integration Platform Cloud


a. Use VNC to log into the VM. Use your favorite VNC client and enter <Ravello
Instance Hostname>:5901 as the URL to connect to

Note: If you are unable to connect to VNC, you can start vncserver manually on
the host using the vncserver command
b. When prompted enter the password: #DIPCR0CKS# and click OK

c. Go to Applications > Internet and click Google Chrome

d. In Chrome, open up DIPC Home bookmark or go to


localhost:8001/dicloud/login.html
e. Login with weblogic/welcome1
After a few seconds, the following page should appear –

Last Updated: 7-Aug-18 Page 6 of 12


Data Integration Platform Cloud: Hands-on Lab

5. If you don’t have a DIPC Agent already set up from a previous lab then please refer to
the Synchronize Data lab to install and configure one
NOTE: MAKE SURE TO SELECT DATA INTEGRATOR (ODI) PLUGIN
OTHERWISE THIS LAB WON’T WORK

Last Updated: 7-Aug-18 Page 7 of 12


Data Integration Platform Cloud: Hands-on Lab

6. Use DIPC Demo Client


a. This hands-on lab uses a JDBC utility client that was built specifically for this
demo. This client is NOT part of DIPC, however it does help visualize the
Synchronize Data and ODI Execution Job process
b. Open a Terminal

c. From the home directory execute ./startDIPCDemoClient.sh

d. Demo Client will open up and should be populated with the following data

Note: Click on Initial Load Complete if needed

Last Updated: 7-Aug-18 Page 8 of 12


Data Integration Platform Cloud: Hands-on Lab

Task 1: Create ODI Execution Task


1. Click on Home in Navigation bar

2. Click on Create under ODI Execution (you may need to go right in the carousel to see it)

3. The ODI Execution Task screen appears

4. Enter
 Name: Load Sales DW
 Description: Execute ODI Scenario to load OLTP data into DW

5. Under Connections click on Import to import a deployment archive created in ODI


Studio that contains the Scenario we want to execute

Last Updated: 7-Aug-18 Page 9 of 12


Data Integration Platform Cloud: Hands-on Lab

a. Navigate to DIPC/DIPC 18.2.3 and select LD_SALES_DIPC_18.2.3.zip

b. Click Open and wait for the import operation to complete (this will take about 2
minutes)
c. Click on the Scenario Name drop-down and select LD_TRG_SALES 001

This scenario joins SRC_ORDERS and SRC_ORDER_LINES, aggregates the data,


filters for ORDERS with Status of ‘CLO’ as well as performs an incremental
update.
So only replicated rows that have a status of ‘CLO’ closed, will be loaded to the
target Sales DW.

6. In the Connection table pick the following Connections and Schemas:


a. ODI_DEMO_TRG:
i. Connection: Sync Target
ii. Schema: ODI_TGT
b. ODI_DEMO_SRC
i. Connection: Sync Target
ii. Schema: DIPC_TGT

7. Click on Save & Run to execute the Task


8. You will be redirected to the Jobs page and you will see a notification that a new Job
execution started

Last Updated: 7-Aug-18 Page 10 of 12


Data Integration Platform Cloud: Hands-on Lab

9. When the Job appears in the list you can click on it to get more details

10. The Job Details contains all the details about the scenario execution in ODI

You can click on any Step in the Job Execution to review the code generated by ODI

When the Job has completed successfully you will see that the data has been fully loaded into
the target Sales Data Warehouse using ODI through the ODI Execution Task in DIPC Console

Last Updated: 7-Aug-18 Page 11 of 12


Data Integration Platform Cloud: Hands-on Lab

Summary
In this lab, we have seen how DIPC and standalone ODI running in DIPC can work hand in hand
to implement an end-to-end data flow.

Last Updated: 7-Aug-18 Page 12 of 12

You might also like