Data Integration Platform Cloud Hands On Lab
Data Integration Platform Cloud Hands On Lab
Hands On Lab
<Name>
<Title>
<Name>
<Title>
Data Integration Platform Cloud: Hands-on Lab
The following lessons will walk us through various steps that are needed to create a Data Integration Platform
Cloud ODI Execution Task to invoke an existing ODI scenario to load between an OLTP system and a target data
warehouse.
Overview
Time to Complete
Perform all tasks – 20 Minutes
Prerequisites
Before you begin this tutorial, you should
Lab Environment
For this lab, the Data Integration Platform Cloud and the client environment are contained
within on environment for simplicity. Most user interactions with Data Integration Platform
Cloud will be through a browser installed on your local machine (Chrome preferred, Firefox is
also supported).
Ravello Setup
1. Log into Ravello and click on Applications then click on ‘+Create Application’
2. Enter a Name and pick the DIPC Blueprint that was shared with you
3. Next you should land in the Canvas for this new Application, click Publish
5. You will need to get the hostname of the Ravello Application you have started. When
the VM shows as Started on the Canvas, go to Console to get access to the VM and
write down the hostname listed at the top of the screen:
There will also be tasked that will be performed within the VM as the Data Integration
Platform Cloud administrator.
2. On the Desktop, double-click on Start Docker Containers and wait for both of
them to fully start
3. Then double click on Start DIPC WLS Managed Server to start WebLogic
Note: If you are unable to connect to VNC, you can start vncserver manually on
the host using the vncserver command
b. When prompted enter the password: #DIPCR0CKS# and click OK
5. If you don’t have a DIPC Agent already set up from a previous lab then please refer to
the Synchronize Data lab to install and configure one
NOTE: MAKE SURE TO SELECT DATA INTEGRATOR (ODI) PLUGIN
OTHERWISE THIS LAB WON’T WORK
d. Demo Client will open up and should be populated with the following data
2. Click on Create under ODI Execution (you may need to go right in the carousel to see it)
4. Enter
Name: Load Sales DW
Description: Execute ODI Scenario to load OLTP data into DW
b. Click Open and wait for the import operation to complete (this will take about 2
minutes)
c. Click on the Scenario Name drop-down and select LD_TRG_SALES 001
9. When the Job appears in the list you can click on it to get more details
10. The Job Details contains all the details about the scenario execution in ODI
You can click on any Step in the Job Execution to review the code generated by ODI
When the Job has completed successfully you will see that the data has been fully loaded into
the target Sales Data Warehouse using ODI through the ODI Execution Task in DIPC Console
Summary
In this lab, we have seen how DIPC and standalone ODI running in DIPC can work hand in hand
to implement an end-to-end data flow.