0% found this document useful (0 votes)
13 views22 pages

About Workday

The document is a self-introduction and job description detailing the author's extensive experience in ERP, specifically as a Workday Technical Consultant at Accenture. It outlines daily responsibilities, including ticket resolution, report creation, and integration development, as well as specific projects involving advanced and matrix reports, EIB integrations, and payroll interfaces. The author emphasizes their proficiency in Workday tools and their ability to manage complex integrations and reporting tasks.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views22 pages

About Workday

The document is a self-introduction and job description detailing the author's extensive experience in ERP, specifically as a Workday Technical Consultant at Accenture. It outlines daily responsibilities, including ticket resolution, report creation, and integration development, as well as specific projects involving advanced and matrix reports, EIB integrations, and payroll interfaces. The author emphasizes their proficiency in Workday tools and their ability to manage complex integrations and reporting tasks.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 22

Self-Introduction

Job Description

I have over all years of work experience in ERP. Initially my carrier started as Workday integration
developer

Currently I am working with Accenture. It is a Support and Enhancement Project. Working as a Workday
Technical Consultant.

 I have a very good experience in all the Create Custom Report, Modified, Existing Reports,
Calculated Fields and also Integrations
 More responsible in EIB Loads and EIB Outbound integrations also Core Connector Worker
(CCW), PICOF and PECI
 I have good Experience in Workday Studio, Couple of Workday Studio integrations. I worked on
apart from this
 Experience in Business process, DT
Knowledge creating different type of security Groups, user based and Segment based security
groups.

Day to day Activities and about Project

Day to day responsibilities

It is a Support and Enhancement project; here I am Technical Consultant.

We receive tickets from business, here we are using Service now ticketing tool

 My manager picks the tickets and assigning to all in the team members
 We will pick the ticket and resolve all the tickets and update the details then respond back to
business team
 If there is any client requirement comes based on the data availability we do build new
Integration as well from the scratch
 Any technical challenges are coming from the project like Reporting side, Integration side I can
take care of all the issues
 Support to the Workday applications providing production support to the on sight client
 Requirement changes and bug fixes
How do you work like Day-to-day business work with USA team?

We do have a daily standup call with US Team we have to give update on the tickets whatever working
on that. If I am stuck with any things and any level so I need to inform to the lead and all.

In the Meeting we discuss about what we are facing all the difficult issues and where we are currently
and how soon we can finish the ticket.

Coming to my Real-time Scenario

 I have a created a different type of Reports like Birthday notifications, All pre hire reports,
Worker head count who are terminated and Future hire, Future terminated worker details
reports, Job requisition reports as well as
 And I have created different type of matrix reports, Worker head count company wise, Location
wise Worker details and mainly matrix reports for that Dash board purpose
And I have different type of dash board reports like Time dashboard, Benefit Dashboards
creation, Team performance dashboard creation, Compensation dashboard creation, Recruiting
dashboard creation as well as
 And I have created EIB inbound and Outbound Integration, Loading compensation, Relation
employees, Terminate employees details data loading, Hire employees data loading, Assign roles
as well as
 And I create Core Connector Worker (CCW) Integration, The employee demographic data, What
are the changes entered in daily basis
 And I have recently build Payroll interface like PICO we extract employees demographic data
and payroll related information integration, Run by monthly basis
 And I have build some Studio Integrations also, Compensation Loading, Employee Seniority data
loading, Employee Training details loading and Employee background checkup all the details
data loading as well as

Advance Report

Mostly I involved advance report but I did not get the chance work on composite but I have very good
knowledge.
Recently I created an Advanced Report. The requirements our company wants to worker head count
and how many worker Terminated, Future hire and future Terminated workers details they asked a
report

Then I created an Advanced Report

Data source: All Workers

Business Object: Worker

I look the Fields Employee ID, Name, hire data, Terminated data

Calculated Fields – Increment / Decrement

Advanced Report

Recently I created an Advanced Report the Requirement is our Company wants Every day who are
celebrating birthday and those employees details we have and to send the message notification like best
wishes messages to all employees

 So we created an advanced Report and Selecting worker as a Business direct @

 Data source is all active workers

 Here I selected the fields Employee ID, Full Name, Employee Status, Designation, Location, Date

of Birth, Hire date and Test constant

 Calculated Fields are Build date and I Created a Configuration message for then we send a

notification or mail for the worker (Notification bar / Double drive) the report is schedule for

Daily basis

What type of Data source you used regularly?

It depends on what kind of data we need to look under worker, Action event, Payroll results line etc.
We will see the data sources list in General terms. Active and terminated workers, All workers, Worker

by organization, Worker for prompt, Business process transaction, Dated, Awaiting process, Worker

business process, Payroll results related data sources.

Report: Complex Report

Recently I created an Advanced Report. The requirement is

 They want pre hire employee’s details and they don’t want to who have yet been. Placed as a

contractor or Hired as an employee

 So I created a one advanced report that add pre-hire

Here I selected data source is all Pre hires and Business objects is Pre hire and we selected the

fields like Pre hire employee, Available for hire, Address, Date pre hire, Pre hire pool (Country

location) Source, Referred by, Gmail, Phone number

 This Report run by monthly scheduled it.

Advanced Report

 Recently I created an advanced report the requirement is

They want Pre hire employees details and they don’t want to who have yet been placed as a

contractor / hired as an employee

 So I created an Advance Report that is Pre hire ( All pre hires for contract hire)
 Here I selected the data source. Data source is a “All Pre hire” and Business object is “Pre hire”

and we selected the fields like

Pre hire employee, Available for hire, Address, Date pre hire, Pre hire pools.

Source – Referred by emails, Phone number

 This Report can be scheduled by Daily basic

 This Report can be deliver to Actions  Web Service (WSDW)  URW

Matrix Report

 Matrix report is to design drillable options

For Example we have a Report they want to see the High level management. Each company of

Head count they want to see view by location, Organization wise, Region wise, Use view Option

available

 It can be used for Mobile application purpose. It is one of the Advantage

 Matrix Report is move to Dash board

 It is can’t call the Integration level in this not available (Example as Web service)

 Data Source used in Matrix Report

- Row Grouping

- Column Grouping

- Summarize: Non configurable calculation (Count, Sum, Average etc.)

- View by

- Detail view Data option available


- Output : Table, Pie Chart and Graph etc

Matrix Report

Recently I created a Matrix report

The Requirement is they want Company total head count in location wise so it is a Global company it has

multiple plants around the world

So they want who belongs to (Italy, Japan, China) countries. Location wise employee details,

Demographic data and total Compensation, Total employment, So they wants a output Pie chart and

Table format

Here I took data source is “All employees” we mentioned

 In row group : Company

 Column Grouping : X

 Summarizing Grouping : Count + sum total companies

 And that we Filter these 3 countries employees

 And we selected drill down option in this we selected specific fields in Employee ID, Location

wise, Employees, Gender, Compensation, Hire date, as well as Detail data I mentioned same

details also

So in this output format that is enabled Pie chart, and Table Format

In this View by and Detail View options are there.


EIB (Enterprise Information Builder) – Outbound

 Recently I build a EIB Outbound Integration which is for Vendor called “Better Workers” they

do maintain the employee demography data, like Employee ID, Employee First name, Last

name, Employee Email ID, Phone number, Hire date some of the information related to the

Worker demographic data

 So they looking for the particular worker details who are working belongs to the Locations,

HR wants that two location workers because we are global company. We have Company

followers are global level

 So HR wants to these two locations i) Missouri. Ii) California and at the same time they want

receive full file ‘csv’

 Every day we observe that worker count in these two locations. It’s like not an even more

than 5000 workers

 So decided go with EIB because these fields also minimal numbers. Only 15 fields are there

including all

 So we build a custom report it is worked as a data source base on their requirement

 And we setup a Filter on that location only like California and Missouri

 In this we can select “Enable as web service” in an advance tab

 After that we build EIB Outbound integration 3 Steps

i)Get Data

ii)Transform

iii)Deliver

 In this Get data field I select data source type, custom report and here I given my report

name
 And after Transform option it shows transformation type

 Custom Transformation

 Custom Report Transformation

Here I selected Custom Transformation after that I put xslt attachment transformation and

prepared xslt for that

 Deliver option given the File name and Deliver method FTP/SFTP Configured that

 This Outbound Integration run by daily basis

CORE CONNECTOR

 Recently I build a Core Connector Worker Integration

 The Requirement is they wants employee Demographic data what are changing the Employee

details in Monthly / Weekly / Daily so they wants these Complete Integration

 Here I used Core Connector Worker + DT

 I selected all the Fields like employee ID, Employee Name, Photo, first name, Last name, Phone

number, Email ID, Position, Status etc. all are selected in the Integration Field attribute Session

 Here I configured Version also in

 Here we need to send the data what are the Changes entered with in the system

 It extract all the data changes between from last successful run date to as of entry moment

 If I want only particular data I will select a transaction by we can reduce integration run time

Launch parameters
 As of entry moment

 Effective Date

 Last Successful entry moment

 Last successful effective date

 We build the DT based on the logic and we write xslt for that

 In this DT we can take a sequence generator to run integration order wise output file to come

 We setup the business process in this Core Connector. In this we can configure deliver service to

the SFTP which was provided by them

 This integration monthly / ….. basis and it is run

Like SFTP Path

Directory – Username – Password

 The Output file is the csv separated one

 We will transferred the output file the Vendor requested output format that is (,) comma

separated one.

Payroll Interface

It gives all payroll data like earnings deduction tax pay system company policy like Leave of absence,

Compensation….

It is extract all the Payroll data

 Enable the Integration services

 Integration attributes
 Version

 Vendor: We send the payroll data to other Vendor for Calculation purpose

Ex: ADP, SAP. etc.

 Primary payroll Integration

Primary: It is run by schedule process Daily / Monthly / weekly Run by through Automatic

Ad hoc: By manually want to run the integration

(It is capture data from last successful run date)

 Launch Parameters

Pay group: Here different Pay groups are there. All employees / selected employees / particular

group pay period selection

 Use of pay period Current date

Either monthly / Weekly in this we take

Monthly: is 1st Jan – 30th Jan

Weekly: is 1st Jan – 7th Jan

 Earliest open pay period

If you want last month data changes in this we can capture

Before Month unprogressive calculation it will complete


PICOF (Payroll Effective Change Interface): Real time

 Recently I build a Payroll Integration, Payroll Vendor called ADP

 Employee demographic data along with payroll relation integration like payment election data

and allowances data and Time payment allowances data (tax, commission) as well as

 Here we used Payroll interface contractor + DT

 Here the Pay groups requirement is like we have to deliver a single to all the pay groups

 Basically here we triggered 4 Groups

These 4 Group members payroll will done by the same payroll vendor (ADP)

4 Groups pay period also same like monthly pay period

 Here we select all the fields in integration field attribute session

 Here we need to send File on daily based as per the changes entered into the system

Completion of PICOF

 After we build a DT based on the logic

 The request here we need to send File on daily basis as per the changes entered into the system

 We write the DT and we write XSLT for that

 What is the transaction entered with in the pay period that is like between the (Last successful

run date to current date) those transaction will pick up (the integration extract the data)

 We will transfer the data to the Vendor requested output format it is “csv” that is comma (.)

separate one

 And after configure delivery service to the SFTP details which was provided by the Vendor
Like: SFTP Path

Directory

User name

Password

 Here we provided the private key to detect the file

 We used the Public key to encrypt the file from my side

PICOF PECI

It gives most recent transactions It gives all the changes from last successful run

Ex: Phone Number changed Ex: Phone number 3 times in any day changed

3 times in a day it will give recent transaction It will give all the transaction

(lookup ahead pay period)

If you run the integration for 5 Pay groups it will Comes to PECI it give different output file for

give the single Output file .xml file different pay group

In this separate integration is there ad hoc – LP Multiple launch parameters there in built option is

enable – Only we can enable the option

PICOF extract the current pay period + number of It always gives effective termination records

days

 PICOF which Vendor you worked – ADP

 Which ADP system you are sending the File to?

ADP has multiple systems


ADP Workforce now

ADP Stream Line

 I wanted to Capture payroll related changes how do you get?

It is possible through CCW as well as PICOF

 When do you use CCW and when do you PICOF?

CCW is used for Contract employees for compensation changes only

PICOF is used for direct employees because PICOF have a Pay group for direct employees

Contract employees will not have a Pay group

ISU

We will create an ISU whenever you create New Integration. We have to create ISU and we need to

provide the proper security access to that ISU integration and then you need to attach at the our new

integration

Workday Cloud Connectors

I haven’t work on this but I have knowledge on this

o Those are in build connector there no need to do that much work on that. We just select enable

that service.

o These are in built template for the client like Vendor


o Suppose Vanguard is there benefit Vanguard

Why choose Studio rather than EIB integration

 CSV file can load into EIB by third-party sending file

 High volume of data cannot load bay using EIB. So we can use by Studio Integration

 The Workday Studio Integration used to build complex integration for Inbound and Outbound

STUDIO

Which I recently Implemented a Studio Inbound Integration

We have all sales force in sales commissioner data. Loaded into Workday we receiving all the data in

weekly / monthly / basic from sales force loading into the Additional

Worker details side “Custom object” level we are loading the data with Studio Integration

1. Inbound Integration

The requirement is our organization wants to give a Gift card like some bonus cards. For Christmas it is

(Request one-time payment) I used the web service is “Compensation”

 We retrieved the data from Retrieval service through SFTP and pulled the data the file format is

csv
 Here we need to convert the data from csv to xtml so we can look “csv to xtml” component in

asynchronies mediation

 In this there is no chance to load data at a time so we can take splitter

Then it is split the data in row by row in root / row combination

 I took another asynchronies mediation in this I look Evaluate expression sub component. It is

stored all the splitted rows in props variable. Here I used syntax

Ex: Props (Employee ID) = Parts (0), x path (row / emp ID)

 I created a SOAP message initially I tested in the sample web service tester this SOAP message

saved in write component. I used web service compensation

 We want to load any data into Workday. We need a “Workday Outsoap” component. Here we

declare Web service and Version

 If any Errors occurred in loading the data then we used “Send error” component to capture all

the failure rows data

 I look another asynchronies mediation to know all the success and failure date rows information

so we can look a evaluate expression

Here we used Syntax

Ex: Props (p.success.log) = New Java long string buppex buffex

Props (p.error.log) = New Java long string buppex buffex

2. Inbound

I have recently built a studio Inbound Integration we are getting details from the vendor

The requirement is loading training details into Workday. As of now the tenant module is not enable for

our Company so we are using corner stone as a training provider

After completed the training that will keep all the worker details in a csv file in the external SFTP location
 My work is to retrieve the data and load into Workday xml format

 First off all retrieve the file from SFTP location by using retrieval services

 We take evaluate component we wrote the function condition

 da.size 70 or not then process will start because there are some files

 da.size = 0 there are no files we can skip the process (there is no need to continue to the

process

 da.size = 0 then we used a router there

 I used a decilator strategy to route each document at a time

So once the first document process through router component then it will go to a synchronize

component

 Then I valuate the condition weather it has all the required data in the csv file or not

 Because sometimes their miss some files

 And I validate if there any field is missing then I will send an error message

(Message this worker doesn’t have a file)

 Everything goes well we used a component csv to xml component. The file it will convert into

required xml format

 After that and I look splitter X path splitter because there is no chance to load the data at a time

that’s why rows are row by row root row combination

 After that we prepared a SOAP content by initially. I tested in the web service tester

And I wrote SOAP content in the “write component”

 After that we want to load any data into workday we need a “Workday out SOAP Component”

Here we declare Web service (talent) name and Version

 If any errors are occurred at loading time then we used a “Send error component”
3. Inbound

Recently I build a Studio Inbound Integration. The requirement is worker seniority date data calculated

within the Xerox system so we get worker seniority data file from Xerox system through SFTP and we

have loaded the data into the Workday each worker wise

 First off all here we get the data file form Retrieval Services through “Workday In and retrieve

the data file

 We want to check whether we have one file or multiple file in SFTP location

 So I look a “Route component” Here I used MVEL. Strategy and I use the condition

 da.size( ) = 1  so here I found multiple file so I stop the process here

 da.size ( ) = 1 so there is a one file then I want to go to continue the process

 And after I look one “Async Mediation” content in this we can take ‘Evaluate sub component’

here we wrote the function condition

 (If da.size 9.0 > 0 ) = (da.wd. retrieve variable )

Because file has some data

 Props (Empty file) = 0 if file is empty to stop the process here it self don’t want to continue

the process

 We get the csv format there is need to convert to xml format

 I look Async Mediation component in the in this I used sub component csv to xml the file it will

convert into required xml format

 After that and I look “Splitter” component here I used xml splitter because there is no chance to

load the data at a time, that’s why I look that so it is splits all the rows, row by row root/row

combination
 After that I prepared a SOAP message by initially I tested in the web service tester and I save

SOAP content in the write component

 After that we want to load any data into Workday we need a Workday out soap component.

Here we declare Web Service (Human resource) name and Version

 If any errors are occurred at that time so then we used a “Send error” component

Splitter

1. X – Path Splitter

2. XML Splitter

3. X – Tream Splitter

Router

 All strategy – we are used multiple report for calling for sub routes multiple

 MOVEL Strategy – We want to write any conditions comparing this input data fields

 Deciliter Strategy

Router: We want to send different ways condition wise. Someone asks text file to call and

send to condition wise

Aggregator
Success log and error log information, aggregator aggregate into a single file

 Size batch strategy

 Time batch strategy

 Content collateral strategy

All success messages capture and aggregate Batch base it gives all the values

 Time Batch : What time it capture all the information gives the time batch strategy

 Content Collateral:

Error Handler

 Local Error Handling

It will give all the component level there is any error occurred put integration message – Local

out comp

 Global Error Compensation

IF there are any unexpected errors occurred entire integration level so we use the global error

handling

# Map

I know about # Map. That evaluate express storing the all the values in # Maps check the condition. If

employee ID is existing into execute those logics in that case we use # map
I – Land

o Complex data we can load by suing I load

o If you need to load for high volume of data external system to workday

o It is a spreadsheet template by using

Page gets Component

 It is also used calling Web services and checking document deliverables it has lot of options

 In this we do condition wise task

 We want to do condition wise task

 We want to execute file particular point of time

Ex: We want every 25 row time we want to execute the file any calculation

Workday Out rest

If you want to pull any report so you can use Workday out rest Component

 In one lack fields loading time we get an error after 50,000 fields at the time what you do?

We can attach send error component and capture all error rows through to some other

component

Asyncranise mediation  Evaluate exp.

Props ( P.Error log) Opened (Prop (emp ID))


 One Integration load only One lack employee details but Vendor wants to load 2 lacks field.

What the time what we do?

By using I load It is also one of the Web services

Evaluate Expression

It is used to store or declare the values here we used

Syntax

Ex: Props (Employee ID) = Parts (0). X path (row / emp ID)

Props (Effective Date) = Parts (0). X Path (employee / Seniority date)

Props (Record count) = Props (Record count)

Workday Integration: Start the Integration by using Workday Integration component

Workday Out: To deliver the file by using Workday out component

Router

MVEL: For comparing the Input data fields and Custom report data fields

Asynchronized Mediation

It is a collection of write, store, evaluate components as well as validation component


It is one way direction

synchronized Mediation

It is Two-way direction

Variable:

What is difference between SOAP and REST?

What is XTT Function (ETV): These steps processing in a development environment

Encrypt: Who have the proper Security Account to get the file that is called Encrypts

Ex: PGP Certification who have those will get

EIB Out bound

We can get the data through Custom Report and Web service  It is worked like as a data source

You might also like