About Workday
About Workday
Job Description
I have over all years of work experience in ERP. Initially my carrier started as Workday integration
developer
Currently I am working with Accenture. It is a Support and Enhancement Project. Working as a Workday
Technical Consultant.
I have a very good experience in all the Create Custom Report, Modified, Existing Reports,
Calculated Fields and also Integrations
More responsible in EIB Loads and EIB Outbound integrations also Core Connector Worker
(CCW), PICOF and PECI
I have good Experience in Workday Studio, Couple of Workday Studio integrations. I worked on
apart from this
Experience in Business process, DT
Knowledge creating different type of security Groups, user based and Segment based security
groups.
We receive tickets from business, here we are using Service now ticketing tool
My manager picks the tickets and assigning to all in the team members
We will pick the ticket and resolve all the tickets and update the details then respond back to
business team
If there is any client requirement comes based on the data availability we do build new
Integration as well from the scratch
Any technical challenges are coming from the project like Reporting side, Integration side I can
take care of all the issues
Support to the Workday applications providing production support to the on sight client
Requirement changes and bug fixes
How do you work like Day-to-day business work with USA team?
We do have a daily standup call with US Team we have to give update on the tickets whatever working
on that. If I am stuck with any things and any level so I need to inform to the lead and all.
In the Meeting we discuss about what we are facing all the difficult issues and where we are currently
and how soon we can finish the ticket.
I have a created a different type of Reports like Birthday notifications, All pre hire reports,
Worker head count who are terminated and Future hire, Future terminated worker details
reports, Job requisition reports as well as
And I have created different type of matrix reports, Worker head count company wise, Location
wise Worker details and mainly matrix reports for that Dash board purpose
And I have different type of dash board reports like Time dashboard, Benefit Dashboards
creation, Team performance dashboard creation, Compensation dashboard creation, Recruiting
dashboard creation as well as
And I have created EIB inbound and Outbound Integration, Loading compensation, Relation
employees, Terminate employees details data loading, Hire employees data loading, Assign roles
as well as
And I create Core Connector Worker (CCW) Integration, The employee demographic data, What
are the changes entered in daily basis
And I have recently build Payroll interface like PICO we extract employees demographic data
and payroll related information integration, Run by monthly basis
And I have build some Studio Integrations also, Compensation Loading, Employee Seniority data
loading, Employee Training details loading and Employee background checkup all the details
data loading as well as
Advance Report
Mostly I involved advance report but I did not get the chance work on composite but I have very good
knowledge.
Recently I created an Advanced Report. The requirements our company wants to worker head count
and how many worker Terminated, Future hire and future Terminated workers details they asked a
report
I look the Fields Employee ID, Name, hire data, Terminated data
Advanced Report
Recently I created an Advanced Report the Requirement is our Company wants Every day who are
celebrating birthday and those employees details we have and to send the message notification like best
wishes messages to all employees
Here I selected the fields Employee ID, Full Name, Employee Status, Designation, Location, Date
Calculated Fields are Build date and I Created a Configuration message for then we send a
notification or mail for the worker (Notification bar / Double drive) the report is schedule for
Daily basis
It depends on what kind of data we need to look under worker, Action event, Payroll results line etc.
We will see the data sources list in General terms. Active and terminated workers, All workers, Worker
by organization, Worker for prompt, Business process transaction, Dated, Awaiting process, Worker
They want pre hire employee’s details and they don’t want to who have yet been. Placed as a
Here I selected data source is all Pre hires and Business objects is Pre hire and we selected the
fields like Pre hire employee, Available for hire, Address, Date pre hire, Pre hire pool (Country
Advanced Report
They want Pre hire employees details and they don’t want to who have yet been placed as a
So I created an Advance Report that is Pre hire ( All pre hires for contract hire)
Here I selected the data source. Data source is a “All Pre hire” and Business object is “Pre hire”
Pre hire employee, Available for hire, Address, Date pre hire, Pre hire pools.
Matrix Report
For Example we have a Report they want to see the High level management. Each company of
Head count they want to see view by location, Organization wise, Region wise, Use view Option
available
It is can’t call the Integration level in this not available (Example as Web service)
- Row Grouping
- Column Grouping
- View by
Matrix Report
The Requirement is they want Company total head count in location wise so it is a Global company it has
So they want who belongs to (Italy, Japan, China) countries. Location wise employee details,
Demographic data and total Compensation, Total employment, So they wants a output Pie chart and
Table format
Column Grouping : X
And we selected drill down option in this we selected specific fields in Employee ID, Location
wise, Employees, Gender, Compensation, Hire date, as well as Detail data I mentioned same
details also
So in this output format that is enabled Pie chart, and Table Format
Recently I build a EIB Outbound Integration which is for Vendor called “Better Workers” they
do maintain the employee demography data, like Employee ID, Employee First name, Last
name, Employee Email ID, Phone number, Hire date some of the information related to the
So they looking for the particular worker details who are working belongs to the Locations,
HR wants that two location workers because we are global company. We have Company
So HR wants to these two locations i) Missouri. Ii) California and at the same time they want
Every day we observe that worker count in these two locations. It’s like not an even more
So decided go with EIB because these fields also minimal numbers. Only 15 fields are there
including all
And we setup a Filter on that location only like California and Missouri
i)Get Data
ii)Transform
iii)Deliver
In this Get data field I select data source type, custom report and here I given my report
name
And after Transform option it shows transformation type
Custom Transformation
Here I selected Custom Transformation after that I put xslt attachment transformation and
Deliver option given the File name and Deliver method FTP/SFTP Configured that
CORE CONNECTOR
The Requirement is they wants employee Demographic data what are changing the Employee
I selected all the Fields like employee ID, Employee Name, Photo, first name, Last name, Phone
number, Email ID, Position, Status etc. all are selected in the Integration Field attribute Session
Here we need to send the data what are the Changes entered with in the system
It extract all the data changes between from last successful run date to as of entry moment
If I want only particular data I will select a transaction by we can reduce integration run time
Launch parameters
As of entry moment
Effective Date
We build the DT based on the logic and we write xslt for that
In this DT we can take a sequence generator to run integration order wise output file to come
We setup the business process in this Core Connector. In this we can configure deliver service to
We will transferred the output file the Vendor requested output format that is (,) comma
separated one.
Payroll Interface
It gives all payroll data like earnings deduction tax pay system company policy like Leave of absence,
Compensation….
Integration attributes
Version
Vendor: We send the payroll data to other Vendor for Calculation purpose
Primary: It is run by schedule process Daily / Monthly / weekly Run by through Automatic
Launch Parameters
Pay group: Here different Pay groups are there. All employees / selected employees / particular
Employee demographic data along with payroll relation integration like payment election data
and allowances data and Time payment allowances data (tax, commission) as well as
Here the Pay groups requirement is like we have to deliver a single to all the pay groups
These 4 Group members payroll will done by the same payroll vendor (ADP)
Here we need to send File on daily based as per the changes entered into the system
Completion of PICOF
The request here we need to send File on daily basis as per the changes entered into the system
What is the transaction entered with in the pay period that is like between the (Last successful
run date to current date) those transaction will pick up (the integration extract the data)
We will transfer the data to the Vendor requested output format it is “csv” that is comma (.)
separate one
And after configure delivery service to the SFTP details which was provided by the Vendor
Like: SFTP Path
Directory
User name
Password
PICOF PECI
It gives most recent transactions It gives all the changes from last successful run
Ex: Phone Number changed Ex: Phone number 3 times in any day changed
3 times in a day it will give recent transaction It will give all the transaction
If you run the integration for 5 Pay groups it will Comes to PECI it give different output file for
give the single Output file .xml file different pay group
In this separate integration is there ad hoc – LP Multiple launch parameters there in built option is
PICOF extract the current pay period + number of It always gives effective termination records
days
PICOF is used for direct employees because PICOF have a Pay group for direct employees
ISU
We will create an ISU whenever you create New Integration. We have to create ISU and we need to
provide the proper security access to that ISU integration and then you need to attach at the our new
integration
o Those are in build connector there no need to do that much work on that. We just select enable
that service.
High volume of data cannot load bay using EIB. So we can use by Studio Integration
The Workday Studio Integration used to build complex integration for Inbound and Outbound
STUDIO
We have all sales force in sales commissioner data. Loaded into Workday we receiving all the data in
weekly / monthly / basic from sales force loading into the Additional
Worker details side “Custom object” level we are loading the data with Studio Integration
1. Inbound Integration
The requirement is our organization wants to give a Gift card like some bonus cards. For Christmas it is
We retrieved the data from Retrieval service through SFTP and pulled the data the file format is
csv
Here we need to convert the data from csv to xtml so we can look “csv to xtml” component in
asynchronies mediation
I took another asynchronies mediation in this I look Evaluate expression sub component. It is
stored all the splitted rows in props variable. Here I used syntax
Ex: Props (Employee ID) = Parts (0), x path (row / emp ID)
I created a SOAP message initially I tested in the sample web service tester this SOAP message
We want to load any data into Workday. We need a “Workday Outsoap” component. Here we
If any Errors occurred in loading the data then we used “Send error” component to capture all
I look another asynchronies mediation to know all the success and failure date rows information
2. Inbound
I have recently built a studio Inbound Integration we are getting details from the vendor
The requirement is loading training details into Workday. As of now the tenant module is not enable for
After completed the training that will keep all the worker details in a csv file in the external SFTP location
My work is to retrieve the data and load into Workday xml format
First off all retrieve the file from SFTP location by using retrieval services
da.size 70 or not then process will start because there are some files
da.size = 0 there are no files we can skip the process (there is no need to continue to the
process
So once the first document process through router component then it will go to a synchronize
component
Then I valuate the condition weather it has all the required data in the csv file or not
And I validate if there any field is missing then I will send an error message
Everything goes well we used a component csv to xml component. The file it will convert into
After that and I look splitter X path splitter because there is no chance to load the data at a time
After that we prepared a SOAP content by initially. I tested in the web service tester
After that we want to load any data into workday we need a “Workday out SOAP Component”
If any errors are occurred at loading time then we used a “Send error component”
3. Inbound
Recently I build a Studio Inbound Integration. The requirement is worker seniority date data calculated
within the Xerox system so we get worker seniority data file from Xerox system through SFTP and we
have loaded the data into the Workday each worker wise
First off all here we get the data file form Retrieval Services through “Workday In and retrieve
We want to check whether we have one file or multiple file in SFTP location
So I look a “Route component” Here I used MVEL. Strategy and I use the condition
And after I look one “Async Mediation” content in this we can take ‘Evaluate sub component’
Props (Empty file) = 0 if file is empty to stop the process here it self don’t want to continue
the process
I look Async Mediation component in the in this I used sub component csv to xml the file it will
After that and I look “Splitter” component here I used xml splitter because there is no chance to
load the data at a time, that’s why I look that so it is splits all the rows, row by row root/row
combination
After that I prepared a SOAP message by initially I tested in the web service tester and I save
After that we want to load any data into Workday we need a Workday out soap component.
If any errors are occurred at that time so then we used a “Send error” component
Splitter
1. X – Path Splitter
2. XML Splitter
3. X – Tream Splitter
Router
All strategy – we are used multiple report for calling for sub routes multiple
MOVEL Strategy – We want to write any conditions comparing this input data fields
Deciliter Strategy
Router: We want to send different ways condition wise. Someone asks text file to call and
Aggregator
Success log and error log information, aggregator aggregate into a single file
All success messages capture and aggregate Batch base it gives all the values
Time Batch : What time it capture all the information gives the time batch strategy
Content Collateral:
Error Handler
It will give all the component level there is any error occurred put integration message – Local
out comp
IF there are any unexpected errors occurred entire integration level so we use the global error
handling
# Map
I know about # Map. That evaluate express storing the all the values in # Maps check the condition. If
employee ID is existing into execute those logics in that case we use # map
I – Land
o If you need to load for high volume of data external system to workday
It is also used calling Web services and checking document deliverables it has lot of options
Ex: We want every 25 row time we want to execute the file any calculation
If you want to pull any report so you can use Workday out rest Component
In one lack fields loading time we get an error after 50,000 fields at the time what you do?
We can attach send error component and capture all error rows through to some other
component
Evaluate Expression
Syntax
Ex: Props (Employee ID) = Parts (0). X path (row / emp ID)
Router
MVEL: For comparing the Input data fields and Custom report data fields
Asynchronized Mediation
synchronized Mediation
It is Two-way direction
Variable:
Encrypt: Who have the proper Security Account to get the file that is called Encrypts
We can get the data through Custom Report and Web service It is worked like as a data source