Bidyut Profile
Bidyut Profile
Bidyut Profile
OBJECTIVE:
I am highly motivated, self-guided and proficient in learning new technologies. I do strongly
believe I would be able to prove myself and can be a great asset for the organization I work for. I do
possess strong professionalism and can carry out my assignment as per the schedule without fail. Hence
I am looking for a challenging position in ETL architecting and process automation with innovation that
will best utilize my expertise and experience with a scope for good career growth.
PROFILE:
9+ Yrs. of experience in Data warehousing, ETL/ELT& SQL/PLSQL/T-SQL development.
Extensive experienced in SDLC, DDLC process and strong understanding of OLTP and OLAP.
Expertise in analysis, design, development, implementation and troubleshooting of Data warehouse
applications.
Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts,
and Decision Support Systems (DSS) using Multidimensional and Dimensional modeling (Star and
Snowflake schema) Concepts.
Extensive Data Warehouse experience using Informatica Power Center 9.6/8.6,SSIS/SSAS 2012/16
(MSBI) as ETL tool on Flat files, Oracle and Teradata extensively. I have good knowledge working
with SQL Server, Sybase and Teradata Databases tool.
Thorough knowledge of the Data Mart Development Life Cycle. Performed all dimensions of
development including Extraction, Transformation and Loading (ETL) data from various sources into
Data Warehouses and Data Marts using Power Center (Repository Manager, Designer, Workflow
Manager, Workflow Monitor).
Extensive experience in developing mappings using various transformations like Source Qualifier,
Expression, Lookup, Aggregator, Router, Rank, Filter and Sequence Generator transformations and
various reusable components like mapplets and reusable transformations.
Extensive experience in developing re-usable Transformations, Mappings and Mapplets.
Experience in Mapping optimizing and Performance tuning of ETL processes.
Knowledge of data warehousing techniques, Star / Snowflake schema, SCD types, data quality, ETL,
OLAP and Report delivery methodologies.
Have exposure to BFS, Aviation, Telecommunication and Retail business sectors with regards to
data warehouse development.
Integrated heterogeneous Data sources using ETL.
Developed transformation logic to cleanse the source data of inconsistencies during the source to
stage loading.
Extensive experience in Relational Database Systems like Oracle, Teradata, SQL Server, Postgre SQL
.Design and database development experience with SQL, PLSQL blocks, T-SQL, Stored procedures,
functions, triggers ,cursor/ref cursor ,collections etc .
Expertise in Relational Modeling & Dimensional Modeling.
Troubleshoot production issues while transferring data from sources to target.
Extensively used database tools like SQL*Loader, TOAD, Sql-Plus for developing and debugging SQL
and PL/SQL programs migrating data from Excel, Flat file to Oracle by using export import utilities..
Proficient in UNIX environment in developing shell scripts and using UNIX command line utilities like
Sed and AWK.
Experience in scheduling tools like Control-M and Cron.
Member of IEEE Communication Society and GNU India.
Understanding of CMMI level.
Experienced in Scaled Agile Framework.
I always strive to be an excellent team player and a quick learner.
EXPERIENCE:
• Working as a Software Lead in CoralHire Technology Pvt. Ltd. with client JDA Software, USA since
May 2018 to till date.
Worked as a Senior DWH Consultant in Nexgen Technology Pvt. Ltd. with client BCBG Group, USA
since January 2017 to May 2018.
Worked as a Senior Analyst in Wissen Infotech India Pvt. Ltd. with client CA Technology, USA since
August 2016 to January 2017.
Worked as a Senior Specialist in Mobily Infotech India Pvt. Ltd. with client Mobily, SA since February
2014 to July 2016.
• Worked as a Senior Software Engineer in Mindtree Technologies Pvt. Ltd. with client SITA, USA since
October 2011 to October 2013.
• Worked as a Senior Team Member in JPMorgan Chase India Pvt. Ltd since April 2010 to July 2011.
CERTIFICATIONS:
Post Graduate Diploma in Computer Applications from EIIT in 2003under ministry of H.R.D Govt.
of India.
Hardware and networking from EIIT in 2004 under ministry of H.R.D Govt. of India
ACHIEVEMENT:
Awarded by NIIT for 6th National I.T Aptitude Test
Awarded by NIIT for 7th National I.T Aptitude Test
EDUCATION:
• B.Tech (INC) from NMIET in 2009, BPUT, Orissa
• 12th from N.C College, CHSE, Orissa.
TECHNICAL SKILLS:
Languages : SQL, PL/SQL, Pro*C.
Scripting : Shell script, Python Script.
Database : Oracle 11g/10g/9i, Teradata, SQL Server 2008/2012/2016, Amazon Redshift.
Db Tools :SQL*PLUS,SQL*Loader ,SQL*Line ,PL/SQL Developer,TOAD9.6,BTEQ,
FastLoad ,MultiLoad ,FastExport ,TPUMP,TPT , SQL Server Studio 2008,
ETL Tools : Informatica 9x/8x, SSDT (SSIS/SSAS).
Modeling Tools : Erwin Data Modeler.
Modeling Concept : ER Modeling (Relational Modeling),Dimensional Modeling
OS : UNIX, Linux, AIX, Solaris, Mac OS X 10.7.5, WINDOWS.
Version Control : SCME, CVS (both UNIX and Window), SVN (both UNIX and Window).
Tracking Tools : HP QC, BMC Remedy, Trillium, EURC and IBM RTC.
Other Tools : Core FTP, Win SCP, File Zilla, Notepad++, Text pad+, beyond compare, putty,
Team viewer.
PROJECTS:
The Gloria Jeans in-house Allocation interface has been designed by pulling data from
assortment data warehose. These Data has been used for allocation planning, inventory
management and reporting which help the organization to grow in revenue.
Contribution:
Design, Development & Deployment.
The Shoppers-Stop in-house Data Warehouse systems are the main source for historical data
used to populate the EKB databases. Structure data must be loaded using JDA Integrator. Data
for base measures are imported into EKB at the lowest product, location, time level required for
each base measure: StyleColor X Store X Week for Shoppers-Stop. These Data has been used for
retail planning, inventory management and reporting which help the organization to grow in
revenue.
Contribution:
Design, Development & Deployment.
3.Project Name TCP Global MFP (Retail)
Client TCP, USA
Duration May 2018- August 2018
Language/Tools Batch Script, SHELLSCRIPT, SQL, PLSQL, Toad, Datastage
Platform Window ,UNIX
Database Oracle12c.
The Children’s Place in-house Data Warehouse systems are the main source for historical data
used to populate the EKB databases. Structure data must be loaded using JDA Integrator. Data
for base measures are imported into EKB at the lowest product, location, time level required for
each base measure: StyleColor X Store X Week for The Children’s Place .
Contribution:
Design, Development & Deployment.
BCBG Group is a global fashion house with a portfolio including more than 20 brands. With multi-
billion dollar global retailer with warehouse worldwide . In this project, a huge data warehouse is
maintained that enables management and the sales team to analyze the business over a period of
time. Budgeting and forecasting decisions are based on the reports produced using this data
warehouse. The project deals with building a real time view of enterprise wide data. A decision
support system is built to compare and analyze product prices, their quantities and customer
profiles. This Data Warehouse is used to deliver reports and information to sales and marketing
management. . Each feed has respective folders .Source file will be placed in IN folder and after load
completion to integration it will be moved to ARCHIVE. All file watcher steps are configured to
generate alert if it run for more than 5 minutes .So all the ETL jobs are configured in WCC as single
or box jobs .
Contribution:
Design, Development & Deployment.
Mentoring a team.
A decision support system is built to compare and analyze product prices, Sales, Inventory with
quantities and customer profiles. This Data Warehouse is used to deliver reports and information to
sales and marketing management. . Each feed has respective folders .Source file will be placed in
IN folder and after load completion to integration it will be moved to ARCHIVE. All file watcher steps
are configured to generate alert if it run for more than 5 minutes .So all the ETL jobs are configured
in WCC as single or box jobs.
Contribution:
Developed mappings using SSIS to extract data from the flat files into SQL Server staging area.
Developed transformation logic to cleanse the source data of inconsistencies during the source to
stage loading.
Developed re-usable transformations, mappings confirming to the business rules.
Implemented SSAS components cubes and dimensions and written the ETL to load SSAS components.
Mentoring a team.
The App empowers the store managers and employees to create and manage effective, optimized
and employee friendly schedules. Combining the data from the traffic and plan forecast view, the
store manager can thereby schedule employees with high performance history to cover those
shifts to gain maximum returns from the incoming traffic.IX Anlyticsis used for reporting sales,
budget, traffic, store performance, employee performance .As part of reporting WIFI analytics
and Video Analytics done for individual stores.
Contribution:
Designed both the OLTP and OLAP system for the applications and reporting.
Developed transformation logic using python scripts to load sales, traffic into AWS Redshift through
S3 buckets. Python scripts implemented to load Employee_Roaster and Task_Hour file into postgredb
for Schedule application.
Developed Procedure, views, triggers in postgre to incorporate US HR Meal_Rest rules and Break
Criteria .AutoScheduleris the higher version to manual Labourshedule where the schedule has been
created by the system with an optimized way once the Employee_Roaster and Task_Hour loaded to
the system.
Contribution:
Designed both the OLTP and OLAP system for the applications and reporting.
Developed Procedure, views, triggers in postgre and used AWS Dynamo Db to maintain the
Application Level Users.
CA Technologies (USA) one of the largest independent software corporations in the world. EIW i.e.
Enterprise Information warehouse is the center of all business data, contract details and user data is
feeding all reporting SAP BO, Qlickview, Power BI .EIW uses Employee / HR,SFDC Account/Product,
MDM DUNS/SITES, TOPS Contract/AR,SAP BW Contract/AR,PH,GSS as it feeds.. Each feed has
respective folders .Source file will be placed in IN folder and after load completion to integration it will
be moved to ARCHIVE. All file watcher steps are configured to generate alert if it run for more than 5
minutes .So all the ETL jobs are configured in WCC as single or box jobs.
Contribution:
Design, Development & Deployment.
Mentoring a team.
Raqi III is the new postpaid packages for designed for the VIP business customers as a replacement for
Raqi II package. Raqi III postpaid voice package for IUC and CUC customers.
This project creates reports for Activations, Deactivation and Service in Operation for bundle
subscriptions. A new fact was created to store the Order related data. A new ELT pattern was developed
and used to process the data. Also maintain the loyalty program for valuable costumers as per the
subscription bundle. For this it the data transformation used the BSS, CRM, MGATE and the other legacy
systems like Core Provisioning (SF).MEDIATION, BILLING and LOYALTY legacy.
Contribution:
Design, Development & Deployment.
Contribution:
Design, Development & Deployment.
For all reporting and analytics In-house services consolidates the Data into a single Portal. Aim of the
project was to bring the data from 3 different platforms and load the to a common Data warehouse to
generate reporting on different kind of mobile offers (Voice, data).
Contribution:
Creating Various ETL’s for loading data to various Data Marts and Then Loading them to our EDW.
Extensive Used SSIS and C#.
Used third Party tools like ETL framework DB for ETL Auditing and Monitoring purpose.
Implementing SCD 1 and SCD2.
Implementing proper Indexing and Database Partitioning for Performance.
SSIS Performance tuning.
Dimensional Modeling.
Creating ETL’s for loading data to Cubes.
Contribution:
Design, Development & Deployment.
Data Sync For all reporting and analytics services Nuance consolidates the Data into a single Portal. Aim
of the project was to bring the data from 3 different platforms (GEMSTAR, BeyondTXT and E5) and load
the to a common Data warehouse. I had joined the project in the development phase.
Contribution:
Creating Various ETL’s for loading data to various Data Marts and Then Loading them to our EDW.
Extensive Used SSIS and C#.
Used third Party tools like ETL framework DB for ETL Auditing and Monitoring purpose.
Implementing SCD 1 and SCD2.
Implementing proper Indexing and Database Partitioning for Performance.
SSIS Performance tuning.
Dimensional Modeling.
Creating ETL’s for loading data to Cubes.
HKG is one of the most traffic connecting airport under SITA provides a real-time air ticketing solution
both within an airport and across multiple airports ensuring that the right passenger travels on the right
flight. When deployed system-wide by an airline. Provides a track and trace system for passenger details
at single and multi-airport locations. Has been developed to reconcile passengers as they move through
and between airports. The system provides positive passenger bag match ensuring that passengers and
their bags travel together on the same aircraft. BagManager sends and receives messages according to
RP1745 formats including BSM (Baggage Source Message), BUM (Baggage Unload Message), BPM
(Baggage Processed Message) etc. These messages can be sent to BagaManager from an external system
DCSs (Departure Control System) or another Bagmanager deployed at another airport.
Contribution:
Design, Development & Deployment.
IMSD CNN Money related to different LOB’s to control the various access level issues for JPMorgan
Employees as well as customers. The associate will perform diverse tasks related to Security
administration and Database access through multiple technology applications for internal JPMC
employees. Monitor processes, running reports, stored procedures etc.
Contribution:
Design, Development & Deployment.
CSW acts as a central system for all kinds of cash, wire or FX transactions. It acts as a vehicle for processing
all instructions from Investor Services custody clients related to wire or cash transfer. It integrates built in
intelligence to generate primary as well as secondary transactions for each wire request it gets. CSW
makes it easy for the business users to put in requests for cash movement and reformatting and routing
them to the appropriate core systems for further processing. Create standard instructions for WSS blotter
clients as per pre-defined templates manually and electronically. These standard instructions are further
passed on transaction records to the TITAN 3/4 system and are included in client reports.
Contribution:
Designed and developed database packages, procedures, functions and triggers on Oracle 9i/10g
databases.
Made enhancements and applied defect fix.
Created Technical Design documents for major enhancements and attended Technical Design
Reviews. Creates Summary Designs to address fix/enhancement code change to support Story's
acceptance criteria.
Codes and test software that meets the design specifications
Created Issue Resolution documents and Unit Test Plans for minor enhancements..
Personal Profile:
Date of birth : 9th May 1983
Marital Status : Married
Nationality : Indian
PAN Card number : ATEPD9095M
Passport number : H7635845(Valid till 16th February 2020)
Declaration:
I hereby declare that the above-mentioned information corrects unto my knowledge and I bear the
responsibility for the correctness of the above mention.
Date :
Place : Bangalore BIDYUT KUMAR DAS