Prashant Sharma
Prashant Sharma
Informatica Developer
Informatica/ETL Developer
6+ years 6 Years of professional experience in ETL design development, implementation and quality assurance projects for clients across various domains. Experienced in performance tuning and process optimization for ETL development projects. Technical expertise in ETL development using Informatica Power center 8.x and 9.x Well versed with Oracle PLSQL and UNIX shell scripting. Extensive experience with Teradata v12 database and hands on v13. Strong Knowledge of Data Warehousing, OLTP, OLAP concepts Strong understanding of Dimensional Modeling (Star and Snowflake Schemas) Sound knowledge in SQL Queries and databases concepts Worked as a data warehousing consultant and with teams providing ETL solutions Hands on ETL QA using Business Objects Data Services v 3.1 and 4.0. Involved in regular client Interaction for information gathering and knowledge transfer. An effective communicator with strong interpersonal, analytical & problem solving skills.
Profile Summary:
SUMMARY OF QUALIFICATIONS
Technical Skill
Tools Used
ETL development Data Modeling Data Marts and Star Schema PL/SQL , Transaction SQL ETL performance tuning & optimization Stored Procedures/Triggers/Package s UNIX Shell Scripting
Informatica 8.x and 9.x Teradata v12 Oracle 10g,DB2, SQL Server HPQC defect management SAS BI Suite BODS 3 and 4 Datastage EE VB Scripting
PROFESSIONAL EXPERIENCE
Loblaw Inc. through Teredata, Mississauga, ON September 2012 till date Informatica/ETL Developer The Loblaw SAP program will be using a mixture of Legacy, ABAP and Back Office Associates tools for executing the data conversion objects. The objects will be reviewed by the respective teams for complexity, volume, etc. and each object will have the most appropriate team assigned to execute the extraction, the transformation and the load. The functional requirements for data load are captured in functional design documents by the project functional teams. There is one functional design for each data object identified in the planning and analysis phase. The functional designs identify the data conversion requirements for the extraction and collection of data, data cleansing that needs to occur in legacy, the transformation or mapping of each data field to the SAP format, and the load requirements. Validation test scenarios should also be provided in the functional design. Responsibilities: Identifying and extracting data from varied sources, applying business rules and loading them to target data marts using Informatica 9.0.1 ETL solution environment. Active participation in weekly calls with data modeling and analysts teams to understand and work on any new requirement. Development of mapplets and shortcuts as per ETL standard practices requirements. Delivering ETL codes for completed solution artifacts as per SLA. Implementing business solutions as per requirement specification. Unit testing and documenting ETL mapping developed for a particular interface. Creation of test plan document per interface for the purpose of internal QA. Creation of ETL test scenarios / test cases and update of test plan for unit testing. Co-ordinate and ensure the agreement of Data Cleanse & Migration functional specifications across the Lines of Business. Providing support to the QA team for various testing phases of ETL development. Solving issues and incorporating them in existing mapping codes.
Page
Establish in consultation with the business the overall data cleanse requirements and co-ordinate ongoing cleanse activities in both the applications (manual cleansing) and technical (automated cleansing) conversion project. ETL code review as per standards set by the module. Maintaining a log of defects and subsequent resolution applied.
Solution Environment: Informatica Power Centre 9.0.1 (Repository Manager, Mapping Designer, Workflow Designer and Monitor), Teradata v13, IBM RTC, Sharepoint. Tools and Languages: HPQC, PL / SQL, Putty, Exam difference Canadian Tire Corporation through Capgemini Consulting, Toronto May 2012 till Sep 2012 Informatica/ETL Development Loading core business data that matters into the EDW will increase BI user adoption and decrease IW usage Provide a consolidated view of the reference data across the organization. As a result existing and new core business data will have been migrated and loaded into the EDW Identification, prioritization and delivery (to the Enterprise Data Warehouse) of shared data subject areas required to support the reporting and analytical needs of multiple projects within the 2012 project agenda, with a focus on delivery of data required by dependent projects AND data currently residing in either IW, ODS or old EDW. Responsibilities: Implementing business logic for source to target data loading. Implementing data profiling and quality activites. Developing ETL routines for extracting data from multiple sources, cleansing and loading to targets. Developing ETL jobs using Informatica Power Centre 8.6 mapping designer and workflow designer. Implementing business solutions as per requirement specification. Understanding the various layers of data transformation and loading process involved. Unit testing and documenting ETL mapping developed for a particular interface. Delivering ETL codes for completed solution artifacts as per SLA. Management of defects (logging defects, reviewing defects, planning defect disposition or retesting, communication and reporting) Establish in consultation with the business the overall data cleanse requirements and co-ordinate ongoing cleanse activities in both the applications ( manual cleansing) and technical (automated cleansing) conversion project. Management of defects (logging defects, reviewing defects, planning defect disposition or retesting, communication and reporting) Collection of test results artefacts (test cases, test data, screen shots, defects and issues log ) for the project repository. Solution Environment: Informatica Power Centre 8.6, Oracle 9i, DB2. Tools and Languages: HPQC, PL / SQL, Putty, Exam difference
Page
Child Maintenance & Enforcement Commission, UK August 2011 May 2012 ETL Developer The client is a new executive agency established by the UK government which deals in maximizing the benefits and effective maintenance arrangements of those children who live apart from one or both of their parents. The project is in phase 2 and involved extensive development and enhancement of three types of initial load mappings. Responsibilities: Work with business users to identify and clarify ETL requirements and business rules . Design, develop, implement, and validate ETL processes. Create and execute unit test plans based on system and validation requirements . Troubleshoot, optimize, and tune ETL processes Document all ETL related work per company's methodology including specifications for new/existing and operational turnover/support Maintain existing code and fix bugs Ensure that all timelines of loading/validating data are met Ensure smooth functioning of development, staging and production environments Delivering ETL codes for completed solution artifacts as per SLA. Demonstrates strong analytical skills to implement solutions for business requirements and problem resolution Analysis, development, testing, implementation and maintenance of in-house developed data warehouse Solution Environment Oracle 9i Database, Informatica Power center 9.0.1 Tools and Languages TOAD PL/SQL Developer, OBIEE Aviva General Insurance, UK October 2010- July 2011 ETL Developer This is a marketing insights project, which requires generating analytical reports using SAS EG and Teradata. Aviva has now migrated its data warehouse to Teradata sensing its seamless capabilities to produce impeccable business intelligence solutions. Ours is a new team, comprising of seven members and were sent to Norwich, UK for three week process training at clients site. Responsibilities involved generating reports for the Capability team, who is designated to run campaigns for various new offers and products of Aviva. These campaigns were usually targeted on a group or class of population, and hence our reports help the capability team to understand the ways a particular group of responds or might respond to a particular campaign. Responsibilities: Work closely with the end user to understand data requirements. Implementing business rules as per the BRD and SRS documents. Identifying source systems for data extraction, cleansing and scrubbing. Delivering ETL codes for completed solution artifacts as per SLA. Developing new mappings as per requirements using Informtica to load subsequent data marts. Writing/tuning complex queries used in the ETL process. Developing automated data loading scripts using Teradata BTEQ, Multiload and Fastload.
Page
Writing procedures and creating views to automate certain modules. Unit testing and peer review of developed ETL jobs. Scheduling ETL jobs as per the subject area in the dev environment. ETL Code reviews as per the standards set by the module.
Solution Environment: Informatica 8.6.1, Teradata SQL Assistant, Toad, UNIX Tools and Languages: T-SQL, PL SQL, Shell scripting, Visual Basic Scripts. CISCO, USA (EDW2B- ETL Migration) July 2007 September 2010 Informatica/ETL Developer First Next Generation Enterprise Data Warehouse released, establishing a complete infrastructure and foundation for Cisco's Data Warehouse Platform on Teradata. The Cisco's existing Data Warehouse is a 7 Terabyte Oracle Database, which stores the historical and operational data of all the measures and now Cisco decides to change the database from Oracle to Teradata The project also implemented advanced features like that of Informatica and Teradata(such as Push down Optimization, Fast Load, TPUMP and other Teradata utilities). Responsibilities: Implementing business logic for source to target data loading. Implementing business solutions as per requirement specification. Understanding the various layers of data transformation and loading process involved. Unit testing and documenting ETL mapping developed for a particular interface. Creation of test plan document per interface for the purpose of internal QA. Creation of test scenarios / test cases and update of test plan schedule Gathering the requirements from client/onsite counterpart and assigned work to offshore. Developing ETL using Teradata Utilities i.e.-BTEQ, Loader-FastLoad, Multiload, Tpump and Informatica mappings for loading data into the target system. Optimization of Teradata and ETL codes Developing Design Documents, Deployment Plan, ETL Specifications, Business Requirement documents. Preparing Unit-Test Cases for data validation Working on the Scheduling Tool ($Universe) to schedule the ETL jobs Preparing Code migration templates for Test/Production migration using Kintana. Reviewing the work done. Maintain versioning of the code under PVCS (versioning control system). Delivering ETL codes for completed solution artifacts as per SLA.
Solution Environment: Informatica Power Centre 7.x and 8.x, Teradata SQL Assistant, Toad, UNIX Tools and Languages: T-SQL, PL SQL, Shell scripting, PVCS
EDUCATION
Page
Page