Resume-Senior Data Engineer-Etihad Airways-Kashish Suri

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

KASHISH SURI

Abu Dhabi, UAE


+971-504086797
[email protected]

EDUCATION

2010 - 2014 MAHARAJA AGRASEN INSTITUTE OF TECHNOLOGY, GGSIPU Delhi


Bachelors in Technology (Electrical and Electronics),
Scored - 77.30%

2010 ST. COLUMBA’S SCHOOL New Delhi


CBSE, AISSCE – Class XII, 91.2% (Aggregate)

2008 ST. COLUMBA’S SCHOOL New Delhi


CBSE, AISSCE – Class X, 87.2% (Aggregate)

WORK EXPERIENCE

Etihad Aviation Group, UAE (Senior Data Engineer) – 1.9 Years (AIRLINES DOMAIN)

Tech serve Oil Field Equipment LLC, UAE (Data Engineer) – 1.7 Years (OIL & GAS DOMAIN)

Infosys Limited- Senior Systems Engineer (BI Consultant) – 3.2 Years (BFSI DOMAIN)

PROJECT DESCRIPTION

ETIHAD AIRWAYS (Senior Data Engineer) Dec’18-Present

‘Etihad is the national airlines of UAE and is a semi government company headquartered in Abu Dhabi. At Etihad, I am
working as a part of the production support team ensuring the BAU activities are running successfully and populating
the correct data. Currently there is an initiative to have all the projects completed inhouse instead of outsourcing to the
vendors. As a part of the team, I am involved in preparing the RFP’s, discussing with business users about their need,
getting the project kick started and working on Big Data technologies to migrate the data from the legacy systems (SQL
& Oracle) to the Data lakes through interfaces, ingesting xml data and providing production support after productionizing.

ROLES AND RESPONSIBILITIES

• Analysing large data sets, Assimilate data from various sources, Data Mining, Data Cleansing, Data Extraction,
Data Visualization & application support using data analytical tools.
• Knowledge of Aviation domain BI technologies with good working knowledge of flight operation data.
• Creating Business Review documents & ICD for freezing the scope and finalizing the requirements.
• Develop, optimize data science models and deploying them into the production Dataiku Automation node.
• Identifying the correct point of contact in business to have the requirement gathering and system study.
• Experienced in Hadoop ecosystem, Google Analytics, Cloud technologies (AWS, Azure, Google), in-memory
database systems (Impala, MPP) and other database systems - traditional RDBMS (Terradata, SQL Server,
Oracle), and NoSQL databases (Cassandra).
• Assisted with project planning, preparing RFP’s, data governance, team coordination, working with the
business to understand the requirements and finalizing the scope with proper deadlines for each deliverable.
• Experience working in Big data technologies, creating data lakes, working on Spark Sql, KAFKA, Pig, Sqoop
and Flume. Creating Kafka topic for streaming data, transformations using Hive, Scala and ingesting xml data.
• Build, operate, monitor, and troubleshoot Hadoop infrastructure. Transferring xml files using IBM MQ.
• Developed workflows in Bedrock(ETL) for XML, Json data ingestion, data cleansing, data integration and data
Transformation based on the XSD (IATA standard). Scheduled the workflows using Oozie.
• Developed and used Shell scripts(Unix), Java, Python for file manipulation, data loading and transformation.
• Working with version management tools such as GitLab for maintaining different versions of the code.
• Developed Shell scripts(Unix), Macros(VBA), Python, Java to automate file manipulation and data loading.
• Utilised Power BI (Power view) and Tableau to create various analytical dashboards that depict critical KPI such
as block hours, On time performance, Dynamic dashboards and Network operations along with slicers and dicers.
• Identifying the correct point of contact in business to have the requirement gathering and system study.
• Following Agile approach to have the deliverable delivered within the SLA. Using Jira for locking the dates.
• Prepare presentations using analysis framework to communicate the insights to the stakeholders.
• Coordinating with vendors for granting the required accesses to the development team. Creating CRs for
any new requirements added to the already locked RFQ.

TECH SERVE OIL FIELD EQUIPMENT LLC (Data Engineer) June’17-Jan19

‘TECHSERVE OILFIELD EQUIPMENT LLC” is one of the fastest growing Engineering services & supplies organization
dealing with the world’s best known brands & Products in UAE & all other Middle East countries, for various industrial
fields like: Oil, Petrochemicals, Refineries, Fertilizers, Steel Plants, Shipbuilding, Construction Companies, Hotel
Industries, Manufacturing, Marine Industries & All other general Industrial applications.

ROLES AND RESPONSIBILITIES

• Analysing large data sets, Assimilate data from various sources, Data Mining, Data Extraction, Data
Visualization using data analytical tools and performing operations/statistical analysis(SAS) on the data
extracted using reporting tools.
• In depth knowledge of OIL & GAS domain BI technologies and solid understanding of how data is turned into
information along with knowledge and how that knowledge supports or enables the business processes.
• Creating Business Review documents & ICD for freezing the scope and finalizing the requirements.
• Created workflow in Informatica for data cleansing and extraction.
• Creating Kafka topic for streaming data, transformations using Hive, Scala and ingesting xml data .
• Build, operate, monitor, and troubleshoot Hadoop infrastructure.
• Build, maintain and spread best practices across organization.
• Developed workflows in Bedrock for data integration, data cleansing, data ingestion and data transformation.
• Plan effectively to set priorities and manage projects, identify roadblocks and work to get them removed, and
understand the importance of meeting client/internal deadlines
• Design and develop state-of-the-art, data-driven exploratory analysis as well as predictive and decision models to
solve clients’ business problems across different domains.
• Developed and used Shell scripts(Unix), Java, Scala for file manipulation, data loading and transformation.
• Experienced in Hadoop ecosystem, Cloud technologies (AWS, Azure, Google), in-memory database systems
(Impala, MPP) and other database systems - traditional RDBMS (Terradata, SQL Server, Oracle), and
NoSQL databases (Cassandra).
• Developed Shell scripts(Unix), Macros(VBA), Java to automate file manipulation and data loading.
• Experienced in Microsoft Azure platform to manage the subscriptions, and infrastructure effectively.
• Building and managing data tracking models to ensure robust data capture across the catering organization.
Worked with external data sources, suppliers and other teams to incorporate information relevant to catering into
monthly performance reports and dashboards.
• Experienced working in Big data technologies, creating data lakes, working on HIVE, Spark, KAFKA, Pig, Sqoop and
Flume. Creating Kafka topic for streaming data, transformations using Hive and ingesting xml data.
• Created reports on the data extracted for client on Power BI and Tableau as per the requirement using table,
bar graph, Pi charts, Drill down(hierarchy), live reports, etc.
• Utilised Power Power BI and Tableau to create various analytical dashboards that depict critical KPI such
as billing hours, Scorecards, Dynamic dashboards and case proceedings along with slicers and dicers.
• Working with version management tools such as GitLab for maintaining different versions of the code.
• Identifying the correct point of contact in business to have the requirement gathering and system study.
• Following Agile approach to have the deliverable delivered within the SLA. Using Jira for locking the dates.
• Prepare presentations using analysis framework to communicate the insights to the stakeholders.
• Coordinating with vendors for granting the required accesses to the development team. Creating CRs for
any new requirements added to the already locked RFQ.
INFOSYS LIMITED

1. INTERMEDIARY PROFILE SERVICES (Technical Analyst) June’15-June’17

IPS is an architecturally significant deliverable, with sole purpose of enabling an application abstraction layer between
systems both inside and outside CGC(Capital Group Of Companies), for the AF line of business. The main goal is to
allow current and future systems to utilize IPS as the go to system for basic information about financial intermediaries.

RESPONSIBILITIES

• Analysing large data sets, Assimilate data from various sources, Data Mining, Data Extraction, Data
Visualization using data analytical tools and performing operations/statistical analysis(SAS) on the data
extracted using reporting tools.
• In depth knowledge of BFSI domain BI technologies and solid understanding of how data is turned into
information along with knowledge and how that knowledge supports or enables the business processes.
• Knowledge of Advanced Excel Functions like Pivot Table, Chart, Text column & Lookups, Logical functions,
V&H Lookup functions, IF Statements, Nested If statement, Date Formula, Index, Match, String Function.
• Building and managing data tracking models to ensure robust data capture across the catering organization.
Worked with external data sources, suppliers and other teams to incorporate information relevant to catering into
monthly performance reports and dashboards.
• Validated all data sources and extracted information from suppliers to develop negotiation strategies for all
supplies ensuring best value is achieved from all spend.
• Utilised Power BI & Tableau to create various analytical dashboards that depict critical KPI such as dynamic
dashboards, billing hours and scorecards along with slicers and dicers.
• Created reports on the data extracted, for client on Power BI as per the requirement using table, bar
graph, Pi charts, Drill down(hierarchy), live reports, etc.
• Developed mappings in Informatica for Data Integration, Extraction and Data Transformation.
• Support requirements management and change management processes.
• Work with the Change Manager and the Learning and Development Manager on the design of learning
programs to support the effectiveness and adoption of the change.
• Weekly call with clients (Capital group of companies), understanding their requirements and suggesting solutions.
• Support all teams including QA, SIT, UAT in producing analytical reports based upon data captured from financial
business areas that show recommendations and the production of monthly dashboards of key performance metrics
for senior management review and produce monthly balance scorecard.
• Experienced in SQL Server, Oracle for database connectivity and data manipulation.
• Developed Shell scripts(Unix), Macros(VBA), Scala, Java to automate file manipulation and data loading.
• Creating document for the Unit test cases. Analysing and investigating the defects reported.
• Working with the QA, SIT, UAT team to ensure the proper testing before the code is migrated to the production.
• Working with version management tools such as GitLab for maintaining different versions of the code.

2. AML METRIC (Technical Financial Analyst) Sep’14-June’15

AML Metrics application provides ETL solutions in the area of Anti Money Laundering to CITI Private Bank. It is also
known as COMRAD application. It has many interfaces which receive data from various source systems mostly in the
form of flat files. Data is cleansed, transformed and loaded to oracle tables, which is further used for reporting purposes.

RESPONSIBILITIES

• Analysing large data sets, Assimilate data from various sources, data migration, data cleansing using data
analytical tools and performing operations/statistical analysis on the data extracted using reporting tools.
• Validated all data sources and extracted information from suppliers to develop negotiation strategies for all
supplies ensuring best value is achieved from all spend.
• Weekly call with clients (Capital group of companies), understanding their requirements and suggesting solutions.
• Support all teams in producing analytical reports on Power BI & SSRS based upon data captured from
financial business areas that show recommendations and the production of monthly dashboards of key performance
metrics for senior management review and produce monthly balance scorecard.
• Developed Generic graphs in Ab Initio for data cleansing, data validation and data transformation.
SUMMER INTERNSHIPS AND ACADEMIC PROJECTS

Jun’12-Jul’12 NTPC Badarpur, Thermal Power Station Delhi, India


Learned the process of power generation at a thermal power plant.
Here water is converted into steam which is used to drive the turbine connected to
an alternator and produce electricity.
Jul’12-Aug’12 Project in Embedded System Design at EMTECH FOUNDATION Delhi, India
Designed a microcontroller based “Radio Frequency Identification based door lock system”.

Jun’13-Jul’13 Delhi Transco Limited (DTL) Delhi, India


Gained hands on experience of various electrical devices.

Feb’14-Apr’14 Major Project Delhi, India


Made a Security Access Control System Using RFID and Bluetooth, which is a two level security
system. First level of security is the RFID system (radio frequency identification) and the
second level of security is the password entered via the Bluetooth. Our project was amongst
the top three projects(EEE) selected from our college for university competition.

TECHNICAL SKILLS
▪ Language: C , C++, java, Scala
▪ ETL Tools: Ab Initio, Informatica BDM, SSIS , Zaloni Bedrock
▪ Reporting Tools: Cognos, SSRS, Tableau, Power BI, Excel
▪ Advanced Excel , Macros(VBA), SAS, Spark, Hadoop, Hive, Sqoop, Flume, Oozie, Kafka, Pig, Dataiku
▪ Databases: Oracle 10g, MS SQL Server, Access, Datalake, Cassandra
▪ Scripting: UNIX Shell

ACHIEVEMENTS

▪ Certificate of Appreciation (Shukran Award) for successfully completing OCT project at Etihad Airways.
▪ Certificate of honour for securing 100% marks in engineering drawing in class XII CBSE examination (2010).
▪ Certificate of appreciation for being the best trainee in Infosys Limited, Chandigarh, at unit level (Business
Intelligence).
▪ Certificate of appreciation for excellent work done in the IPS project (Informatica developer) in Infosys Limited,
Chandigarh.
▪ Certificate of excellence for securing first position in a non-technical event at Techsurge and Mridang’11.
▪ Certificate for voluntary services - member of The Synodical Board of Health Services and had the opportunity
of working with Ms. Karuna Roy, one of the leading social activist.
▪ Member of EDC (Entrepreneurship Development Cell). EDC provides a platform to the students to showcase
their entrepreneurship skills and for this various events are organised at the college level.
▪ Member of the Event Organising Group for Data and Analytics unit at Infosys Limited, Chandigarh. Responsible
for organising the quarterly team engagement parties and sports events at the unit level.

HOBBIES
Playing cricket, basketball, exploring new dishes, travelling to new places and listening music, Reading novels.

PERSONAL

Father’s Name: Mr. Sanjay Suri


Date of Birth: 10th January, 1992
Gender: Male
Interpersonal Skills: Self Motivated, Hardworking, Team worker, Good communication skills, Good Logical and
Problem Solving Skills, Confident and Determined.
Passport No: K0438314
UAE D/L: Yes

You might also like