JD DP IN SE 03 ETL DWH Dev Limendo India

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Job Description & Job Code SE-DP-IN-03

Title: Software Engineer - ETL/ELT/DWH/BI - Data Practice


Location: Hybrid, flexible, periodic physical meetings/work required

At Limendo, our people & technology enable business innovation. We’ve


established ourselves as a leading employer in Bolzano and are now turning our
heads to Bengaluru. We’re looking for a highly skilled software engineer(s) to join
our application practice & technology team. Our ideal candidate will have expert
knowledge of software development processes and solid experience working
across the breadth of data life cycle including testing and deploying applications in
cloud. If writing high quality, elegant, beautiful and meticulous code are among the
talents that make you tick, we’d like to hear from you.

Objectives of this Role

• Design data storage systems, collect, process and transform data for
storage. Involve in data collection decisions, apply best practices in
managing data, make choices based on type of data for appropriate
storage, processing and retrieval. Ensure infrastructure availability and
optimization.
• Build/Design or Create Data Warehouse or Data Lakes, manage
heterogenous data across various sources, analyze data, uncover insights
and enable use of data for further processing, identify missing data and
take suitable action. Classify, clean and process data for use in
downstream processes and applications. Ensure data quality.
• Perform tests and troubleshoot Data Warehouse or Data Lakes, take
action to continuously improve data collection, data processing, data
transformation and data storage required for business. Perform
exploratory data analysis as required.
• Create data visualizations for defined audience, enable sense making and
augment use of data for business decision making and identify data
monetization opportunities.
• Build/develop, modify/extend downstream applications that generate
data, automate application testing and embrace tasks and assignments
across the software life cycle and data life cycle.
• Contribute to development and engineering of existing products, deliver
mission critical software across multiple industry verticals. Work with
partners, collaborators and cross functional, distributed teams in an agile
setting. Champion innovation for our products and customers through
participation in ideation, prototyping and validating product fit

Daily and Monthly Responsibilities

• Design, develop and modify data systems, using scientific analysis and
mathematical models to predict and measure outcome and
consequences of design
• Develop and direct data system testing and validation procedures,
programming, and documentation, analyze information to determine,
recommend, and plan data system reports and layouts, and modifications
• Build the infrastructure required for optimal extraction, transformation,
and loading of data from a wide variety of data sources using SQL and
AWS ‘big data’ technologies
• Writing Complex SQL queries, sub-queries and complex joins
• Azure/AWS Pipeline knowledge to develop ETL for data movement,
experience to map the source to target rules and fields (Redshift or
Snowflake)
• Implement natural keys, surrogate keys, SCD, SCD2 into the data
warehouse to support history, automated triggers to take snapshots,
history maintenance, automated triggers to take backup and re-store in
case of failures
• Translate business requirements, design jobs for data migration from
various data sources, RDBMS design, development and performance
tuning, experience with MS SQL Server 2014 database technologies and
ETL best practices with MS SQL Server
• Using Business Intelligence Development Studio to build SSIS packages
• Excellent troubleshooting & optimization skills: interpreting ETL logs,
performing data validation, dissecting SSIS code, understanding the
benefits and drawbacks of parallelism, experience with change data
capture, using expressions, scoping of variables, commonly used
transforms, event handlers and logging providers, ability to understand
and optimize the surrogate key generation and inconsistent data type
handling.

Nice to have

• Visualize, design, and develop creative and innovative software.


• Contribute to development and engineering of new/existing products,
deliver mission critical software across multiple industry verticals that are
unit tested, code reviewed, and enable continuous integration &
deployment.

Skills and Qualifications

• Bachelor’s degree in Computer / Software engineering or Information


Technology
• 1 - 4 years of experience as Software Engineer with ETL/DWH,
Databases, Storage Technologies. (Azure Data Pipeline, Kafka Apache
Data Stream)
• Experience with rapid development cycles in an agile/iterative setting
• Strong scripting and test automation abilities
• Experience in cloud setting is preferred, AWS, Azure or GCP
• Server & Databases: MSQL, MySQL, PL/SQL, Oracle, NodeJS Server,
PostgreSQL, MongoDB, Firebird
• Business Intelligence Solutions: SSIS, SSAS, SSRS, MDX, OLAP,
experience in Hadoop technologies, Data Modeling Tools: MS Visio,
ERwin, …
• Apache Spark and/or PySpark.
• Python, Java and/or. NET experience
• Data visualization: MS Power BI or any other

You might also like