0% found this document useful (0 votes)
24 views2 pages

Data Analytics Engineer

The Banking department is seeking a Data Analytics Engineer to support its technology transformation by implementing and migrating data sets to a data lake. The role involves developing data pipelines, collaborating with various stakeholders, and ensuring best practices in data management and analytics. Key qualifications include experience with big data technologies, programming languages, and data modeling, with a focus on teamwork and communication skills.

Uploaded by

Largo Terranova
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views2 pages

Data Analytics Engineer

The Banking department is seeking a Data Analytics Engineer to support its technology transformation by implementing and migrating data sets to a data lake. The role involves developing data pipelines, collaborating with various stakeholders, and ensuring best practices in data management and analytics. Key qualifications include experience with big data technologies, programming languages, and data modeling, with a focus on teamwork and communication skills.

Uploaded by

Largo Terranova
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Description: Data Analytics Engineer

Background:
As part of the Banking department’s technology transformation program, we are undertaking a significant
journey to leveraging modern data technologies to unlock new capabilities such as real-time streaming and
advanced analytics. We are looking for a data engineer to join our team to help us in implementing new data
sets and migrating existing data sets to our on-premises data lake which will eventually migrate to the cloud.

This role is tackling a range of complex software and data challenges, including data management, advanced
analytics and business intelligence. This role is pivotal to implement, maintain and support data pipelines
using shared data platforms exploiting modern data technologies and software development practices.

Apart from technologies, this role requires continuous collaboration; not just among the team but also
outward with expert business analysts, technologists, project managers, data scientists and statisticians – and
sometimes counterparts in other international organisations and central banks.

Perfect candidate:
A technically capable Data Engineer who has worked in similar technology stack, a collaborative team player.

General Information:
• Start date: ASAP
• Latest start date: could wait a couple of months for the right profile
• End date: 12 months
• Work location: Basel
• Workload: 100%
• Travelling: n/a
• On Call: potentially
• Team: Working with the Banking IT team and interactions with other stakeholders within Banking.
• Department: Banking
• Unit: Banking IT

Tasks & Responsibilities:

• Provide implementation support on variety of data management and analytics projects using the
Bank’s approved databases (big data as well as relational databases) and analytical technologies
• Responsible for development and following best practices for big data environment
• Develop required code for optimal extraction, transformation, and loading of data from a wide variety
of data sources using SQL and ‘big data’ technologies.
• Translate, load and present disparate datasets in multiple formats/sources including JSON
• Translate functional and technical requirements into detail design
• Participate and contribute in overall data architecture and the design of Big Data & Advance Analytics
solutions
• Designing, developing, constructing, installing, testing and maintaining highly scalable, robust and
fault-tolerant data management & processing systems by integrating a variety of programming
languages & tools together
• Develop end-to-end data pipelines; defining database structure, applying business rules and
functional capabilities, security, data retention requirement etc.
• Work with stakeholders including the business area sponsors, product owners, data architect, data
engineers, project managers and business analysts to assist them with their data-related technical
issues and support their data needs.
• Work with big data solution architect to strive for greater standardisation across various data pipelines
• Ensure effective design and development of system architectures through frequent product deliveries;
employing effective governance methods for transparency and communication.
• Remain up-to-date with industry standards and technological advancements that will improve the
quality of your outputs
• Propose and implement ways to improve data quality, reliability & efficiency of the whole system.
• Interact with the business to identify, capture and analyse business requirements
• Develop functional specifications in a team environment, as well as derive use cases where appropriate
• Assist and support proof of concepts as Big Data technology evolves
• Ensure solutions developed adhere to security and data entitlements
• Translate, load and present disparate datasets in multiple formats/sources including JSON
• Translate functional and technical requirements into detail design

Must haves:

• Experience in building data ingestion pipelines for data warehouse and/or data lake architecture (****)
• Hands-on development in using open-source data technologies such as Hadoop, Hive, Spark, HBase,
Kafka, Impala, ELK etc. preferably with Cloudera (****)
• Strong experience in data modelling, design patterns and building highly scalable applications (****)
• Experience with at least one major programming language: C#, Python, Java, etc. (**)
• Experience with relational SQL and NoSQL databases: SQL Server, Sybase IQ, Postgres, Cassandra, etc.
(**)
• Experience with CICD pipelines and agile methodologies like Scrum, Kanban (**)
• Experience in Automated Testing, Test-driven Development, debugging, troubleshooting, and
optimizing code (**)

Interpersonal skills:

• Excellent verbal and written communication skills and ability to explain complex technical concepts
in simple terms

Nice to have:

• Experience with data pipeline and workflow management tools: Airflow, RunDeck, Nifi etc
• Experience with stream-processing systems: Kafka, Spark-Streaming, etc.

• Experience with cloud-based technologies such as Databricks, Snowflake, Azure Synapse


• Knowledge of micro-services architecture and experience in API creation and management
technologies (REST, SOAP etc)
• Experience supporting and working with cross-functional teams in a dynamic environment.

Next steps:
Interviews: First round interviews would be preferred starting on the week of 08.07.2024
Info: Please keep in mind that this job posting will be halted on Friday 05.07.2024, at 13:30pm CH time.

You might also like