0% found this document useful (0 votes)
6 views

Big Data Engineer - Python Java Bigquery PySpark Google Cloud_Infogain_JD

The job description is for a Google Cloud Data Engineer position requiring 5.5 to 11 years of experience and expertise in Python, Java, BigQuery, PySpark, and Google Cloud. Responsibilities include designing and developing features, mentoring junior engineers, and improving system efficiency and scalability. The role is hybrid and based in Bengaluru, focusing on GCP data services and ETL processes.

Uploaded by

hariharan_mk_1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

Big Data Engineer - Python Java Bigquery PySpark Google Cloud_Infogain_JD

The job description is for a Google Cloud Data Engineer position requiring 5.5 to 11 years of experience and expertise in Python, Java, BigQuery, PySpark, and Google Cloud. Responsibilities include designing and developing features, mentoring junior engineers, and improving system efficiency and scalability. The role is hybrid and based in Bengaluru, focusing on GCP data services and ETL processes.

Uploaded by

hariharan_mk_1
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

Job Description

Position TID: TH41224_88478


Role: Google Cloud Data Engineer
Educational Qualification: BE/BTech/MCA/M Sc / Any Degree
Experience Required: 5.5 to 11 Years
Primary Skills: Python, Java, Bigquery, PySpark, Google Cloud
Work location: Bengaluru
Mode of work: Hybrid

Google Cloud Data Engineer_Infogain

Core Skills
Extensive experience with Google Cloud Platform (GCP) data services such as BigQuery, Cloud
Storage, and Dataflow.
Expertise in ETL (Extract, Transform, Load) processes and data integration on GCP.
Strong SQL and database querying optimization skills on GCP.
Experience with data warehousing and data architecture on GCP.

Responsibilities
 Design, code, and develop new features/fix bugs/add enhancements.
 Analyze and improve the efficiency, scalability and stability of various system resources.
 Lead and mentor junior engineers and drive a culture of technical perfection.
 Drive creative and innovative solutions to complex problems, exemplifying good technical
discernment.
 Drive improvements and new approaches to address potential systemic pain points and
technical debt, anticipate and avoid problems.
 Take a hands-on approach in developing prototypes, independently and with others, to
establish design decisions and/or technical feasibility.
 Evaluate, Install, Setup, Maintain and Upgrade, Data Engineering, Machine Learning and
CI/CD infrastructure tools hosted on Cloud (GCP/AWS)
 Drive the CI/CD infrastructure tooling-related work in collaboration with various internal
teams to get the user stories, epics, and goals to closure.
 Propose, participate, and implement architecture-level enhancements/changes strategically
through Dev/Stage/Prod environments.
 Design, evangelize, deliver comprehensive best practices & efficient usage of available
tooling resources/capabilities to run high-performance systems.
 Provide innovative & and strategic solutions, along with cost & risk analysis, to improve the
stability, scalability, and performance of the tools' infrastructure.
 Perform troubleshooting, analysis, and resolution of environmental issues.
 Innovate and automate processes/tasks to improve operational efficiency
 Document and maintain application setup, runbooks, administration, and troubleshooting
guides.

You might also like