0% found this document useful (0 votes)
38 views2 pages

Core - Data - Business Intelligence (Snowflake)

This job description is for a Core - Data - Business Intelligence role based in Hyderabad, India. The responsibilities include designing and implementing fully operational Snowflake data warehouse solutions, ingesting data from various sources, handling high volume data loads and migrations, developing distributed data processing systems, and collaborating with teams to ensure data quality. The ideal candidate has 5+ years of experience in data engineering, is proficient in Snowflake, SQL, and data warehousing concepts, and has strong problem-solving and communication skills.

Uploaded by

Srilakshmi M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views2 pages

Core - Data - Business Intelligence (Snowflake)

This job description is for a Core - Data - Business Intelligence role based in Hyderabad, India. The responsibilities include designing and implementing fully operational Snowflake data warehouse solutions, ingesting data from various sources, handling high volume data loads and migrations, developing distributed data processing systems, and collaborating with teams to ensure data quality. The ideal candidate has 5+ years of experience in data engineering, is proficient in Snowflake, SQL, and data warehousing concepts, and has strong problem-solving and communication skills.

Uploaded by

Srilakshmi M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Job Description of Core - Data - Business Intelligence (Snowflake) for Innova Solutions Private Limited

Title: Core - Data - Business Intelligence (Snowflake)

Location: Hyderabad, Telangana, India

Job Description:

Key Responsibilities:

• 5+ years of strong experience in designing and implementing a fully operational solution on Snowflake Data Warehouse.

• Responsible to Ingest data from files, streams, and databases.

• Should be familiar with bulk loading concepts, external tables and advanced snowflake concept etc

• Responsible to design and develop distributed, high volume, high velocity multi-threaded event processing systems.

• Responsible to handle high volume data loads/migrations.

• Responsible to handle end to end lifecycle from analysis through deployment.

• Analyse, code and performance tuning UDFs, UDTF, Procedures using SQL, Python and/or JavaScript.

• Excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologies

• Hands-on experience with Snowflake utilities like Snow SQL, Snow Pipe, experience in administering Snowflake, experience with data loading from cloud
(Azure),AWS & APIs etc.

• Collaborate with cross-functional teams to ensure data quality, consistency, and integrity.

• Troubleshoot and resolve data-related issues in a timely manner.

• Build, monitor, and maintain data warehouses and ETL processes.

• Optimize and tune SQL queries for performance and reliability.

• Experienced in designing optimal data load and data processing ETL jobs.

• Responsible to guide and lead the team.

• Responsible to guide the team in agile methodology.

• Should have prior experience working with cloud technologies.

• Understanding data pipelines and modern ways of automating data pipeline using cloud-based implementation and Testing and clearly document the
requirements to create technical and functions specs

• Collaborate with data analysts and clients to understand data requirements and translate them into efficient data processing solutions.

• Experience in using Scheduling tools (Control M)


Qualifications:

• Bachelor’s degree in computer science, data engineering, or a related field.

• Minimum of 5 years of professional experience in data engineering.

• Proficiency in Snowflake, Informatica & SQL.

• Strong understanding of data warehousing concepts and best practices.

• Experience in optimizing data pipelines for performance and scalability.

• Strong problem-solving and troubleshooting skills.

• Excellent communication and collaboration skills.

• Ability to work independently and as part of a team (Mid-Level)

• Should have prior experience working in agile.

• Key Skills: Data Processing, Data Migration, Data warehousing, Snowflake, Performance Tunning

• Good to have: Certification on Snowflake

Experience: 5-8 years

You might also like