0% found this document useful (0 votes)
11 views2 pages

Data New

The Company provides on-demand retail and delivery of products to customers in Pennsylvania. As an Analytics Engineer, responsibilities include owning data integrity and availability, identifying gaps, creating documentation, and mentoring teammates to ensure high data quality and standards using tools like Snowflake, Stitch, dbt, and Looker.

Uploaded by

Gaurav Gangwar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views2 pages

Data New

The Company provides on-demand retail and delivery of products to customers in Pennsylvania. As an Analytics Engineer, responsibilities include owning data integrity and availability, identifying gaps, creating documentation, and mentoring teammates to ensure high data quality and standards using tools like Snowflake, Stitch, dbt, and Looker.

Uploaded by

Gaurav Gangwar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

About The Comapny, Inc.

The Company, Inc. provides on-demand retail and delivery of products. The Company offers
college essentials, party supplies, and accessories to snacks, frozen foods, and household
essentials. The Company serves customers in the State of Pennsylvania.

About the role,

The Company's engineering team is building solutions to dramatically change the way people
purchase their daily goods. We provide the modern-day solution to meet customer's immediate
everyday needs with products ranging from snacks and ice cream to household goods and beer at
the click of a button.

As an Analytics Engineer at The Company, you'll be responsible for building and maintaining the
bridge between our data and the rest of our organization. You'll be helping ensure the integrity,
reliability, and usability of production sized data for all teams as they make critical data-driven
decisions. This role sits within the Analytics Engineering team within our Engineering
organization. You'll have the opportunity to make a huge impact as the future success of The
Company will largely depend on our ability to execute and scale operationally.

Our stack currently consists of stitch and various data engineering pipelines for extracting and
loading, snowflake, debt, and looker. If you have expertise in these tools or an appetite to learn
more about them—let's chat!

Responsibilities

• Be our go-to data expert and have a deep understanding of our data warehouse and data
processing layers.
• Partner with data analytics, product managers, engineers, data scientists, and business
stakeholders to obtain a holistic understanding of each team's data needs and translate it
into infrastructure and tooling that enables a data-driven culture.
• Own data integrity, availability, transformation logic, and efficient data access to support
the growing needs of the organization.
• Identify gaps in existing data, create data product specs, and work with Engineering
teams to implement data tracking.
• Incorporate automation wherever possible to improve data pipelines and analyses
• Create testing and monitoring systems to ensure data quality and observability
• Build and maintain documentation to ensure data accessibility to all stakeholders
• Mentor Analytics Engineering teammates on best practices and ensuring all data products
are held to the highest standards

Must Have

• Bachelor's in Engineering, Business, Information Systems, or other quantitative


discipline
• 5+ years of professional experience architecting and building data models that integrate
complex and disparate data sources using Snowflake
• Expert in SQL and database table design - able to write structured and efficient queries on
large data sets
• Experience and strong knowledge of data warehousing concepts, big data technologies,
and analytics platforms. Redshift, Azure, and/or Snowflake experience strongly preferred
• Experience working in git workflows
• Excellent communication skills to work with stakeholders to translate business needs and
ideas into tractable work items.
• Top-notch organizational skills and ability to manage multiple projects in a fast-paced
environment
• Move fast, be a team player, always be learning and give back
• Overlap 8 hours with EST

Nice to Have

• Master's degree in Engineering, Business, Information Systems, or other quantitative


discipline
• Experience with ETL/ELT tools (bonus points for dbt!)
• Javascript or Python

Data warehouse, query, data analysis

You might also like