SR Data Engineer - Oracle
SR Data Engineer - Oracle
We received an updated and detailed JD for Data Engineer - Oracle role. Please see all the details
below. We even got the technology stack details this time so we understand the role better. There
are 3 positions, C2c is allowed and we will pay $80-85/hr based on candidate experience. We need
to close these roles soon. Even though below it says 5+yrs experience we need Sr profiles.
We have an open role for an Oracle Data Engineer. In particular, we are looking for onsite/onshore
(or similar EST time zone). Below is the Job description with basic requirements, and below is the
tech stack, showing the general technologies being used by the team.
Job Summary:
Describe what the person will do in the role - how he/she will impact the organization.
Using the latest Oracle technologies, the Sr. Data Engineer will drive multiple data initiatives applying
innovative architecture that will eventually scale to the cloud. You will work to develop the
Enterprise Data Warehouse to support BI/Analytics for the Linear Ad Sales Unified Platform. You will
work across the different stages of the data pipeline, including acquisition, integration, ODS, and
Real-Time data marts. You will leverage CDC and utilize data integration tools such as SAP Data
Services, Oracle GoldenGate, to deliver real-time data for the Ad Sales BI initiative. You will partner
with architects, engineers, QA, and the scrum team to deliver data through an Agile philosophy.
We are looking for a creative and talented individual who loves to design a scalable the platform
which scale at peta-byte level and extract value from both structured and unstructured real-time
data. Specifically, we are looking for a technology expert to build a highly scalable and extensible
data platform which enables collection, storage, modeling, and analysis of massive data sets from
numerous channels. You must be self-driven to continuously evaluate new technologies, innovate
and deliver solutions for business-critical applications with little to no oversight from management
team.
Basic Requirements:
Summarize job responsibilities, core deliverables and major duties. What is required for the position
to exist?
The Sr. Data Engineer will work directly on the Oracle enterprise data warehouse
(EDW) to deliver batch and real time data for analytics and reporting capabilities
to the Linear Ad Sales business unit.
Work across the different stages of the EDW data pipeline, using tools such as
Oracle ExaCC Exadata, SAP Data Services, Oracle GoldenGate, Oracle BigData
Extension, and Kalido DIW
Think of new ways to help make our data platform more scalable, resilient and
reliable and then work across our team to put your ideas into action.
Ensure performance is optimized for real time data by implementing and refining
robust data processing across landing, integration and data mart layers
Help us stay ahead of the curve by working closely with product, data modelers,
API developers, DevOps team, and analysts to design systems which can scale
elastically
Mentor other software engineers by developing re-usable frameworks. Review
design and code produced by other engineers.
Provide expert level advice to data scientists, data engineers, and operations to
deliver high quality analytics via machine learning and deep learning via data
pipelines and APIs.
Embrace the DevOps mentality to build, deploy and support applications in cloud
with minimal help from other teams
Required
· Bachelor’s degree or better in Computer Science or a related technical field or equivalent job
experience.
· 5+ years of experience working as an Oracle / Snowflake database developer (Oracle 11g or
greater) or similar technologies
· At least 5+ years development experience in a data warehousing/ Data Mart or big data
environment
· At least 5+ years with data warehouse / Data Mart design experience
· Experience with PL/SQL, SQL, database performance Tuning and optimization.
· Experience with ETL and other data integration tools such as SAP Data Services, Oracle
GoldenGate, and Tidal Enterprise scheduler.
· Ability to develop, implement and maintain standards established by the architecture and
Development teams.
· Experience in Code Quality implementation (Used Pep8/Pylint) tools or any other code quality
tool.
Preferred
· Experience with AWS, Kafka, Snowflake, Airflow or other cloud technologies