Moneet Poosarla

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 4

MONEET POOSARLA

Data Engineer/Business Analyst| [email protected] | C: 720-819-7445 | www.linkedin.com/in/moneet-poosarla

PROFESSIONAL SUMMARY:
 Data Engineer with expertise assisting a BCBS subsidiary firm with building and maintaining a Data Warehouse model for
massive data transactions.
 As a Data Engineer with expertise in the healthcare and insurance domains, I am poised to contribute my technical
prowess and industry knowledge to drive data-driven insights, enhance operational efficiency, and enable evidence-
based decision-making in complex healthcare and insurance landscapes.
 I am well-versed in industry standards and regulations, including HL7, FHIR, HIPAA, ensuring compliance and data
privacy while enabling seamless data exchange between disparate healthcare systems.
 Professional experience in Data Analysis, Data Modelling, Data Integration, Data Conversion and Data Migration for IT
industry.
 Demonstrated proficiency in both Power BI and data warehousing technologies such as Google BigQuery and Azure SQL
Data, facilitating seamless integration and interoperability across diverse data sources for comprehensive analytics
solutions.
 Worked on HIPAA Transactions and Code Sets Standards according to the test scenarios such as 270/271,
276/277,837/835 transactions.
 Assisted in implementing comprehensive data governance structures within Collibra, including communities, domains,
subdomains, and assets, aligned with organizational needs.
 Knowledge of Collibra Data Intelligence Cloud architecture and components.
 Data Catalog Management: Managing and maintaining the data catalog, including data lineage, data dictionaries,
business glossaries, and metadata management.
 Leveraging extensive expertise in Informatica Intelligent Cloud Services (IICS), I have played a pivotal role in driving
seamless data integration across diverse platforms and applications.
 Throughout my career, I have demonstrated a strong proficiency in programming languages like Python, SQL, and
NoSQL, along with experience in big data technologies such as Apache Spark and Hive. I am no stranger to cloud
platforms like AWS, leveraging their services for scalable and cost-effective data processing and storage solutions.
Moreover, I have hands-on experience with data integration tools like Informatica, Talend, enabling seamless data flow
across heterogeneous systems.
 Proficient in writing complex SQL Queries in OLTP Database, stored procedures, Normalization, Database Design,
creating Indexes, Functions, Triggers, Sub Queries.
 Utilized Google Analytics GA4 to track and analyze user behavior across the organization's digital platforms.
Implemented advanced event tracking and custom reports to gain insights into user interactions, optimize user
experience, and drive data-driven decision-making for marketing and product development strategies.
 Have hosted a website deployed on AWS through Bash scripting and performed CI/CD using Jenkins and Kubernetes.

TECHNICAL SKILLS:
Languages SQL/TSQL, PL/SQL, Python, C++, Bash/Shell scripting
Cloud AWS Glue, AWS Redshift/RDS/EFS/S3, AWS Lambda, AWS Beanstalk, Elasticache, AWS MQ,
AWS Cloudfront, AWS Cloudwatch, AWS Route 53, AWS IAM, AWS ACM, Azure, GCP
Data Base Oracle 11g/10g/9i/8i, MS SQL Server 2005/2000, MySQL5.1/4.1, DB2 9.1/8.1/7.2, Teradata, AWS
Aurora, AWS RDS, AWS Redshift
Platforms Windows, UNIX, LINUX
Methodologies Agile, Scrum, Waterfall
Tools & Middleware Git, JIRA, Jenkins, Informatica, Salesforce, Teradata, Jenkins, Kubernetes, SSMS, SSIS,
Enterprise BPA, Configuration Manager, Sentry, MS Visual Studio.
PROFESSIONAL EXPERIENCE:

Elevance Health BCBS, Richmond, VA | Mar’2021 – Current | Senior Data Engineer/Business Analyst

Responsibilities:
 Primary liaison to the stakeholders, engaging in understanding and collecting data requirements in line with
business standards while simultaneously developing requested data models and assisting the developer team on
both onshore and offshore.
 Developed Data pipelines, ETL processes and scheduled them using Apache Airflow to measure the effectiveness of
campaigns through 50+ metrics using Apache spark.
 Leverage advanced analytics and machine learning to detect fraudulent claims, improve operational efficiency, and
enhance customer experience on the Amazon Cloud Platform using services like AML, Redshift, Sagemaker.
 Developed and managed HL7 interfaces using Mirth Connect to facilitate seamless data exchange between disparate
healthcare systems. Ensured interoperability by configuring and maintaining multiple HL7 message types, including
ADT, ORU, and ORM, leading to a 20% improvement in data accuracy and communication efficiency.
 Designed and implemented real-time ETL processes using Informatica PowerCenter to capture and transform data
from online transaction systems and third-party APIs.
 Wrote SQL/T-SQL queries, Stored-Procedures, functions, and Triggers Scheduled many jobs to automate different
database related activities including backup, monitoring database health, disk space, backup verification.
 Involved in rescaling production deployed data pipeline for a customer, integrating data from diverse sources such
as claims systems, member enrollment databases, and external healthcare providers. Ensured data accuracy,
integrity, and compliance with regulatory standards, enabling efficient claims processing and accurate reporting.
 Partnering with Strategic Marketing Directors to develop and implement new strategies, taking key business
decisions, build visualizations for executive level KPI reporting, collaborating with Data Science and Machine
Learning to drive marketing effectively.
 Spearheaded the development of dynamic Power BI reports and interactive dashboards, enabling stakeholders to
make data-driven decisions swiftly and effectively.
 Leveraged Power BI in conjunction with Azure SQL Data to perform complex data manipulations and analysis, driving
insights that informed strategic business initiatives and operational improvements.
 Supported marketing by providing insights through three automated reports and updating the key metrics into
Tableau Dashboards.
 Introduced Jenkins for CICD and ensured best practices for version control through introduction of GIT.
 Successfully completed the migration ahead of schedule, resulting in a seamless transition to the new data
warehouse, enhancing data accessibility and improving reporting capabilities.

Anthem BCBS, Richmond, VA | Oct’2018-Mar 2021 | ETL Developer

Responsibilities:
 Implemented FHIR (Fast Healthcare Interoperability Resources) standards to facilitate seamless data exchange
between healthcare systems, enhancing the accuracy and accessibility of patient information.
 Worked on Exporting and analyzing data to the Snowflake using for visualization and to generate reports for the BI
team. Used Amazon Elastic Cloud Compute (EC2) infrastructure for computational tasks and Simple Storage Service
(S3) as storage mechanism.
 Effectively build data objects using ETL and programmed scripts to efficiently automate the data requirement
business process.
 Leverage intelligent cloud services to build, test and deploy automated data jobs while simultaneously integrating
them with the production and UAT process automation tool while providing support to their
troubleshooting/debugging issues.
 Construct precise and clear queries for the business users to be able to access and validate the processed data and
document the entire lifecycle of events for optimal knowledge transfer.
 Ability to leverage Collibra lineage Harvester to harvest and re-harvest mapping & transformation logic from source
code (DML/DDL scripts, SQL and ETL scripts).
 Configure and manage data quality rules, scorecards, and dashboards within the Collibra Data Quality platform.
 Conduct harvesting of lineage and loading of data catalogs from Databricks, including import/excel based manual
documentation of lineage for Bronze (L1), Silver (L2), and Consumption layer (L3).
 Engaged in multiple data environments like AWS and In-house server systems.
 Maintained the real-time ETL pipeline, ensuring high availability and minimal data latency, while proactively
resolving data integration issues within minutes to prevent business disruptions.
 Developed Informatica workflows to perform data profiling, cleansing, and deduplication on patient records,
resulting in a 25% reduction in data errors and a 20% increase in compliance with healthcare data standards.
 Implemented data cleaning strategies, performing data deduplication and standardization processes on member
demographic data. Developed algorithms and rule-based approaches to identify and merge duplicate records,
resulting in a 30% reduction in duplicate member entries and improved data accuracy for claims processing and
member analytics.
 Generate Tableau reports for processes to evaluate KPI's and effectively provide invaluable insights to assist in
business decisions.
 Conducted Enterprise-Wide Data Migration by seamlessly transferring terabytes of data from on-premises systems
to the cloud using IICS. Achieved zero data loss and minimized downtime, ensuring business continuity.
 Spearheaded a data quality improvement initiative for a healthcare provider, integrating data cleansing and
validation processes within IICS. This resulted in enhanced data accuracy, compliance with regulatory standards, and
improved patient care outcomes.
 Managed Cross-Platform Analytics integration utilizing Informatica IICS by Integrating diverse analytics platforms
(e.g., Tableau, Power BI) for a business intelligence-focused organization. The unified analytics framework facilitated
cross-platform data sharing and reporting, empowering decision-makers with a consolidated and comprehensive
view of business performance.
 Collaborated with business stakeholders to understand their reporting requirements and translated them into
interactive visualizations, utilizing Tableau's features such as filters, parameters, and calculated fields to enable
dynamic data exploration and analysis. Integrated various data sources, including claims data, member
demographics, and provider information, ensuring data accuracy and consistency. Implemented performance
optimization techniques, such as data extracts and data source filters, to enhance dashboard loading and query
response times.
 Consolidate data from various systems including policy management, billing, and customer relationship
management. Implemented robust ETL processes and optimized data modeling techniques, resulting in improved
data accessibility and enhanced analytics capabilities for claims analysis and risk assessment.
 Effectively track the progress of user stories and tasks in JIRA for the ongoing projects in KANBAN board.
 Built a process from the ground up to pull data from third party websites, utilizing Informatica IICS to MySQL
database and AWS S3.
 Conduct testing and implementation of API integration of multiple processes using Altair and Postman.
 Planned and deployed the entire process chain of marketing data cluster business requirement and automated the
process in Enterprise BPA and Control-M environments.
 Handled the migration of legacy commercial data from On-prem servers to AWS while transforming the data to
serve the business users on Salesforce CRM.
 Implemented robust data quality assurance measures, including automated data validation rules and data profiling
techniques, for a large-scale insurance claims processing system. Achieved a 20% reduction in data inconsistencies
and improved data accuracy, leading to more accurate claims assessments and fraud detection.
 Performed unit testing at various levels of the ETL code, Stored Procedures and actively involved in team code
reviews.

TCS/SAP, Bangalore, India | Aug’2015 – Jul’2016 | Assistant Systems Engineer

Responsibilities:
 Headed the Quality Assurance team responsible for the recognition, analysis, and solution documentation of
technical difficulties prevalent on systems based on various client requirements.
 Designed a framework to implement the constructed protocols, thereby significantly improving quality and
efficiency by up to 45%.
 Built the source control repository that integrates and controls the version updates for over 100 brownfield systems.
 Handled the distribution and migration of terabytes of data on both local and global client servers.
 Performed Unit Tests, Peer reviews, Integration Tests and System Tests and documented the results.
 Prepared detail oriented functional specification documents for medium and big enhancement projects.
 Worked on Migration activities for Production. Supporting for Pre and Postproduction.
 Involved in building the process for inbound data feed for transactional as well as configuration data.

Mylan Pharma, Vizag, India | May’2014 – Jul’2015| Junior Developer

Responsibilities:
 Understanding all the business requirements from Analyst and team lead
 Reconstructed the parameters and tasks in the automation tool responsible for the monitoring, processing, and
execution of over 100 batches.
 Worked on scheduling of jobs through UNIX shell scripting.

EDUCATION:

MASTER OF SCIENCE Electrical Engineering


University of Colorado | Denver, CO, USA | May 2018

BACHELOR OF TECHNOLOGY Electronics & Instrumentation Engineering


Gandhi Institute of Technology and Management | Visakhapatnam, AP, India | April 2015

You might also like