Anil Kondla BI

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Anil Kondla

A Data Enthusiast with more than 6 years of Professional experience in Design, Analysis, Implementation and Development
of Decision Driven Business Intelligence solutions. Strong business acumen with a proven ability to build intuitive and
actionable dashboards that drive business decisions. Self-motivated to achieve results, maintain high quality standards and
meet aggressive business objectives. I believe in data-driven decisions by turning Data into Actionable Insights.

[email protected] 9700552359 Hyderabad, Telangana, India linkedin.com/in/anilkumarkondla

WORK EXPERIENCE

Senior Software Engineer


OpenText
03/2022 - Present (Hyderabad)
Experience Summary:
● Driving the Data strategy for a Next Generation, multi-tenant, Serverless Product called Predictive Case StrategyTM built around proprietary advanced analytics, machine learning and
automation.
● Worked on writing complex functions, views for api consumption. Also worked on Text mining for extracting useful information from the Legal complaint documents using NLP.
● Worked closely with a group of Cloud Data Engineers to identify gaps in the Data requirements and proposing efficient solutions to transform the Data according to the Product
requirements.

Senior Systems Analyst


Legato Health Technologies (An Anthem Company)
09/2020 - 02/2022 (Hyderabad)
Experience Summary:
● Worked with Stakeholders in gathering the requirements and tracking the progress of the Development of various Analytics Products.
● Also worked as an Individual contributor in developing highly Intuitive, Interactive and Robust dashboards using Tableau.
● Worked on writing complex SQL queries to drive the Tableau Reports.
● Worked as a coordinator for work Intake and routing it to the concerned team by analyzing the systems impacted.

Software Engineer
Legato Health Technologies (An Anthem Company)
02/2019 - 08/2020 (Hyderabad)
Experience Summary:
● Worked on Enterprise Analytics and Insights featuring in the development of Reporting system that provides insights about the anomalies prior payment and helps in Claim adjudication
which yield in Total Program Savings on the Requested claims.
● The application PING (Program Integrity Next Generation) is robust and determines the claims that are to be finalized, pended and corrected based on various algorithms and filters.
● It provides an efficient way to yield Savings on the Requested claim amount using Algorithms. The efficiency of various algorithms based on the amount of savings and number of hits
on each of these is determined.
● Also worked on GRRPS (Guarantee Reconciliation, Reporting and Payment System) which is intended to build modern Analytic Products using Tableau and mongoDB.
● Worked on Pilot Development in Pathfinder which is part of Medicare Revenue Data Mart to enable the Competency of the team in setting up an offshore team. Worked on creating a
road map and building a strategy in enhancing the capabilities.

Associate Consultant
Teklink International Inc
07/2016 - 01/2019 (Hyderabad)
Experience Summary:
● Worked on Data Analysis, Data Transformation, Data Processing, Data Integration with different Data Platforms, Data Migration to heterogeneous Data Platforms both SAP and
Non-SAP.
● Worked on development of complex Stored Procedures, functions and views.
● Worked on various Case Studies on SAP HANA and Expertise in modeling of informative models including Attribute views, Analytical views and Calculation views. Created calculation
views to address complex business requirements. Created advanced modeling views using Hierarchies, calculated and restricted columns, variables, input parameters, currency
conversions.
● Extensively worked on Data Warehousing concepts like Star schema, Snowflake Schema.
● Worked on database query optimization, created Stored Procedures to Extract, Transform, and Load (ETL) data to reportable data structures and Views for End User Reporting.
● Also worked on complex Reporting Scenarios with great accuracy and integrity of data by doing through analysis and testing.
● Worked on performance optimization of Views and Stored Procedures.

SKILLS

Data Integration/Data Migration: SAP Data Services, Informatica, Apache NiFi, pySpark
Databases: MySQL, Postgres, MongoDB
Data warehouses: SAP HANA, MSSQL Server, Oracle, Teradata, Snowflake, Redshift, MemSQL, Hive
Programming and Scripting: SQL, Core Java, Python, JavaScript, Nodejs
Data Visualization and Reporting tools: Matplotlib, Seaborn, Tableau, Power BI, SAP BO, Looker, AWS Quicksight
Integrated Development Environment: VS code, Spyder, Jupyter Notebook, Zeppelin Notebook, Toad, My SQL workbench, SQL Developer, SQL assistant, Dbeaver, PG admin,
SQL Server Management Studio
Amazon web services: IAM, Cognito, EC2, S3, Lambda, ECS, EMR, Cloudwatch, Cloudfront, RDS, Glue, Redshift, Sagemaker
DevOps: Git, GitHub, GitLab, Docker
Office tools: Microsoft Office, Libre Office, Open Office
ITSM, Issue tracking and Documentation: ServiceNow, Jira, Confluence

CERTIFICATES

● MemSQL 6 Certified Developer.


● Microsoft Certified Developer.
ACHIEVEMENTS

● Outstanding Newcomer 2016 at TekLink.


● Pat on the back for going above and beyond at Legato.
● Spot Award for Appreciation and Recognition for distinguished performance, dedication and hard work at Legato.

PROJECTS

Predictive Case Strategy – Opentext (03/2022 - Present)


Description:
OpenTextTM Predictive Case StrategyTM is a flexible and powerful end-to-end investigations platform built around proprietary advanced analytics, machine learning and automation. PCS
delivers best-in-class investigative capabilities in a fully integrated, intuitive review interface that helps legal teams get to the facts that matter sooner and inform case strategy.
Role:
Database Developer/Data Engineer.
Responsibilities:
● Driving the Data strategy and working on data-driven initiatives in Product development.
● Worked on writing complex functions, views for api consumption. Also worked on Text mining for extracting useful information from the Legal complaint documents using NLP.
● Worked closely with a group of Cloud Data Engineers to identify the gaps in the Data requirements and proposed efficient solutions to transform the Data according to the Product
requirements.
Technologies:
● Angular, AWS Lambda, Python, AWS RDS (Postgres), AWS Fargate, SOS, Cloudwatch, NLP, Machine Learning, Git, Jira, Confluence.

Pathfinder - Anthem (09/2021 - 02/2022)


Description:
Pathfinder is a Tableau-based reporting tool that displays Medicaid and Medicare Risk Adjustment data to support Anthem`s government lines of Business. The purpose of Pathfinder is to
allow a self-service reporting portal enabling users to see a population of interest (region, contract, provider) and be provided statistics and details around the risk status of a population
or member.
Role:
Tableau Developer.
Responsibilities:
● Worked on building new Reports on tableau besides working on enhancements on the existing Reports.
● Worked on writing Initial and Custom SQL for Tableau Reporting needs.
● Coordinated with onshore counterparts to understand requirements and worked on development based on the inputs.
Technologies:
● Tableau, Teradata, Jira, Confluence.

Provider Data Format - Anthem (09/2020 - 08/2021)


Description:
Provider Data Format is to create the Tableau Reporting tool based on the reconciliation and profiling of the provider data to submit to the BCBSA.
Role:
Tableau Developer, Business analyst.
Responsibilities:
● Worked with stakeholders and Product owners in gathering the requirements for the Tableau reports.
● Developed a Tableau tracking tool to track the variances of the Providers at different levels for different markets.
● Worked on writing complex SQL for the Tableau Reports.
● Worked on preparing the High-Level Technical Design Document for the Tableau Reporting tool.
Technologies:
● Tableau, Oracle, Jira, Confluence.

Guarantee Reconciliation Reporting & Payment system - IngenioRx (06/2019 - 08/2020)


Description:
The Project Guarantee Reconciliation, Reporting and Payment System is intended to build modern Analytic Products using Tableau and mongo dB to enhance internal reporting capabilities
that serve Client level P&L with comparison to quote estimates, Book accruals and actuals to GL, Interface with billing system to pay clients and support Ad hoc reporting functionality.
Role:
Tableau Developer.
Responsibilities:
● Designed and Developed Interactive and Intuitive reports using dashboards which empower decision makers.
● Worked on communicating the results of analysis with product and leadership teams to influence the overall strategy of the product.
● Worked on writing custom SQL based on the Business Requirements.
● Worked on Optimizing the performance of the reports.
Technologies:
● Tableau, MongoDB, MongoSQL, Confluence, Jira.

Program Integrity Next Generation (CSBD) - Anthem (03/2019 - 05/2019)


Description:
Worked on Enterprise Analytics and Insights features in the development of an application that determines the anomalies prior payment which yield in Total Program Savings on the
Requested claims. The application PING (Program Integrity Next Generation) is robust and determines the claims that are to be finalized, pended and corrected based on various rules. It
provides an efficient way to yield Savings on the Requested claim amount using Algorithms. The efficiency of various algorithms based on the amount of savings and number of hits on
each of these is determined.
Role:
Tableau Developer.
Responsibilities:
● Worked on Analytical Queries based on the Business Requirement for Reporting in Tableau.
● Built Interactive, Intuitive and Robust Dashboards using Tableau based on the Requirements.
● Worked with Stakeholders to get Business requirements and analyze problem statements and to address them with acuity, diligence and analytical skills and created Visualizations
using Tableau.
Technologies:
● Hive, Spark, Flume, Tableau, Splunk, Confluence, Jira.
Global Keystone - Kellogg Company (01/2017 - 01/2019)
Description:
The Project was built for Global Analytics for Kellogg Company by leveraging the insights using SAP Data Services, Microsoft SQL Server, MemSQL and Tableau.
Role:
Data Engineer.
Responsibilities:
● Worked on to provide extensive support by analyzing the problems and thereby implementing the Solution to overcome them.
● Worked on database query optimization, created Stored Procedures to Extract, Transform, and Load (ETL) data to reportable data structures and Views for End User Reporting.
● Worked on performance optimization of the Jobs.
● Worked on Proof of Concept in building a Tableau Dashboard that reports the Real time insights of the Daily Data Loads of various Subject areas.
● Worked on addressing the complex Business Requirements into Reportable format by writing Complex SQL views.
Technologies:
● Tableau, MSSQL Server, MemSQL, SAP Data Services, Service Now.

Leveraging In-Memory - Kellogg Company (06/2017 - 12/2017)


Description:
Starting with an initiative to speed access to customer logistics data, Kellogg turned to MemSQL to make its 24-hour ETL process faster and reduced the ETL process to an average of 43
minutes. By integrating Tableau with MemSQL, Kellogg was able to run visualizations directly on top of MemSQL rather than running an extraction. By doing so, Kellogg realized a 20x
improvement on analytics performance. This allows hundreds of business users to concurrently access up-to-date data and increase the profitability of customer logistics data.
Role:
MemSQL Developer.
Responsibilities:
● Extensively worked on Data ingestion through SAP Data Services (ETL) from External Sources and Apache Kafka Pipelines.
● Analyzed and monitored the Real Time Performance and latency of the Nodes.
● Worked on Analytical Queries based on the Business Requirement for Reporting in Tableau.
● Collaboratively worked with the team on Database Schema development and Sharding Strategies for Fast Retrieval of Data.
Technologies:
● MemSQL, Kafka, SAP Data Services.

KLA Waste Management - Kellogg Company (10/2016 - 12/2016)


Description:
This project is intended to reduce adverse effects of all kinds of waste, whether generated during the extraction of raw materials, the processing of raw materials into intermediate and
final products. The primary objective of the project is to produce variance of the Wastage in Quantities with their Respective Values of specific plants for Kellogg's Latin America Region.
Role:
SAP Data Services Developer.
Responsibilities:
● Coordinated with the Onsite coordinator to know the existing system and validated the requirement for development.
● Created ETL jobs to move data from SAP ECC source systems to SQL Server Database.
● Implemented Full load and Delta Load mechanism for History and Transactional data.
● Worked on leveraging Business requirements into Reporting format using SQL views and built stored procedures for data transformation.
Technologies:
● SAP Data Services, MSSQL Server.

EDUCATION

B. Tech (CSE) - Jawaharlal Nehru Technological University, Hyderabad


08/2012 - 05/2016
Percentage 72%

LANGUAGES

English Hindi Telugu


Native or Bilingual Proficiency Full Professional Proficiency Full Professional Proficiency

INTERESTS:

Photography Cinematography Speed Cubing Traveling

Cycling Table tennis Music Exercise

You might also like