Vibhore Goel (1)
Vibhore Goel (1)
Vibhore Goel (1)
Contact: 9650590468
Email: [email protected]
Address: B-817 Sector 49, Sainik Colony, Faridabad (Haryana) - 121001
Professional Summary
18+ years of IT experience in developing/maintaining enterprise software applications in
different industries like Health Care, Real Estate Mortgages Risk Analysis, and Data
Analytics & ERP.
5+ years as a Java architect, responsible for communication with stakeholders, analyze and
provide solutions for problem areas, recommend, and finalize the technology stack, setup the
team structure, design the architecture and developing key components in the applications.
Experience Microservices Stack (Configuration management, Service discovery, Fault
tolerance, Centralized logging, Centralized metrics, Distributed tracing, Scheduling,
Deployment, Auto Scaling).
Experience with Cloud Technologies like GCP, AWS, Kubernetes, Docker, microservices
implementation and deployment.
Good Understanding of Microservices, Rest API, JPA, Hibernate, Lombok, Bootstrap
Spring Boot, Core Java
Relational Database Management System (RDBMS) like SQL Server, MySQL, PostgreSQL
using queries, stored procedures, functions, views.
Messaging tools like Apache Kafka.
Development tools like Eclipse, IntelliJ, MySQL workbench, SQL Developer, Postman.
Project management tools like Maven.
Proficient in integration of tooling for CI/CD like Jenkins.
Worked with version control systems like SVN, Git and GitHub, Jira for bug tracking.
Experience like Sysadmin on UNIX, Linux platforms, developed shell scripts and
configured
CRON jobs.
Skilled in Hadoop Eco-system, Spark, ETL, DWH and experience working in Hive SQL,
HDFS, Sqoop, Yarn, AWS GLUE, Athena, RDS, Step Functions, Terraform Scripts.
Strong understanding of data warehouse concepts like Star Schema,
Snowflake, Dimension and Fact table schema.
Experienced in handling team and experience in Sprint planning in Jira.
Experienced of working in Agile Model
Lead in the development of custom software and business intelligence solutions that
streamline client business proficiency to meet bottom line
Finalist in Opera Open Analytical Challenge
2nd prize in Zennovate Analytics Innovation Challenge
Attain the multiple Cool performance awards in Served Organizations.
Qualification
Bachelor of Computer Application (BCA) from Maharshi Dayanand University, Rohtak
Advance Computer Diploma from Indira Gandhi Open University
Post Graduate Diploma in Computer Application (PGDCA) from Indira Gandhi Open University
Master of Computer Application (MCA) from Indira Gandhi Open University
PROJECT PROFILE
VISA
Client
Zenon Analytics
Company
Description & Data access, sharing and usage terms are embedded in tens of thousands of
Project Scope contracts across VISA, requiring thousands of Legal/Privacy hours to review,
which is not scalable.
Provide the solution for data Governance & apply the data policies & rules on
downstream physical databases.
Extraction of Contract Metadata, policies & Clauses from Docx/PDF
affiliate contracts.
Creation of Contract Meta Data repository in MySQL database.
For Legal team review purpose, push the contract data policies & Rules
in Collibra Stewardship & Governance tool using Collibra APIs
Push the policies & rules in Apache Ranger which restrict data access
on downstream physical databases.
Solution can process the multiple affiliate contracts in single batch in an
automated fashion.
Systematic way of any affiliate data access request to data user(s)
Ability to remove the manual, repetitive, and error-prone human intervention
Role Software Architect / Lead / Microservices Developer
Java, Spring Boot, Python, Apache Ranger, Collibra, MySQL, Maven, IntelliJ, Collibra
Software
Data Governance & Stewardship tool.
Design the overall technical architecture of the application. Communication
Responsibilities with multiple layers with VISA internal system and VISA external services.
Worked on the overall security assessment of application as per VISA
standard.
Design the data model & sync the contract policy records with Collibra platform
object Id.
Developed the integration layer Rest Services with Spring Boot application to
get the data from data model & push the Data policies & rules in Collibra
Stewardship platform using Collibra APIs.
Using Python & Apache Ranger SDK, push the Data policies & Rules in Apache
Ranger which work as wrapper of Hadoop Eco-System data restriction as user
or group level.
Led a team of 5 developers, responsible for tasks allocation, mentoring, best
code practices, issue resolutions, pair programming and architecture design.
Used Git as a code repository; Maven as a management tool, BitBucket as
tool for CI/CD.
Implement the Dev/QA/Production pipeline to release the application
Project Title AWS GLUE ETL Pipeline Using Spark Python
Augeo
Client
Zenon Analytics
Company
Description & Day to Day transaction management system for Employee Activities. Sessions
and Events data generated from Web activity. Data for comments, posts, and
Project Scope reactions.
Using the Client RAW data, created staging layer in Athena DB.
Processed the data using AWS Glue Python and Spark script created the
data partitioned.
Store the data the Athena DB as processed layers.
Using AWS RDS service, push the processed data in MySQL DB for BI
reporting purpose.
Automated the process using Step functions and Terraform scripts to load
the full and incremental data.
Software S3 Bucket for data Lake/staging/processed layer, AWS Lambda, Python, Spark,
Athena DB, RDS, SNS, Step Functions, Terraform script for deployment
Responsibilities Design the overall technical architecture of the Data processing pipeline.
Divide in multiple layers: Data Sources, Data Ingestion, Data Storage, Data
Processing, Data Consumption
Implement the AWS lambda function for Quality checks of raw data.
Created the parameterized AWS Glue PySpark jobs for process the data.
Implemented the Cloud watch logs to monitor the Daily base incremental load
activities.
For Automate the infra and daily load execution, created the Step functions &
Terraform Scripts
Lead the team of 5 resources,
responsible for creating the python scripts,
Step functions, Deployment scripts, validate the data in QA environment
before Production signoff.
CISCO
Client
Zenon Analytics
Company
Description & CISCO have multiple Vendors for the documentation, Documentation can be
quarterly audit reports, financial reports, billing reports.
Project Scope
Initially all the documentation over the email. Difficult to maintain the aduit trail.
Developed the React based UI application CISCO & their vendor worked on central
UI
1. Open the new SOP. Assign the team member to start the work on report.
2. Team member start the work on report and update the status on daily basis.
3. Team lead can review the document.
4. After report is completed
5. 2 Type of approval one from Vendor side & another from CISCO.
6. After approval close the SOP.
Software Microservices, Spring Boot, Java React, Docker, Kubernetes, MySQL, AWS services,
Kafka
Responsibilities Design the overall technical architecture of the application.
Created the dashboard to showcase the overall progress of the reports
Generate the Audit reports
Developed the micro services
Containerization and Orchestration for modern application deployment.
Worked with agile development environments & spring planning
Description & It provides the high scale Big Data Analytics solution over the Hadoop Eco-system
and SPARK. It also provides the ML based descriptive and predictive signals of
Project Scope data. System automatically generated the data basic stat and EDD to validate the
quality of data.
Signal Hub is a Low-Code/No-Code tool to create the ML based data pipeline.
This tool has 3 modules.
Big data Solution pipeline monitoring (STUDIO M)
Workbench IDE for solution pipeline development (STUDIO W)
Final data result as Knowledge Hub (STUDIO K)
Role Sr. Lead Developer
Software HDFS, Yarn, Java, Spark Hive SQL, ML modeling, YAML for scripting
Responsibilities In this tool, Creation of solution is based on SVN/GIT repository & assigned to
multiple developers for the data pipeline development.
Implemented the full code commit life cycle for GIT/SVN based solution.
Implemented multiple Spark based transformation like: Read Collection,
Read View, Auto discovery schema, De dup the data, Joining the data,
tagging the data.
Using multiple ML model, training & scoring the data.
Created descriptive & predictive singles.
Creation of Job logs to validate the errors and warnings
Project Title Contract AI (GCP Cloud based, Client Contract Documents Processing System)
Company ElectrifAi
Software Java, Micro-Services, Kafka, Google Cloud Platform AWS docker, Kubernetes,
GitLab, Python, Swagger, Spark
Responsibilities Created the 20+ ML models images for different use cases and published in AWS
marketplace and same we have exposed the endpoint REST API of the models in
GCP.
Project Title Mobiuss CRE(Commercial Real Estate Analysis)
RMBS (Residential Mortgage-Backed Security)
Created the US based commercial & residential mortgages analysis & forecasting
Description & tool of the basis on mortgage market evaluation parameters. Validated the backed
Project Scope security of the Loan mortgages
Role Lead Engineer, Delivery Operations
Description A web based financial application to maintain money merge account system for
Agent/Client in US & Canada. This software maintains whole money information
about the all client as well as agents. Agent can have multiple clients which have
the right to view all the information regarding him.
1. Applicant Information: This consists of general information about the
agent/client and their dependent co-applicant.
2. Property: This contains the multiple properties agent/client.
3. Mortgage: This contains the multiple mortgage notes information of the
application relative to agent/client.
4. Income: This contains all the income information of agent/client.
5. Creditor: This module contains all the credit information of the application.
6. Bank: This module contains the different banks account information of
agent/client.
7. Analysis: This module generates the analysis MMA (Money Merge Account)
report based on properties, mortgage notes, incomes, creditors, and bank
information of agent/client
Responsibilities The responsibilities are to study the functional requirement specifications,
conceptualize the design and to develop the modules based on functional
requirement specifications and reviewing and testing the modules.
Project Title PTOS (Physical Therapist Health care financial supported software)
Description A web-based software most widely used software in the rehab market and
supported by the largest support team in the rehab industry. The application is a
prospective payment system application. Modules in this application:
1. Patient Billing
2. Documentation
3. Electronic Medical Records (EMR)
4. Insurance Claim Processing
5. Letters of Physicians
6. Extensive Management Reporting
The responsibilities are to study the functional requirement specifications,
conceptualize the design and to develop the modules based on functional
Responsibilities requirement specifications and reviewing and testing the modules.
Initially Ptos2.0 version software was developed in Foxpro6.0. Data Migration from
VFP to MS-Sql 2005 and vice versa. .Net application with sql2005 database. So I
was handling the Migration of data from PTOS FoxPro base application to .Net
application (sql2005 database). With this facility users can migrate all master from
FoxPro to sql2005. This utility all provide the facility work with both the projects
simultaneously, if user enter the transaction in FoxPro application then it will be
shown in .Net application and Vice versa
Project Title ANSI-837 EOB Reader (Decrypt Encrypted electronic US insurance payment claim
file and import into the PTOS software)
Company Ampere Software
Description That application is used for import insurance payments from electronic ANSI-837
claim encrypted file into the software to avoid the manually punch the payment
transactions in the software. This application is used as a tool to import all
insurance payment transactions into the software. When the user imports the
encrypted file into the software then that application shows all encrypt data in grid
(readable mode). In this grid contains all information related to the insurance
claim. (Name of claimed person, Date of service when claim applied, Claimed
Date, Charge amount, write off amount by insurance company, Patient paid
amount, Insurance paid amount, Adjustments, Paid the insurance claim by
Primary Insurance or Secondary Insurance, CPT Code and Procedure code)
In this tool, I’ve studies all ANSI-837 US insurance claim code system then
decrypted the received electronic insurance payment claimed file.
Responsibilities
(Vibhore Goel)