Vibhore Goel (1)

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Vibhore Goel

Contact: 9650590468
Email: [email protected]
Address: B-817 Sector 49, Sainik Colony, Faridabad (Haryana) - 121001

Professional Summary
 18+ years of IT experience in developing/maintaining enterprise software applications in
different industries like Health Care, Real Estate Mortgages Risk Analysis, and Data
Analytics & ERP.
 5+ years as a Java architect, responsible for communication with stakeholders, analyze and
provide solutions for problem areas, recommend, and finalize the technology stack, setup the
team structure, design the architecture and developing key components in the applications.
 Experience Microservices Stack (Configuration management, Service discovery, Fault
tolerance, Centralized logging, Centralized metrics, Distributed tracing, Scheduling,
Deployment, Auto Scaling).
 Experience with Cloud Technologies like GCP, AWS, Kubernetes, Docker, microservices
implementation and deployment.
 Good Understanding of Microservices, Rest API, JPA, Hibernate, Lombok, Bootstrap
Spring Boot, Core Java
 Relational Database Management System (RDBMS) like SQL Server, MySQL, PostgreSQL
using queries, stored procedures, functions, views.
 Messaging tools like Apache Kafka.
 Development tools like Eclipse, IntelliJ, MySQL workbench, SQL Developer, Postman.
 Project management tools like Maven.
 Proficient in integration of tooling for CI/CD like Jenkins.
 Worked with version control systems like SVN, Git and GitHub, Jira for bug tracking.
 Experience like Sysadmin on UNIX, Linux platforms, developed shell scripts and
configured
CRON jobs.
 Skilled in Hadoop Eco-system, Spark, ETL, DWH and experience working in Hive SQL,
HDFS, Sqoop, Yarn, AWS GLUE, Athena, RDS, Step Functions, Terraform Scripts.
 Strong understanding of data warehouse concepts like Star Schema,
Snowflake, Dimension and Fact table schema.
 Experienced in handling team and experience in Sprint planning in Jira.
 Experienced of working in Agile Model
 Lead in the development of custom software and business intelligence solutions that
streamline client business proficiency to meet bottom line
 Finalist in Opera Open Analytical Challenge
 2nd prize in Zennovate Analytics Innovation Challenge
 Attain the multiple Cool performance awards in Served Organizations.

Technical Skill Set


 Back End Technologies: Java 8, J2EE, JDBC, JPA, JMS, Spring Boot, Spring Security,
JPA, Unix bash scripting, Python
 Messaging: Apache Kafka, ActiveMQ
 Cloud Services: AWS, GCP, Azure
 Databases: Hive, HBASE, MySQL, MS-SQL, SQLite, H2
 Versioning: GIT, SVN
 DevOps: Jenkins, Maven, Docker, Cluster maintenance, release
deployment, server maintenance, software installation & configurations,
Kubernetes orchestration setup
 Big Data technologies: Hadoop Eco-system, Spark, ETL, Data Analysis,
DWH and experience working in Hive SQL, HDFS, Sqoop, Yarn, AWS GLUE
 Coding technologies: Java, Scala
 Scripting Language: Linux Shell Scripting, YAML, Python
 OS, UI & Others:: Linux, Windows, HTML, JavaScript, React JS
Professional Experience
 Working as Technical Architect for Zenon Analytics, Noida from June 2021 to till now.
 Worked as Sr. Lead Software Engg. for ElectrifAi (Formerly Opera Solutions),
Noida from Feb 2010 to June 2021
 Worked as Sr. Software Product Engg. for Ampere Software, Noida from Feb 2006
to Jan 2010.
 Worked as Software Engg. for GSPL, New Delhi from June 2003 to Jan 2006

Qualification
Bachelor of Computer Application (BCA) from Maharshi Dayanand University, Rohtak
Advance Computer Diploma from Indira Gandhi Open University
Post Graduate Diploma in Computer Application (PGDCA) from Indira Gandhi Open University
Master of Computer Application (MCA) from Indira Gandhi Open University

PROJECT PROFILE

Project Title Contracts AI – Data Governance Solution

VISA
Client
Zenon Analytics
Company
Description & Data access, sharing and usage terms are embedded in tens of thousands of
Project Scope contracts across VISA, requiring thousands of Legal/Privacy hours to review,
which is not scalable.
Provide the solution for data Governance & apply the data policies & rules on
downstream physical databases.
 Extraction of Contract Metadata, policies & Clauses from Docx/PDF
affiliate contracts.
 Creation of Contract Meta Data repository in MySQL database.
 For Legal team review purpose, push the contract data policies & Rules
in Collibra Stewardship & Governance tool using Collibra APIs
 Push the policies & rules in Apache Ranger which restrict data access
on downstream physical databases.
 Solution can process the multiple affiliate contracts in single batch in an
automated fashion.
 Systematic way of any affiliate data access request to data user(s)
 Ability to remove the manual, repetitive, and error-prone human intervention
Role Software Architect / Lead / Microservices Developer
Java, Spring Boot, Python, Apache Ranger, Collibra, MySQL, Maven, IntelliJ, Collibra
Software
Data Governance & Stewardship tool.
 Design the overall technical architecture of the application. Communication
Responsibilities with multiple layers with VISA internal system and VISA external services.
 Worked on the overall security assessment of application as per VISA
standard.
 Design the data model & sync the contract policy records with Collibra platform
object Id.
 Developed the integration layer Rest Services with Spring Boot application to
get the data from data model & push the Data policies & rules in Collibra
Stewardship platform using Collibra APIs.
 Using Python & Apache Ranger SDK, push the Data policies & Rules in Apache
Ranger which work as wrapper of Hadoop Eco-System data restriction as user
or group level.
 Led a team of 5 developers, responsible for tasks allocation, mentoring, best
code practices, issue resolutions, pair programming and architecture design.
 Used Git as a code repository; Maven as a management tool, BitBucket as
tool for CI/CD.
 Implement the Dev/QA/Production pipeline to release the application
Project Title AWS GLUE ETL Pipeline Using Spark Python

Augeo
Client
Zenon Analytics
Company
Description & Day to Day transaction management system for Employee Activities. Sessions
and Events data generated from Web activity. Data for comments, posts, and
Project Scope reactions.
 Using the Client RAW data, created staging layer in Athena DB.
 Processed the data using AWS Glue Python and Spark script created the
data partitioned.
 Store the data the Athena DB as processed layers.
 Using AWS RDS service, push the processed data in MySQL DB for BI
reporting purpose.
 Automated the process using Step functions and Terraform scripts to load
the full and incremental data.

Role Data Architect

Software S3 Bucket for data Lake/staging/processed layer, AWS Lambda, Python, Spark,
Athena DB, RDS, SNS, Step Functions, Terraform script for deployment

Responsibilities  Design the overall technical architecture of the Data processing pipeline.
Divide in multiple layers: Data Sources, Data Ingestion, Data Storage, Data
Processing, Data Consumption
 Implement the AWS lambda function for Quality checks of raw data.
 Created the parameterized AWS Glue PySpark jobs for process the data.
 Implemented the Cloud watch logs to monitor the Daily base incremental load
activities.
 For Automate the infra and daily load execution, created the Step functions &
Terraform Scripts
 Lead the team of 5 resources,
responsible for creating the python scripts,
Step functions, Deployment scripts, validate the data in QA environment
before Production signoff.

Project Title Docnimo (Document Management System – React based UI)

CISCO
Client
Zenon Analytics
Company
Description & CISCO have multiple Vendors for the documentation, Documentation can be
quarterly audit reports, financial reports, billing reports.
Project Scope
Initially all the documentation over the email. Difficult to maintain the aduit trail.

Developed the React based UI application CISCO & their vendor worked on central
UI
1. Open the new SOP. Assign the team member to start the work on report.
2. Team member start the work on report and update the status on daily basis.
3. Team lead can review the document.
4. After report is completed
5. 2 Type of approval one from Vendor side & another from CISCO.
6. After approval close the SOP.

Role Software Architect

Software Microservices, Spring Boot, Java React, Docker, Kubernetes, MySQL, AWS services,
Kafka
Responsibilities  Design the overall technical architecture of the application.
 Created the dashboard to showcase the overall progress of the reports
 Generate the Audit reports
 Developed the micro services
 Containerization and Orchestration for modern application deployment.
 Worked with agile development environments & spring planning

Project Title Signal Hub (Custom ML/AI tool)

Company Opera Solutions (ElectrifAi)

Description & It provides the high scale Big Data Analytics solution over the Hadoop Eco-system
and SPARK. It also provides the ML based descriptive and predictive signals of
Project Scope data. System automatically generated the data basic stat and EDD to validate the
quality of data.
Signal Hub is a Low-Code/No-Code tool to create the ML based data pipeline.
This tool has 3 modules.
 Big data Solution pipeline monitoring (STUDIO M)
 Workbench IDE for solution pipeline development (STUDIO W)
 Final data result as Knowledge Hub (STUDIO K)
Role Sr. Lead Developer

Software HDFS, Yarn, Java, Spark Hive SQL, ML modeling, YAML for scripting

Responsibilities In this tool, Creation of solution is based on SVN/GIT repository & assigned to
multiple developers for the data pipeline development.
 Implemented the full code commit life cycle for GIT/SVN based solution.
 Implemented multiple Spark based transformation like: Read Collection,
Read View, Auto discovery schema, De dup the data, Joining the data,
tagging the data.
 Using multiple ML model, training & scoring the data.
 Created descriptive & predictive singles.
 Creation of Job logs to validate the errors and warnings

Project Title Contract AI (GCP Cloud based, Client Contract Documents Processing System)

Opera Solutions (ElectrifAi)


Company
Description & GCP cloud-based project to extract the contents from contract using OCR/NLP.
Implemented the Kafka connected to push the data in DB
Project Scope
Role Sr. Lead Engineer, Delivery Operations

Software Java, Micro-Services, Kafka, Google Cloud Platform docker, Kubernetes,


GitLab, Jenkin-X, Spring Boot
Major Development responsibility to set up the Cloud Infra of the product.
Responsibilities
• GCP Cloud & Container Management
• Creation of docker custom images
• Store a docker images in GCP Container Registry
• Create a container using Cloud Build
• Inspect the cluster and Pods.
• View a Pod’s console output.
• Sign into a Pod interactively
• Create and use Deployments.
• Create and run Jobs and Cronjobs.
• Scale clusters manually and automatically
• Configure Node and Pod affinity.
• Use Secrets to isolate security credentials.
• Use Config Maps to isolate configuration artifacts.
• Push out and roll back updates to Secrets and ConfigMaps
• Configure Persistent Storage Volumes for Kubernetes Pods
• Use Stateful Sets to ensure that claims on persistent storage
volumes persist across restarts.
 Network & Security
o Define IAM roles and policies for Kubernetes Engine cluster
administration.
o Locate and inspect Kubernetes logs.
o Enable applications running in GKE to access GCP storage services.
o Create Services to expose applications that are running within Pods.
o Use load balancers to expose Services to external clients.
o Leverage container-native load balancing to improve Pod load
balancing.
o Understand the structure of GCP IAM
 Written the scripts for deployment the microservices in GCP

Cloud Model Publishing in AWS marketplace


Project Title

Company ElectrifAi

Description & Dockerize based microservices of the ML models published in AWS


Project Scope marketplace and Swagger based model API in GCP
Role Sr. Lead Engineer

Software Java, Micro-Services, Kafka, Google Cloud Platform AWS docker, Kubernetes,
GitLab, Python, Swagger, Spark
Responsibilities Created the 20+ ML models images for different use cases and published in AWS
marketplace and same we have exposed the endpoint REST API of the models in
GCP.
Project Title Mobiuss CRE(Commercial Real Estate Analysis)
RMBS (Residential Mortgage-Backed Security)

Company Opera Solutions

Created the US based commercial & residential mortgages analysis & forecasting
Description & tool of the basis on mortgage market evaluation parameters. Validated the backed
Project Scope security of the Loan mortgages
Role Lead Engineer, Delivery Operations

Software Adobe Flex Builder 4.6, SQL Server 2008,


Java, Eclipse, Tomcat Server 6, JIRA, SVN, Hudson, SQL Server
Responsibilities My responsibility:
 Prepared the Software requirement specification as per business need.
 UI Design, Coding (Java), Managing database, Testing, Implementation,
and deployment of application.
 ETL implementation
 Deployment end- to-end solution Staging to Production environments

Project Title MMA (Money Merge Account Analysis system)

Company Ampere Software

Client United First Financial USA


http://www.moneymergeaccount.com/
Role Sr. Software Engineer

Software Java, MySQL, Flex UI, Jenkins, Maven,SVN

Description A web based financial application to maintain money merge account system for
Agent/Client in US & Canada. This software maintains whole money information
about the all client as well as agents. Agent can have multiple clients which have
the right to view all the information regarding him.
1. Applicant Information: This consists of general information about the
agent/client and their dependent co-applicant.
2. Property: This contains the multiple properties agent/client.
3. Mortgage: This contains the multiple mortgage notes information of the
application relative to agent/client.
4. Income: This contains all the income information of agent/client.
5. Creditor: This module contains all the credit information of the application.
6. Bank: This module contains the different banks account information of
agent/client.
7. Analysis: This module generates the analysis MMA (Money Merge Account)
report based on properties, mortgage notes, incomes, creditors, and bank
information of agent/client
Responsibilities The responsibilities are to study the functional requirement specifications,
conceptualize the design and to develop the modules based on functional
requirement specifications and reviewing and testing the modules.

Project Title PTOS (Physical Therapist Health care financial supported software)

Company Ampere Software

Client PTOS Software


http://www.rehabpub.com/buyers-guide/listing/ptos-software/
Role Sr. Software Engineer

Software ASP.net with c#, MS-SQL, TFS, VS-2005

Description A web-based software most widely used software in the rehab market and
supported by the largest support team in the rehab industry. The application is a
prospective payment system application. Modules in this application:
1. Patient Billing
2. Documentation
3. Electronic Medical Records (EMR)
4. Insurance Claim Processing
5. Letters of Physicians
6. Extensive Management Reporting
The responsibilities are to study the functional requirement specifications,
conceptualize the design and to develop the modules based on functional
Responsibilities requirement specifications and reviewing and testing the modules.

Initially Ptos2.0 version software was developed in Foxpro6.0. Data Migration from
VFP to MS-Sql 2005 and vice versa. .Net application with sql2005 database. So I
was handling the Migration of data from PTOS FoxPro base application to .Net
application (sql2005 database). With this facility users can migrate all master from
FoxPro to sql2005. This utility all provide the facility work with both the projects
simultaneously, if user enter the transaction in FoxPro application then it will be
shown in .Net application and Vice versa

Project Title ANSI-837 EOB Reader (Decrypt Encrypted electronic US insurance payment claim
file and import into the PTOS software)
Company Ampere Software

Client PTOS Software


http://www.rehabpub.com/buyers-guide/listing/ptos-software/
Role Sr. Software Engineer

Software ASP.net with c#, MS-SQL, TFS, VS-2005

Description That application is used for import insurance payments from electronic ANSI-837
claim encrypted file into the software to avoid the manually punch the payment
transactions in the software. This application is used as a tool to import all
insurance payment transactions into the software. When the user imports the
encrypted file into the software then that application shows all encrypt data in grid
(readable mode). In this grid contains all information related to the insurance
claim. (Name of claimed person, Date of service when claim applied, Claimed
Date, Charge amount, write off amount by insurance company, Patient paid
amount, Insurance paid amount, Adjustments, Paid the insurance claim by
Primary Insurance or Secondary Insurance, CPT Code and Procedure code)

In this tool, I’ve studies all ANSI-837 US insurance claim code system then
decrypted the received electronic insurance payment claimed file.
Responsibilities

(Vibhore Goel)

You might also like