SolutionArchitectureReport PDF
SolutionArchitectureReport PDF
Solution
Architecture -
IRDA Business
Analytics Project
2010
Solution Architecture Document - IRDA Business Analytics Project
Table of Contents
List of Abbreviations Used with Their Definition .......................................................................................... 5
11.5 Recovery Point and Time Objectives for the Business Analytics Solution .................................. 56
13. Sizing and Performance Considerations for IRDA Business Analytics Program .............................. 67
16.8 Physical and Analog Data Conversion tools and techniques ...................................................... 87
Appendix ..................................................................................................................................................... 91
B. Indicative List of Dimensions with their values and attributes ..................................................... 204
C. Data Sizing Estimate for the IRDA BAP Solution ........................................................................... 211
G. Data Archiving Procedures and Guidelines for IRDA Business Analytics Solution ........................ 246
Abbreviations Description
ACL An access control list (ACL) is a list of permissions attached to an object. An ACL
specifies which users--or system processes--are granted access to objects, as well as
what operations are allowed to be performed on given objects.
ADS Active Directory Server (ADS) is a technology created by Microsoft that provides a
variety of network services
ANSI The American National Standards Institute (ANSI) is a private non-profit organization
that oversees the development of voluntary consensus standards for products,
services, processes, systems, and personnel in the United States
API An application programming interface (API) is an interface implemented by a software
program to enable interaction with other software
ATI Agent Training Institutes
B2B Business - to - Business
BCP Business continuity planning (BCP) is the creation and validation of a practiced
logistical plan for how an organization will recover and restore partially or completely
interrupted critical (urgent) functions within a predetermined time after a disaster or
extended disruption. The logistical plan is called a business continuity plan.
CBAC Context-based access control (CBAC) intelligently filters TCP and UDP packets based on
application layer protocol session information and can be used for intranets, extranets
and internets.
CDC/DC Centralized Data Center or Data Center is a facility used to house computer systems
and associated components, such as telecommunications and storage systems.
COM COM (hardware interface) (COM) is a serial port interface on IBM PC-compatible
computers running Microsoft Windows or MS-DOS
DD Deputy Director
DMZ The Demilitarized Zone (DMZ) is a critical part of a firewall: it is a network that is
neither part of the un trusted network, nor part of the trusted network
DNS The Domain Name System (DNS) is a hierarchical naming system for computers,
services, or any resource connected to the Internet or a private network
DRC A Disaster Recovery Center (DRC) is a backup site is a location where an organization
can easily relocate following a disaster, such as fire, flood, terrorist threat or other
disruptive event.
DRM Disaster Recovery Management (DRM) is the process, policies and procedures related
to preparing for recovery or continuation of technology infrastructure critical to an
organization after a natural or human-induced disaster.
DTLS The Datagram Transport Layer Security (DTLS) protocol provides communications
privacy for datagram protocols.
Abbreviations Description
DW A data warehouse (DW) is a repository of an organization's electronically stored data.
Data warehouses are designed to facilitate reporting and analysis
EAI Enterprise Application Integration (EAI) is defined as the use of software and computer
systems architectural principles to integrate a set of enterprise computer applications.
ED Executive Director
ESB An enterprise service bus (ESB) consists of a software architecture construct which
provides fundamental services for complex architectures via an event-driven and
standards-based messaging-engine (the bus).
ETL Extract, transform, and load (ETL) is a process in database usage and especially in data
warehousing
F&A Finance and Accounts
GUI A graphical user interface (GUI) is a type of user interface item that allows people to
interact with programs in more ways than typing such as computers; hand-held
devices etc.
HIDS A host-based intrusion detection system (HIDS) is an intrusion detection system that
monitors and analyses the internals of a computing system rather than the network
packets on its external interfaces
HRMS Human Resource Management System
HSRP Hot Standby Router Protocol (HSRP) is a Cisco proprietary redundancy protocol for
establishing a fault-tolerant default gateway
IDM Identity management (IDM) is a broad administrative area that deals with identifying
individuals in a system (such as a country, a network or an organization) and
controlling the access to the resources in that system by placing restrictions on the
established identities
JDBC Java Database Connectivity (JDBC) is an API for the Java programming language that
defines how a client may access a database. It provides methods for querying and
updating data in a database
LAN A local area network (LAN) is a computer network covering a small physical area, like a
home, office, or small group of buildings
LDAP The Lightweight Directory Access Protocol (LDAP) is an application protocol for
querying and modifying directory services running over TCP/IP
MAC Mandatory access control (MAC) refers to a type of access control by which the
operating system constrains the ability of a subject or initiator to access or generally
perform some sort of operation on an object or target
MDM Master Data Management (MDM) comprises a set of processes and tools that
consistently defines and manages the non-transactional data entities of an
organization
NIC Network Information Center (NIC), is the part of the Domain Name System (DNS) of
the Internet that keeps the database of domain names, and generates the zone files
which convert domain names to IP addresses
ODBC Open Database Connectivity (ODBC) provides a standard software API method for
using database management systems (DBMS)
Abbreviations Description
ODS An operational data store (ODS) is a database designed to integrate data from multiple
sources to make analysis and reporting easier
OLAP Online analytical processing (OLAP) is an approach to quickly answer multi-
dimensional analytical queries
RBAC Role-based access control (RBAC) is an approach to restricting system access to
authorized users.
RPC Remote procedure call (RPC) is an Inter-process communication technology that
allows a computer program to cause a subroutine or procedure to execute in another
address space (commonly on another computer on a shared network) without the
programmer explicitly coding the details for this remote interaction.
RPO Recovery Point Objective (RPO) is the point in time to which you must recover data as
defined by your organization. This is what an organization determines is an
"acceptable loss" in a disaster situation.
RTO Recovery Time Objective (RTO) is the duration of time and a service level within which
a business process must be restored after a disaster (or disruption) in order to avoid
unacceptable consequences associated with a break in business continuity.
SAML Security Assertion Markup Language (SAML) is an XML-based standard for exchanging
authentication and authorization data between security domains, that is, between an
identity provider (a producer of assertions) and a service provider (a consumer of
assertions)
SAN A Storage area network (SAN), an architecture to attach remote computer storage
devices to servers in such a way that the devices appear as locally attached to the
operating system
SIP The Session Initiation Protocol (SIP) is a signaling protocol, widely used for controlling
multimedia communication sessions such as voice and video calls over Internet
Protocol (IP)
SLA Service Level Agreement (SLA) is a part of a service contract where the level of service
is formally defined. SLA is sometimes used to refer to the contracted delivery time (of
the service) or performance.
SOAP SOAP, originally defined as Simple Object Access Protocol, is a protocol specification
for exchanging structured information in the implementation of Web Services in
computer networks.
SQL SQL (Structured Query Language) is a database computer language designed for
managing data in relational database management systems (RDBMS)
SSL Secure Sockets Layer (SSL) is a cryptographic protocol that provides security for
communications over networks such as the Internet
SSO Single sign-on (SSO) is a property of access control of multiple, related, but
independent software systems
TAT Turn Around Time
TCP The Transmission Control Protocol (TCP) is one of the core protocols of the Internet
Protocol Suite.
Abbreviations Description
TLS Transport Layer Security (TLS) is a cryptographic protocol that provides security for
communications over networks such as the Internet
TPA Third Party Agents
UDP The User Datagram Protocol (UDP) is one of the core members of the Internet
Protocol Suite, the set of network protocols used for the Internet.
UML Unified Modeling Language (UML) is a standardized general-purpose modeling
language in the field of software engineering.
VoIP Voice over Internet Protocol (VoIP) is a general term for a family of transmission
technologies for delivery of voice communications over IP networks such as the
Internet or other packet-switched networks.
VPN A virtual private network (VPN) encapsulates data transfers between two or
more networked devices not on the same private network so as to keep the
transferred data private from other devices on one or more intervening local or wide
area networks.
WAN A wide area network (WAN) is a computer network that covers a broad area (i.e., any
network whose communications links cross metropolitan, regional, or national
boundaries
XML XML (Extensible Markup Language) is a set of rules for encoding documents
electronically
Terms Description
Application Server An application server is a software framework dedicated to the
efficient execution of procedures (programs, routines, scripts) for
supporting the construction of applications.
Audit logging Audit log is a chronological sequence of audit records, each of which
contains evidence directly pertaining to and resulting from the
execution of a business process or system function.
Conceptual data model A conceptual data model is a map of concepts and their relationships.
This describes the semantics of an organization and represents a series
of assertions about its nature.
Content Management Content management, or CM, is the set of processes and technologies
that support the collection, managing, and publishing of information in
any form or medium. In recent times this information is typically
referred to as content or, to be precise, digital content.
Context based access Context-based access control intelligently filters TCP and UDP packets
based on application layer protocol session information and can be
used for intranets, extranets and internets
Terms Description
Data Encryption Data encryption is the process of transforming information (referred to
as plaintext) using an algorithm (called cipher) to make it unreadable to
anyone except those possessing special knowledge, usually referred to
as a key.
Data Integrity Data integrity is data that has a complete or whole structure. All
characteristics of the data including business rules, rules for how pieces
of data relate dates, definitions and lineage must be correct for data to
be complete.
Data Profiling Data profiling is the process of examining the data available in an
existing data source and collecting statistics and information about that
data.
Database Index A database index is a data structure that improves the speed of data
retrieval operations on a database table at the cost of slower writes
and increased storage space.
FTP File Transfer Protocol (FTP) is a standard network protocol used to copy
a file from one host to another over a TCP/IP-based network, such as
the Internet.
Terms Description
Knowledge Management Knowledge management (KM) comprises a range of strategies and
practices used in an organization to identify, create, represent,
distribute, and enable adoption of insights and experiences. Such
insights and experiences comprise knowledge, either embodied in
individuals or embedded in organizational processes or practice
Load Balancing Load balancing is a technique to distribute workload evenly across two
or more computers, network links, CPUs, hard drives, or other
resources, in order to get optimal resource utilization, maximize
throughput, minimize response time, and avoid overload.
Operational Data Store An operational data store (or "ODS") is a database designed to
integrate data from multiple sources to make analysis and reporting
easier.
Physical Data Model A physical data model (database design) is a representation of a data
design which takes into account the facilities and constraints of a given
database management system.
Portlets Portlets are pluggable user interface software components that are
managed and displayed in a web portal. Portlets produce fragments of
markup code that are aggregated into a portal page.
Terms Description
Role based access Role-based access control is an approach to restricting system access to
authorized users
Storage Area Network A storage area network (SAN) is an architecture to attach remote
computer storage devices (such as disk arrays, tape libraries,
and optical jukeboxes) to servers in such a way that the devices appear
as locally attached to the operating system.
System Integrity The state that exists when there is complete assurance that under all
conditions an IT system is based on the logical correctness and
reliability of the operating system, the logical completeness of
the hardware and software that implement the protection
mechanisms.
Universal Serial Bus Universal Serial Bus (USB) is a specification to establish communication
between devices and a host controller (usually personal computers).
Virtual Token Virtual tokens are a new concept in multi-factor authentication which
reduce the costs normally associated with implementation and
maintenance of multi-factor solutions by utilizing the user's existing
internet device as the "something the user has" factor.
Terms Description
of application.
Web Server A web server is a computer program that delivers (serves) content,
such as web pages, using the Hypertext Transfer Protocol (HTTP), over
the World Wide Web.
Web Services Web services are typically application programming interfaces (API)
or web APIs that are accessed via Hypertext Transfer Protocol and
executed on a remote system hosting the requested services.
1. Executive Summary
1.1 Introduction
Post AS IS study, Requirement gathering activity across all the eight departments under consideration of
this project was started. Based on the requirements study and keeping in mind the different
functionalities to be expected out of the solution, the next stage was to propose and design a technical
platform which would support all such functionalities. The purpose of this document is to provide the
design the architecture of the envisaged business analytics solution based on the functional
requirements.
The overarching objective of the Business Analytics solution is to provide necessary data and
information for analyzing the insurance companies and regulatory decision making.
For designing the solution architecture, the system requirements have been considered as one of the
key input. Based on both functional and system requirements different views of the solution has been
represented to describe the entire solution in details.
IRDA Portal: The portal will provide a platform for the extended enterprise to be
managed. It will therefore expose both enterprise applications and a number of
functional applications to the extended enterprise
Business Applications and Services
Business Applications: These will be offered as a platform to IRDA. This platform will be
run on the internal environment and will be accessible, to differing extents, through the
channels of information dissemination through defined integration touch points.
Internal Applications: Hosted enterprise application suite is expected to be in place to
manage the internal operations of IRDA in various departments.
Application Design/Integration Services
Following are the layers / components of the Information view through which information
passes:
Data Acquisition Layer
Data Aggregation and Storage Layer
Business Access Layer
Information Delivery Layer
Information Consumer Layer
o Application View - Elaborates the applications to fit the Functional requirement of the
system and to support the Information flow in the system. The application view assumes all
the different applications spread across departments would be accessible from the portal
using a central integration layer.
o Infrastructure View - Elaborates the Infrastructure need of IRDA to support the applications
that has been proposed in the previous view. The infrastructure view comprises of the
following components:
Central Data Centre (CDC) and Disaster Recovery (DR) site specifications
Business Continuity Plan
Hardware Infrastructure
Network Configuration
Scalability Plan
o Security Framework and Architecture - Elaborates the security need of IRDA to safeguard
the data, information, other contents, applications from various internal and external
security threats. This section describes the different methods, processes and mechanisms
such as access control, authentications and encryption mechanisms through advanced
solutions like biometric devices and digital signature
Minimize the complications and time needed to capture data from insurers for the purpose of
offsite inspection
Centralized data storage and share it across the departments within the authority
Enhanced data analysis capability to support better regulation and monitoring market growth
Evolving need of information and analysis
Effective Information dissemination through Enhanced Functionality
4. Solution Themes
Based on the objectives, the following themes are identified for the envisaged solution. These themes
are the guiding factors in designing the envisaged solution.
www.irda.gov.in:
o Content Management Used for storing, controlling, versioning, and publishing industry-specific
documentation such as news articles, operators' manuals, technical manuals, sales guides, and
marketing brochures
o Brokers Online Filing Used for filling up online forms for registering the brokers with IRDA. This
facility is available online in this portal.
o Grievance Management - The grievance management system tracks the new complaint,
forwarded complaints, update status of complaints, rending the insurers and generating some
customized reports on the basis of the complaints.
o New Business Statistics This module is designed to capture new business data for both life and
non life department
o Advertisement - This module is designed to track the details advertisements for each insurer.
The module also tracks those advertisements that are released by intermediaries.
o Third Party Administrator - This module is designed to track the details of the third party
administrators
o Surveyor Licensing System - Surveyor Licensing Module is developed to track the information on
the surveyor. It consists of the pre-defined HTML forms which are robust and secure.
www.irdaindia.org:
o Agency Licensing Portal - This is a system developed for online licensing of the agents and
corporate agents
www.irdaonline.org:
Presently there are no components hosted in this website. This website is used for information
gathering purpose only.
The non web based solutions at IRDA at present are the following:
ATI Database - Capturing new application details of ATIs (both Online / Off-line)
Receipt and Non Web Based Finished In Place Stand Alone Mode
Inward System
Please refer to the Appendix H of the appendix section for details of all the above mentioned systems
gathered from the inputs of the IT department and other supporting documents.
In the proposed system, both statistical and regulatory data coming from the external entities will be
stored in a centralized data store which will be transformed, cleaned and structured to make them
available in form of reports and analysis to both IRDA internal stakeholders and external entities. The
diagram below displays the To-Be scenario with respect to data capture and management.
For designing the solution architecture, the system requirement has been considered as one of the key
input. These system requirements have been divided into the Functional Requirement and System
requirement of the solution. Based on both functional and system requirements different views of the
solution has been represented to describe the entire solution in details.
Reference Architecture
Functional Architecture
Delivery Channel Architecture (Information View)
Application Architecture
The following subsections elaborate the abovementioned components of the solution architecture.
The reference architecture gives an overview of the entire solution containing the key components of
the solution. The Business Analytics Solution has three broad components that interact with each other.
A front-end portal which acts as a window to all IRDA applications e.g. Insurer data
capture and approval.
A centralized database for having single version of truth
A view that facilitates regulatory monitoring and understanding of market
development activities of the Insurance companies.
An analytical platform, gathering input from the operational data warehouse,
providing analytics and reporting across various dimensions.
The diagram below depicts the key components of the solution.
This view of the architecture elaborates various functional components of the envisaged solution. The
functional components have been identified based on the functional requirements specified by the
business users across departments and across levels during the requirement gathering activities.
Overall envisaged technology platform of IRDA system will comprise of a set of applications and services
that are expected to be rendered through a typical n-tier architecture configuration. A number of
services will be hosted for internal consumption; typically to manage the business processes and
functions of IRDA as an organization and also its extended enterprise of estates, vendors and other
partners.
A host of external services through content, data and application level integration will also be rendered
through this platform to the insurer, IRDA customers, employees and management team. Following
sections describe the conceptual view of the overall services platforms
Following diagram depicts the conceptual view of the overall services platforms:
The components of the solution architecture, as depicted in the above diagram, will render various services as given in the table below:
Service Description
Platform Management Platform Management Services comprise of two broad set of service
Services suites-
IRDA Portal: The portal will provide a platform for the extended
enterprise to be managed. It will therefore expose both enterprise
applications and a number of functional applications to the extended
enterprise. The portal should therefore operate through an appropriate
parser that is able to render a broad number of services while connected
to the portal. Each response request in the portal should be well
integrated with the parsing application. The parser application and
infrastructure is expected to form a key component of solution design.
All the customer- facing services will be available through the portal since
it is expected to be the primary gateway for Insurance Companies,
Insurance Customers, Peer Regulatory bodies etc.
Service Description
Business Applications and
Services Business Applications: These will be offered as a platform to IRDA. This
platform will be run on the internal environment and will be accessible,
to differing extents, through the channels of information dissemination
through defined integration touch points.
All the customer facing services will need to have a tightly coupled
integration with IRDA service delivery platform. It is critical to capture the
data and transaction footprints for these services. Internal applications
will also need to have data and application level footprints. Data and
application logic will be completely resident on the respective
environments. Specific data level integration requirements will be
defined during the analysis phase of the project.
The applications envisaged are
1. Content Authoring/Publishing Service A service which allows
users to upload/download and access document repositories in
the IRDA portal
2. Knowledge Management The KM portal will contain various
documents and information about various activities of IRDA and
on the Insurance sector as a whole. This will have two basic sub
parts
a. Repository of information and documents available to
external stakeholders
b. Information and documents only available to IRDA
employees / users
3. Collaboration Services Services rendered by the portal however
these are not necessarily owned by the portal. Instead these
services may e borrowed for other portals or applications
4. Search Allows users to search for a specific service in the portal,
instead of trying navigate to the link for that particular service
5. Personalization Allows the user to personalize the web pages,
look & feel and save favorite links etc.
Service Description
Design/Integration Services Design and Maintenance Platform
Services Comprises of an integrated development environment providing an
interface to maintain services based on the relevant development
framework. The interface will be used to modify and manage the changes
in the services description and definition. A typical environment would
also contain workflow and rules management engine to provide the
ability to design configurable services.
Data Integration
Data will be resident in multiple repositories of the platform. Some of the
data assets might also reside outside the platform in the database of
specific application provider e.g. agency portal system. A concerted data
management strategy will need to be designed. Typical on and offline
data integration model should be considered as part of the overall
technology solution stack.
Data Stores
Overall application layer will have multiple data stores in two different
variations
The table below elaborates the various layers of the information view and the services it renders.
Data Aggregation & Storage layer This layer stores and maintains
transformed data received from the Data
Data Warehouse Takes in data from various sources and Data Required for
stores it for reporting and analysis Analytical
purposes as per the different subject reporting is stored
areas. in the data
warehouse
Business Activity Monitoring This is used for Monitoring real time E.g. Automated
business transactions and taking email reminders
necessary actions on defined rule asking for
violations. compliance data
from insurers.
Content Management The takes care of the Content lifecycle Through this layer
management, Workflow provides a different
feature of full text search and indexing documents like
and various library services for the stored company
content. information,
research papers
etc. will be
available to IRDA
employees and
internal
stakeholders.
Information Delivery This layer enables delivery of various
services through the different channels
(web portal, email etc.) using various
communication methods such as using
message bus and through files
interchange etc.
The application view assumes all the different applications spread across departments would be
accessible from the portal using a central integration layer. This layer would act as a hub for the overall
architecture both from a data and application level communication perspective. This layer will facilitate
communication between the portal, different internal applications, back end systems, access
management services and all other channels. The diagram below shows a model of the IRDA Business
Analytics solution application view.
Intermediaries
Authentication/
IRDA Authorization Services Validation/Rules Engine
Caching Services (Data, Web Pages, Predictive Caching) Insurer data processing service
Employees
Directory
Client
Application Components
Following application components are envisaged as main building blocks:
Access Management
Access Management component will serve as a gateway for all requests that are routed through web
browser. It will use an Employee directory of IRDA along with a SSO/IDM infrastructure to authenticate
users. All the external users would be validated against a user credential database. Access Management
module should be designed in an open mode with ability to accommodate additional applications and
security management solution for new applications.
Application Portal
Application portal will serve as a gateway for requests that may be routed through browser. Portal
should be developed using standard portal server. Portal server may have following illustrative
components-
Knowledge Management Services: Portal will provide for rich knowledge management capabilities
across different departments. A role based view of this system would be given to internal and
external stakeholders.
Analytics and Reporting Services: Portal will provide reporting and data analysis features in forms of
reports and dashboards.
Work Flow Services: A configurable workflow service would be running across all the business
applications.
Notifications and Monitoring Services: A configurable notifications service would be running in
association with the work flow service
Transaction Processing
Security
Administration
Systems Features
Industry-standard Scalability
Industry-standard Performance
Integration Hub
Integration hub will serve three core functions
Manage the interface between various applications through application to application and
application to data integration. Each of the applications may have their dedicated data stores. At a
physical level, this may result in separate schema or separate databases. This managed layer will
handle the transient and persistent storage of this information.
Manage the interface between the Business Applications/Services Layer and Presentation Interface:
Application/data level integration for internal services and application will be managed by this layer.
In addition to managing this interface through a message queue, enterprise service bus, channels or
direct RPC calls, this layer will also be responsible for hosting capabilities like workflow engine,
master data management components, unified notifications and content publishing services.
Application/Services level connectivity with the applications/databases that is hosted in IRDA
environment will be managed through this integration hub. This layer will ensure loose or tightly
coupled connectivity for the different applications.
In addition to these services, a set of common services should also be provisioned for using this
integration hub. This would include-
Back End systems: These are mostly existing systems with which other business applications need to
interact in uni/bidirectional way with respect to information integration.
Central Content Management System
Core Infrastructure Services: This would include all operational databases containing data pertaining
to different business applications, users, portal metadata etc. It would also include communication
components such as Email.
Following are the key considerations for IRDAs envisaged Business Analytics solution:
1. The hardware sized for all the applications should be redundant and scalable. All the components
within the server should be hot swappable and should incur no downtime due to component failure.
2. All the servers suggested should have dual power supplies. The power input to the power supplies
will be from separate Uninterrupted Power Supplies which will be fed from two different power
sources. In case of failure of one power supply, the second power supply should be able to take the
full load without causing any interruption in services.
3. All servers should have at a minimum of dual 1000 Mbps network interface cards (NIC) installed on
different slots. Each NIC will be cabled from a different module on the switch using gigabit speed
cabling.
4. The system should be platform independent and should not only be deployable on multiple
platforms such as HP UNIX, IBM AIX, IBM i, Sun Solaris, Microsoft Windows, Linux etc., but should
also allow integration with other software deployed across heterogeneous operating system
platforms.
5. The system should have the capability to use Service Oriented Architecture best practices and
should use industry standards for integration to achieve universal use.
6. The system should be database independent and should allow deployment on multiple RDBMS such
as DB2, Oracle, and Microsoft etc. The system should allow integration with other heterogeneous
databases irrespective of the choice of database for the enterprise system. The database language
should be ANSI SQL and should avoid using any Vendor specific proprietary extensions to ANSI SQL
(e.g. PL-SQL)
7. Ability to be browser independent. The system should be compatible with the following browsers
8. The system should have modular structure providing the flexibility to deploy selected modules-
products- lines of business combination as per the IRDAs convenience
9. The system should provide fast and steady response times (Quality of Service). The speed and
efficiency of the system should not be affected with growing volumes, especially during search
operations, data warehousing, reporting, MIS, online processes and batch processes.
10. The system should be operational with good response time using low band width in the region of
about 15Kb per user, especially for WAN and internet users.
11.1 Support multi- tier architecture (The Application should at least have the following
within its architecture) for all modules within the application with well defined
interfaces between the layers
11.2 Capability to integrate with external / third party components like Rules Engine,
Functional Modules, General Ledger etc which should not be point to point integration,
but with well defined interfaces for data integration using enterprise data model
11.4 Multiple similar hardware and mix of multiple hardware in a horizontal setup.
11.5 Scalability for external components (External components should not restrict scalability)
- Provide performance benchmarks for similar functions required in IRDA for Solution
scalability
11.7 Addition of CPU, Memory, Hard disk capacity without causing downtime
11.8 Support the deployment of additional modules at a later point in time with minimal
downtime and loss of productivity.
Interoperability is essential for the IRDA business analytics solution. In order to apply Interoperability to
the BAP solution the following challenges may arise:
Technical interoperability
Technical Interoperability covers the technical issues of computer systems. It includes also issues on
platforms and frameworks. Frameworks for the solution might become complex and many times
provide conceptual differences to working approaches. In addition, at times frameworks are duplicative
and contradicting with multiple levels. Hence, thorough review and utmost care should be taken while
deciding on the frameworks and platforms for the solution. Some of the specific platform and
framework related considerations for the business analytics solutions are:
Other considerations which are dependent on the platform and frameworks are:
Portlets built for one portal platform would not interoperate with other portal platforms
Developers would need to build the same portlet many times to support multiple portal
vendors.
A limited number of portlets will be available from a particular portal vendor for page designers.
Deployment of portlets may want to be managed on certain systems but consumed on other
systems.
Organizational interoperability
Organizational interoperability is concerned with organizational processes and cooperation of agencies.
Some of the processes may not be enough flexible and adaptive to be integrated and be interoperable.
The IRDA top level management will need to play a vital role in such a context. Leadership and strategic
direction of management are cited as the most important factors for corporate adoption of Web
technology.
Semantic Interoperability
Interoperability or integration efforts are about making information from one system syntactically and
semantically accessible to another system. Syntax problems involve format and structure. Semantics
being an important technical issue is one that is almost invisible outside technical circles. What it boils
down to is that the meaning of apparently identical terms can differ in significant ways between
systems. Such differences normally make it more difficult to make systems work together. The
differences can be minimized if systems are designed using agreed data formats. Semantics relate to the
understanding and integrity of the information.
There are various technologies that help in achieving the objectives of the business analytics solution by
solving the problem of interoperability. Key technologies are discussed below:
SOA is not just architecture of services seen from a technology perspective, but the policies,
practices, and frameworks by which the right services are provided and consumed.
With SOA it is critical to implement processes that ensure that there are at least two different
and separate processesfor provider and consumer.
Rather than leaving developers to discover individual services and put them into context, the
Business Service Bus is instead their starting point that guides them to a coherent set that has
been assembled for their domain.
.
A web service supports direct interactions with other software agents using XML-based messages
exchanged via Internet-based protocols. The Semantic Web infrastructure of ontology services,
metadata annotators, reasoning engines and so on will be delivered as Web services. In turn Web
services need semantic-driven descriptions for discovery, negotiation and composition.
10.1 Overview
A conceptual data model shows relationships among various data entities to represent high-level
dependencies among the data required by business functions/departments. In other words, it shows
relationship and interactions among conceptual data entities. It also zooms in on the area of the
organization that is the subject of analysis for the project and provides a high-level view representing
the business under study for that organization. Entities here refer to the different dimensions /
parameters across which different analyses are performed and the facts or measures which are being
analyzed across these dimensions. The data model comprises of entities that were defined during the
data collection and requirements gathering phases of the project, and includes all entities necessary to
support the clients business analytics platform and is developed at a departmental level after analysis
of the metrics and reports of the following departments:
1. Life
2. Non Life
a. General
b. Reinsurance
4. Actuarial
5. Intermediaries1
a. Brokers
b. Corporate Agents
c. Surveyors
Leaving few forms, owing to the flat dimensional structure for F&A and investment, no dimensional
model is suggested for these departments.
Each department is further split into subject areas having similar data elements and analytical context.
Similar forms are clubbed together to form a subject area. Each department and the underlying subject
areas have a specific set of dimensions or parameters depending upon the analytical requirements.
Some of them are common across departments for example time and insurer which are common across
1
Agents and ATIs data are already captured in agency licensing portal and hence not a part of the dimensional design
departments. Some of them are unique to a particular department such as broker which is relevant only
for intermediaries.
Dimensions have attributes in them which are the characteristics metadata elements to further detail
out a dimension. A subject area is represented by fact table which represent the metrics or measure in a
particular subject area. Dimensions are linked to a fact table using a unique identifier (a key / ID) which
shows that the kind of relationship it has with the fact and also the level of granularity at which the
metric is analyzed. For example, if month ID is connected with a fact table, it means the metrics in the
fact table for that particular subject area are analyzed monthly with respect to the time dimension. It
can be represented in the diagram below:
Time
Insurer
1. Year ID
2. Half Year ID
1.
2.
Insurer
Insurer
ID
Name
Relationship 3.
4.
Quarter ID
Month ID
3. Insurer Type 5. Day
4. Date of Registration (Granularity)
ABC Metrics
Fact Table
(Metrics)
capturing process. This outcome of this process is summarized and visual snapshot of the commonality
of the forms in terms of data capture within and across departments. The structure for this matrix is
represented as shown below:
Dimensions
Geography
Dept. Form Name
Premium
Location
Channel
Division
Product
Insurer
Group
Time
Type
Type
LoB
ABC XYZ X P X M
This represents that form XYZ of ABC department is capturing data across Insurer, product (at the lowest
product level granularity) and channel wise. The frequency of the data capture is monthly.
For detailed data models and de duplication matrix for the different departments, please refer to
Appendix A.
For the detailed structure of the dimensions please refer to Appendix B in the appendix section.
This section elaborates the Infrastructure need of IRDA to support the applications that has been
proposed in the previous section.
A fully web based application architecture is envisaged that will ensure all the application access is
through web. Specific description of different infrastructure solution components along with the
rationale is presented in respective sections:
This assessment is to focus on the technological aspects of disaster recovery not the business
operations
Current system business continuity plans do not cover the new requirements that will be
needed for IRDA infrastructure
Disaster recovery infrastructure capacity must be able to operate at the same performance level
as the primary site
Does not include overall restoration priority with other systems, such as IGMS.
Any recommended infrastructure needs are specific to IRDA and thus independent of any other
application
Development environments will be recovered via appropriate backups as time/resource permit
upon completion of DR plan execution
In the event of a disaster, the recovery facility and services should provide the capability to
maintain operations of the in-scope the business analytics solution components and related
applications for an undetermined amount of time
The proposed Data Centre at can be referred to as the Central Data Center (CDC). Functional and
technical resources can be located at CDC and an IT Help desk for issue resolution and solution
enhancement should be established at the Data Center.
CDC should host various applications, as detailed out in the Application View of the solution
architecture.
The Central Data Centre should have full capacity for hosting and running the network/server
Infrastructure of IRDA. This should be installed in Hot Stand By mode or back-up in cold state. CDC
should act as Primary Data Center and also Business Continuity Planning (BCP) and Disaster Recovery
(DR) site for backup of CDC.
In normal situation, CDC should be able to provide services to the different categories of IT users. An
automatic load balancer should be installed to divert traffic to least occupied server in CDC. In case of
failure of CDC, the DR site should automatically take over the task of failed CDC. DR process should
replicate the data from CDC to DR site as online replication. In case of failure of the CDC, this DR should
be in a position to take over the entire load automatically.
It is recommended that there should be one Centralized Data Centres (CDC) at the IRDA Headquarter
in Hyderabad. CDC should be connected to a Disaster Recovery (DR) site and, if possible, established in a
different seismic zone, from that of the CDC. Also, if the recovery needs are higher, in such a case, a near
site is proposed which will be exactly a replica of the CDC in terms of storage. Further, the computing
power of the CDC can be replicated into the near site as well for an almost near real time recovery.
Data Replication SAN Replication between the Data Centers for critical data bases needed to
meet accelerated RPO requirements (up to 2hours or less of lost data)
Dedicated DB Deployment of Recovery Data Center Database Server Capacity to support
Servers the critical Data Bases
Tech Services Tools Make monitoring and support tools resilient or quickly available at the
recovery data center
Technical specifications for the DRC for the IRDA business analytics program
The following with respect to Disaster Recovery is recommended for the IRDA business analytics
program:
Technical Considerations
Network connectivity and sufficient bandwidth will be needed between DC and DRC; burstable
bandwidth provisioning should be negotiated with WAN provider(s).
System software should be used to synchronize platforms at production and recovery locations
Dedicated equipment is required at the DRC, but it could be used to provide testing or
development during normal operations
Automated provisioning/repurposing of test and development equipment for
production/recovery purposes is a recommended capability
Boot-from-SAN, Ignite or similar process should be used to reduce recovery time
Regular, full-scale testing of the disaster recovery solution should be performed
A distinct DR site should be created in the next seismic zone, designed as the backup (mirror)
site to the main site. The DR site should deploy the entire application solution (current and
latest version of the application builds, and all solution components).
The DR site should be invoked automatically when the production site fails to provide its
services and it should ensure that it supports a degraded performance of at least 80 per cent of
that prescribed for the primary site.
It should be ensured that data is replicated at the DR site at regular intervals
Routine tests should be simulated to ensure that in case of an emergency, rollover to the DR site
happens automatically without any service downtime.
IRDA should run all services and transactions from the DR Site, at least once in a month, on a
non-peak day to check its performance in case of an exigency and service provider (s) should
perform DR drills monthly.
In terms of storage requirements for the DRC, IRDA needs to implement some type of
Information Lifecycle Management (ILM) approach. Data needs to be classified and placed on
the appropriate class of storage. IRDA needs to implement a synchronous or asynchronous
replication approach for critical data (e.g. SRDF, TrueCopy, PPRC, SnapMirror, etc.).
In addition to IRDAs disaster recovery initiatives, other options also include investigating the
use of third party vendors to provide offsite data storage. Offsite data storage services from a
third party provider provide a secured means to store critical business and application data in
the event of a disaster. Many of these vendors also provide disaster recovery services, which
may include the ability to use vendor hardware to run IRDA business applications in the event of
a disaster to the IRDA operations center.
DR (Disaster Recovery) site should be at a different seismic zone from that of CDC. To ensure
undisturbed connection between CDC and DR site the connectivity needs to be at least from two
individual ISPs, one is for main network connection and the other is as fall back option. Similarly in DR
site, internet connectivity should be same like CDC. It is recommended to have an online replication
between CDC and CDC DR site.
CDC and DR Specifications have been provided in Appendix D. CDC and DR Specifications.
Business survival necessitates planning for every type of business disruption including, but by no means,
limited to the above mentioned categories of disasters with results ranging from insured losses of
replaceable tangibles to uninsurable capital losses to customer dissatisfaction and possible desertion to
complete insolvency.
A business continuity plan is an insurance against such disasters and ensures that key (if not all),
business functions continue. A business continuity strategy, then, is a high-value as well as a high-
maintenance proposition. In this context, IRDA is no exception and it is as vulnerable as any other
enterprise globally.
The key challenge of business continuity preparation is not technology, but the internal business
aspects that begin at the foundation level of any project and continue throughout its life cycle: such as
justification, executive buy-in, broad organizational support, governance and politics.
Perhaps the most important point to make about business continuity support technologies is that its
effectiveness depends entirely upon the organizations top-down commitment to the entire project,
including updating and testing IT Systems and Infrastructure, recommending suitable policies for
maintenance to remain ever geared up for an unexpected turn of events.
11.5 Recovery Point and Time Objectives for the Business Analytics
Solution
The business requirements for IT disaster recovery services were captured by reviewing each of the key
business processes within the functional areas of IRDA, identifying for each:
How quickly following a disaster a particular process needs to be operational (the RTO
Recovery Time Objective)
The amount of data that can be lost as a result of a disaster (the RPO Recovery Point
Objectives)
In addition, various design attributes relating to the process, such as, if there is a workaround that can
be put in place, if it is necessary to be able to perform the process out of the office, were identified.
Functional Areas Reviewed
Online submission of data and electronic communication management are considered the
highest priority functional area for restoration in the event of a disaster.
Reporting and analytics, tracking and monitoring and workflow are considered to be of medium
priority.
Document and Knowledge Management and other functions like collaborations, search etc.
were considered low priority since generally their component processes are not time critical and
workarounds are possible providing that the firm can communicate with its clients and access
documents.
Business Processes Considered
o Restoration priorities (RTO) align with being able to communicate with the external
stakeholders, being able to complete disclosure transactions, and the ability to be able
access and update documents.
o Zero data loss (RPO) is required for all the critical functionalities.
Required RTO
Functional Area Between 5 Hours to 24 Between 2 Hours to 4 < 2 Hours
Hours Hours
Online submission of
data
Electronic
Communications
Reporting and analytics
Tracking and
Monitoring
Workflow Management
Required RTO
Functional Area Between 5 Hours to 24 Between 2 Hours to 4 < 2 Hours
Hours Hours
2
Document
Management
Knowledge
Management
Support Services and
Others
Legends
2
Document management can be of medium or high priority at times. The priority of document management can be seasonal at
time. Ex. in case of inspection activities or enquiry of an insurer/intermediary.
Prior to the capture of business requirements, three approaches to DR were defined providing
different levels of recovery capability.
Following the capture of the business requirements and technology constraints, three solution
options were developed broadly in line with the three approaches.
Key Features
Key Features
within 2-4 hours, although additional
testing prior to services coming on line and
potential synchronisation errors may
cause recovery time to be extended.
Performance in case of disaster: Slightly
reduced relative to normal operation for
critical applications, reduced for non
critical applications.
Benefits: Meets requirements (with some
risk on recovery time objective).
Weaknesses: Failback to normal operation
requires a planned day of outage.
Option 3 No replication of data, recovery reliant instead on
restoration of data from tape. Low priority
systems are not maintained, but called off in a DR
scenario.
RPO: up to an hour of data loss.
RTO: Recovery of critical applications likely
to take 2-4 hours
Performance in case of disaster: Slightly
reduced relative to normal operation for
critical apps, reduced for non critical apps.
Benefits: DR tests can be handled entirely
offline and without interruption to normal
operations.
Weaknesses: Failback to normal operation
likely to require a planned weekend of
outage. DR tests are time consuming.
Legends Description
Comparison of Options
Option 3 provides some improvements over the current state, primarily the certainty that a
facility and systems will be available to recover the services. However the RTO objectives for
priory areas would be not met and recovery relies on relatively unreliable tapes. Option 3 is not
a recommended approach given the scale of incremental investment required relative to the
benefit delivered.
Option 2 meets most of the business recovery requirements and will cost lesser to implement
than Option 1. However, the cost difference is not as large as anticipated because the
asynchronous replication software costs more than the synchronous equivalent and is licensed
by terabyte of data replicated. This is likely to result in recurring charges increasing more
sharply in this option, than in Option 1 (where the software is licensed per device).
Option 1 meets the business recovery requirements. Assessing these options against selection
criteria including IT operational risk, cost and match to business recovery requirements, Option
1 can be the preferred solution. The relatively small incremental costs brings:
o minimised data loss (zero for saved data in critical processes),
o faster recovery of services,
o relative ease of failback, negating any prolonged period of outage,
o ease of maintainability and
o a potential platform for further improving continuity in the future through stretching
primary services between the DC and DR data centre. Although this would be at extra
cost, it is not possible with other options.
LEVEL 0
Failure Solution
Failure of a component in a Each machine should have 100 per cent fault tolerance capability.
machine (Machine can be Fault Tolerance feature should be restored immediately
Network Infrastructure The faulty component should be replaced with a new component
Devices and Servers and High- and it should be sent for repair/warranty replacement, etc at the
end users Systems) earliest.
LEVEL 1
Failure Solution
Failure where the machine Each machine should have a backup counterpart.
comes to a halt (Machine During problem/breakdown of a machine, backup machine should
refer to Network automatically take over the job of primary machine
Infrastructure Devices and
The faulty machine should be replaced with the new machine,
Servers and High-end users
which should be sent for repair/warranty replacement etc.
Systems)
LEVEL 2
Failure Solution
Failure which causes the Two similar sites (one CDC and one DR site) have been proposed
complete site to halt which should always be up and running for normal operation. In
case of higher recovery normal expectations a near site is also
proposed which will get replicated at almost at a near real time.
In case of problem/breakdown of a site, DRC should automatically
take over the entire load and should start functioning as the
primary site.
The status of faulty site shall be restored to normal as soon as
possible.
LEVEL 3
Failure Solution
Failure which causes the Options such as hiring data center services from ISP vendors, etc.
entire site to halt, including should be explored and latest backup should be restored to start
DR site the operation.
Efforts should be made to restore the normal status of main sites
and DR sites at the earliest
Services Description
Development and This service delivers the development and delivery of the
Implementation application as per functional requirements specifications
and technical specifications
Application Support This service captures bug fixing, fixing problems in the
application and responding to calls with respect to the
application
Service Monitoring This service captures events (IT alarms, alerts and
notifications) in the IT infrastructure and forwards them
to appropriate personnel or systems for further action.
Services Description
Release and Deployment This service plans, manages and coordinates activities to
Support deploy service solutions as set out in release packages.
Service Testing This service ensures that new or changed IT services and
their supporting infrastructure are adequately tested and
accepted for successful delivery and operation.
Capacity Management This service ensures that all current and future capacity
and performance aspects of the application are provided
to meet business and service requirements at acceptable
cost.
Availability Management This service optimizes the capability of the services and
supporting organization to deliver sustained levels of
service availability that meet business requirements at
acceptable costs.
Solution Design Services This service plans, and designs solutions to support
business services, processes or functions.
This section illustrates the methodologies to be followed and considered for data and hardware (server)
sizing for the data storage for IRDA business analytics project. The exact calculations can be performed
during the design stage when all the table structures will be designed and ready to be developed.
Data Sizing
The template below will capture the sizing estimates for the databases excluding indexes.
Total (E=D*
Table Type Estimated No. of
Database Table Length of the Number of
(Fact/ Master Size Years of
Name Name Row (B) Rows(C)
/Others) (D=B*C) History
Data)
Total
Inputs Required:
No. of Rows: For master tables, the no. of the members in that particular dimension along with
their attributes will be the measure for the no. of rows in the particular table. For fact table it
will be the combination of all the members in different master tables together. For example, if
there are 3 master tables, M-1, M-2 and M-3 having 3,4,5 members respectively, the total rows
in the fact table will be 3X4X5 = 60 rows. For the other transactional tables, an empirical or
estimated no. of rows can be captured which is typically a percentage of the total no. of rows in
the fact table.
Row Length: The data type represents the length of a row or the data storage for a particular
row. If the data type is VarChar it will be 8kb and if it is a SmallInt it will be 32 KB and so on. The
number when multiplied by the no. of rows will give the estimate for the data size for the entire
table of a data base. This estimated number when multiplied by no. of years of history data will
represent the total storage size in KB.
No. of Years of History Data: This will be typically the years of history data needed to capture
for tracking, monitoring, reporting and archiving.
The template below captures the sizing estimates for the databases including indexes. This is the
minimum real memory requirement for the different database tables.
DB Size Total
DB Name Indexes (KB) Total (KB)
(KB) (GB)
Total
Inputs Required
Sizing of the database for incremental increase in data on a year on year basis
The template below captures the server for the data stored in the data bases based on incremental
growth of the data on a year on year basis. The incremental increase in data needs to be estimated for
this case.
Indexed
Not Indexed
Inputs Required
Load Estimation
Theoretical calculations can be carried out on the queries to estimate the load on the network
connections. The usage assumptions (given below) can be used to complete this analysis. Also, data
from sizing estimates of the data stored can be used in calculating the average length of rows and the
total number of rows
The theoretical estimated query response is captured in the table shown below:
Medium
Users
Heavy Users
Medium
Users
Heavy Users
Inputs Required
The following assumptions associated with corporate query workload are defined in estimating the
required computing resources for the IRDA BAP solution data storage:
Light users create approximately 1,000 bytes each transaction, with 15 transactions an hour for
an 8-hour business day
Medium analytical processing/relational online analytical processing (OLAP/ROLAP) users create
approximately 4,000 bytes each transaction, with 20 transactions an hour for an 8-hour business
day
Heavy create approximately 75,000 bytes each transaction with 10 transactions an hour for an
8-hour business day
Workload Inputs
The following assumptions associated with corporate user demographics are defined to assist in
computing resources required for the data storage for IRDA business analytics:
__________% is heavy-users conducting exhaustive analysis with query tools supporting
multidimensional analysis
__________% of the medium user community is medium users requiring online analytical
processing of large volumes of data
__________% perform simple to medium complexity queries against the data warehouse
Applications Specifications
Application specifications will enable determining data load to and from the BAP solution with respect
to the queries in question:
The following factors need to be considered when setting the requirements:
Complicated Calculations
Number of Users
Hardware Specifications
Data Size
The template below can be used to capture definitions of simple, medium and complex queries:
Simple
Medium
Complex
From the template above, the application usage for the different applications can be found out:
The no. of users information can be found out from the users having access to the different
applications.
Network Sizing
The purpose of this section is to determine the network (LAN/WAN) usage requirements and identify
the loads that may be placed on the network by the system.
The assumptions and inputs required to perform network sizing for the IRDA Business Analytics solution
are:
These assumptions/inputs will be captured for both the scenarios of maximum load and normal load.
Data from this analysis can be captured in the template shown below:
Scenario A:
Maximum Load
Scenario B:
Typical Load
From the template above, describe the network usage between the different servers:
For detailed data sizing of IRDA BAP for the different departments, please refer to Appendix C
Load balancing this component would provide load balancing capabilities for incoming
requests, thereby allowing the portal user traffic to be uniformly serviced by a number of front-
end servers. Application Server Web Caching would be used for providing this functionality.
Caching Caching services would provide a universal view of the caching by means of caching of
the following
o Web content
o Data
o User Information
This would be made possible by retaining versions of rendered web pages, common web page
areas (headers, footer and navigation panels), frequently accessed data, rarely changing data
from external applications/services. Caching would result in greater performance,
responsiveness and scalability by reducing repeated database access for same data, execution of
web page generation logic, cross-system calls or business logic. Caching layer would have built-
in intelligence to hold data in the memory based on frequency of access, change frequency, user
behaviour. Techniques to invalidate the cache when newer data is available would also be
applied.
Database The database design, development and operational plans would be designed using
proven best practices for data centric applications to ensure maximum possible scalability.
Operations Angle
From an Operations perspective, the following best practices would be followed to ensure scalability.
Periodically measure the system performance counters of the server using System Management
Console to ensure that the hardware is scaling up
Disable unnecessary heavy performance logging in the system. (e.g. Windows PerfMon)
This security strategy will be implemented in a manner that provides for as efficient administrative
process, as possible after go live.
Basic application security will be used to prevent access to key configuration and administrative
applications. Action, processing option and other security methods will be implemented to further
secure allowed objects.
Access to applications, reports, and tools will be granted on an individual role basis. Each user will
belong to a security role to which security will be applied. IT will be allowed for the users to belong to
more than one security role. In general, the security roles will be designed such that each user will only
have one security role per environment. Security will be applied at the Role level, not the user level.
It is recommended that an audit of the user list be reviewed periodically after go live, to help to ensure
that users have been assigned appropriate environments and to identify unauthorized access to the
Production environment.
Environment Purpose
Environment
Description
The following table shows a list of access by environment for the development and project teams as well
as end users.
PD TR QA DV CV
Key Users N N Y N N
Administrators Y Y Y Y Y
Project Team N N N Y Y
Training N Y N N N
Production Support Y N N N N
1. Application Level Security plays an important role in the security strategy and approach for
mitigating risks in the to be environment after go live. Application security will be used to
block all users leaving the administrators to run and install components on the top of the
solution. This will in effect restrict users from running application or batch objects, by default.
To simplify security configuration the following variations of action security will be configured:
VIEW ONLY N N N N N
CHANGE ONLY Y N N N N
ADD ONLY N Y Y N N
NO DELETE Y Y Y N N
FULL ACCESS Y Y Y Y N
In interactive applications with drill down options, the security options must also be configured
for that user or role to be able to drill down.
3. Row Security will be used to restrict access, through Business Analytics Solution, to the data in
tables. The use of row security will be limited to situations where the other security types prove
to be insufficient, due to the system performance implications.
4. Column Security will be used to restrict access to the data in the Business Analytics Solution data
tables and in applications and forms. As this security type does not have any significant impact
on performance, it is expected that this security will be used in the IRDA BAP security design, in
compliance with business requirements, in order to protect key data fields.
5. Form Security will be used to restrict end users from opening unauthorized and restricted data
input forms for data entry.
6. Report Security will be used to restrict access, through Business Analytics Solution, to the
unauthorized static and canned reports.
7. Analytical Security will be used to allow end users access to searching/selecting tables or
business views (i.e., creating ad-hoc queries).
8. Design Security will be used on a need-only basis in the IRDA security design. This security
restricts or allows the role the ability to make changes to the web
pages/tabs/forms/reports/RDBMS and other components within the entire solution. Here
typically the access will be given to System Administrators.
IRDA will develop separate user roles for the following groups of users:
For the detailed security setting for the IRDA business analytics project, please refer to Appendix F
Application
System and
Network
The abovementioned security tiers will cover the various security aspects as following:
Migrate active transactional data from source systems to target solution with no deterioration
in data quality;
Provide read only access to inactive transactional data either through solution or alternate
reporting mechanisms; and
Dispose of any redundant data.
There may also be a requirement for migration routines to be developed to populate reference data
within the new application or sub modules.
Data Sources
The data sources that needs to be considered in data migration activities are
Hardcopies
Excel Spreadsheets
Text Files
MS Word Documents
Existing legacy systems databases
Ideally the Regulatory Programme data migration approach would adhere to the business archiving
strategies and standards and utilise existing archiving approaches and solutions. From a migration
perspective three options are available as a stop gap arrangement in case of absence of a formal
archiving strategy:
3 Existing legacy databases are Restricts the volume of data Once new application
taken offline and access to that requires migration. archiving solutions are
data is via reports (canned implemented offline
and ad-hoc) across the Restricts the volume of databases may require
database. multiply migrated data migration into the archiving
records within new
solution.
applications.
While implementing data migration architecture, the data migration team has to take a number of
considerations. Following are a few of such considerations:
Data volume analysis
Source system and target system processing power
Complexity of data mapping rules and business rules
If during transformation, several records are normalized into separate database records that will result
in a significant increase in the overall data volume, extract data from the source system(s) as is and
move it to a staging area in the target system, then apply cleansing and transformations locally.
Highlights:
Reduced network round trip
Local transformations means the actual data migration process is over, data has actually
reached the targeted server.
Leverage processing power of target server
Physical Data Model Paper Data Excel based data RDBMS storage
Conceptual level Structure Conceptual level Structure Conceptual level Structure
Conceptual Data Model Paper Data Excel based data RDBMS storage
Data
Depending upon the state of data both logical and physical appropriate courses of actions needs to be
taken.
Key activities:
Identify data source and their details
Run system extracts and queries
Conduct user interviews and awareness programs on data migration process
Review migration scope and validation strategy
Create work plan and milestone dates
Identify data cleansing needs and expectations
Create data prep worksheets
Clean up source data in current system
Format unstructured data in other systems
Run extracts and queries to determine data quality
Create metrics to capture data volume, peak hours and off-peak hours
Deliverables/Outputs
Migration scope document
Migration validation strategy document
Work plan with milestone dates
Data Cleansing
One of the main activities in this stage will be to clean the data to minimize data quality issues in the
data sources.
A data quality issue can be categorised into two types:
1. A semantic data quality issue in which data does not comply with As Is business rules; or
2. A physical data quality issue in which the data which does not comply with application rules,
data definitions or validations in either existing source systems or new applications.
While there are synergies between migration and cleansing, there can be no. of challenges faced which
are the following:
Identification of semantic data quality issues requires an in depth knowledge of the business
domain which will required IRDA business users involvement
Resolution of such quality issues requires subjective judgement which draws on business
expertise.
To allow the business the greatest lead time possible, the IRDA Business Analytics Program data
migration team will conduct analysis across the dataset requiring migration to identify as many data
quality issues that may impact the data migration process as early as possible in the life cycle
As part of the migration process data from the source systems may be transformed to make it
compatible with the target application. The data transformation logic may be able to fix simple physical
data quality issues such as:
Key Activities:
Create/verify data element mappings
Run data extracts from current system(s)
Create tables, scripts, jobs to automate the extraction
Address additional data clean-up issues
Execute application specific customizations
Run mock migrations
Conduct internal data validation checks including business rules and referential integrity checks
Perform data validation
Prepare migration validation reports and data movement metrics
Review migration validation reports and metrics
Record count verifications on the new system
Reconcile or resolve any exceptions or unexpected variations
Sign off on migration validation
Deliverables/Outputs
Extracts from source system
Data migration modules, jobs, scripts
New application(s) loaded with converted data
Exceptions, alerts and error handling control points
Exception reports, cross-reference files/manuals
Signed-off data migration validation document
Key Activities
Complete data migration reports and cross-reference files/manuals
Data Archiving Strategy
Deliverables / Outputs
Data archiving strategy document
The executives or managers should form this group responsible for the data content. Responsibilities
include, but are not limited to the following:
Data User
This group consists of users who use the computerized data for various official purposes. These users
may be internal or external to the organization. Responsibilities include, but are not limited to the
following:
Follow standards of acceptable use and compliance with the owner's controls
Maintain confidentiality of the data and report unauthorized activity
Data Custodian
Implementation of data storage safeguards and ensuring availability of the data is the primary
responsibility of this group. This group should provide support to the business user(s).
Responsibilities include, but are not limited to the following:
For details of marking, transmission, storage, restoration, and destruction procedures / guidelines for
information assets, please refer to Appendix G
There are several tools and techniques available for converting physical analog data to electronic data.
All the hardcopies of the document submitted as an attachment by the insurers and other users of the
system in IRDA can be converted into electronic formats using these techniques. In the context of IRDA,
two techniques are suggested:
Manual data entry This is the conventional and most prevalent technique and is applied by
employing data entry operators who will on a regular basis for a certain time period in the
implementation schedule will manually enter data using the hardcopy and the physical files in IRDA
in a format prescribed during the design stage and finally the data will get stored in the system
databases.
Data Digitization and scanning Digitization is a method of converting analog form of data to digital
and electronic formats using advanced tools and technologies without the need of manual
intervention. Scanning is a method of converting physical formats of data into electronic formats like
images and PDFs etc. Data digitization and scanning in IRDA will involve the following processes:
o Identification of the items for the collection Identify which data are needed to be
converted into digital formats. For example, actuarial report and abstract submitted by
the insurer to the actuarial department.
o Choice of formats Deciding upon the formats to which these data is going to be
converted. For example, *.doc, *.html, *.rtf etc.
o Choice of hardware Hardware infrastructure needed to convert this data to electronic
format. For example, scanners, OCRs (Optical Character Readers), computers etc.
o Choice of software Selection of the software supporting digitization such as OCR and
ICR.
o Storage and archiving Calculating the storage space required for the digitized data.
o Management Managing the activities so as to ensure optimal results. There several
factors to be kept in mind for getting the desired quality for digitization such as
conditions and coloring of the source documents, font types, persons handling the
Following diagram is a representation of the workflow of the manual data conversion activities to be
carried out in IRDA.
Physical Documents
submitted by insurers
Documents are
and others are Identification of
Start digitalized/
accumulated for batches
scanned
digitalization and
scanning
Re- scanning/
Whether the data is
digitizing of
properly captured as per
documents for
desired results?
better accuracy
With Errors
Error Free
Minor Cleansing
Thorough
and editing of data
cleansing and
to increase
Editing of data
accuracy
Whether the
Archive the document in
document is
End Back-up document management Scanned
scanned or
system
digitized?
Digitized
Creation of
metadata
Back-up of
End
database is taken.
The success of the data migration project lies in a seamless data movement and always remains on the
shadow of implementing the new system.3
Failure to treat data migration as a project unto itself. Data migration is complex undertaking that
should not be regarded as merely a peripheral effort to the main development project. The data
migration effort should be treated as a complete sub-project with a defined process, a thoughtfully
derived time and cost estimate, and a series of phases that can be tracked or managed.
Underestimating the time and cost of data migration. It is important to perform a reasonably diligent
survey of source systems in order to determine the quality of those source system's documentation and
source data. If the source system does not have up-to-date data documentation in the form of data
model and data dictionary, the task of determining the structure and data types of the desired source
data and it's mapping to the target data will be increased in time and cost. If the source system has less
stringent data quality requirements than the target system or if the data quality of the source system
has been allowed to lapse over time, the actual act of performing the migration will take longer time
due to the need to perform post-migration data clean up.
Lack of end-state data quality. If the migration effort does not formally specify the level of end-state
data quality and the set of quality control tests that will be used to verify that data quality, the target
domain may wind up with poor data quality. This will negatively impact the perceived outcome of the
development effort.
Failure to support the organizational support. When the complexity and importance of data migration
is not adequately appreciated it may be difficult to gain organizational support for that data migration,
especially in terms of funding and resources. It may be even more difficult to garner a positive level
support in separate organizations that have primary responsibility for the source data. This can happen
when the organization supporting the source data feels threatened by the new system or it can happen
simply because the migration effort is not a top priority for that organization.
Lack of appreciation for the complexities of data mapping. The central effort of data migration is
understanding the source data and developing the mapping that allows the data in the source domain to
be accurately transformed and moved to the target domain. There are many factors affecting mapping
that can be ignored:
Ensuring that the semantics sense of a given attribute is correctly mapped: the same datum may
carry a different name in the source domain than in the target domain; the source domain and the
target domain may carry the same name for what is conceptually a different datum.
3
For details of risks related to Business Analytics Project, please refer to Risk Mitigation Strategy section of the
implementation plan document
These issues and subsequent impacts may manifest themselves in both quantitative and qualitative
ways:
The most important factors in mitigating the risks of data migration are to treat the data migration as a
project and to use a sound methodical process having the following KPIs:
Data Profiling - Gain a complete understanding of the content, structure, quality, and integrity of
the data of the source system.
Data Mapping - Develop an accurate set of data mapping specifications from the source system to
the target system.
Migration Approach and Architectural Considerations - Whether point-to-point migration or hub-
and-spoke migration, this needs to be evaluated and carefully articulated.
Development Selecting a tool to automate the migration process and make it more scalable
should be a high-priority item.
Quality Assurance - Conduct mock migrations, pilot migrations before the final migration run; this
will ensure that the migration process is robust and trusted.
1. Life Department
Time
Insurer Division
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
1. Division ID
2. Insurer Name 4. Month ID
2. Division Name
3. Insurer Type 5. Day
4. Date of Registration
NB Fact
1. Insurer ID
2. Quarter ID
Premium Type 3. Channel ID
4. Business ID
5. State ID
6. Premium Type ID
1. Premium Type ID 7. Division ID Channel Type
2. Premium Type 8. UIN
9. Group ID
1. Channel ID
2. Channel Name
Product New Business Metrics
1. Product Category
2. Category ID
3. Target Group
4. Product Name
5. UIN
6. Date of approval Group
7. Average Term
Geography
1. Group ID
Line of Business 1. Classification 2. Group Name
2. Country ID
1. LOB ID 3. Country Name
2 LOB Name 4. State ID
3. LOB Type 5 State Name Location
6. City Name
7.. Pin Code
1. Location ID
2. Location Name
INPUT_LIFE_9 New Business To capture the new business data against all Quarterly
Data - Channel channels at an overall level
wise
INPUT_LIFE_9.1 New Business To capture the new business data for each Yearly
Data - Product channel and product
and Channel Wise
INPUT_LIFE_14 Details of New To capture data on new business against all Quarterly
Business Data For the micro insurance channels along with no.
MI - Channel wise of total existing MI products.
INPUT_LIFE_14.1 Details of New To capture data on new business for each MI Yearly
Business Data For product and channel type
MI - Product and
Channel wise
INPUT_FnA_LIFE_ New Business To capture the new business data for life Monthly
NBS Data for Life insurance products
Department
Geography
Dept. Form Name
Premium
Location
Channel
Division
Product
Insurer
Group
Time
Type
Type
LoB
New Business Data - Channel wise X C X X X X Q
Time
Insurer Division
1. Year ID
2. Half Year ID
1. Insurer ID 1 . Division ID 3. Quarter ID
2. Insurer Name 2 . Division Name 4. Month ID
3. Insurer Type 5. Day
4. Date of Registration
RNB Fact
1. Insurer ID
2. Quarter ID
3. Channel ID
4. Business ID
5. Premium Type ID
6. Division ID
Product
1. Product Category Channel Type
Line of Business 2. Category ID
3. Target Group
4. Product Name 1 . Channel ID
1 . LOB ID
5. UIN 2 . Channel Name
2 LOB Name
6. Date of approval
3 . LOB Type
7. Average Term
INPUT_LIFE_9.3 Renewal Business To capture the renewal business data for Yearly
Data - Product and each channel and product
Channel Wise
INPUT_LIFE_9.4 Renewal Business To capture the new business data against Quarterly
Data - Channel wise all channels and sub-segments (rural
,urban, semi urban and metro)
Dimensions
Channel
Division
Product
Insurer
Time
Type
LoB
Renewal Business Data - Product and Channel X P X X X Y
Life Wise
Time
Insurer Division
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
1 . Division ID
2. Insurer Name 4. Month ID
2 . Division Name
3. Insurer Type 5. Day
4. Date of Registration
Claims Fact
1. Insurer ID
2. Quarter ID
Premium Type 3. Channel ID
4. Business ID
5. State ID
6. Premium Type ID
1. Premium Type ID 7. Division ID Channel Type
2. Premium Type 8. Product Category ID
9. Claim ID
1. Channel ID
2. Channel Name
Product Claims Metrics
1. Product Category
2. Category ID
3. Target Group
4. Product Name
5. UIN
6. Date of approval
7. Average Term Claim Details
Geography
1. Claim ID
Line of Business 2. Claim Type
3. Claim Description
1 . LOB ID 4. Claimant Name
2 LOB Name 1 . Classification
3 . LOB Type 2 . Country ID
3 . Country Name
4 . State ID
5 State Name
6 . City Name
7 .. Pin Code
INPUT_LIFE_11(a) Claims Data on To collect the state wise claims information on social Quarterly
Social Security security schemes
Schemes
INPUT_LIFE_17(a) Death Claims To capture the data on death claims for individual Quarterly
(Individual) business
INPUT_LIFE_17(b) Death Claims To capture the data on death claims for group Quarterly
(Group) business
INPUT_LIFE_17 Rider Claims To capture the data on rider claims for individual Quarterly
Data business
INPUT_LIFE_18 State wise Death To capture the claims data for each state for both Quarterly
Claims individual and group business
Movement Form
INPUT_LIFE_18.2 Details of Claims To capture the movement of claims handled through Quarterly
Handled TPA
through TPAs
INPUT_LIFE_19(a) MI Claims To capture the claims data for MI business Maturity Quarterly
Movement Form only
Maturity
INPUT_LIFE_19(b) MI Claims To capture the claims data for MI business Death Quarterly
Movement Form claims only
- Death Claims
INPUT_LIFE_20 Penal Interest To capture the data on penal interest paid to Quarterly
Paid - Claims policyholders.
and Benefits
Data
INPUT_LIFE_23 Repudiated To capture claim wise details for each instance of Quarterly
Claims Data repudiation
INPUT_NONLIFE_ Claims pending To capture claim wise details for each instance of Quarterly
Pending_Claims for more than repudiation and pendency for more than 6 months
six months and
repudiated
claims
Dimensions
Geography
Dept. Form Name
Premium
Channel
Division
Product
Insurer
Details
Claim
Time
Type
Type
LoB
Geography
Dept. Form Name
Premium
Channel
Division
Product
Insurer
Details
Claim
Time
Type
Type
LoB
MI Claims Movement Form Maturity X X Q
Time
Insurer
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
2. Insurer Name Agency Statistics Fact 4. Month ID
3. Insurer Type 5. Day
4. Date of Registration
1. Insurer ID
2. Quarter ID
3. Channel ID
4. State ID
5. Slab ID
Channel Type
1. Channel ID
Geography
2. Channel Name
1. Classification
2. Country ID Policy Slab
3. Country Name
4. State ID
5 State Name 1. Slab ID
6. City Name 2. Slab Name
7.. Pin Code
INPUT_LIFE_1.3 Agency To capture the detailed breakup of the agents based Quarterly
Statistics on different policy slabs
Data Slab
Wise
INPUT_LIFE_1(a) Agency To capture the data on no. of agents at the end of Quarterly
Statistics quarter for each state.
Data State
wise
INPUT_LIFE_1(c) Micro To collect the data on no. of MI agents at the end of Quarterly
Insurance quarter for each state.
Agency
Statistics
Data - State
wise
Dimensions
Geography
Policy Slab
Time
Type
Geography
Policy Slab
Dept. Form Name
Channel
Insurer
Time
Type
Micro Insurance Agency Statistics Data X X Q
Time
Insurer
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
2. Insurer Name Office Fact 4. Month ID
3. Insurer Type 5. Day
4. Date of Registration
1. Insurer ID
2. Month ID
3. Office ID
4. City ID
5. Location ID
1. Location ID
2. Location Name
INPUT_LIFE_2 New Office Each insurer who seeks to open new As and
Application Form offices will have to fill up the form for When
(Online) approval of IRDA
INPUT_NON_LIFE_ Office_1 OFFICE DETAILS To collect the information on the office Quarterly
(Branch) details in each state for each
insurer.
Geography
Dept. Form Name
Location
Insurer
Details
Office
Time
New Office Application Form (Online) X X X D X
OFFICE DETAILS X X Q
Non Life
Details of foreign offices X X Q
Time
Insurer Division
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
1. Division ID
2. Insurer Name 4. Month ID
2. Division Name
3. Insurer Type 5. Day
4. Date of Registration
Others
1. Insurer ID
2. Quarter ID
3. Case ID
Premium Type 4.Premium ID
5. Division ID
6. Product Category ID
7. Channel ID
1. Premium Type ID 8. LOB ID Case Details
2. Premium Type 9.. URN
1. Case Location
2. Case Source
3. Case ID
Product
Other Facts
1. Product Category
2. Category ID
3. Product Name Channel Type
4. Product ID
1. Channel ID
2. Channel Name
Line of Business
1. LOB ID
2 LOB Name Advertisement
3. LOB Type Details
1. URN
2. Date of Launch
3. Ad Type
4.Co Insurer Name
5. Medium
INPUT_LIFE_22 Free Look and To collect data on free look and cheque Quarterly
Cheque dishonor dishonor data during the quarter.
data
Advertiseme
Case Details
nt Details
Dept. Form Name
Premium
Channel
Division
Product
Insurer
Time
Type
Type
LoB
Premium Awaited Policies (For the quarter) X C X Q
Persistency Data X C X X Q
Insurer
Time
1. Year ID
1. Insurer ID 2. Half Year ID
2. Insurer Name 3. Quarter ID
3. Insurer Type 4. Month ID
4. Date of Registration 5. Day
Channel Type
1.1 Insurer
. I n s uID
rer ID
2.2 Quarter
. Q u a rIDter ID
3.3 Channel
. C h a nIDnel ID
4. LoB ID 1. Channel ID
Location 4. State ID 2. Channel Name
5. State ID
5. LOB ID
6. Location ID
1. Location ID
2. Location Name Business Data Facts
Geography
INPUT_NON_LIFE_ Segment wise Gross To collect Segment wise and State wise Quarterly
PREMIUM_1 Premium data across all information on Gross Premium, No. of
Channels - For the quarter Policies and Total Sum Assured across
Channels of Non-Life General business
INPUT_NON_LIFE_ Segment wise Gross To collect Segment wise and State wise Quarterly
PREMIUM_1.1 Premium data across all information on Gross Premium, No. of
Channels - Upto the quarter Policies and Total Sum Assured across
Channels of Non-Life General business
INPUT_NON_LIFE_ Segment wise direct To collect Segment wise and State wise Quarterly
PREMIUM_2 business - Upto the quarter information on Gross Premium, No. of
Policies and Total Sum Assured for
direct business
INPUT_NON_LIFE_ Segment wise direct To collect Segment wise and State wise Quarterly
PREMIUM_2.1 business - For the quarter information on Gross Premium, No. of
Policies and Total Sum Assured for
direct business
INPUT_FnA_NON New Business To capture the new business data for non life Monthly
LIFE_NBS Data for Non Life insurance products
Department
Geography
Dept. Form Name
Location
Channel
Insurer
Time
Type
LoB
Segment wise Gross Premium data across all X X X X Q
Channels - For the quarter
Channel Type
1. Insurer ID
2. Business ID
3. Quarter ID 1. Channel ID
Line of Business 4. Channel ID 2. Channel Name
5. UIN
1. Business ID
2. Business Name Product Performance
3. Business Type Metrics
Product
1. Product Category
2. Category ID
3. Target Group
4. Product Name
5. UIN
6. Date of approval
7. Average Term
Channel
Product
Insurer
Time
Type
LoB
Details of product performance for products X P X X Y
with 1 year contract
Time
Insurer
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
2. Insurer Name 4. Month ID
3. Insurer Type 5. Day
4. Date of Registration
Claims Fact
1. Insurer ID
2. LOB ID
3. Quarter ID
4. Channel ID
5. State ID Channel Type
6. Claim ID
Geography
INPUT_NON_LIFE_ State wise and channel To collect information on the claims reported Quarterly
CLAIMS_1 wise claims reported in each state during the quarter.
INPUT_NONLIFE_P Claims pending for To capture the details for each instances of Quarterly
ending_Claims more than six months claim if pending for more than six months or
and repudiated claims repudiated
Dimensions
Geography
Dept. Form Name
Channel
Insurer
Details
Claim
Time
Type
LoB
Insurer
Time
1. Year ID
1. Insurer ID 2. Half Year ID
2. Insurer Name 3. Quarter ID
3. Insurer Type 4. Month ID
4. Date of Registration 5. Day
Non Life Micro Insurance
Fact
1. Insurer ID
2. Quarter ID
3. Cover Type ID
4. State ID
Cover Type ID
1. Classification 1. Cover Type ID
2. Country ID 2. Cover Type
3. Country Name Name
4. State ID
5 State Name
6. City Name
7. Pin Code
Dimensions
Cover Type
Geography
Dept. Form Name
Insurer
Time
MICROINSURANCE STATISTICS -For the quarter X X X Q
Non Life MICROINSURANCE STATISTICS -Upto the X X X Q
quarter
Time
Insurer
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
2. Insurer Name Office Data 4. Month ID
3. Insurer Type 5. Day
4. Date of Registration
1. Insurer ID
2. State ID
3. Quarter ID
4. Office ID
5. Location ID
Geography
1. Office Type
2. Office ID
3. Office Name
1. Classification
4. Office Address
2. Country ID
3. Country Name
4. State ID
5 State Name Location
6. City Name
7. Pin Code
1. Location ID
2. Location Name
Dimensions
Geography
Dept. Form Name
Location
Insurer
Details
Office
Time
OFFICE DETAILS-Quarterly X X X Q
Non Life
Details of foreign offices X X X Q X
Insurer
Time
Re-Insurer
1. Year ID
1. Insurer ID 2. Half Year ID
1. Reinsurer ID 2. Insurer Name 3. Quarter ID
2. Reinsurer Name 3. Insurer Type 4. Month ID
3. Reinsurer Type 4. Date of Registration 5. Day
4. Date of Registration
Line of Business
1. LOB ID
2 LOB Name
3. LOB Type
INPUT_NL_REIN Particulars of Excess of Loss To capture the details of the excess of Yearly
SURANCE_3 Cover Treaty For the Year loss cover treaties filed by the insurer
INPUT_NL_REIN Performance of excess of loss To collect the performance data on the Yearly
SURANCE_3.1 cover treaty - To be furnished Excess of Loss Cover treaty
by Insurers
Reinsurer
Dept. Form Name
Product
Insurer
Details
Treaty
Time
Type
LoB
List of reinsurance treaties during the X X P X Y X
year
Insurer
Time
Re-Insurer
1. Year ID
1. Insurer ID 2. Half Year ID
1. Reinsurer ID 2. Insurer Name 3. Quarter ID
2. Reinsurer Name 3. Insurer Type 4. Month ID
3. Reinsurer Type 4. Date of Registration 5. Day
4. Date of Registration
Reinsurance Program
Premium Type Fact
Treaty Details
1. Reinsurer ID
1. Premium Type ID 2. Quarter ID 1. Treaty ID
2. Premium Type 3. Claim ID 2. Treaty Type
4. Insurer ID 3. Treaty Name
5. LOB ID 4. Nature of Treaty
6. Premium Type ID 5. Basis of Treaty
7. UIN
Product 8. Sub Class ID
9. Treaty ID
1. Product Category
2. Category ID Reinsurance Program Claim Details
3. Target Group Metrics
4. Product Name 1. Claim ID
5. UIN 2. Claim Type
6. Date of approval 3. Claimant Name
7. Average Term
Line of Business Sub Class Details
1. LOB ID
1. Sub Class ID
2 LOB Name
2. Sub Class Name
3. LOB Type
INPUT_NL_REINSUR Detailed report on reinsurance This report shows of the details Yearly
ANCE_11 program of the reinsurance program
Dimensions
Reinsurer
Sub Class
Dept. Form Name
Premium
Product
Insurer
Details
Details
Treaty
Claim
Time
Type
LoB
Time
Insurer
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
2. Insurer Name 4. Month ID
3. Insurer Type 5. Day
4. Date of Registration
Reinsurance Business
Data Fact
1. Insurer ID
2. Yearr ID
3. Treaty ID
4. LOB ID
Reinsurance Business
Data Metrics Treaty Details
Line of Business
1. Treaty ID
2. Treaty Type
1. LOB ID 3. Treaty Name
2 LOB Name 4. Nature of Treaty
3. LOB Type 5. Basis of Treaty
Dimensions
Insurer
Details
Treaty
Time
LoB
Reinsurance Statistics under Reg 3(12) - X X Y X
Business Within India (to be furnished by
Insurers)
Non Life
Reinsurance Reinsurance Statistics under Reg 3(12) - X X Y X
Foreign Business (To be furnished by
Insurers)
Insurer
Time
Re-Insurer
1. Year ID
1. Insurer ID
2. Half Year ID
2. Insurer Name
1. Reinsurer ID 3. Quarter ID
3. Insurer Type
2. Reinsurer Name 4. Month ID
4. Date of Registration
3. Reinsurer Type 5. Day
4. Date of Registration
Reinsurance Recoveries
Fact
1. Reinsurer ID
2. Quarter ID
3. Treaty ID
4. Insurer ID
5. LOB ID Treaty Details
Dimensions
Reinsurer
Dept. Form Name
Insurer
Details
Treaty
Time
LoB
Details of Outstanding Recoveries - To be X X Y X X
Reinsurance furnished by the insurer
Aging data of reinsurance recoverables X X Q X X
Time
Insurer 1. Year ID
2. Half Year ID
3. Quarter ID
1. Insurer ID 4. Month ID
Reinsurance Claims Fact
2. Insurer Name 5. Day
3. Insurer Type
4. Date of Registration
1. Insurer ID
2. Quarter ID
3. LOB ID
4. Classification
Reinsurance Claims
Metrics
Geography
Line of Business
1. Classification
2. Country ID
1. LOB ID 3. Country Name
2 LOB Name 4. State ID
3. LOB Type 5 State Name
6. City Name
7. Pin Code
INPUT_NL_REINSUR Claims data for reinsurance To capture the claims data related to Quarterly
ANCE_12 reinsurance
INPUT_NL_REINSUR Premium and Claims Data To capture the data for premiums Quarterly
ANCE_13 and claims with detailed break ups
and finally calculating claims ratio
Geography
Insurer
Time
LoB
Claims data for reinsurance X X Q X
Non Life
Reinsurance Premium and Claims Data X X Q X
Time
Insurer 1. Year ID
2. Half Year ID
3. Quarter ID
1. Insurer ID 4. Month ID
Reinsurance Others Fact
2. Insurer Name 5. Day
3. Insurer Type
4. Date of Registration
1. Reinsurer ID
2. Quarter ID
Rating Agency
3. Insurer ID
4. Rating Agency ID
1. Rating Agency ID
2. Rating Agency
Reinsurance Others
Type
Metrics
Re-Insurer
1. Reinsurer ID
2. Reinsurer Name
3. Reinsurer Type
4. Date of Registration
Dimensions
Reinsurer
Dept. Form Name
Insurer
Agency
Rating
Time
Reinsurance Reinsurance Concentration X X X Y
Insurer
Time
1. Year ID
1. Insurer ID 2. Half Year ID
2. Insurer Name 3. Quarter ID
3. Insurer Type 4. Month ID
4. Date of Registration 5. Day
Channel Type
1. Insurer ID
2. Quarter ID
3. Channel ID
4. Business ID 1. Channel ID
5. State ID 2. Channel Name
Geography
Line of Business
1. Classification
2. Country ID
1. LoB ID 3. Country Name
2. LoB Name 4. State ID
3. LoB Type 5 State Name
6. City Name
7. Pin Code
INPUT_HEALTH_ Details of new business and To capture the statewise new business Yearly
4.1 renewal business - Statewise and renewal business activities for each
insurer
Dimensions
Geography
Dept. Form Name
Channel
Insurer
Time
Type
LoB
Details of new business and renewal X X X X Q
Health
business - Statewise
Channel Type
1.Insurer ID
2.Quarter ID
3.LOB ID 1. Channel ID
4.Channel ID 2. Channel Name
Line of Business
5.TPA ID
6.UIN
7.Claims ID
1. TPA ID
1. Claim ID Product 2. TPA Name
2. Claim Type
3. Date of Registration
3. Claimant Name
4. Claim Description 1. Product Category
2. Category ID
3. Target Group
4. Product Name
5. UIN
6. Date of approval
7. Average Term
INPUT_HEALTH_6.4 Performance of Universal Health This form is used to capture the Quarterly
Insurance Scheme (UHIS)/RSBY details of the Performance of
Universal Health Insurance
Scheme (UHIS)/RSBY for an
insurer
Claims Type
Dept. Form Name
Channel
Product
Insurer
Time
Type
TPA
LoB
Details of product performance - Products X P X Y
with 1 year or less than 1 year term (To be
furnished by All insurers having health
products)
Time
Insurer Division
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
1. Division ID
2. Insurer Name 4. Month ID
2. Division Name
3. Insurer Type 5. Day
4. Date of Registration
Claims Fact
1. Insurer ID
2. Quarter ID
1. Insurer ID
3. Channel ID
2 .Business
4. D i v i s i oIDn I D
3 .State
5. Q u aIDr t e r I D
4 . T P A IID
6.Premium D
5 .Division
8. C l a i mIDI D
9. Product ID
10. Claim ID
Claims Facts
TPA
Claim Details
1. Claim ID
1. TPA ID 2. Claim Type
2. TPA Name 3. Claimant Name
3. Date of Registration
INPUT_HEALTH_6.3 Details of Claims for an To collect the information of the claims for an Yearly
Insurer- Statewise insurer.
INPUT_HEALTH_12 Claims Data for TPAs To capture the claims data for TPAs. And the Monthly
(To be furnished by claims from policyholders, claims from
TPAs) hospitals and claims in aggregate level.
Division
Dept. Form Name
Insurer
Details
Claim
Time
TPA
Details of Claims Handled directly- To be
submitted by the insurers having health X X M
business (Individual)
Insurer
Time
1. Year ID
1. Insurer ID 2. Half Year ID
2. Insurer Name 3. Quarter ID
3. Insurer Type 4. Month ID
4. Date of Registration 5. Day
1. Insurer ID
2. Quarter ID
3. TPA ID
TP A A na ly s is M e t r ic s
TPA
1. TPA ID
2. TPA Name
3. Date of Registration
INPUT_HEALTH_8 TPA Contract Details (To be To capture the data on the contracts Yearly
furnished by TPAs) of TPAs with insurers and
hospitals/doctors along with the data
on claims processed during previous
year
Dimensions
Time
TPA
1. Quarter ID
2. TPA ID
Time
1. Year ID
2. Half Year ID
3. Quarter ID
Financial Analysis of TPAs 4. Month ID
Facts 5. Day
TPA
1. TPA ID
2. TPA Name
3. Date of Registration
INPUT_HEALTH_9 Profit & Loss Statement This return contains the Profit & Loss Yearly
for TPAs (To be furnished Statement of the TPAs. The format of
by TPAs) profit and loss statement is standard
across all TPAs for easy consolidation
of the financials
INPUT_HEALTH_10 Profit & Loss To capture the profit & loss Yearly
Appropriation Format for appropriation for each TPA
TPAs (To be furnished by
TPAs)
INPUT_HEALTH_11 Balance Sheet for TPAs To represent the balance sheet items Yearly
(To be furnished by TPAs) of the TPAs.
Dimensions
Time
Insurer
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
2. Insurer Name 4. Month ID
3. Insurer Type 5. Day
4. Date of Registration
Line of Business
1 . LOB ID Actuary
2 LOB Name
3 . LOB Type
1 . Actuary ID
2 . Actuary Name
Actuary
Insurer
Time
LoB
IBNR-A: Statement of IBNR Provision X X X Q
IBNR-B1(b):Cumulative Statement of X X X Q
Incurred Claims Development ( By Amount)
Time
Insurer Division
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
1 . Division ID
2. Insurer Name 4. Month ID
2 . Division Name
3. Insurer Type 5. Day
4. Date of Registration
Form DD - Form To capture the data on new business in the year Yearly
for New and total in-force business during the year.
Business and
Total In-Force
INPUT_ACTUARIAL_1 Business Data
Funds
Maintained by
the insurer
(Linked
Business)
Geography
Premium
Division
Product
Insurer
Group
Time
Type
LoB
X P X X X X X Y
Form NLB 1 - Details of in-force policies at
Division
Product
Insurer
Group
Time
Type
LoB
product level ( Non-Linked Policies)
X X Q
Form KT - Q - Statement of available solvency
margin and solvency ratio
X X Y
Form KT - 3 - Statement of available solvency
margin and solvency ratio
X C X X Y
Form IA
X C X X X X Y
Form H
X P X Y
Valuation Bases
X Y
Return on Assets
X X Y
Details of foreign operation
X Y
Components of global reserves
Time
Insurer Division
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
1 . Division ID
2. Insurer Name 4. Month ID
2 . Division Name
3. Insurer Type 5. Day
4. Date of Registration
Form DD - Form To capture the data on new business in the year Yearly
for New and total in-force business during the year.
Business and
Total In-Force
INPUT_ACTUARIAL_1 Business Data
Funds
Maintained by
the insurer
(Linked
Business)
Geography
Premium
Division
Product
Insurer
Group
Time
Type
LoB
X P X X X X X Y
Form NLB 1 - Details of in-force policies at
Division
Product
Insurer
Group
Time
Type
LoB
product level ( Non-Linked Policies)
X X Q
Form KT - Q - Statement of available solvency
margin and solvency ratio
X X Y
Form KT - 3 - Statement of available solvency
margin and solvency ratio
X C X X Y
Form IA
X C X X X X Y
Form H
X P X Y
Valuation Bases
X Y
Return on Assets
X X Y
Details of foreign operation
X Y
Components of global reserves
Time
Insurer Division
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
1 . Division ID
2. Insurer Name 4. Month ID
2 . Division Name
3. Insurer Type 5. Day
4. Date of Registration
Form DD - Form To capture the data on new business in the year Yearly
for New and total in-force business during the year.
Business and
Total In-Force
INPUT_ACTUARIAL_1 Business Data
Funds
Maintained by
the insurer
(Linked
Business)
Geography
Premium
Division
Product
Insurer
Group
Time
Type
LoB
X P X X X X X Y
Form NLB 1 - Details of in-force policies at
Division
Product
Insurer
Group
Time
Type
LoB
product level ( Non-Linked Policies)
X X Q
Form KT - Q - Statement of available solvency
margin and solvency ratio
X X Y
Form KT - 3 - Statement of available solvency
margin and solvency ratio
X C X X Y
Form IA
X C X X X X Y
Form H
X P X Y
Valuation Bases
X Y
Return on Assets
X X Y
Details of foreign operation
X Y
Components of global reserves
Time
Insurer
1. Year ID
2. Half Year ID
1. Insurer ID 3. Quarter ID
2. Insurer Name 4. Month ID
3. Insurer Type Actuarial Reinsurance 5. Day
4. Date of Registration
Fact
1. Insurer ID
2. Quarter ID
Product 3. Business ID
4. Classification
1. Product Category
6. UIN
2. Category ID 8. Treaty ID
3. Target Group
4. Product Name
5. UIN
6. Date of approval
7. Average Term
Actuarial Reinsurance -
Metrics
Treaty Details
Geography
1. Treaty ID
2. Treaty Type
3. Treaty Name
3. Treaty Nature
3. Treaty Basis
Re Insurer 1 . Classification
2 . Country ID
3 . Country Name
1. Re Insurer ID 4 . State ID
2. Re Insurer Name 5 State Name
3. Re Insurer Type 6 . City Name
4. Date of Registration 7 .. Pin Code
Geography
Reinsurer
Product
Insurer
Treaty
Time
Form LR-1: List of reinsurance treaties for the
X P X X X Y
year
Brokers-Financial
Brokers- BusinessAnalysis
Data
Time
Broker
1. Year ID
1. Broker ID 2. Half Year ID
s
2. Broker Name 3. Quarter ID
3. Broker Category Broker Business Data 4. Month ID
4. Broker Type 5. Day
5. License No. B r o k e r F i n aFnaccita l A n a l y s i s
6. Sub Class Fact
1. Broker ID
2. Quarter ID
3. Client ID
1 . IBnrsoukreerr IIDD
4
2. U
5 Q IuNa r t e r I D
3. L
6 SO h aBr eI D
holder ID
4. P
7 A rcecmo ui unm
t NToy.p e I D Client Details
Insurer
1. Client ID
B rB
o rkoekr eFr i n
Bauns ci ni aels A
s nDaal yt as i s 2. Client Name
1. Insurer ID MFeatcr itcss
2. Insurer Name Bank Details
3. Insurer Type
4. Date of Registration
Product
Line of Business Premium Type
1. Product Category
2. Category ID
3. Target Group 1. LOB ID
1. Premium Type ID
4. Product Name 2 LOB Name
2. Premium Type
5. UIN 3. LOB Type
6. Date of approval
7. Average Term
INPUT_BROKER_10 Business Data for brokers To capture the new business data for a Yearly
broker insurer wise and client wise
INPUT_BROKER_10.1 Business Data for brokers To capture the new business data for Quarterly
(Life Insurers) brokers for life insurers
INPUT_BROKER_10.2 Business Data for brokers To capture the new business data for Quarterly
(Non Life Insurers) brokers for non life insurers
Dimensions
Premium
Category
Product
Insurer
Broker
Client
Time
Type
LoB
Business Data for brokers X X Y X
Broker Time
1. Year ID
1. Broker ID 2. Half Year ID
2. Broker Name 3. Quarter ID
3. Broker Category 4. Month ID
4. Broker Type 5. Day
5. License No.
Claims Fact
1. Broker ID
2. Quarter ID
3. Claim ID
Claims Metrics
Claim Details
1. Claim ID
2. Claim Type
3. Claimant Name
INPUT_BROKER_12 Claims Data To capture the details of the claims for a Quarterly
broker
Dimensions
Broker
Details
Claim
Time
Brokers Claims Data X X Q
Broker
Time
1. Broker ID
2. Broker Name 1. Year ID
3. Broker Category 2. Half Year ID
4. Broker Type 3. Quarter ID
5. License No. 4. Month ID
5. Day
1. Broker ID
2. Year ID
3. Office ID
4. State ID
Office Details
1. Office Type
Geography 2. Office ID
Office Data Metrics 3. Office Name
4. Office Address
1. Classification
2. Country ID
3. Country Name
4. State ID
5 State Name
6. City Name
7. Pin Code
Dimensions
Geography
Dept. Form Name
Broker
Office
Time
Brokers Particular of branch and registered offices X X X Y
Time
Broker
1. Year ID
1. Broker ID 2. Half Year ID
s
2. Broker Name 3. Quarter ID
3. Broker Category 4. Month ID
4. Broker Type 5. Day
5. License No. Broker Financial Analysis
Fact
1. Broker ID
2. Quarter ID
3. Shareholder/Promoter/
Associate ID
4. Account No.
INPUT_BROKER_1 Capital Structure and To capture the details of the capital Quarterly
shareholders details for a structure of a broker
broker
INPUT_BROKER_3 Financial Statement for To capture the profit and loss Yearly
each broker - Profit and statement details for a broker
Loss Statement
INPUT_BROKER_4 Balance Sheet of Brokers To capture balance sheet details for a Yearly
broker
INPUT_BROKER_5 Financial data for brokers To capture the financial data for a Half Yearly
broker
INPUT_BROKER_11 Insurance Bank Accounts To capture the details of the insurance Yearly
of brokers bank accounts
INPUT_BROKER_15 Fixed Deposit Details To capture the fixed deposit details for Yearly
a broker
INPUT_BROKER_19 Annual Fees Data To capture the details of annual fees Yearly
paid with details such as demand Draft
details, date of payment and payment
status.
Shareholder
Dept. Form Name
Broker
Details
Bank
Time
Capital Structure and shareholders details for X Q X
a broker
Broker
Time
1. Broker ID 1. Year ID
2. Broker Name 2. Half Year ID
3. Broker Category s
3. Quarter ID
4. Broker Type 4. Month ID
Broker Org Struct Fact
5. License No. 5. Day
6. Name of Director/
Person(s)
7. Profession
8. Address 1. Broker ID
2. Year ID
INPUT_BROKER_6 Board of Directors and To capture the details of the persons Yearly
management details in the board of directors and
management details
Dimensions
Time
Time
Broker
1. Year ID
1. Broker ID 2. Half Year ID
s
2. Broker Name 3. Quarter ID
3. Broker Category 4. Month ID
4. Broker Type 5. Day
5. License No. BB
r or koek re rF Ii n a
spne
c icat li oAnnD
a laytsai s
6. Sub Class Fact
1. Broker ID
2. Qu
M oan rt the rI DI D
3. ISnhsapreechtoi ol dneIrDI D
4. Account No.
BB
r or koek re rF Ii n a
spnec icat li oAnnD
a laytsai s
MFeatcr itcss
I n sBpaenc kt i oDne D
t aei ltsa i l s
1. Bank ID
2 . IBnasnpke cNt a
1 i omneI D
3 . IBnasnpke cAtdi odnr eSstsa t u s
2
4 . IAncscpoeucnt it oNn oD
3 . ate
5. V
4 A icocloaut inotn TDy e
p sec .
6. R
5 A cecpoour tn tS B ub
aml ainscsei o n
7 .a F
D t e. D . N o .
Dimensions
Inspection
Dept. Form Name
Broker
Details
Time
Brokers Inspection Data X M X
Time
Broker
1. Year ID
1. Broker ID 2. Half Year ID
s
2.
1 B r o k e r INDa m e 3. Quarter ID
3.
2 Broker N Cam t eeg o r y 4. Month ID
4.
3 Broker C T yapt eeg o r y 5. Day
5.
4 L ircoeknesr eTN
B y poe. Broker Financial Analysis
Broker Legal Data Fact
6.
5 Siucbe nCslea sNso .
L Fact
1. Broker ID
2. Quarter ID
3. Sh
C aa
ser e IhDo l d e r I D
4. Account No.
1. Bank ID
2. Bank Name
3.
1 Ban
C sek A
I Dd d r e s s
4.
2 A cacsoeuS
C n t aNt uos.
5.
3 A cacsoeuD
C nat tTey p e
6. Account Balance
7. F.D.No.
Dimensions
Case Details
Dept. Form Name
Broker
Time
Brokers Legal Data X Q X
Broker Time
1. Year ID
1. Broker ID/License No.
2. Half Year ID
2. Broker Name Others 3. Quarter ID
3. Broker Category
4. Month ID
4. Broker Type 1. Broker ID/License No.
5. Day
5. License Issue Date 2. Year ID
3. Auditor ID
4. Reinsurance Type ID
5. Security Screening ID
Other Metrics
INPUT_BROKER_7 Audit arrangements for a To capture the details of the audit Yearly
broker arrangement for a broker.
INPUT_BROKER_21 APPLICATION FOR GRANT OF Each brokers who is willing to As and when
LICENCE/RENEWAL OF register with IRDA for broking
LICENCE business, is liable to fill up this form
INPUT_BROKER_22 GRANT OF LICENSE TO THE This form is used for granting As and when
BROKERS license to a broker
INPUT_BROKER_23 APPLICATION FOR DUPLICATE This form is used for application for As and when
LICENCE a duplicate license by the brokers
Dimensions
Reinsurance
Screening
Dept. Form Name
Security
Auditor
Broker
Details
Details
Time
Corporate
Brokers-Financial
Agents- Business
Analysis
Data
CA
Time
1. Year ID
2. Half Year ID
s
3. Quarter ID
1. CA ID 4. Month ID
2. CA Name 5. Day
3. CA Category Broker Financial Analysis
4. License No. CA Business Data Fact
Fact
5. Primary Profession 1/
2/3 1. CA ID
2. Quarter ID
1.
3 B rl ioeknet r I IDD
C
2.
4 IQnusaurrteerr IIDD
3.
5 SO
L h aBr eI D
holder ID
4.
6 A rcecm
P o ui unm t NToy.p e I D Client Details
Insurer 7. UIN
1. Client ID
B r o kCeAr F
Biunsai n ceisasl D
Anaa
t al y s i s 2. Client Name
1. Insurer ID MFeatcr itcss 3. Client Type
2. Insurer Name Bank Details
3. Insurer Type
4. Date of Registration
Product
Line of Business Premium Type
1. Product Category
2. Category ID
3. Target Group 1. LOB ID
1. Premium Type ID
4. Product Name 2 LOB Name
2. Premium Type
5. UIN 3. LOB Type
6. Date of approval
7. Average Term
INPUT_CA_14 New Business Data for To capture the new business data for a Yearly
corporate agents corporate agents insurer wise
INPUT_CA_14.1 New Business Data for To capture the new business data for a Quarterly
corporate agents(Insurer corporate agents insurer wise.
wise)
Dimensions
Premium
Category
Product
Insurer
Client
Time
Type
LoB
CA
Time
CA
1. Year ID
2. Half Year ID
3. Quarter ID
4. Month ID
1. CA ID 5. Day
2. CA Name
3. CA Category
4. License No.
5. Primary Profession 1/ Claims Fact
2/3
1. CA ID
2. Quarter ID
3. Claim ID
Claims Metrics
Claim Details
1. Claim ID
2. Claim Type
3. Claimant Name
INPUT_CA_13 Claims Data To capture the details of the claims for a Quarterly
corporate agents
Dimensions
Details
Claim
Time
CA
CA Claims Data X X Q
CA
Time
1. CA ID
2. CA Name
3. CA Category 1. Year ID
4. License No. 2. Half Year ID
5. Primary Profession 1/ 3. Quarter ID
2/3 4. Month ID
5. Day
1. CA ID
2. Quarter ID
3. Office ID
4. State ID
Office Details
1. Office Type
Geography 2. Office ID
Office Data Metrics 3. Office Name
4. Office Address
1. Classification
2. Country ID
3. Country Name
4. State ID
5 State Name
6. City Name
7. Pin Code
Dimensions
Geography
Dept. Form Name
Office
Time
CA
CA Particular of offices of corporate agents X X X Y
Time
CA
1. Year ID
2. Half Year ID
s
3. Quarter ID
4. Month ID
1. CA ID 5. Day
2. CA Name CA Financial Analysis
3. CA Category Fact
4. License No.
5. Primary Profession 1/
2/3 1. CA ID
2. Quarter ID
3. Shareholder/Promotre/
Associate ID
CA Financial Analysis
Metrics
Share Holder/Promoter/
Associate
1. Shareholder/Promoter/
Associate ID
2. Shareholder/Promoter/
Associate Name
3. Shareholder/Promoter/
Associate Address
4. Shareholder/Promoter/
Associate Profession
Input_CA_10.0 Income Statement To capture financial data for inflow and Yearly
outflow with respect to a corporate
agent
Input_CA_11.0 Capital structure and To capture the details of the capital Quarterly
Shareholder's Details of a structure and shareholders for a
corporate agents corporate agent
Input_CA_11.1 Income Data To capture the income data for the Yearly
corporate agents
Dimensions
Shareholder
/Promoter
Dept. Form Name
Time
CA
CA
Time
1. CA ID
2. CA Name 1. Year ID
3. CA Category 2. Half Year ID
s
4. License No. 3. Quarter ID
5. Primary Profession 1/ CA Org Struct Fact 4. Month ID
2/3 5. Day
6. Name of Director/
Person(s)
7. Profession
1. CA ID
8. Address
2. Quarter ID
Input_CA_7.0 Board of Director and management To capture the details of the board Yearly
details of directors / partners for a
corporate agent and management
details
Input_CA_9.0 Group Companies for a corporate To capture list of all group Yearly
agent companies attached with a
corporate agent
Dimensions
Time
CA
CA
Time
1. Year ID
2. Half Year ID
s
3. Quarter ID
1. CA ID 4. Month ID
2. CA Name 5. Day
3. CA Category Brok
CeAr L
Fiicneannsciinagl A
Dnaat a
lysis
4. License No. C a p tFuar e
c tF a c t
5. Primary Profession 1/
2/3
1. B rAo kI D
C er ID
2. Qa
D u ya r t e r I D
3. Shareholder ID
4. Account No.
Brok
CeAr L
Fiicneannsciinagl A
Dnaat a lysis
C a p t uFra
ecMt se t r i c s
INPUT_CA_1 Application for a new This form is used for application for a As and when
license to act as a new corporate agent license.
Corporate Agent required
INPUT_CA_2 Application for a License/ This form is used for application for a As and when
Renewal of License to act license/renewal of license to act as a
as a Corporate Agent corporate agent required
INPUT_CA_3 License to Act as a This form is used for application from a As and when
specified person under firm or company for a certificate/renewal
the Insurance Act, 1938 of certificate to act as a specified person required
(IV OF 1938)
INPUT_CA_4 License to Act as a This form is used for issuing license to act As and when
Corporate Agent under as a corporate agent
the Insurance Act, 1938 required
(IV OF 1938)
INPUT_CA_5 Data Capture for To capture the details like name, address, As and when
Corporate Agents contact no., photograph, date of
commencement of employment, date of required
leaving the service, if any, and salary
specified for a corporate agent.
INPUT_CA_6 Application for Duplicate The form will be used for application of a As and when
License duplicate license in case the original
license gets destroyed or lost or required
mutilated
Time
CA
Application for a new license to act as a X D
Corporate Agent
CA Time
1. Year ID
2. Half Year ID
Others 3. Quarter ID
1. CA ID 4. Month ID
2. CA Name 5. Day
3. CA Category
1. CA ID/License No.
4. License No.
2. Quarter ID
5. Primary Profession 1/
3. Case ID
2/3
Other Metrics
Case Details
1. Case ID
2. Case Status
3. Case Date
4. Case Location
5. Case Source
Input_CA_16.0 Legal Data for Corporate To capture the details of the no. of Quarterly
Agents lawsuits against each corporate
agent
Dimensions
Broker
Details
Claim
Time
CA Legal Data for Corporate Agents X Q X
Surveyors Data
Surveyor
Time
Insurer
1. License No. 1. Year ID
2. Surveyor Name 2. Half Year ID
1. Insurer ID 3. Surveyor Category 3. Quarter ID
2. Insurer Name 4.Qualification 4. Month ID
3. Insurer Type 5. Age 5. Day
4. Date of Registration 6. Date of Registration
7. Name of Persons/
Directors
Surveyor Fact
1. Insurer ID
Product 2. Day ID
3. License No.
1. Product Category
4. Business ID
2. Category ID 5. City ID
3. Target Group 6. Claim ID
4. Product Name 7. UIN
5. UIN
6. Date of approval
7. Average Term
Surveyor Metrics
Geography
Form 1-AF APPLICATION FOR To capture details of an applicant in the As and When
A LICENCE TO ACT application process
AS SURVEYOR
AND LOSS
ASSESSOR
FORM - IRDA 3A Details of partners To capture the details of partners / directors Yearly
AF / directors for a for a surveying firm
surveying firm
Form - IRDA- 2-AF APPLICATION To capture details of an applicant in the As and When
FROM A FIRM OR application process from a firm or a
COMPANY FOR A company
LICENCE TO ACT
AS A SURVEYOR
AND LOSS
ASSESSOR
Form - IRDA- 5-AF APPLICATION FOR To capture the details of the surveyor for As and When
RENEWAL OF A renewal of license
LICENCE TO ACT
AS SURVEYOR
AND LOSS
ASSESSOR
Form - 6A-AF APPLICATION To capture the details of the surveyor for As and When
FROM A FIRM OR renewal of license( In case of firm or a
COMPANY FOR company)
RENEWAL OF A
LICENCE TO ACT
AS A SURVEYOR
AND LOSS
ASSESSOR
Form 3-AF APPLICATION To capture the details of fresh application As and When
FROM A FIRM OR for a firm or a company
COMPANY FOR A
LICENCE TO ACT
AS A SURVEYOR
AND LOSS
ASSESSOR
FORM - IRDA 9 Application for To capture application details for duplicate As and When
Duplicate License license
FORM - IRDA 13 Data capture To capture state wise information for each Yearly
format for surveyor for claims and inspections data
capturing 3 years
of data
Form III PRESCRIBED To capture the enrollment details for a As and When
FORMAT FOR trainee surveyor
ENROLLMENT OF
TRAINEE
SURVEYORS &
LOSS ASSESSORS
FOR TRAINING
Form IV FORMAT FOR To capture the quarterly data for a trainee Quarterly
DAILY DIARY surveyor having details of the training
activities
Details
Claim
Time
Type
LoB
X X X D
APPLICATION FOR A LICENCE TO ACT AS
SURVEYOR AND LOSS ASSESSOR
X X X Y
Details of partners / directors for a surveying
firm
X X X D
APPLICATION FROM A FIRM OR COMPANY
FOR A LICENCE TO ACT AS A SURVEYOR AND
LOSS ASSESSOR
X X D
APPLICATION FOR RENEWAL OF A LICENCE
TO ACT AS SURVEYOR AND LOSS ASSESSOR
X D
APPLICATION FROM A FIRM OR COMPANY
Surveyor FOR RENEWAL OF A LICENCE TO ACT AS A
SURVEYOR AND LOSS ASSESSOR
X X X D
APPLICATION FROM A FIRM OR COMPANY
FOR A LICENCE TO ACT AS A SURVEYOR AND
LOSS ASSESSOR
X D
Application for Duplicate License
X P X X X X Y
Data capture format for capturing 3 years of
data
X X X D
PRESCRIBED FORMAT FOR ENROLLMENT OF
TRAINEE SURVEYORS & LOSS ASSESSORS FOR
TRAINING
X X X Q
FORMAT FOR DAILY DIARY
Premium Type Premium Type ID Unique ID for each of the premium type
Channel Type Channel ID Unique ID for identifying each of the different channel
type
Channel Name The name of the channel used for acquiring business
Office Details Office ID Unique ID for identifying each of the different channel
type
Office Type The name of the channel used for acquiring business
Ad Type
Co Insurer
Medium
Cover Type Cover Type ID Unique ID for identifying each of the different cover type
Cover Type Name The name of the cover type used for acquiring business
Name of
Persons/Directors
Generic Assumptions:
No. of divisions: 2
No. of groups: 2
No. of states: 30
No. of channels: 8
No. of Insurers=30
Appro
ximat
e No.
of
Data Total % Mont % of Quarte
Master Data Point Row Mont hly Quarte rly % of Yearly Total
Subject Combinations Input Size( hly Data rly Data Yearly Data Data(
Area (Possible) s GB) data (GB) data (GB) data (GB) GB)
2880000.00 8.00 10.99 0.00 0.00 0.78 34.28 0.22 2.42 36.69
NB
48000.00 8.00 0.18 0.00 0.00 1.00 0.73 0.00 0.00 0.73
RNB
Claims 1152000.00 20.00 10.99 0.00 0.00 1.00 43.95 0.00 0.00 43.95
Data
Agency 36000.00 10.00 0.17 0.00 0.00 1.00 0.69 0.00 0.00 0.69
Stats
Office 300000.00 10.00 1.43 1.00 17.17 0.00 0.00 0.00 0.00 17.17
Data
Advertis
ement 7200.00 4.00 0.01 1.00 0.16 0.00 0.00 0.00 0.00 0.16
Data
3840.00 20.00 0.04 0.00 0.00 1.00 0.15 0.00 0.00 0.15
Others
No. of Insurers=30
Master
Data Total % of Yearl
Combinati Approximate Row % Monthly % of Quarterly Yearl y
ons No. of Data Size(GB Monthly Data Quarterly Data y Data Total
Subject Area (Possible) Point Inputs ) data (GB) data (GB) data (GB) Data(GB)
288000.0
0 8.00 1.10 0.00 0.00 1.00 4.39 0.00 0.00 4.39
NB Data
Product
Performance 60000.00 40.00 1.14 0.00 0.00 0.00 0.00 1.00 1.14 1.14
Data
48000.00 20.00 0.46 0.00 0.00 1.00 1.83 0.00 0.00 1.83
Claims Data
6000.00 10.00 0.03 0.00 0.00 1.00 0.11 0.00 0.00 0.11
Non Life MI
36000.00 10.00 0.17 0.00 0.00 1.00 0.69 0.00 0.00 0.69
Office Data
No. of Insurers=30
No. of Reinsurers=15
Master Approxim
Data ate No. of Total % Mont % of Quarte % of
Combinati Data Row Mon hly Quarte rly Year Yearly Total
Subject ons Point Size(GB thly Data rly Data ly Data Data(G
Area (Possible) Inputs ) data (GB) data (GB) data (GB) B)
Treaty 11250000.
Wise 00 10.00 53.64 0.00 0.00 0.00 0.00 1.00 53.64 53.64
Data
Progra 2250000.0
m 0 5.00 5.36 0.00 0.00 0.00 0.00 1.00 5.36 5.36
Details
Busines 15000.00 6.00 0.04 0.00 0.00 0.00 0.00 1.00 0.04 0.04
s Data
Recover 225000.00 10.00 1.07 0.00 0.00 0.50 2.15 0.50 0.54 2.68
ies Data
Claims 9000.00 10.00 0.04 0.00 0.00 1.00 0.17 0.00 0.00 0.17
Data
4500.00 10.00 0.02 0.00 0.00 1.00 0.09 0.00 0.00 0.09
Others
No. of TPAs=30
No. of Insurers=30
Master Approxim
Data ate No. of Total % Mont % of Quarte % of Yearl
Combinati Data Row Mont hly Quarte rly Year y Total
Subject ons Point Size(G hly Data rly Data ly Data Data(
Area (Possible) Inputs B) data (GB) data (GB) data (GB) GB)
Business 72000.00 6.00 0.21 0.00 0.00 1.00 0.82 0.00 0.00 0.82
Data
Product 1152000.
Performa 00 10.00 5.49 0.00 0.00 0.00 0.00 1.00 5.49 5.49
nce Data
Claims 9000.00 10.00 0.04 0.60 0.31 0.00 0.00 0.40 0.02 0.33
Data
4500.00 10.00 0.02 0.33 0.08 0.00 0.00 0.67 0.01 0.10
TPA Data
TPA
Financial 27000.00 50.00 0.64 0.00 0.00 0.25 0.64 0.75 0.48 1.13
Data
No. of Insurers=30
No. of reinsurers=15
Master Approxim
Data ate No. of Total % Mont % of Quarte % of Yearl
Combinati Data Row Mont hly Quarte rly Year y Total
Subject ons Point Size(G hly Data rly Data ly Data Data(
Area (Possible) Inputs B) data (GB) data (GB) data (GB) GB)
6000.00 10.00 0.03 0.00 0.00 1.00 0.11 0.00 0.00 0.11
IBNR
1800000.0
Valuatio 0 10.00 8.58 0.00 0.00 0.00 0.00 1.00 8.58 8.58
ns
2250000.0
Reinsura 0 10.00 10.73 0.00 0.00 0.00 0.00 1.00 10.73 10.73
nce
No. of Brokers=300
No. of Insurers=30
Master Approxim
Data ate No. of Total % Mont % of Quarte % of Yearl
Combinati Data Row Mont hly Quarte rly Year y Total
Subject ons Point Size(G hly Data rly Data ly Data Data(G
Area (Possible) Inputs B) data (GB) data (GB) data (GB) B)
4320000.
Business 00 8.00 16.48 0.00 0.00 0.33 21.75 0.67 11.04 32.79
Data
300000.0
Claims 0 4.00 0.57 0.00 0.00 1.00 2.29 0.00 0.00 2.29
Data
150000.0
Office 0 4.00 0.29 0.00 0.00 0.00 0.00 1.00 0.29 0.29
Data
300000.0
Financial 0 10.00 1.43 0.00 0.00 0.00 0.00 1.00 1.43 1.43
data
Organizat
ion
Structure 300.00 10.00 0.00 0.00 0.00 0.00 0.00 1.00 0.00 0.00
Data
Inspectio 300.00 10.00 0.00 1.00 0.02 0.00 0.00 0.00 0.00 0.02
n Data
Legal 15000.00 10.00 0.07 0.00 0.00 1.00 0.29 0.00 0.00 0.29
Data
15000.00 10.00 0.07 0.00 0.00 0.00 0.00 1.00 0.07 0.07
Others
No. of Insurers=30
Master Approxi
Data mate No. Total % Mont % of Quarte % of Yearl
Combinati of Data Row Mon hly Quarte rly Year y Total
Subject ons Point Size(G thly Data rly Data ly Data Data(G
Area (Possible) Inputs B) data (GB) data (GB) data (GB) B)
Financial
Data 6000.00 15.00 0.04 0.00 0.00 0.00 0.00 1.00 0.04 0.04
Sharehol
der's
Data 15000.00 10.00 0.07 0.00 0.00 1.00 0.29 0.00 0.00 0.29
NB Data
(Life) 3840.00 6.00 0.01 1.00 0.13 0.00 0.00 0.00 0.00 0.13
NB Data
(Non Life) 1200.00 6.00 0.00 1.00 0.04 0.00 0.00 0.00 0.00 0.04
Approximate data size for the current year (including all the departments and functions) = 317 GB
Total data size (Considering 8 years of history data at 8% growth) = 317*8/ (1.08^7) = 1.48 TB
Indexing Factor = 8%
Assuming a physical space of 70% of the total data size is consumed by the logs, views, stored
procedures and tables etc., the YoY estimation of the total space is given below:
Aggressive Estimate (12% 2.7 3.0 3.4 3.8 4.2 4.7 5.3 5.9
Growth in Data Per Year)
Average Estimate (10% 2.7 3.0 3.3 3.6 3.9 4.3 4.8 5.2
Growth in Data Per Year)
Conservative Estimate(8% 2.7 2.9 3.1 3.4 3.7 4.0 4.3 4.6
Growth in Data Per Year)
Assumptions:
Additional bandwidth requirements for email, other data intensive operations etc. will be another 1
Mbps
An additional 0.5 Mbps should be kept as a reserve buffer to handle any surge of requests as well as
special data intensive processing in an one off mode
Hardware Specifications
It is recommended to design a hardware infrastructure that will not only address the present
requirements of the IRDA application but will also take into account the future high-bandwidth and high
availability of the application stack. Hardware infrastructure should have:
High Availability: Application, Web and database servers have been designed in failover and firm
mode with an ability to ensure full-proof operations
Separate Storage: Additional space requirements for IRDA in future will ensured through a
separate Storage Area Network (SAN) driven disk. The disk will also be mirrored to ensure data
protection and integrity
Redundancy: Adequate processing and capacity redundancy have been built in within the
system to ensure zero to minimal disruption in the overall operations
Table below describes various hardware components envisaged for the solution in the main data center:
Firewall/Proxy Gateway 2 Processors with dual core Existing firewall and gateway
each or higher may be used
All components of the proposed infrastructure should be configured in a fail over mode. This will ensure
no single point of failure of the system and a high availability of the application for its end users. To
optimize performance, load balancing should be implemented using a group of web servers. No antivirus
software envisaged in the facility as it will be configured through high-end firewall configurations.
Desktops will be protected using standard antivirus software
Networking will be enabled by two virtualized ports configured in a failover mode. Dedicated VPN links
will be established between Development environment and Hosted Services environment for
development purposes.
IRDA Intranet External Users
Internet
Hosted Environment
Online Access
Router
Dedicated VPN
Firewall
Load
Balancers
DMZ
Public
IRDA
DATABASE
text
CLUSTER
Private
Application design will ensure that all the data intensive processing will be restricted as back-end server
process within the data centre environment. Data centre will operate on a dedicated 10/100 MBPS LAN
ensuring the processing integrity. All the master data managed facilities; application intensive
processing as well as transactions will be carried out through the standard internet traffic from the field.
Civil Interiors
Around 250 square foot of facilities should be made available at a designated area at IRDA. The room
should have four walls and a ceiling with an appropriate clearance for putting server racks. Following
sections details out an indicative approach to the different facets of the data centre design, securities
and maintenance operations.
9-inch thick brick wall will be laid to ensure safety and compliance standards
Walls will be all plastered
A ramp will be provided at the entrance. Laying and fabrication work of the ramp will be carried out
according to the required slope angle specification.
Good penetrations will be made on all doors and seals with appropriate fire-stop material to ensure
integrity
Anti-termite treatment will be provided initially as well as on periodic basis
Joinery
Main fire door will be a 2.4 m double winged , fire rated 1.5 x 2.4 m , double door with 300x900 mm
fire rated flass vision panel
100 mm x 50 mm marandi/equivalent wooden frame fire glass will be provided with Fire Check
material in the top and bottom along with minimum 6 mm thick 1 hour Fire rated glass with
beading and finished with approved colors of 1.5 mm thick
FULL HEIGHT CALCIUM SILICATE (FIRE RATE MIN 1 Hour) BOARD PARTITION will be provided for fire
control. Calcium Silicate board partition (Non Asbestos) faced both sides will be 12 mm thick.
Skirting, door junction and material transition sections to be in hardwood frame 50 x to provide
10x10 grooves at all ends. If any hardwood sections are provided, these are to be coated all over
with 2 or more coats of viper anti-termite/fire retard paint. Inside voids to be filled with 50 mm thick
panels of glass wool/ rock wool. All board edges to be protected with aluminium angles
Access Flooring and Insulation Work
ACCESS FLOOR: Adjusting floor panels of 450+/- 20mm height will be provided under floor
construction
Self adhesive type 13mm thick XLPE foam on under deck & floor including metalized foil complete
with proper jointing will be provided as insulation
False Ceiling
Armstrong or equivalent make false ceiling including making openings for electrical
light fitting complete and fire alarm detectors and nozzles will be provided
Painting and Epoxy Works
Plastic/acrylic light textured emulsion paint will be applied on server room internal
walls. Epoxy coating will be applied to the Duco paint will be provided for fire decor
Additionally appropriate fire and security signage will be provided as well
Data Centre will have access control system with proximity card reader with appropriate security control
procedures established. Standard features for such a system is outlined below:
1 Door Control Unit for Access Control, microprocessor based with tamper
protected wall-mount case
Low power 12 V DC power supply with internal battery backup for 4 hours
operation.
Proximity card Readers with 3" read range capable of
Reading the facility code and unique card number from the card
Shall only read the card data and passes on to the door controller for validation
ISO thickness Proximity Cards (Blank faced) with option of printing directly on card.
Card has facility code, and unique card number
Electromagnetic Locks (600 lbs) with Magnetic Contact UL listed for single leaf
doors
Emergency Door Release (Break Glass Type)
1:N authentication Biometric Finger Print Reader with inbuilt proximity reader
Access Management Software complete with Graphic User Interface, Time &
Attendance software transactions, Anti-passback features complete as required
Fire Detection System
Air Conditioning
3 TR capacity PeX135FA-100 Precision AC, Floor discharge type with R-407C with Ethernet Connectivity
and Sequential controller will be provided with appropriate climate control features
Power Back-up
Liebert make 7400M 30/40 KVA UPS system with 3 phase input and 3 phase output
with accessories
Sealed Maintenance free Battery to Support approximately 30 minutes back-up
Additionally a standard public address system as well as a Rodent insect repellent should also need to be
installed in the data centre.
Since most of the services to be provided by Data Center (DC) are highly critical to IRDA, the efficiency of
DC operation should be of high importance. The service levels should be of prime importance to ensure
high efficiency of DC operations. The proposed technical architecture requires high availability .This
should be achieved through a High Available Design at various IT Infrastructure layers.
Along with the design, there is a need for strong IT infrastructure management processes and a
comprehensive maintenance plan involving maintenance engineers, spares and backend support from
OEMs for spare replenishment.
The key considerations for ensuring high efficiency of Data Center (DC) operations are discussed in the
table below:
Computing, Storage From an operational perspective, the system should provide enough availability
and Application to give comfort to applicants in terms of reliability and efficiency of the system.
Environment
Service Level for ensuring uptime should be 99.9 per cent
The architecture should have No Single Point Of Failures
Communication The MPLS backbone and Network should have assured uptime and therefore two
Network separate links from two individual service providers should be used.
Service Level for ensuring uptime should be 99.9 per cent
Information Information Security at various layers prohibiting the possible security threats
Security should be ensured.
Security should be ensured for the application data, Network and Physical
Infrastructure being set up under this project.
Security threats from unknown networks integrating with IRDA such as Insurer
need special attention and the design should take care of such users in a different
manner.
Maintainability Adequate spares at the sites for all elements with single point of failure, sufficient
on-site manpower to failure resolution on site, and back-to-back spare
replenishment plan with minimum spares turnaround time from the Product
OEMs should be ensured.
Manageability Latest tools for Incident Management including Help desk, Problem Management
and Asset Management should be used and appropriate processes should be
defined based on the ITIL framework.
Application Tier
This is the innermost security tier covering the security aspects of the applications running in IRDA
setup. This application tier should have following security aspects:
Two-factor, or multi-factor authentication is exactly what it sounds like. Instead of using only one type
of authentication factor, such as only things a user KNOWS (login IDs, passwords, secret images, shared
secrets, solicited personal information, etc), two-factor authentication requires the addition of a second
factor, the addition of something the user HAS or something the user IS. For IRDA there is need for such
multi factor authentications, especially for sensitive data like product pricing etc.
Two-factor authentication is a commonly used concept. For example, the two-factor authentication is
used every time a bank customer visits their local ATM machine. One authentication factor is the
physical ATM card the customer slides into the machine. The second factor is the PIN they enter.
Without both, authentication cannot take place. This scenario illustrates the basic parts of most multi-
factor authentication systems; the "something you have" + "something you know" concept.
Tokens
One form of 'something you have' is the smart card and USB tokens. Differences between the smart card
and USB token are diminishing; both technologies include a microcontroller, an OS, a security
application, and a secured storage area.
Virtual Tokens
Virtual tokens are a new concept in multi-factor authentication first introduced in 2005 by security
company, Sestus. Virtual tokens reduce the costs normally associated with implementation and
maintenance of multi-factor solutions by utilizing the user's existing internet device as the "something
the user has" factor. Also, since the user's internet device is communicating directly with the
authenticating website, the solution does not suffer from man-in-the-middle attacks and other forms of
online fraud.
Biometrics
Biometric authentication also satisfies the regulatory definition of true multi-factor authentication.
Users may biometrically authenticate via their fingerprint, voiceprint, or iris scan using provided
hardware and then enter a PIN or password in order to open the credential vault. However, while this
type of authentication is suitable in limited applications, this solution may become unacceptably slow
For many biometric identifiers, the actual biometric information is rendered into string or mathematic
information. The device scans the physical characteristic, extracts critical information, and then stores
the result as a string of data. Comparison is therefore made between two data strings, and if there is
sufficient commonality a pass is achieved. It may be appreciated that choice of how much data to
match, and to what degree of accuracy, governs the accuracy/speed ratio of the biometric device. All
biometric devices, therefore, do not provide unambiguous guarantees of identity, but rather
probabilities and all may provide false positive and negative outputs. If a biometric system is applied to a
large number of users - perhaps all of the customers of a bank, the error rate may make the system
impractical to use.
Biometric information may be mechanically copied and they cannot be easily changed. This is perceived
as a key disadvantage since, if discovered, the compromised data cannot be changed. A user can easily
change his/her password; however, a user cannot change their fingerprint. A bio-identifier can also be
faked. For example, fingerprints can be captured on sticky tape and false gelatine copies made, or
simple photos of eye retinas can be presented. More expensive biometrics sensors should be capable to
distinguish between live original and dead replicas, but such devices are not practical for mass
distribution. It is likely that, as biometric identifiers become widespread, more sophisticated
compromise techniques will also be developed.
Historically, fingerprints have been used as the most authoritative method of authentication. Other
biometric methods such as retinal scans are promising, but have shown themselves to be easily
spoofable in practice. Hybrid or two-tiered authentication methods offer a compelling solution, such as
private keys encrypted by fingerprint inside of a USB device.
SMS One time password uses information sent as an SMS to the user as part of the login process. One
scenario is where a user either registers (or updates) their contact information on a website. During this
time the user is also asked to enter his or her regularly used telephone numbers (home, mobile, work,
etc). The next time the user logs in to the website, they must enter their username and password; if they
enter the correct information, the user then chooses the phone number at which they can be contacted
immediately from their previously registered phone numbers. The user will be instantly called or receive
an SMS text message with a unique, temporary PIN code. The user then enters this code into the
website to prove their identity, and if the PIN code entered is correct, the user will be granted access to
their account. This process provides an extra layer of online security beyond merely a username and
password. These solutions can be used with any telephone, not just mobile devices.
A USB token has different form factor; it can't fit in a wallet, but can easily be attached to a key ring. A
USB port is standard equipment on today's computers, and USB tokens generally have a much larger
storage capacity for logon credentials than smart cards. As with smart cards, magnetic card readers, and
mobile signature methods, they are costly to deploy and support, are vulnerable to numerous forms of
theft and fraud, and have been resisted by consumers.
Digital Certificates
Digital Client certificates are PKI solutions for enabling the enhanced user identification and access
controls needed to protect sensitive online information. Digital certificates can also be stored and
transported on smart cards or USB tokens for use when travelling. Each certificate can only be used to
authenticate one particular user because only that users computer has the corresponding and unique
private key needed to complete the authentication process. Client certificates are delivered
electronically; however, deployment and support of digital certificates have proven problematic. In a
2008 study published by the Credit Union Journal, digital certificates were noted as averaging very high
support costs and very low rates of user acceptance due to difficult technical implementation
requirements.
Converting the paper based data capture process to online data submission will require utilizing some
advanced data authentications mechanism like Digital Signature. For quite a few returns submitted by
Insurer require physical signoff by various stakeholders like CFO, Auditor, Appointed Actuary etc. This
section will focus on various aspects of Digital Signature for utilizing it appropriately at IRDA.
The Information Technology Act, 2000 provides for use of Digital Signatures on the documents
submitted in electronic form in order to ensure the security and authenticity of the documents filed
electronically. This is the only secure and authentic way that a document can be submitted
Digital Signature is used in Electronic filing or e-filing which is a method of filing signed document that
uses an electronic format rather than a traditional paper format. Parties convert their documents into
the file format designated by the authority and file their documents via email or over the Internet.
Scenario comparison between Digital Signature and traditional authentication is given below:
Getting a Private and Public Key: In order to electronically sign documents with standard digital
signatures, sender needs to obtain a Private and Public Key a one-time setup/operation. The
Private Key, as the name implies, is not shared and is used only by the signer to sign documents.
The Public Key is openly available and used by those that need to validate the signers digital
signature.
Initiate the signing process: Depending on the software used, sender needs to initiate the
signing process (e.g. clicking a Sign button on the softwares toolbar).
Create a digital signature: A unique digital fingerprint of the document (sometimes called
Message Digest or Document Hash) is created using a mathematical algorithm (such as SHA-1).
Even the slightest difference between two documents would create a different digital
fingerprint of the document.
Append the signature to the document: The hash result and the users digital certificate (which
includes his Public Key) are combined into a digital signature (by using the users Private Key to
Based on the specific data confidentiality requirement at IRDA, the following process flow is proposed
for acquiring Digital Signature:
Based on the specific data confidentiality requirement at IRDA, the following process flow is proposed
for validating Digital Signature:
Within IRDA, roles are created for various job functions. The permissions to perform certain operations
are assigned to specific roles. Members of staff (or other system users) are assigned particular roles, and
through those role assignments acquire the permissions to perform particular system functions. Unlike
context-based access control (CBAC), RBAC does not look at the message context (such as a connection's
source).
Since users are not assigned permissions directly, but only acquire them through their role (or roles),
management of individual user rights becomes a matter of simply assigning appropriate roles to the
user; this simplifies common operations, such as adding a user, or changing a user's department.
RBAC differs from access control lists (ACLs) used in traditional discretionary access control systems in
that it assigns permissions to specific operations with meaning in the organization, rather than to low
level data objects. For example, an access control list could be used to grant or deny write access to a
particular system file, but it would not dictate how that file could be changed. In an RBAC-based system,
an operation might be to create a 'credit account' transaction in a financial application or to populate a
'blood sugar level test' record in a medical application. The assignment of permission to perform a
Audit log is a chronological sequence of audit records, each of which contains evidence directly
pertaining to and resulting from the execution of a business process or system function.
Audit Trails provides track, trace and reporting capabilities for any changes associated with any data a
system, and for any data in third-party software or customized software resident
Transport Layer Security (TLS) and its predecessor, Secure Sockets Layer (SSL), are cryptographic
protocols that provide security for communications over networks such as the Internet. TLS and SSL
encrypt the segments of network connections at the Transport Layer end-to-end.
In applications design, TLS is usually implemented on top of any of the Transport Layer protocols,
encapsulating the application specific protocols such as HTTP, FTP, SMTP, NNTP, and XMPP. Historically
it has been used primarily with reliable transport protocols such as the Transmission Control Protocol
(TCP). However, it has also been implemented with datagram-oriented transport protocols, such as the
User Datagram Protocol (UDP) and the Datagram Congestion Control Protocol (DCCP), usage which has
been standardized independently using the term Datagram Transport Layer Security (DTLS).
A prominent use of TLS is for securing World Wide Web traffic carried by HTTP to form HTTPS. Notable
applications are electronic commerce and asset management. Increasingly, the Simple Mail Transfer
Protocol (SMTP) is also protected by TLS (RFC 3207). These applications use public key certificates to
verify the identity of endpoints.
An increasing number of client and server products support TLS natively, but many still lack support. As
an alternative, users may wish to use standalone TLS products like Stunnel. Wrappers such as Stunnel
rely on being able to obtain a TLS connection immediately, by simply connecting to a separate port
reserved for the purpose.
TLS can also be used to tunnel an entire network stack to create a VPN, as is the case with OpenVPN.
Many vendors now marry TLS's encryption and authentication capabilities with authorization. When
compared against traditional IPsec VPN technologies, TLS has some inherent advantages in firewall and
NAT traversal that make it easier to administer for large remote-access populations.
It is a branch of technology known as information security as applied to computers and networks. The
objective of system security includes protection of information and property from theft, corruption, or
natural disaster, while allowing the information and property to remain accessible and productive to its
intended users. The term system security means the collective processes and mechanisms by which
sensitive and valuable information and services are protected from publication, tampering or collapse by
unauthorized activities or untrustworthy individuals and unplanned events.
That condition of a system wherein its mandated operational and technical parameters are
within the prescribed limits.
The quality of an Automated Information System when it performs its intended function in an
unimpaired manner, free from deliberate or inadvertent unauthorized manipulation of the
system.
The state that exists when there is complete assurance that under all conditions an IT system is
based on the logical correctness and reliability of the operating system, the logical completeness
of the hardware and software that implement the protection mechanisms, and data integrity.
Data integrity is a term used in computer science and telecommunications that can mean ensuring data
is "whole" or complete, the condition in which data is identically maintained during any operation (such
as transfer, storage or retrieval), the preservation of data for their intended use, or, relative to specified
operations, the a priori expectation of data quality. Put simply, data integrity is the assurance that data
is consistent and correct.
In terms of a database data integrity refers to the process of ensuring that a database remains an
accurate reflection of the universe of discourse it is modelling or representing. In other words there is a
close correspondence between the facts stored in the database and the real world it models
Identification is the process by which the identity of a user is established, and authentication is the
process by which a service confirms the claim of a user to use a specific identity by the use of credentials
(usually a password or a certificate).
Identification is an assertion of who someone is or what something is. If a person makes the statement
"Hello, my name is John Doe." they are making a claim of who they are. However, their claim may or
may not be true. Before John Doe can be granted access to protected information it will be necessary to
verify that the person claiming to be John Doe really is John Doe.
Authentication is the act of verifying a claim of identity. When John Doe goes into a bank to make a
withdrawal, he tells the bank teller he is John Doe (a claim of identity). The bank teller asks to see a
photo ID, so he hands the teller his driver's license. The bank teller checks the license to make sure it has
There are three different types of information that can be used for authentication: something you know,
something you have, or something you are. Examples of something you know include such things as a
PIN, a password, or your mother's maiden name. Examples of something you have include a driver's
license or a magnetic swipe card. Something you are refers to biometrics. Examples of biometrics
include palm prints, finger prints, voice prints and retina (eye) scans.
Using strong passwords lowers overall risk of a security breach, but strong passwords do not replace the
need for other effective security controls. The effectiveness of a password of a given strength is strongly
determined by the design and implementation of the authentication system software, particularly how
frequently password guesses can be tested by an attacker and how securely information on user
passwords is stored and transmitted.
Access to protected information must be restricted to people who are authorized to access the
information. The computer programs, and in many cases the computers that process the information,
must also be authorized. This requires that mechanisms be in place to control the access to protected
information. The sophistication of the access control mechanisms should be in parity with the value of
the information being protected - the more sensitive or valuable the information the stronger the
control mechanisms need to be. The foundation on which access control mechanisms are built, start
with identification and authentication.
Role-based access control (RBAC) is an approach to restricting system access to authorized users. It is a
newer alternative approach to mandatory access control (MAC) and discretionary access control (DAC).
RBAC is sometimes referred to as role-based security.
RBAC is a policy neutral and flexible access control technology sufficiently powerful to simulate DAC and
MAC.
With the concepts of role hierarchy and constraints, one can control RBAC to create or simulate lattice-
based access control (LBAC). Thus RBAC can be considered a superset of LBAC.
Audit trail or audit log is a chronological sequence of audit records, each of which contains evidence
directly pertaining to and resulting from the execution of a business process or system function.
Audit records typically result from activities such as transactions or communications by individual
people, systems, accounts or other entities. The process that creates audit trail should always run in a
privileged mode, so it could access and supervise all actions from all users, and normal user could not
stop/change it. Furthermore, for the same reason, trail file or database table with a trail should not be
accessible to normal users.
Accountability uses such system components as audit trails (records) and logs to associate a subject with
its actions. The information recorded should be sufficient to map the subject to a controlling user. Audit
trails and logs are important for
Many systems can generate automated reports based on certain predefined criteria or thresholds,
known as clipping levels. For example, a clipping level may be set to generate a report for the following:
When organizations think of backup and recovery, it is usually associated with protecting information
residing on a server. It is important in IRDAs context to remember, however, that this constitutes both
When a server operating system fails, it can take eight or more hours (days, in some instances) to
rebuild and restore the server. This process includes reinstalling the OS, applications, patches,
configuring settings, etc. Moreover, there are no guarantees that the server will be in the exact same
state as before the failure took place.
There is also the matter of having to replace the server hardware. Few organizations can afford the
luxury of maintaining extra server hardware in case they need to replace an existing system. This
introduces the issue of restoring a system to a new and dissimilar piece of hardware, while trying to
preserve the integrity of the system state and the availability of the data. Organizations must ensure
that their system backup/recovery solutions provide hardware-independent restoration.
By deploying both data protection and system recovery solutions, organizations of any size can realize
the benefits of shorter backup times, faster system recovery, and reduced data loss.
For this purpose it is recommended to use Disaster Recovery Site for IRDA to protect the Data and
Applications considering the sensitivity and criticality of the data that IRDA deals with.
Network Tier
Network security consists of the provisions made in an underlying computer network infrastructure,
policies adopted by the network administrator to protect the network and the network-accessible
resources from unauthorized access, and consistent and continuous monitoring and measurement of its
effectiveness (or lack) combined together.
Network security starts from authenticating the user, commonly with a username and a password. Since
this requires just one thing besides the user name, i.e. the password which is something you 'know', this
is sometimes termed one factor authentication. With two factor authentication something you 'have' is
also used (e.g. a security token or 'dongle', an ATM card, or your mobile phone), or with three factor
authentication something you 'are' is also used (e.g. a fingerprint or retinal scan).
Once authenticated, a firewall enforces access policies such as what services are allowed to be accessed
by the network users. Though effective to prevent unauthorized access, this component may fail to
check potentially harmful content such as computer worms or Trojans being transmitted over the
network. Anti-virus software or an intrusion prevention system (IPS) helps to detect and inhibit the
action of such malware. An anomaly-based intrusion detection system may also monitor the network
and traffic for unexpected (i.e. suspicious) content or behaviour and other anomalies to protect
resources, e.g. from denial of service attacks or an employee accessing files at strange times. Individual
events occurring on the network may be logged for audit purposes and for later high level analysis.
It is recommended to have following Network security measurements for the IRDA system.
Firewalls
In order to provide some level of separation between IRDAs intranet and the Internet, firewalls would
have to be employed. A firewall is simply a group of components that collectively form a barrier
between two networks.
Bastion host
A general-purpose computer used to control access between the internal (private) network (intranet)
and the Internet (or any other un-trusted network). Typically, these are hosts running a flavour of the
Unix operating system that has been customized in order to reduce its functionality to only what is
necessary in order to support its functions. Many of the general-purpose features have been turned off,
and in many cases, completely removed, in order to improve the security of the machine.
Router
A special purpose computer for connecting networks together. Routers also handle certain functions,
such as routing, or managing the traffic on the networks they connect.
Many routers now have the ability to selectively perform their duties, based on a number of facts about
a packet that comes to it. This includes things like origination address, destination address, destination
The DMZ is a critical part of a firewall: it is a network that is neither part of the untrusted network, nor
part of the trusted network. But, this is a network that connects the untrusted to the trusted. The
importance of a DMZ is tremendous: someone who breaks into your network from the Internet should
have to get through several layers in order to successfully do so. Those layers are provided by various
components within the DMZ.
Proxy
This is the process of having one host act in behalf of another. A host that has the ability to fetch
documents from the Internet might be configured as a proxy server, and host on the intranet might be
configured to be proxy clients. In this situation, when a host on the intranet wishes to fetch the IRDAs
web page, for example, the browser will make a connection to the proxy server, and request the given
URL. The proxy server will fetch the document, and return the result to the client. In this way, all hosts
on the intranet are able to access resources on the Internet without having the ability to direct talk to
the Internet.
The following section outlines the various application components that are used to administer security.
This strategy includes the decisions IRDA has made regarding the processing options settings for the
security objects.
1) Sign-On Security
In addition, the following global password settings will be configured, in accordance with IRDAs
requirements. The
General: Electronic
IRDA Electronic
Internal Information is shared Information: Information:
prominently with third parties under Delete file from
Modification
printed on a contractual agreement. storage media
restricted to the
page / using typical
Electronic Information: information owner or
displayed on system delete
party authorized by
screen, form or Authenticate recipient
information owner. commands. If
presentation prior to transmission
Internal Information is backed on the hard
media. using at least the
up to facilitate disk, delete the
password (in case of
recovery of approved files before
FTP).
backup procedures / using it for
In case of
Non-Electronic guidelines. different
electronic
Information: purpose, or
documents, Back up media should
Use appropriate degauss the
use footers for be kept in a fire proof
packaging to conceal hard disk. If on
marking safe.
the contents. Seal on CD / Floppy,
Objective:
Frequently used for storing, controlling, versioning, and publishing industry-specific documentation such
as news articles, operators' manuals, technical manuals, sales guides, and marketing brochures. The
content managed may include computer files, image media, audio files, video files, electronic
documents, and Web content. These concepts represent integrated and interdependent layers.
Functionality of CMS:
User Credential verification This is the security based access method so that the users will
proper access rights can access the proper documents and data security is maintained
Adding of page content This functionality is used for adding new pages in the portal. Once user
log into the system, he can add new contents based on his credentials.
Modifying a page content Modifying functionality applies on the existing contents in the
portal. If any user wants to modify the content, then he log in the system using his user id and
password and depending upon his credentials, he can modify the content.
Deleting Page Content - This functionality is used for deleting the content already present in the
portal. The user logs in the system using his user ID and password. The system verifies his
credentials and depending upon his access rights, he can delete the content from the portal. In
case the user does not have any deletion rights, system throws a message.
Search Engine This functionality is used for searching the contents using some key words.
Objective:
The main objective of Broker online filing is to provide an online facility for registering new broker,
modifying existing brokers information, filing of returns by brokers.
Registration of brokers with IRDA The new would-be brokers are required to fill up some forms for
registering themselves with IRDA. This facility is available online in this portal. The form required to be
filled up for registration is available in this portal. A would-be broker will fill up the form. The data will
be automatically stored in database. This data will be visible to IRDA nodal officer who will approve or
disapprove the registration on the basis of his offline findings about the data.
Modification and removal of brokers details This functionality is available for IRDA Nodal officer who
can modify or remove the information on brokers whenever necessary. Once the officer receives any
communication from the broker regarding modification of details, he can go this portal and change the
information as per needed. He can also remove the information about a broker in case he finds anything
to do so.
Filing of returns Brokers files the returns on monthly basis. This functionality of the portal helps them
to file the returns online. The returns can be submitted using a pre-defined template. The data
submitted will be captured and stored in a database. This data will be used for generating the
consolidated reports for audit and internal consumption.
Once the returns are submitted, the broker can also view the summary status report on the following
categories;
Resubmission request
Returns Filed
The broker can drill down to those reports that are falling in each category.
Objective:
The objective of Grievance management is to monitor the policyholders grievance and track them for
speedy resolution. The grievance management system tracks the new complaint, forwarded complaints,
update status of complaints, rending the insurers and generating some customized reports on the basis
of the complaints.
Registering New Complaint Once IRDA receives any complaint from the policyholders, it registers the
complaints after some verification. The newly registered complaint is given a grievance no. and a status.
The grievance no. is a unique no and is used to track the complaint for future reference.
Tracking new complaints IRDA tracks the new complaints along with some basic information on the
date of receipt and status of the complaint. The portal is capable of showing the complaints for each
insurer and for overall insurers also.
Track the forwarded complaints This functionality of the portal helps the grievance Department to
track the forwarded complaints so that if there is any forwarded complaint which has not been taken
care of by the insurer for long, they can send the reminder to the insurers.
Reminder The system is capable of tracking the complaints which have not been taken care of by the
insurers and reminder has been sent to them.
Update Status The status of the complaints are changed as per the follow up. Such status helps to
track the current scenario of the complaints.
Customized Reports On the basis of the complaints registered, the system is capable of generating
some customized reports for each insurer and on an overall basis. Such reports are generated annually
and reviewed by IRDA for accessing the complaints status for each insurer and in Indian insurance
scenario.
Customized Letters Format There are three types of letter formats available -
Acknowledgement letter
Closed letter
This module is designed to track the details advertisements for each insurer. The module also tracks
those advertisements that are released by Intermediaries. The insurers maintain a separate register for
tracking the advertisements. So the information captured in the portal helps IRDA to inspect the data
maintained by the insurers.
Functionality of Advertisement:
Registering New Advertisement Once there is some new advertisement for any insurer, then the portal
generates the unique insurer-wise advertisement reference no. to track the advertisements and also it
sends automatic e-mail to the insurer and the Office-in-Charge (OIC).
Tracking complaints The insurers has to send the details of the advertisements released to IRDA
through hardcopy format. On the receipt of the details, IRDA enters the following details about the
advertisement;
Date of receipt
Observations
There is any advertisement which has been launched by the insurer but not provided details .
If there is any non-compliance in the details of the advertisement, then notification is sent to
the insurer.
Objective:
This module is designed to track the details of the third party administrators. Third Party Administrators
are engaged by the insurers for fee/ remuneration.
Functionality of Advertisement:
Modification and removal of details of Surveyors Once a registered Surveyors communicates with the
Officer-in-Charge in IRDA about some changes in the details, the officer-in-charge can change the details
of the TPA as required and update the data in the system. If there is anything which dictates the removal
of TPA, then the OIC also can remove the TPA using the removal functionality.
Querying of data The portal is capable of querying the data on the basis of the requirements. Once any
user runs a particular query, the portal will fetch the relevant data from the database and display the
data in a structured predefine format. This feature of the portal helps to generate dynamic reports on
the basis of the data on Surveyors.
Objective:
This module is designed to track the statistics on new business for life & non-life business. The
compliance officer of insurer submits the value of the new business statistics for every month to the
Officer-in-Charge (OIC) at IRDA.
Online Submission of Statistics The insurers submit the new business statistics value on monthly basis
to the officer-in-charge in IRDA. Once they submit the values of the new business statistics, the data is
stored in the database so that OIC can access the data and validate the same on the basis of the returns
submitted by the insurers.
The returns are classified into Individual, group , urban, rural, social sector. Also the portal captures the
information on rides.
Security Enabled Accessing Feature Each insurer is assigned a user ID and a password so that they can
submit the new business statistics values and cannot view the details of other insurers. This function
helps to maintain data security.
Customized Reporting Customized reporting capability helps IRDA to consolidate the returns
submitted by the insurers and generate reports for each insurer and also for the Indian Insurance
industry. The reports are published in the website and IRDA journals after obtaining the approval from
Auto-Mailing Feature This auto-mailing feature sends automated e-mails to the insurers in case there
is any delay in submission of the new business statistics values.
The following data was gathered for different systems based on a questionnaire session organized for
capturing the details for various systems
Parameters Details
System Application Objective and functionality Grievances Management for both Life and Non-
(optional) Life
Pain Points and Drawbacks in the system Currently Some attributes / Classifications not available
Parameters Details
Parameters Details
Database Size 5 - 10 GB
Pain Points and Drawbacks in the system Slow response time, reports take lot of time
Currently
Parameters Details
System Application Objective and functionality Tracking of inward mails /office notes
(optional)
5. MIS
Parameters Details
System Application Objective and functionality Collecting Regulatory returns from Insurers
(optional) and Generating Analysis
Technology used VB 6
How is report generated out of this particular Crystal Reports / Sql Reporting Services
Source System
Pain Points and Drawbacks in the system Reports do not tally with the actual reports
Currently submitted by Insurers
6. ATI Database
Parameters Details
System Application Objective and functionality Capturing new application details of ATIs
(optional) (Online / Off-line)