Fisma Standard

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

Telecommun Syst (2010) 45: 139152

DOI 10.1007/s11235-009-9248-8

Information system security compliance to FISMA standard:


a quantitative measure
Elaine Hulitt Rayford B. Vaughn

Published online: 30 December 2009


Springer Science+Business Media, LLC 2009

Abstract To ensure that safeguards are implemented to pro- 1 Introduction


tect against a majority of known threats, industry leaders are
requiring information processing systems to comply with Continuously changing information system configurations
security standards. The National Institute of Standards and and attack methods make information system risk manage-
Technology Federal Information Risk Management Frame- ment using traditional methods a formidable task. Tradi-
work (RMF) and the associated suite of guidance docu- tional qualitative approaches usually lack sufficient measur-
ments describe the minimum security requirements (con- able detail upon which to base confident, cost-effective mit-
trols) for non-national-security federal information systems igation decisions. Quantitative approaches are burdened by
mandated by the Federal Information Security Management the requirements to collect an abundance of detailed asset
Act (FISMA), enacted into law on December 17, 2002, as value and historical incident data and to apply complex cal-
Title III of the E-Government Act of 2002. The subjec- culations to measure the data precisely with limited people
tive compliance assessment approach described in the RMF resources.
guidance, though thorough and repeatable, lacks the clar- The managed risk approach is recognized as the best
ity of a standard quantitative metric to describe for an in- way to achieve good information security. Risk is the likeli-
formation system the level of compliance with the FISMA- hood and cost of a threat exploiting an existing vulnerability.
required standard. Given subjective RMF assessment data, A threat is the circumstance (thing) that has the potential to
this article suggests the use of Pathfinder networks to gen- cause loss or harm to assets. A vulnerability is a weakness
erate a quantitative metric suitable to measure, manage, in security that may allow a threat to cause loss or harm to
and track the status of information system compliance with assets.
FISMA. To ensure that safeguards are implemented to protect
against1 a majority of known threats, industry leaders are
requiring that information processing systems comply with
Keywords Risk analysis Secure architecture modeling
specific security standards. The Federal Information Secu-
Standards compliance modeling Pathfinder networks
rity Management Act (FISMA) enacted into law on Decem-
RMF FISMA
ber 17, 2002, as Title III of the E-Government Act of 2002
[28] defined three security objectives for federal govern-
E. Hulitt ()
ment information systems: (1) Confidentiality, to preserve
U.S. Army Engineer Research and Development Center, authorized restrictions on access and disclosure, with means
CEERD-IS, 3909 Halls Ferry Road, Vicksburg, MS 39180-6199, for protecting personal privacy and proprietary information;
USA (2) Integrity, to guard against improper information mod-
e-mail: [email protected]
ification or destruction while ensuring information nonre-
R.B. Vaughn pudiation and authenticity; and (3) Availability, to ensure
Department of Computer Science and Engineering, Center timely and reliable access to and use of information [30].
for Computer Security Research, Mississippi State University,
P.O. Box 9637, Starkville, MS 39762, USA
1 Approved for public release; distribution is unlimited.
e-mail: [email protected]
140 E. Hulitt, R.B. Vaughn

Fig. 1 Risk management


framework (from [18])

To achieve these security objectives, FISMA tasked the Na-


tional Institute of Standards and Technology (NIST) to de-
velop a set of standards and guidelines, the Federal Infor-
mation Risk Management Framework (RMF) (Fig. 1), that
(1) describe categories for information systems according to
risk levels (low, moderate, high), (2) identify types of in-
formation systems to be included in each category, and (3)
describe a minimum set of security requirements (controls)
that must be applied to systems in each category to achieve Fig. 2 Calculate annualized loss expectancy
adequate security [16, 20, 28]. Adequate security is defined
by Office of Management and Budget (OMB) Circular A- 2 Quantitative risk management
130 as security commensurate with the risk and magni-
tude of harm resulting from the loss, misuse, or unautho- Annualized Loss Expectancy (ALE) is the value tradition-
rized access to or modification of information [29]. FISMA ally used to express monetary loss for assets associated with
also requires an annual assessment of information system an identified risk occurring within a 1-year period [1, 19,
compliance with the required standard [15]. With approxi- 22]. ALE is calculated as shown in Fig. 2 where SLE, Sin-
mately 100 security controls in the low-impact category to gle Loss Expectancy, is the estimated amount of money that
over 300 security controls in the high-impact category, the would be lost should the identified risk occur, and ARO, An-
subjective compliance assessment approach described in the nualized Rate of Occurrence, is the frequency at which the
RMF guidance, though thorough and repeatable, lacks the risk is expected to occur within 1 year [1, 19, 22]. Calcula-
clarity of a standard quantitative metric to describe for an tion of SLE requires determining a dollar value for tangible
information system the level of compliance with the stan- and intangible assets associated with an identified risk [19,
dard. Given the review process outlined by NIST RMF 22]. For tangible assets, the cost to repair, purchase, install,
documents, the challenge is to provide a quantitative risk configure, replace, or recover should be considered [19, 22].
analysis metric adequate to (1) clearly describe the status The calculation of SLE also includes determining the value
of compliance with the FISMA-required standard, (2) track of the following intangible information assets:
progress toward compliance with the FISMA-required stan- Availability: For an e-commerce site, how much money
dard, (3) direct the allocation of resources required to meet would be lost if the site is unavailable to customers for a
FISMA minimum requirements, and (4) simplify annual re- given period of time? For a critical business system, how
port preparation. The authors propose generating a quanti- much money would be lost paying employees who are
tative risk analysis metric at the information system level, unable to work because of system unavailability [19, 22]?
using Pathfinder networks (PFNETs), to measure, manage, Integrity: Depending on the nature of the failure and the
and track the status of system security compliance with the recovery options available, the accuracy of recovered data
FISMA-required standard. may be in question or the recovery of data in its entirety
Information system security compliance to FISMA standard: a quantitative measure 141

Fig. 3 Risk assessment matrix


(from GAO [27])

may not be possible. What would be the monetary impact Category A, FrequentPossibility of repeated incidents
of inaccurate or incomplete data on business [19]? Category B, ProbablePossibility of isolated incidents
Confidentiality: What would be the monetary impact of Category C, OccasionalPossibility of occurring some-
the revelation of confidential or secret data on a public or time
government entity [19]? Category D, RemoteNot likely to occur
Organizations are reluctant to report security failures. Category E, ImprobablePractically impossible
Therefore, it is not likely that there is sufficient historical in- Risks 1 through 4 in Fig. 3 are predetermined categories
cident data to accurately determine ARO. There is also the describing an organizations policy regarding which risks
virtually impossible task of determining the precise value of are unacceptable and require corrective action and which
intangible information assets for calculating SLE [6, 19]. risks are acceptable.
Typically, the strategy defined when this approach is
taken is that the highest risk exposures (darkest shaded ar-
3 Qualitative risk management eas) require prompt attention, the moderate risk exposures
(lightly shaded areas) require plans for corrective action, and
Qualitative risk analysis approaches are characterized by the the lowest risk exposures (unshaded areas) can be accepted
application of ordinal risk measures in a matrix format to [19]. For risk identified as requiring prompt attention, the as-
describe risk severity or extent of asset loss given the oc- sessment team may use some combination of qualitative and
currence of an identified threat scenario [7, 19, 27]. The
quantitative methods to prepare cost estimates to provide ev-
matrix rankings are based on data gathered via interview
idence as to what course of action should be taken [7, 27].
and/or questionnaire from knowledgeable stakeholders [7,
Qualitative risk analysis approaches tend to be less complex
19, 27]. Figure 3, from United States General Accounting
than the traditional quantitative approach [19]. However, the
Office (GAO) [27], ranks asset loss severity and probability
quality of the results will be more dependent on the expe-
of risk occurrence for identified scenarios.
rience, expertise, and judgment of the team identifying and
Reference [27] defines the Scenario severity levels in
evaluating the threat scenarios [27].
Fig. 3 as follows:
Category I: Death, loss of critical proprietary information,
system disruption, or severe environmental damage 4 The RMF
Category II: Severe injury, loss of proprietary informa-
tion, severe occupational illness, or major system or envi- 4.1 Purpose
ronmental damage
Category III: Minor injury, minor occupational illness, or The RMF, shown in Fig. 1, describes the steps and related
minor system or environmental damage
standards and guidelines for implementing the minimum
Category IV: Less than minor injury, occupational illness,
set of controls required to provide adequate security for an
or less than minor system or environmental damage
information system and the associated information stored,
Reference [27] defines the Scenario probability levels in processed, and transmitted by that system. The framework
Fig. 3 as follows [27]: includes guidance for assuring that controls are properly
142 E. Hulitt, R.B. Vaughn

implemented and operating as intended to provide the ex- Table 1 Security control classes, families, and identifiers (from [17])
pected security benefit. The RMF emphasizes the idea that ID Family Class
risk management is a continuous process [6, 20].
AC Access control Technical
4.2 Federal information processing standard 199 AT Awareness and training Operational
AU Audit and accountability Technical
Federal Information Processing Standard (FIPS) 199 [16] CA Certification, accreditation, and security Management
addresses the first two FISMA mandates, the definition of in- assessments
formation system categories according to risk level and the CM Configuration management Operational
identification of system types to include in each category. CP Contingency planning Operational
FIPS 199 defines three categories for information systems IA Identification and authentication Technical
considering the potential impact to organizations and indi- IR Incident response Operational
viduals should a breach of confidentiality, integrity, or avail- MA Maintenance Operational
ability occur: (1) Low, limited adverse effect, (2) Moderate, MP Media protection Operational
serious adverse effect, and (3) High, severe or catastrophic PE Physical and environmental protection Operational
adverse effect. FIPS 199 applies to all federal information PL Planning Management
systems except those designated as national security as de-
PS Personnel security Operational
fined in 44 United States Code Section 3542(b)(2).
RA Risk assessment Management
SA System and services acquisition Management
4.3 FIPS 200
SC System and communications protection Technical
SI System and information integrity Operational
FIPS 200 [17] addresses the third FISMA mandate, to de-
velop minimum information security requirements (con-
trols) for information systems in each category as defined
baseline are marked Not Selected. The numbers in paren-
by FIPS 199. FIPS 200 went into effect when published,
theses following the control identifiers indicate the control
March 2006. Federal agencies are required to be in compli-
enhancement that applies. The baselines are intended to be
ance with the standard no later than 1 year from its effec-
broadly applicable starting points and may require modifi-
tive date. There is no provision under FISMA for waivers to
cation to achieve adequate risk mitigation for a given sys-
FIPS 200.
tem [18]. Given the repeatable review process outlined by
the NIST RMF documents, the challenge is to provide a
4.4 FISMA-required system controls
quantitative risk analysis metric adequate to (1) clearly de-
scribe the status of compliance with the FISMA-required
As required by FIPS 200, NIST Special Publication (SP)
standard, (2) track progress toward compliance with the
800-53, Recommended Security Controls for Federal Infor-
FISMA-required standard, (3) direct the allocation of re-
mation Systems [18], defines the security controls and pro-
vides guidelines for selecting the appropriate set to satisfy sources required to meet FISMA minimum requirements,
the minimum requirement for adequate security given a sys- and (4) simplify annual report preparation. The authors pro-
tem category of low, moderate, or high impact. The control pose generating a quantitative risk analysis metric at the in-
sets described in FIPS 200 cover 17 security-related areas formation system level, using PFNETs, to measure, manage,
(families). As illustrated in Table 1, the 17 security control and track the status of system security compliance with the
families are organized into three classesmanagement, op- FISMA-required standard.
erational, and technicalto facilitate the selection and spec-
ification of controls when evaluating an information sys-
tem. Two-character identifiers are assigned to each control 5 Compliance measurement using PFNETS
family. A number is appended to the family identifier to
uniquely identify controls within each family. Appendix D PFNETs are the result of an effort by Dearholt and Schvan-
of SP 800-53 identifies three minimum sets (baselines) of eveldt [5] to develop network models for proximity data
security controls that correspond to the low-, moderate-, and [24]. Proximity refers to the measure of relationship (sim-
high-impact information system categories defined in FIPS ilarity, relatedness, dissimilarity, distance, etc.) between two
199. Appendix F of SP 800-53 provides a detailed descrip- entities [5]. In networks, proximity measures are repre-
tion of each security control and numbered enhancements sented by distance, with small values representing similar-
for each control where applicable. As illustrated in Table 2, ity or a high level of relatedness, and large values repre-
controls in the Access Control family not used in a particular senting dissimilarity or a low level of relatedness [5]. Given
Information system security compliance to FISMA standard: a quantitative measure 143

Table 2 Excerpt from security control baselines (from [18])

CNTL Control Control baselines


No. name Low Mod High

AC-1 Access control policy and procedures AC-1 AC-1 AC-1


AC-2 Account management AC-2 AC-2(1)(2)(3)(4) AC-2(1)(2)(3)(4)
AC-3 Access enforcement AC-3 AC-3(1) AC-3(1)
AC-4 Information flow enforcement Not selected AC-4 AC-4
AC-5 Separation of duties Not selected AC-5 AC-5
AC-6 Least privilege Not selected AC-6 AC-6
AC-7 Unsuccessful login attempts AC-7 AC-7 AC-7
AC-8 System use notification AC-8 AC-8 AC-8
AC-9 Previous logon notification Not selected Not selected Not selected
AC-10 Concurrent session control Not selected Not selected AC-10
AC-11 Session lock Not selected AC-11 AC-11
AC-12 Session termination Not selected AC-12 AC-12(1)
AC-13 Supervision and reviewaccess control AC-13 AC-13(1) AC-13(1)
AC-14 Permitted actions without identification or authentication AC-14 AC-14(1) AC-14(1)
AC-15 Automated marking Not selected Not selected AC-15
AC-16 Automated labeling Not selected Not selected Not selected
AC-17 Remote access AC-17 AC-17(1)(2)(3)(4) AC-17(1)(2)(3)(4)
AC-18 Wireless access restrictions AC-18 AC-18(1) AC-18(1)(2)
AC-19 Access control for portable and mobile devices Not selected AC-19 AC-19
AC-20 Use of external information systems AC-20 AC-20(1) AC-20(1)

a dissimilarity matrix resulting from the subjective catego- Table 3 Stakeholder A entity correlation

rization (mapping) of entities as defined by Dearholt and E1 E2 E3 E4


Schvaneveldt [5], application of the Pathfinder algorithm
generates a unique quantitative network representation of E1 X X
the proximity data. Any change in the subjective catego- E2 X X X
rization of entitiesin the case of risk analysis, vulnera- E3 X X X
bilities to threatschanges the resulting network. Our re- E4 X X
search indicates that the Pathfinder technique may be suit-
able for generating quantitative network models of informa-
tion security standard controlsmore accurately, the lack 1. Correlate entities (e.g., vulnerabilities to threats) in an
thereofand information system security controls for com- n n matrix. Stakeholder As perception of the relation-
parison using a correlation coefficient (cc) formula to de- ship between entities E1, E2, E3, and E4 is represented
termine the status of information system compliance with a in Table 3. Stakeholder Bs perception of the relationship
specified standard (%compliant). Among the successful ap- between the same entities is represented in Table 4.
plications of PFNETs have been in the discovery of salient 2. Build entity co-occurrence groups from entity correla-
links between documents to facilitate three-dimensional vir- tions. Assuming symmetric relationships, the following
tual reality modeling of document relationships [3], in au- entity correlation groups result:
thor co-citation analysis to reveal salient linkages between
groups of related authors to produce interactive author maps Stakeholder A: (E1, E2, E3), (E2, E3, E4), (E3, E4)
in real-time [14], and in the requirements phase of soft- Stakeholder B: (E1, E3, E4), (E2, E3, E4), (E3, E4)
ware development projects to determine stakeholder (users, 3. Build similarity matrix from co-occurrence groups. Ta-
sponsors, project managers, and developers) understand- bles 5 and 6 are the similarity matrices for the entity cor-
ing/misunderstanding of specified requirements [11]. Fol- relations of Stakeholder A and Stakeholder B, respec-
lowing is an example of the steps necessary to build and tively. The number at entity intersections in similarity
compare PFNETs [11]: matrices for Stakeholder A and Stakeholder B indicates
144 E. Hulitt, R.B. Vaughn

Table 4 Stakeholder B entity correlation one (to avoid zero dissimilarity matrix entries) [11, 12].
E1 E2 E3 E4 The maximum co-occurrence count is 2 for Stakeholder
A and 3 for Stakeholder B.
E1 X X 5. Apply Pathfinder algorithm to dissimilarity matrix to
E2 X X build PFNET. The Pathfinder procedure shown in Fig. 4,
E3 X X X representing each entity as a node, uses dissimilarity ma-
E4 X X X trix entity link weights to generate a network [5, 11, 12].
A path will exist between node pair (i, j ) in PFNET
(r, q) if and only if there is no shorter alternate path
Table 5 Stakeholder A similarity matrix
between (i, j ), where r is the Minkowski r-metric cal-
E1 E2 E3 E4 culation of path weight, for paths with number of links
q. The distance between two nodes not directly linked
E1 1 1 0 is computed using the Minkowski r-metric. For path P
E2 1 2 1 with weights w1 , w2 , . . . , wk , the Minkowski distance is
E3 1 2 2 [5, 11]
E4 0 1 2
 k 1/r

w(P ) = wir where r 1, wi 0 for all i.
Table 6 Stakeholder B similarity matrix i=1
(1)
E1 E2 E3 E4
When r = 1, path weight is calculated by summing the
E1 0 1 1
link weights along the path [5, 11]. Calculating path
E2 0 1 1
weight this way assumes ratio-scale data where each
E3 1 1 3
weight value is presumed to be within a multiplicative
E4 1 1 3
constant of the correct value [5]. When link values
are obtained from empirical data, computing path weight
Table 7 Stakeholder A dissimilarity matrix this way may not be justifiable [23]. For generating
PFNETs, where only the ordinal relationships between
E1 E2 E3 E4
link weights and path weights are important, r should be
E1 2 2 3
set to [5]. When r = , the path weight is the same as
E2 2 1 2
the maximum weight associated with any link along the
E3 2 1 1
path [5, 11]. Weight matrices W 1 , W 2 , . . . , W q contain
shortest paths of length 1, 2, . . . , q, respectively, calcu-
E4 3 2 1
lated using the Minkowski formula. D 1 contains shortest
paths of length 1. D 2 contains shortest paths of length 1
Table 8 Stakeholder B dissimilarity matrix or 2. D q contains shortest paths of length 1or2, . . . , q.
A PFNET is generated for Stakeholder A (Fig. 5) and
E1 E2 E3 E4
another for Stakeholder B (Fig. 6). The PFNET(r = ,
E1 4 3 3 q = n 1) generated by the Pathfinder procedures is
E2 4 3 3 guaranteed to be connected, to retain the shortest paths
E3 3 3 1 having q or fewer links without violation of triangle in-
E4 3 3 1 equality, and to always have the minimum number of
edges [5]. Given a symmetric weight matrix (dissimilar-
ity matrix) and values for r and q, the Pathfinder proce-
the number of times the entities co-occur as grouped by dures generate a unique network [5, 11].
each stakeholder [11, 12]. 6. Build minimum distance matrix from PFNET. The min-
4. Build dissimilarity matrix from similarity matrix. Ta- imum distance matrices contain the shortest path dis-
bles 7 and 8 are the dissimilarity matrices for Stakeholder tances between any two nodes of the PFNET. These path
A and Stakeholder B, respectively. For a given similarity distances are calculated the traditional way, by adding
matrix, dissimilarity matrix entries are generated by sub- link weights along paths between nodes. Tables 9 and 10
tracting each similarity matrix co-occurrence count en- are the minimum distance matrices for the PFNETs in
try from the maximum co-occurrence count entry plus Fig. 5 and 6, respectively.
Information system security compliance to FISMA standard: a quantitative measure 145

Fig. 4 Pathfinder algorithm (from [5])

Fig. 5 Stakeholder A PFNET Table 9 Stakeholder A minimum distance matrix

E1 E2 E3 E4

E1 2 2 3
E2 2 1 2
E3 2 1 1
E4 3 2 1

Table 10 Stakeholder B minimum distance matrix

E1 E2 E3 E4
Fig. 6 Stakeholder B PFNET
E1 6 3 3
E2 6 3 3
E3 3 3 1
E4 3 3 1

tionship between the same set of data entities. The min-


imum distance vector for Stakeholder A is (2 2 3 1 2 1).
The minimum distance vector for Stakeholder B is
7. Use a cc formula to determine the degree of covariance (6 3 3 3 3 1). To compare Stakeholder As perception
(similarity) between the two models. Quantitatively mea- with Stakeholder Bs perception of how E1, E2, E3 and
sure the similarity between two perceptions of the rela- E4 are related, the minimum distance vectors are used in
146 E. Hulitt, R.B. Vaughn

Table 11 cc values for Stakeholder A and Stakeholder B Table 12 Threat categories

Stake-holder A Entity Detailed cc Stake-holder B ID Threat category name


distance distance
vectors vectors T1 Introduction of unapproved software
T2 Software version implementation errors
223 E1 0.50 633 T3 Sabotage of software
212 E2 0.50 633 T4 Theft of software
211 E3 0.50 331 T5 Sabotage of data/information
321 E4 0.87 331 T6 Theft of data/information/goods
T7 Destruction of data/information
T8 Disruption of service
the computation of a cc value for the two PFNETs using T9 Accountability data loss
this formula [11, 12, 21]:

(a a)(b
b)
cc =   (2) Generate current- and open-risk minimum-distance matri-
(a a)
2 (b b) 2 ces from the PFNETs generated. Compare the minimum-
distance matrices using a cc formula to generate over-
where a is the value of an element in the minimum dis- all %similar measures for the models as well as detailed
tance vector for Stakeholder A, a is the mean of all the %similar measures for each entity within the models.
elements in the minimum distance vector (upper or lower Subtract the overall cc %similar to open-risk measure
triangular values), b is the value of a corresponding ele- from 1 to generate the %compliant to closed-risk (no vul-
ment in the distance vector for Stakeholder B and b is nerabilities) measure.
the mean of all elements in the current-risk distance vec-
Assuming we are evaluating a Financial Management
tor. The overall cc for the PFNETs for Stakeholder A
System (FMS) that is web-enabled, intranet accessible, and
and Stakeholder B in this example is approximately 0.36. categorized as moderate impact using the NIST criteria, an
This may indicate low agreement between Stakeholder A example using the Pathfinder technique follows.
and Stakeholder B as to how entities E1, E2, E3, and
E4 relate. It is possible to compare Stakeholder As and 5.1 Define Representative Threat Set
Stakeholder Bs perceptions of how a single entity in
each PFNET relates to all others by using the distance Table 12 is a sample list of threats associated with operating
vectors for a single entity in the cc formula [11]. Ta- the FMS application. The threat categories are taken directly
ble 11 shows cc values for each entity. Stakeholders A or derived from Ozier [19], Bishop [2], and the Federal In-
and B appear to have a significant difference of opinion formation System Controls Audit Manual (FISCAM) [26].
as to how E1 relates to E2, E3, and E4 (cc = 0.50).
They appear to be in much better agreement as to how 5.2 Build open-risk PFNET model
E4 relates to E1, E2, and E3 (cc = 0.87). Calculating the
individual cc values should provide insight into what un- Table 13 contains a subset of the FISMA-required base-
derlying differences contribute most to the difference in line controls for a moderate-impact system. In Table 14,
overall perception between stakeholders. the controls from Table 13 are negated to create the vulner-
As illustrated in Fig. 7, to generate the proposed %com- ability set for this example. To build the open-risk model
(open standard) for evaluating the FMS system, we assume
pliant metric, the researcher must
all 20 vulnerabilities (low-level categories) exist by map-
Define a representative threat set where the threat level of ping/relating them to the 9 threats (high-level categories)
detail is dependent on the stakeholder (e.g., system secu- identified in Table 12. Vulnerabilities may be mapped to
rity analyst or FISMA security certifier) requirements. more than one threat. This exercise may be done manually,
Build an open-risk PFNET model of the FISMA-required but could become very tedious as the number of vulnerabili-
standard security controls. Controls when negated be- ties and threats increases. For this example, a web-based cat-
come vulnerabilities. Map all vulnerabilities to threat set. egorization tool written by Kudikyala [10] was used to relate
Complete the Pathfinder procedure. the vulnerabilities to threats resulting in the co-occurrence
Build a current-risk PFNET model of the information sys- groups shown in Table 15.
tem being evaluated. Map system current vulnerabilities The categorization tool [10] automatically builds an nn
to the threat setmapping defined by the open-risk model similarity matrix of distinct entities categorized. For this
(the standard). Complete the Pathfinder procedure. example, n is the sum of 9 threats and 20 vulnerabilities
Information system security compliance to FISMA standard: a quantitative measure 147

Fig. 7 Compliance measurement using pathfinder

resulting in a 29 29 similarity matrix for the open-risk The PFNET generated from the open-risk dissimilarity
co-occurrence groups. Similarity matrix entries reflect the matrix is a mathematical model of standard open risk.
number of times grouped entities co-occur. For the standard
open-risk co-occurrence groups, shown in Table 15, V8 5.3 Build current-risk PFNET model
and V4 co-occur 4 times. In the open-risk similarity matrix,
Assume these vulnerabilities exist in the FMS system: V4,
the co-occurrence count at entries (V8, V4) and (V4, V8)
V6, V7, V8, V9, V10, V11, V12, V13, V14, V15, V16, V17,
would be 4. Higher co-occurrence counts indicate greater
V18, V19, and V20 (see Table 14). To build the current-
similarity. The categorization tool [10] automatically builds risk PFNET model, map the FMS vulnerabilities to threats
a dissimilarity matrix from the similarity matrix of cate- as dictated by the vulnerability mappings in the open-risk
gorized entities. The vulnerability-to-threat relationships in standard model to generate the co-occurrence groups shown
this example are symmetric. Therefore an open-risk dissim- in Table 16 under FMS System Current Risk. (Note: the
ilarity matrix (upper triangular portion only) is generated Standard Open Risk and FMS System Current Risk co-
from the open-risk similarity matrix by subtracting each co- occurrence groups in Table 16 are the initial entries in Ta-
occurrence count entry from the maximum co-occurrence ble 17.)
count entry plus one to prevent 0-value dissimilarity matrix Using the procedure described in Sect. 5.2,
entries. Lower co-occurrence counts indicate greater simi- A similarity matrix is generated from the FMS system
larity. current-risk co-occurrence groups.
A Unix-based PFNET generation tool, written by Ku- A dissimilarity matrix is generated from the similarity
rup [13] applying the Dearholt and Schvaneveldt algorithm, matrix.
was used to generate the open-risk PFNET from the dis- The PFNET algorithm is applied to the dissimilarity ma-
similarity matrix. The tool requires as input the number of trix to generate the current-risk PFNET model.
nodes (n = 29), the upper triangular portion of the dissimi- The PFNET generated from the current-risk dissimilarity
larity matrix, an r-metric (, input as 1), and a q parame- matrix is a mathematical model of the FMS system current
ter (n 1 = 28). risk.
148 E. Hulitt, R.B. Vaughn

Table 13 FISMA standard control subset (from [18]) Table 14 Vulnerability categories

ID FISMA control name Control Vulnerability Vulnerability category name


ID ID
AC-1 Access control policy and procedures
AC-2 Account management CM-1 V1 Inadequate configuration management
AC-3 Access enforcement policy and procedures
AC-5 Separation of duties CM-5 V2 Inadequate access restrictions for change
AC-7 Unsuccessful login attempts AC-3 V3 Inadequate access enforcement
AC-8 System use notification IA-2 V4 Inadequate user identification and
authentication
AC-13 Supervision and reviewaccess control
AC-2 V5 Inadequate account management
AU-2 Auditable events [access]
AC-8 V6 No system use notification
AU-6 Audit monitoring, analysis, and reporting
AC-7 V7 No termination after maximum
CM-1 Configuration management policy and procedures
unsuccessful login attempts
CM-5 Access restrictions for change
AC-1 V8 Inadequate access control policy and
CP-4 Contingency plan testing and exercises procedures
CP-9 Information system backup AC-13 V9 Inadequate supervision and
CP-10 Information system recovery and reconstitution reviewaccess control
IA-2 User identification and authentication PS-4 V10 Inadequate execution of personnel
termination procedure
PS-4 Personnel termination
AU-2 V11 Inadequate access monitoring
SA-5 Information system documentation [operations]
SA-5 V12 No information system operations manual
SC-2 Application partitioning
CP-9 V13 Insufficient system backups
SC-8 Transmission integrity
CP-10 V14 Inadequate recovery mechanisms
SI-9 Information input restrictions
CP-4 V15 No contingency plan testing and exercises
AU-6 V16 Inadequate audit monitoring, analysis, and
5.4 Compare minimum distance matrices reporting
SC-8 V17 Integrity of transmitted data not protected
A Unix-based PFNET correlation tool, written by Kudikyala AC-5 V18 Inadequate separation of duties
[9], was used to generate minimum distance matrices SC-2 V19 Inadequate application partitioning
from the standard open-risk and FMS system current-risk SI-9 V20 Inadequate information input restrictions
PFNETs using Floyds algorithm for shortest path [4]. Path
distances for the minimum distance matrices are calculated
the traditional way, by adding link weights along paths be- Table 15 Standard open-risk co-occurrence groups
tween nodes. The correlation tool was also used to compare (T1, V1)
the open- and current-risk minimum distance matrices us- (T2, V1)
ing the cc formula (2). Normally the cc range is [1, +1], (T3, V2)
where 1 represents no similarity and +1 represents perfect
(T4, V2)
similarity between models [11, 12]. Because of the approach
(T5, V18, V17, V10, V8, V4, V3, V1)
taken in this research to compare current system state to a
(T6, V18, V10, V8, V4, V3, V1)
standard perception of adequate security, the cc range is nar-
(T7, V20, V19, V10, V8, V4, V3, V1)
rowed from [1, +1] to [0, +1]no comparison beyond a
(T8, V15, V14, V13, V12, V3, V1)
perfect match.
(T9, V16, V11, V9, V8, V7, V6, V5, V4)
5.5 Generate %compliant measure

The correlation tool [9] generates an overall cc value that by the closed-risk modelno vulnerabilities exist. Compar-
indicates the degree of covariance (similarity) between ing the FMS system current-risk model 1 to the open-risk
the standard open-risk model and the system current-risk model results in a cc of 0.45 (see Table 18, Overall Path
modelsimilarity to unacceptable risk; all vulnerabilities Distance cc for FMS 1). The FMS 1 current-risk model in
exist. The goal for the FMS system is a cc of 0, i.e., no this example exhibits 45 percent similarity to the open-risk
similarity to the open-risk model. Subtracting the overall cc model. Subtracting 0.45 from 1.0 (open-risk) yields a value
value from 1 yields a value (%compliant) that indicates how that indicates the FMS system is 55 percent compliant to
close the FMS system is to standard compliance as defined closed-risk (see Table 18 %compliant for FMS 1).
Information system security compliance to FISMA standard: a quantitative measure 149

Table 16 Co-occurrence
groups Standard open risk FMS system current risk

(T1, V1)
(T2, V1)
(T3, V2)
(T4, V2)
(T5, V18, V17, V10, V8, V4, V3, V1) (T5, V18, V17, V10, V8, V4)
(T6, V18, V10, V8, V4, V3, V1) (T6, V18, V10, V8, V4)
(T7, V20, V19, V10, V8, V4, V3, V1) (T7, V20, V19, V10, V8, V4)
(T8, V15, V14, V13, V12, V3, V1) (T8, V15, V14, V13, V12)
(T9, V16, V11, V9, V8, V7, V6, V5, V4) (T9, V16, V11, V9, V8, V7, V6, V4)

Table 17 Risk model


co-occurrence groups Open risk: (T1, V1) (T2, V1)
See Table 16 (T3, V2) (T4, V2)
(T5, V18, V17, V10, V8, V4, V3, V1) (T6, V18, V10, V8, V4, V3, V1)
(T7, V20, V19, V10, V8, V4, V3, V1) (T8, V15, V14, V13, V12, V3, V1)
(T9, V16, V11, V9, V8, V7, V6, V5, V4)
FMS Model 1 (T5, V18, V17, V10, V8, V4) (T6, V18, V10, V8, V4)
See Table 16 (T7, V20, V19, V10, V8, V4) (T8, V15, V14, V13, V12)
(T9, V16, V11, V9, V8, V7, V6, V4)
FMS Model 2 (T5, V18, V10, V8) (T6, V18, V10, V8)
(T7, V20, V19, V10, V8) (T8, V15, V14, V13, V12)
(T9, V16, V11, V9, V8, V7, V6)
FMS Model 3 (T5, V10, V8) (T6, V10, V8)
(T7, V10, V8) (T8, V15, V14, V13, V12)
(T9, V16, V11, V9, V8, V7, V6)
FMS Model 4 (T5, V8) (T6, V8)
(T7, V8) (T8, V15, V14, V13, V12)
(T9, V16, V11, V9, V8, V7, V6)
FMS Model 5 (T5, V8) (T6, V8)
(T7, V8) (T8, V15, V14, V13, V12)
(T9, V9, V8, V6)
FMS Model 6 (T5, V8) (T6, V8)
(T7, V8) (T8, V15, V14, V13)
(T9, V8, V6)
FMS Model 7 (T5, V8) (T6, V8)
(T7, V8) (T9, V8, V6)
FMS Model 8 (T9, V6)
Closed Risk (No Vulnerabilities)

Note: Vulnerabilities in bold type assumed corrected in following model

The more existing vulnerabilities identified in the FMS open-risk model minimum distance matrix. Table 18 shows
system, the closer the resulting cc value will be to 1.0 (open- the overall path distance cc, node path distance (detailed)
risk). As vulnerabilities are removed, the cc value moves cc, and %compliant values for the FMS models as vulnera-
closer to 0.0 (closed-risk). Table 17 shows sample FMS bilities are removed and the FMS models are compared with
risk model co-occurrence groups. The vulnerabilities in bold the open-risk model.
type are removed in each successive FMS model. For each Using the distance vectors for each entity in the minimum
FMS model, the Pathfinder procedure was applied to gen- distance matrices for the open- and current-risk models, de-
erate a minimum distance matrix for comparison with the tailed cc values are generated that indicate how a single en-
150 E. Hulitt, R.B. Vaughn

Table 18 Risk model comparisons

Open FMS FMS FMS FMS FMS FMS FMS FMS Closed
risk 1 2 3 4 5 6 7 8 risk

%compliant 0.00 0.55 0.59 0.66 0.69 0.77 0.82 0.87 0.95 1.00
Overall path
distance cc 1.00 0.45 0.41 0.34 0.31 0.23 0.18 0.13 0.05 0.00
Node path
distance cc
V1 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
V2 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
V3 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
V4 1.00 0.70 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
V5 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
V6 1.00 0.43 0.46 0.45 0.62 0.41 0.33 0.33 0.23 0.00
V7 1.00 0.43 0.46 0.45 0.62 0.00 0.00 0.00 0.00 0.00
V8 1.00 0.70 0.43 0.35 0.13 0.10 0.09 0.09 0.00 0.00
V9 1.00 0.43 0.46 0.45 0.63 0.41 0.00 0.00 0.00 0.00
V10 1.00 0.75 0.54 0.43 0.00 0.00 0.00 0.00 0.00 0.00
V11 1.00 0.43 0.46 0.45 0.63 0.00 0.00 0.00 0.00 0.00
V12 1.00 0.56 0.56 0.56 0.56 0.56 0.00 0.00 0.00 0.00
V13 1.00 0.56 0.56 0.56 0.56 0.56 0.48 0.00 0.00 0.00
V14 1.00 0.56 0.56 0.56 0.56 0.56 0.48 0.00 0.00 0.00
V15 1.00 0.56 0.56 0.56 0.56 0.56 0.48 0.00 0.00 0.00
V16 1.00 0.43 0.46 0.45 0.62 0.00 0.00 0.00 0.00 0.00
V17 1.00 0.66 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
V18 1.00 0.74 0.69 0.00 0.00 0.00 0.00 0.00 0.00 0.00
V19 1.00 0.65 0.62 0.00 0.00 0.00 0.00 0.00 0.00 0.00
V20 1.00 0.65 0.62 0.00 0.00 0.00 0.00 0.00 0.00 0.00
T1 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
T2 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
T3 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
T4 1.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
T5 1.00 0.66 0.64 0.54 0.29 0.29 0.29 0.29 0.00 0.00
T6 1.00 0.64 0.61 0.51 0.31 0.31 0.31 0.31 0.00 0.00
T7 1.00 0.64 0.62 0.55 0.29 0.29 0.29 0.29 0.00 0.00
T8 1.00 0.56 0.56 0.56 0.56 0.56 0.48 0.00 0.00 0.00
T9 1.00 0.43 0.46 0.45 0.62 0.41 0.33 0.33 0.23 0.00

tity in each model relates to all othershow a single entity Announcement (BAA), Cyber Security Research and De-
contributes to the similarity between models. An analysis of velopment (BAA07-09) [25], describes security metrics as
the detailed cc values for models compared should provide a difficult, long-standing problem. TTA 3 cites the fact that
some insight with regard to choosing an efficient mitigation the security metrics problem is listed on the INFOSEC Re-
path to reaching compliance with standard. search Council (IRC) Hard Problems List [8] as evidence
of the importance of research in this area. Good security
metrics are required to direct the allocation of security re-
6 Conclusion sources to improve the security status of government infor-
mation systems, to demonstrate compliance with FISMA-
Technical Topic Area 3 (TTA 3), Cyber Security Metrics, required security control standards, and to simplify the an-
of the Department of Homeland Security Broad Agency nual FISMA reporting requirement. TTA 3 advises that the
Information system security compliance to FISMA standard: a quantitative measure 151

lack of sound and practical security metrics is severely ham- Acknowledgement E. Hulitt and R.B. Vaughn thank Marsha Gay
pering progress both in research and engineering of secure for her expert editing of this article.
systems [25].
The proposed approach is unique in that it offers a %com-
pliant metric at the information system level. This approach References
avoids the requirement to gather an abundance of histori- 1. Bell, M. Z. (2005). Risky thinkingon threat analysis and busi-
cal incident datausually not availableand to value all ness risk management. Albion Research Ltd., Dunrobin, Ontario,
tangible and intangible assets as is required with the tradi- Canada. http://www.riskythinking.com/glossary/annualized_loss_
tional quantitative approach to risk management. Qualitative expentancy.php, May 2005.
2. Bishop, M. (2003). Computer security: art and science. Boston:
risk management approaches generally produce insufficient AddisonWesley.
measurable detail, and depend heavily on the experience and 3. Chen, C. M. (1998). Bridging the gap: the use of pathfinder net-
expertise of the assessment team. An assessment repeated works in visual navigation. Journal of Visual Languages and Com-
puting, 9(3), 267286.
with a different team may produce different results. The pro- 4. Corman, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. (2001).
posed approach in combination with NIST RMF guidance Introduction to algorithms (2nd ed.) Cambridge: MIT Press.
provides for producing consistent quantitative results. De- 5. Dearholt, D. W., & Schvaneveldt, R. W. (1990). Properties of
tailed cc values should indicate vulnerability groups where pathfinder networks. In Schvaneveldt, R. W. (Ed.), Pathfinder as-
sociative networks: studies in knowledge organization (pp. 130).
targeted cost benefit analysis may be applied to determine an Norwood: Ablex.
effective approach for eliminating vulnerabilities contribut- 6. Gerber, M., & von Solms, R. (2005). Management of risk in the
ing most to the noncompliant state of the system being eval- information age. Computers & Security, 24(1), 1630.
7. Henry, K. (2004). Risk management and analysis. In Tipton, H. F.,
uated. The quantitative %compliant metric should allow for
& Krause, M. (Eds.), Information security management handbook
the discussion of system compliance with FISMA-required (5th ed., pp. 751758). Boca Raton: Auerbach Publications.
standards in terms easily understood by participants at var- 8. INFOSEC Research Council (IRC) (1999). National scale IN-
ious levels of an organization without requiring all to have FOSEC research hard problems list, draft 21. http://www.infosec-
research.org/documents, September 1999.
detailed knowledge of the internals of the security standard 9. Kudikyala, U. K. (2003). PFNET comparison tool (correla-
or the system being evaluated. tions.java) (Technical Report). Department of Computer Science,
Mississippi State University, Starkville, MS, February 2003.
10. Kudikyala, U. K. (2003). Requirements categorization tool (Tech-
nical Report). Department of Computer Science, Mississippi State
7 Future work University, Starkville, MS, February 2003.
11. Kudikyala, U. K. (2003). Reducing misunderstanding of soft-
ware requirements by conceptualization of mental models using
Using the method proposed in this paper, experiments with Pathfinder networks. PhD thesis, Department of Computer Sci-
live data are planned where openrisk PFNET models of ence, Mississippi State University, Starkville, MS.
the FISMA baseline controls for each impact category 12. Kudikyala, U. K., & Vaughn, R. B. (2004). Understanding soft-
ware requirements using Pathfinder networks. CrossTalk: The
low, moderate, highwill be built. Based on prior subjec- Journal of Defense Software Engineering, 17(5), 1625.
tive FISMA reviews, PFNET models of application system 13. Kurup, G. (1989). PFNET generation tool (geom_pfn) (Techni-
risk will be built. Application models will be compared to cal Report). Department of Computer Science, Mississippi State
University, Starkville, MS, August 1989.
the appropriate standard model to generate a %compliant
14. Lin, X., Buzydlowski, J., & White, H. D. (2003). Real-time author
measure. Analysis points will include the following: co-citation mapping for online searching. Information Processing
and Management, 39(5), 689706.
Comparison of the experimental overall system certifica- 15. National Institute of Standards and Technology (2003). Secu-
tion ratings. rity metrics guide for information technology systems, SP 800-
Discovery of a way to use detailed cc values to chart an 55. Computer Security Division. http://csrc.nist.gov/publications/
nistpubs/800-55/sp800-55.pdf, Gaithersburg, MD.
efficient mitigation path to FISMA compliance. 16. National Institute of Standards and Technology (2004). Standards
The comparison of the mitigation patch suggested by the for security categorization of information systems, FIPS PUB 199.
quantitative detailed cc values with the mitigation path Computer Security Division. http://csrc.nist.gov/publications/
recommended by the subjective evaluation. fips/fips199/FIPS-PUB-199-final.pdf, Gaithersburg, MD.
17. National Institute of Standards and Technology (2006). Minimum
Comparison of the experimental overall system certifica- security requirements for federal information and information sys-
tion ratings between years. tem, FIPS PUB 200. Computer Security Division. http://csrc.nist.
gov/publications/fips/fips200/FIPS-PUB-200-final-march.pdf, Gai-
The proposed approach for generating a %compliant to thersburg, MD.
standard measure at the application system level described 18. National Institute of Standards and Technology (2006). Recom-
in this article is peculiar to FISMA, but the approach should mended security controls for federal information systems, SP
800-53 Rev. 1. Computer Security Division. http://csrc.nist.gov/
be applicable to any standard where controls may be negated publications/nistpubs/800-53-Rev1/800-53-rev1-final-clean-sz.
to generate vulnerabilities. pdf, Gaithersburg, MD.
152 E. Hulitt, R.B. Vaughn

19. Ozier, W. (2004). Risk analysis and assessment. In Tipton, H. F., Elaine Hulitt has worked as a Soft-
& Krause, M. (Eds.), Information security management handbook ware Systems Engineer (7 years),
(5th ed., pp. 795820). Boca Raton: Auerbach Publications. a Database Administrator (20 years),
20. Ross, R., Katzke, S., & Toth, P. (2005). The new FISMA standards and an Operations and Maintenance
and guidelines changing the dynamic of information security for Manager (3 years) for companies
the federal government. In 2005 IEEE military communications in the Chicago, IL area including
conference, Atlantic City, NJ, October 1721 2005 (Vol. 2, pp. Kraft Retail and Foodservice and
864870). New York: IEEE Press. GTE and for the last 15 years with
21. Rummel, R. J. (1976). Understanding correlation. Department of the U.S. Army Engineer Research
Political Science, University of Hawaii, Honolulu, HI. http://www. and Development Center, Vicks-
mega.nu/ampp/rummel/uc.htm. burg, MS. Hulitt earned a BME
from Jackson State University (Ma-
22. Shimonski, R. J. (2004). Risk assessment and threat identification.
jor, piano; Minor, voice), Jackson,
TechGenix, Ltd., St. Julians, Malta. http://www.windowsecurity.
MS, a MS degree in Computer Sci-
com/articles/Risk_Assessment_and_Threat_Identification.html,
October 2004. ence from DePaul University, Chicago, IL, and a Ph.D. in Computer
23. Schvaneveldt, R. W. (1990). Graph theory and pathfinder primer. Science from Mississippi State University, Starkville, MS.
In Schvaneveldt, R. W. (Ed.), Pathfinder associative networks:
studies in knowledge organization (pp. 297299). Norwood: Rayford B. Vaughn received his
Ablex. Ph.D. from Kansas State Univer-
24. Schvaneveldt, R. W. (1990). Preface. In Schvaneveldt, R. W. (Ed.), sity in 1988 and is currently the
Pathfinder associative networks: studies in knowledge organiza- Billy J. Ball Professor of Computer
tion (p. ix). Norwood: Ablex. Science and Engineering at Missis-
25. United States Department of Homeland Security (2007). Cyber se- sippi State University. He teaches
curity research and development. Broad Agency Announcement and conducts research in the areas
BAA07-09. http://www.hsarpabaa.com/Solicitations/BAA07-09_ of Software Engineering and Infor-
CyberSecurityRD_Posted_05162007.pdf. mation Security. Prior to joining the
26. United States General Accounting Office (1999). Federal infor- University, he completed a twenty-
mation system controls audit manual (FISCAM), Volume I finan- six year career in the Army where
cial statement audits (GAO/AIMD-12.19.6). http://www.gao.gov/ he commanded the Armys largest
special.pubs/ai12.19.6.pdf. software development organization
27. United States General Accounting Office (1999). Information se- and created the Pentagon agency
curity risk assessment practices of leading organizations, a sup- that today centrally manages all Pentagon IT support. While on ac-
plement to GAOs May 1998 executive guide on information tive duty with the Army, he served a three-year assignment with the
security management (GAO/AIMD-00-33). http://www.gao.gov/ National Security Agencys National Computer Security Center where
special.pubs/ai00033.pdf. he authored national level computer security guidance and conducted
28. United States General Accounting Office (2004). Information se- computer security research. Dr. Vaughn has over 100 publications to
curity: agencies need to implement consistent processes in autho- his credit and is an active contributor to software engineering and in-
rizing systems for operations (Technical Report). Report to Con- formation security conferences and journals. He is actively engaged
gressional Requesters (GAO-04-376). http://www.gao.gov/cgi- in high performance computing intrusion detection system research at
bin/getrpt?GAO-04-376. Mississippi State University and established the MSU Center of Com-
29. United States Office of Management and Budget (OMB) (1996). puter Security Research in 2001. In 2004, Dr. Vaughn was named
Security of federal automated information resources, Appendix III a Mississippi State University Eminent Scholar and in 2005 he was
to OMB Circular No. A-130. Management of Federal Information given the Most Outstanding Academic Award by the National Collo-
Resources. http://www.whitehouse.gov/omb/circulars/a130/a130. quium on Information Systems Security Education. Today, Dr. Vaughn
html, February 1996. is the elected representative of all the principal investigators on the NSF
30. United States Public Law 107-347-DEC. 17 2002, 116 STAT. Scholarship for Service program and member of the Interagency Co-
2899 (2002). Federal information security management act ordinating Committee overseeing the SFS program. He maintains an
active relationship with NSA as a part of the DOD Information Assur-
(FISMA). Title III of the E-Government Act of 2002. http://
ance Scholarship Program that MSU has been funded by since 2001.
frwebgate.access.gpo.gov/cgi-bin/getdoc.cgi?dbname=107_cong_
public_laws&docid=f:publ347.107.pdf.

You might also like