0% found this document useful (0 votes)
31 views40 pages

Lecture - 12 - Evaluation Standards

The document outlines the evaluation standards for computer security, including the Trusted Computer System Evaluation Criteria (TCSEC), FIPS 140, and Common Criteria. It emphasizes the importance of independent assessments to ensure systems meet specific security requirements and describes the evaluation process and methodologies used to determine a system's trustworthiness. Additionally, it discusses the evolution of these standards and their impact on the commercial sector's approach to computer security.

Uploaded by

reach.hifza
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views40 pages

Lecture - 12 - Evaluation Standards

The document outlines the evaluation standards for computer security, including the Trusted Computer System Evaluation Criteria (TCSEC), FIPS 140, and Common Criteria. It emphasizes the importance of independent assessments to ensure systems meet specific security requirements and describes the evaluation process and methodologies used to determine a system's trustworthiness. Additionally, it discusses the evolution of these standards and their impact on the commercial sector's approach to computer security.

Uploaded by

reach.hifza
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

IS 820: Computer Security

Evaluation Standards
Overview
• Why evaluate?
• Evaluation criteria
– Trusted Computer System Evaluation Criteria: TCSEC
(aka Orange Book)
– Federal Information Processing Standards: FIPS 140
– Common Criteria
– System Security Engineering Capability Maturity
Model: SSE-CMM
Background
• Goal of Evaluation: Show that a system meets
specific security requirements under specific
conditions
– Called a trusted system
– Based on specific assurance evidence
• Formal evaluation methodology
– Technique used to provide measurements of trust
based on specific security requirements and
evidence of assurance
Features of an Evaluation Methodology
• Provides set of requirements defining security functionality
for system

• Provides set of assurance requirements that specify


evidence of required assurance

• Provides methodology for determining that system meets


functional requirements based on analysis of assurance
evidence collected

• Provides measure of result indicating how trustworthy


system is with respect to security functional requirements
– Called level of trust
Why Evaluate?
• Provides an independent assessment, and measure
of assurance, by experts
– Includes assessment of requirements to see if they are
consistent, complete, technically sound, sufficient to
counter threats
– Includes assessment of administrative, user, installation,
other documentation that provides information on proper
configuration, administration, use of system
• Importance of Independence
• Tradeoffs between security and cost for evaluation
– Evaluation charges
– Time to market and number of features
– Staffing costs
Bit of History
• Government, military drove early evaluation
processes
– Their desire to use commercial products led to businesses
developing methodologies for evaluating security,
trustworthiness of systems
• Methodologies provide combination of
– Functional requirements
– Assurance requirements
– Levels of trust
Some Definitions
• Reference Monitor
– An access control concept of an abstract machine that
defines a set of design requirements to mediate all accesses
to objects by subjects
• Reference Validation Mechanism (RVM)
– An implementation of reference monitor.
– RVM must
• be tamperproof
• always be invoked
• never be bypassed (in conjunction with the above req)
• be small enough to be analyzed and tested, the
completeness of which can be assured
Some Definitions
• Trusted Computing Base (TCB)
– Consists of all protection mechanisms within a computer
system including hardware, firmware and software, that are
responsible for enforcing a security policy
• Trusted Path
– A communication path guaranteed to be between the user
and the TCB
• Functional vs Assurance Requirements
– Functional requirements describe what a system should do
– Assurance requirements describe how functional
requirements should be implemented and tested.
TCSEC: 1983–1999
• Trusted Computer System
Evaluation Criteria
– Also known as the Orange
Book
– Developed by National
Computer Security Center,
US Dept. of Defense
– Series that expanded on
Orange Book in specific
areas was called Rainbow
Series
Other "Rainbow Book" Standards
• Red Book
Trusted Network Interpretation

• Yellow Book
Methodology for Security Risk Assessment

• Lavendar Book
Database Security Evaluation
TCSEC: 1983–1999
• Heavily influenced by Bell-LaPadula model and
reference monitor concept
• Emphasizes confidentiality
• Original spec Aug 83, revised Dec 85
Evaluation Criteria Classes (Ratings)

Class Description
A1 Verified Design
B3 Security Domains
B2 Structured Protection
B1 Labelled Security Protection
C2 Controlled Access Protection
C1 Discretionary Security Protection
D Minimal Protection
Functional Requirements
• Discretionary access control requirements
– Control sharing of named objects by named
individuals/groups
– Address propagation of access rights, ACLs, granularity of
controls
• Object reuse requirements
– Hinder attacker gathering information from disk or
memory that has been deleted
– Address overwriting data, revoking access rights, and
assignment of resources when data in resource from
previous use is present
Functional Requirements
• Mandatory access control requirements (B1 up)
– Embody simple security condition, *-property
– Description of hierarchy of labels attached to subjects and
objects, represent authorizations and protection
respectively
• Label requirements (B1 up)
– Used to enforce MAC
– Address representation of classifications, clearances,
exporting labeled information, human-readable output
• Identification, authentication requirements
– Address granularity (per group or user) of authentication
data, protecting that data, associating identity with
auditable actions
Functional Requirements
• Audit requirements
– Define what audit records contain, events to be recorded;
set increases as other requirements increase
• Trusted path requirements (B2 up)
– Communications path guaranteed between user, TCB
• System architecture requirements
– Tamperproof reference validation mechanism
– Process isolation
– Enforcement of principle of least privilege
– Well-defined user interfaces
Operational Assurance/Functional Requirements

• Trusted facility management (B2 up)


– Separation of operator, administrator roles
• Trusted recovery (A1)
– Securely recover after failure or discontinuity
• System integrity requirement
– Hardware diagnostics to validate on-site
hardware, firmware elements of TCB
Assurance Requirements
• Configuration management requirements (B2 up)
– Identify configuration items, consistent mappings among
documentation and code
• Trusted distribution requirement (A1)
– Address integrity of mapping between masters and on-site
versions
– Address acceptance procedures for the customer
• System architecture requirements
– Modularity, minimize complexity and techniques to keep
TCB small and simple (C1 – B2).
– TCB must be a full reference validation mechanism at B3
Assurance Requirements
• Design, specification, verification requirements
– B1: informal security policy model shown to be consistent
with its axioms
– B2: formal security policy model proven to be consistent
with its axioms, have a descriptive top-level specification
(DTLS)
– B3: DTLS shown to be consistent with security policy
model
– A1: formal top-level specification (FTLS) shown consistent
with security policy model using approved formal
methods; mapping between FTLS and source code
Assurance Requirements
• Testing requirements
– Address conformance with claims, resistance to
penetration, correction of flaws followed by retesting
– Requires searching for covert channels for higher classes
(B2 up) using formal methods
• Product documentation requirements
– Security Features User’s Guide describes uses, interactions
of protection mechanisms
– Trusted Facility Manual describes requirements for running
system securely
• Other documentation: test, design docs
Evaluation Classes C and D
D Did not meet requirements of any other class

C1 Discretionary protection; minimal functional,


assurance requirements only for I&A and DAC
Briefly used, no product evaluated after 1986

C2 Controlled access protection; object reuse and


auditing requirements, more stringent security
testing requirements
Most products were evaluated & certified at this
class
Evaluation Classes A and B
B1 Labeled security protection; informal security policy
model; MAC for some objects; labeling; more stringent
security testing
B2 Structured protection; formal security policy model; MAC
for all objects, labeling; trusted path only for login; least
privilege; covert channel analysis, configuration
management
B3 Security domains; implements full RVM; increases trusted
path requirements, constrains code development; DTLS
requirements; documentation
A1 Verified protection; significant use of formal methods;
trusted distribution; code - FTLS correspondence
Evaluation Process
• Run by government, no fee to vendor
• 3 stages
– Application: request for evaluation
• May be denied if gov’t didn’t need product
– Preliminary technical review
• Essentially a readiness review
• Discussion of evaluation process, schedules,
development process, technical content, etc.
• Determined schedule for evaluation
– Evaluation phase
Evaluation Phase
• 3 parts; results of each presented to technical
review board composed of senior evaluators not on
evaluating team; must approve one part before
moving on to next part
– Design analysis: review design based on documentation
provided; developed initial product assessment report
• Source code not reviewed
– Test analysis: vendor-supplied tests
– Final evaluation report
• Once approved, all items closed, rating given
RAMP
• Ratings Maintenance Program goal: maintain
assurance for new version of evaluated product
• Vendor would update assurance evidence
• Technical review board reviewed vendor’s report
and, on approval, assigned evaluation rating to new
version of product
• Note: major changes (structural, addition of some
new functions) could be rejected here and a full
new evaluation required
Impact
• New approach to evaluating security
– Based on analyzing design, implementation,
documentation, procedures
– Introduced evaluation classes, assurance requirements,
assurance-based evaluation
– High technical standards for evaluation
– Technical depth in evaluation procedures
• Some problems
– Evaluation process difficult, lacking in resources
– Mixed assurance, functionality together
– Evaluations only recognized in US
Scope Limitations
• Written for operating systems
– National Computer Security Center (NCSC) introduced
“interpretations” for other things such as networks
(Trusted Network Interpretation, the Red Book), databases
(Trusted Database Interpretation, the Purple or Lavender
Book)
• Focuses on needs of US government
• Does not address integrity or availability
– Critical to commercial firms
Process Limitations
• Criteria creep (expansion of requirements defining
classes)
– Criteria interpreted for specific product types
– Good for community (learned more about security), but
inconsistent over time
• Length of time of evaluation
– Misunderstanding depth of evaluation
– Management practices of evaluation
– As was free, sometimes lacking in motivation
Contributions
• Heightened awareness in commercial sector to
computer security needs
• Commercial firms could not use it for their products
– Did not cover networks, applications
– Led the wave of new approaches to evaluation
– Some commercial firms began offering certifications
• Basis for several other schemes, such as Federal
Criteria, Common Criteria
Classification - Examples
• Credit Suisse implemented B1 level secure OS
system to provide global internet services.
• UNIX system such as SUN - B1 level.
• Computer system level above B2 can not be
exported – US Law
• Digital Equipment Corporation – VAX/VMS 4.3 – C1
level.
• Control Data Corporation’s Network Operating
System – C1 level
FIPS 140: 1994–Present
FIPS 140: 1994–Present
• Evaluation standard for cryptographic modules
(implementing cryptographic logic or processes)
only for federal computer systems
– Established by US government agencies and Canadian
Security Establishment
• Updated in 2001 to address changes in process and
technology
– Officially, FIPS 140-2
• FIPS 140-3 (effective Sep 2019, testing commenced
Sep 2020). Overlap with FIPS 140-2 for 2 years
• Evaluates only crypto modules
– If software, processor executing it also included, as is
operating system
FIPS Standards
• Each standard defines Four increasing levels of
security

• FIPS 140-1 covers basic design, documentation,


cryptographic key management, testing, physical
security (from electromagnetic interference), etc.

• FIPS 140-2 covers specification, secure design and


implementation, ports & interfaces; roles, services;
finite state model; physical security; mitigation of
other attacks; etc. (Expiry: September 22, 2026)
FIPS Standards
• FIPS 140-3 (effective: September 22, 2020) adds
new security features that reflect recent advances
in technology and security methods. Software and
firmware requirements are addressed in a new area
dedicated to software and firmware security.
Security Level 1 (Lowest)
• Encryption algorithm must be FIPS-approved
algorithm
• Software, firmware components may be
executed on general-purpose system using
unevaluated OS
• No physical security beyond use of
production-grade equipment required
• Example: PC doing encryption
Security Level 2
• More physical security
– Tamper-proof coatings or seals or pick-resistent locks
• Role-based authentication
– Module must authenticate that operator is authorized to
assume specific role and perform specific services
• Software, firmware components may be executed
on multiuser system with OS evaluated at EAL2 or
better under Common Criteria
– Must use one of specified set of protection profiles
Security Level 3
• Enhanced physical security
– Enough to prevent intruders from accessing critical
security parameters within module
• Identity-based authentication
• Strong requirements for reading, altering critical
security parameters
• Software, firmware components require OS to have
EAL3 evaluation, trusted path, informal security
policy model
– Can use equivalent evaluated trusted OS instead
Security Level 4
• “Envelope of protection” around module that
detects, responds to all unauthorized attempts at
physical access
– Includes protection against environmental conditions or
fluctuations outside module’s range of voltage,
temperatures
• Software, firmware components require OS meet
functional requirements for Security Level 3, and
assurance requirements for EAL4
– Equivalent trusted operating system may be used
Impact
• By 2002, 164 modules, 332 algorithms tested
– About 50% of modules had security flaws
– More than 95% of modules had documentation errors
– About 25% of algorithms had security flaws
– More than 65% had documentation errors

• Up to Dec 16, 2024 - 4917 Certificates issued to


vendors
• Program greatly improved quality, security of
cryptographic modules
https://csrc.nist.gov/projects/cryptographic-module-validation-program
http://csrc.nist.gov/groups/STM/cmvp/documents/140-1/140val-all.htm

You might also like