[Directives and Handbooks]

NASA Handbook

NHB 2410.9A
Effective Date: June 1, 1993
Expiration Date:


Responsible Office: AO / Chief Information Officer

NASA AUTOMATED INFORMATION SECURITY HANDBOOK

PREFACE

Public law and national policy require Federal agencies to
establish Automated Information Security (AIS) programs to assure
effective management control and adequate levels of protection for
all agency automated information resources.

Automated information system security is becoming an increasingly
important issue for all NASA managers.  Rapid advancements in
computer and network technologies and the demanding nature of space
exploration and space research have made NASA increasingly
dependent on automated systems to store, process, and transmit vast
amounts of mission support information.  In many cases, automated
processes are integrated functions that directly contribute to the
success of NASA missions.  In today's electronically-based society,
the practice of effective AIS management principles is an inherent
function of good business and good professional practice.

The AIS management process covered by this Handbook exemplifies
NASA's efforts to assure that scientific missions and business
functions are carried out in an accurate, safe, accountable, and
efficient manner.  This Handbook, in addition to NMI 2410.7,
"Assuring the Security and Integrity of NASA Automated Information
Resources," provides consistent policies, procedures, and guidance
to assure that an aggressive and effective AIS program is
developed, implemented, and sustained.  The provisions of this
Handbook apply to all NASA organizations and NASA support
contractors.  Generally excluded are contractor research facility
automated information resources not under direct NASA management
control.

This Handbook is intended primarily for use by Program Office AIS
Managers at Headquarters (PO-AISM's) and Center AIS Managers
(C-AISM's) at Field Installations; however, it has been structured
to allow anyone from senior management to technical support
personnel to quickly understand the overall concepts and their
personal relationship to the AIS Program.  The intention of
providing implementation flexibility in the guidance portions is to
encourage the exercise of sound judgement by those closest to a
problem.  PO-AISM's and C-AISM's are expected to apply common sense
in determining appropriate variations and exceptions that may
become necessary in specific computing environments.

This Handbook is issued in loose-leaf form and will be revised by
page changes.  Comments and suggestions concerning this Handbook
should be addressed to the NASA AIS Program Manager, Code JIS, NASA
Headquarters, Washington, DC 20546.

This Handbook cancels NHB 2410.9 dated September 18, 1990.

                              Jeffrey E. Sutton
                              Director, Security, Logistics
                              and Industrial Relations Division

DISTRIBUTION:
SDL 1(SIQ)



                        Table of Contents

Paragraph                                                    Page

CHAPTER 1.  PROGRAM OVERVIEW . . . . . . . . . . . . . . . . .1-1

100  INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . .1-1
     a.   Management Issue . . . . . . . . . . . . . . . . . .1-1
     b.   Value of Information and Computing Resources . . . .1-1
     c.   Life Cycle Phases. . . . . . . . . . . . . . . . . .1-1
     d.   History. . . . . . . . . . . . . . . . . . . . . . .1-1
     e.   References . . . . . . . . . . . . . . . . . . . . .1-2
     f.   Terminology. . . . . . . . . . . . . . . . . . . . .1-2

101  PURPOSE . . . . . . . . . . . . . . . . . . . . . . . . .1-3

102  ORGANIZATIONAL SCOPE. . . . . . . . . . . . . . . . . . .1-3

103  SYSTEMS COVERED . . . . . . . . . . . . . . . . . . . . .1-3

104  EXCEPTIONS. . . . . . . . . . . . . . . . . . . . . . . .1-4

105  NASA COMPUTER SYSTEMS ENVIRONMENT . . . . . . . . . . . .1-4

106  IMPORTANCE OF AN EFFECTIVE COMPUTER SECURITY PROGRAM. . .1-4
     a.   Management Priority. . . . . . . . . . . . . . . . .1-4
     b.   Public Image . . . . . . . . . . . . . . . . . . . .1-4
     c.   Increasing Incidents . . . . . . . . . . . . . . . .1-5

107  NASA AIS PROGRAM BACKGROUND . . . . . . . . . . . . . . .1-5
     a.   Initial Policy . . . . . . . . . . . . . . . . . . .1-5
     b.   Initial Handbook . . . . . . . . . . . . . . . . . .1-5
     c.   Summary of Other Milestones. . . . . . . . . . . . .1-5

108  ORIGIN OF NATIONAL POLICY . . . . . . . . . . . . . . . .1-6
     a.   National Organizations . . . . . . . . . . . . . . .1-6
     b.   National Documents . . . . . . . . . . . . . . . . .1-6

CHAPTER 2.  PROGRAM ORGANIZATION AND MANAGEMENT. . . . . . . .2-1

200  INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . .2-1

201  MANAGEMENT PHILOSOPHIES . . . . . . . . . . . . . . . . .2-1
     a.   Integration. . . . . . . . . . . . . . . . . . . . .2-1
     b.   Decentralization . . . . . . . . . . . . . . . . . .2-1
     c.   Perfection . . . . . . . . . . . . . . . . . . . . .2-1

202  NASA AIS PROGRAM GOAL AND OBJECTIVES. . . . . . . . . . .2-2
     a.   Goal . . . . . . . . . . . . . . . . . . . . . . . .2-2
     b.   Objectives . . . . . . . . . . . . . . . . . . . . .2-2

203  PROGRAM ELEMENTS. . . . . . . . . . . . . . . . . . . . .2-2
     a.   Basic Elements . . . . . . . . . . . . . . . . . . .2-2
     b.   Sustaining Program Effectiveness . . . . . . . . . .2-5

204  NASA AIS POLICY . . . . . . . . . . . . . . . . . . . . .2-5

205  HEADQUARTERS ROLES AND RESPONSIBILITIES . . . . . . . . .2-6
     a.   Overview . . . . . . . . . . . . . . . . . . . . . .2-6
     b.   Multidisciplinary Coordination . . . . . . . . . . .2-6
     c.   NASA AIS Program Manager . . . . . . . . . . . . . .2-6
     d.   Headquarters Program Offices . . . . . . . . . . . .2-9
     e.   Other Headquarters Offices . . . . . . . . . . . . 2-10

206  INDIVIDUAL RESPONSIBILITIES FOR AIS . . . . . . . . . . 2-10

207  PROGRAM ORGANIZATIONAL STRUCTURE. . . . . . . . . . . . 2-12
     a.   Overview . . . . . . . . . . . . . . . . . . . . . 2-12
     b.   Headquarters Level . . . . . . . . . . . . . . . . 2-12
     c.   Center Level . . . . . . . . . . . . . . . . . . . 2-12

208  MANAGEMENT REVIEW AND COMPLIANCE ASSURANCE PROCESS. . . 2-14
     a.   Headquarters Reviews . . . . . . . . . . . . . . . 2-14
     b.   Center Reviews . . . . . . . . . . . . . . . . . . 2-14
     c.   DPI Reviews. . . . . . . . . . . . . . . . . . . . 2-14

CHAPTER 3.  CENTER AND DPI REQUIREMENTS. . . . . . . . . . . .3-1

300  CENTER REQUIREMENTS . . . . . . . . . . . . . . . . . . .3-1
     a.   Designation of Authorities . . . . . . . . . . . . .3-1
     b.   C-AISM Responsibilities. . . . . . . . . . . . . . .3-1
     c.   Identifying DPI's. . . . . . . . . . . . . . . . . .3-2
     d.   Identifying Additional Entities. . . . . . . . . . .3-4

301  DPI REQUIREMENTS. . . . . . . . . . . . . . . . . . . . .3-4
     a.   Designation of Authorities . . . . . . . . . . . . .3-4
     b.   DPI-AISO Responsibilities. . . . . . . . . . . . . .3-4

302  MANAGEMENT PROCESS. . . . . . . . . . . . . . . . . . . .3-5
     a.   Risk Assessments . . . . . . . . . . . . . . . . . .3-5
     b.   Certifying Requirements. . . . . . . . . . . . . . .3-6
     c.   Personnel Screening. . . . . . . . . . . . . . . . .3-6
     d.   Access Protection and Accountability . . . . . . . .3-6
     e.   Compliance Assurance . . . . . . . . . . . . . . . .3-6
     f.   Contingency and Disaster Recovery Plans. . . . . . .3-6
     g.   Approval of Methodologies. . . . . . . . . . . . . .3-7

303  RISK ASSESSMENT PROCESS . . . . . . . . . . . . . . . . .3-7
     a.   Purpose. . . . . . . . . . . . . . . . . . . . . . .3-7
     b.   Scope. . . . . . . . . . . . . . . . . . . . . . . .3-7
     c.   Minimum Requirements . . . . . . . . . . . . . . . .3-7
     d.   Security Testing . . . . . . . . . . . . . . . . . .3-9

304  PROTECTIVE MEASURES TO PREVENT MISUSE AND ABUSE . . . . 3-11

305  CERTIFICATION PROCESS . . . . . . . . . . . . . . . . . 3-11
     a.   New or Modified Applications . . . . . . . . . . . 3-12
     b.   Recertifications . . . . . . . . . . . . . . . . . 3-13

306  MINIMUM LEVEL OF SCREENING FOR NON-FEDERAL ADP PERSONNEL3-13

307  CONTROLLING ACCESS BY FOREIGN NATIONALS . . . . . . . . 3-14
     a.   Introduction . . . . . . . . . . . . . . . . . . . 3-14
     b.   Purpose. . . . . . . . . . . . . . . . . . . . . . 3-14
     c    Categories.  . . . . . . . . . . . . . . . . . . . 3-14
     d.   Sponsors . . . . . . . . . . . . . . . . . . . . . 3-14
     e.   Submission/Approval of Requests. . . . . . . . . . 3-14
     f.   Exceptions . . . . . . . . . . . . . . . . . . . . 3-14

308  CONTINGENCY AND DISASTER RECOVERY PLANNING. . . . . . . 3-16
     a.   Definitions. . . . . . . . . . . . . . . . . . . . 3-16
     b.   Plan Content . . . . . . . . . . . . . . . . . . . 3-16

309  NETWORK AND COMPUTER SECURITY INCIDENT RESPONSE (CSIR)
     CAPABILITY. . . . . . . . . . . . . . . . . . . . . . . 3-17
     a.   Responsibilities . . . . . . . . . . . . . . . . . 3-17
     b.   Objectives . . . . . . . . . . . . . . . . . . . . 3-18
     c.   Procedure Elements . . . . . . . . . . . . . . . . 3-18
     d.   Non-duty Hours Considerations. . . . . . . . . . . 3-21

310  CSAT. . . . . . . . . . . . . . . . . . . . . . . . . . 3-21
     a.   Continuous CSAT. . . . . . . . . . . . . . . . . . 3-21
     b.   Multifaceted Approach. . . . . . . . . . . . . . . 3-21

311  PROCUREMENT OF PRODUCTS AND SERVICES. . . . . . . . . . 3-23
     a.   Introduction . . . . . . . . . . . . . . . . . . . 3-23
     b.   NASA Contracting Environment . . . . . . . . . . . 3-23
     c.   Project Manager Responsibilities . . . . . . . . . 3-23
     d.   Sponsoring Organization Responsibilities . . . . . 3-23
     e.   Contracting Officer Responsibilities . . . . . . . 3-23
     f.   Evaluating Security Capabilities . . . . . . . . . 3-24
     g.   Contract Administration. . . . . . . . . . . . . . 3-24
     h.   Requirements for Contractor-Operated DPI's . . . . 3-24

CHAPTER 4.  AUTOMATED INFORMATION CATEGORIES AND
SENSITIVITY/CRITICALITY LEVELS . . . . . . . . . . . . . . . .4-1

400  INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . .4-1
     a.   Information Categories . . . . . . . . . . . . . . .4-1
     b.   Sensitivity/Criticality Levels . . . . . . . . . . .4-1

401  INFORMATION CATEGORIES. . . . . . . . . . . . . . . . . .4-4
     a.   Statutes . . . . . . . . . . . . . . . . . . . . . .4-4
     b.   Derivations. . . . . . . . . . . . . . . . . . . . .4-4

402  SENSITIVITY/CRITICALITY LEVELS. . . . . . . . . . . . . .4-8
     a.   Introduction . . . . . . . . . . . . . . . . . . . .4-8
     b.   Automated Information and Applications . . . . . . .4-8
     c.   Computer Systems . . . . . . . . . . . . . . . . . .4-8

403  PROTECTIVE MEASURE BASELINE CONSIDERATIONS. . . . . . . .4-8
     a.   Sensitivity/Criticality Level 0. . . . . . . . . . 4-10
     b.   Sensitivity/Criticality Level 1. . . . . . . . . . 4-10
     c.   Sensitivity/Criticality Level 2. . . . . . . . . . 4-11
     d.   Sensitivity/Criticality Level 3. . . . . . . . . . 4-12

CHAPTER 5.  AIS PLANNING . . . . . . . . . . . . . . . . . . .5-1

500  INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . .5-1

501  HEADQUARTERS AIS PLANNING . . . . . . . . . . . . . . . .5-1
     a.   NASA AIS Program Plan. . . . . . . . . . . . . . . .5-1
     b.   Headquarters Program Office/Institutional Program Office
          AIS Plans (PO/IPO-AISPs) . . . . . . . . . . . . . .5-2

502  CENTER AIS PLANNING . . . . . . . . . . . . . . . . . . .5-3
     a.   Center AIS Plan (C-AISP) . . . . . . . . . . . . . .5-3
     b.   Content and Format . . . . . . . . . . . . . . . . .5-5

503  DPI AIS PLANNING. . . . . . . . . . . . . . . . . . . . .5-6
     a.   Purpose. . . . . . . . . . . . . . . . . . . . . . .5-6
     b.   Content. . . . . . . . . . . . . . . . . . . . . . .5-7

504  DPI CONTINGENCY AND DISASTER RECOVERY PLANS . . . . . . .5-8

505  EXTERNAL REQUESTS FOR REPORTS ON AIS PLANNING ACTIVITY. .5-8

CHAPTER 6.  SPECIAL CONSIDERATIONS FOR MICROCOMPUTERS. . . . .6-1

600  INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . .6-1
     a.   Security Principles. . . . . . . . . . . . . . . . .6-1
     b.   Security Implications. . . . . . . . . . . . . . . .6-1

602  SPECIAL PROTECTIVE MEASURES FOR MICROCOMPUTERS. . . . . .6-1
     a.   Technical Protective Measures. . . . . . . . . . . .6-1
     b.   Administrative Protective Measures . . . . . . . . .6-2
     c.   Physical Protective Measures . . . . . . . . . . . .6-2
     d.   Personnel Protective Measures. . . . . . . . . . . .6-2

603  AIS SOFTWARE MANAGEMENT ISSUES. . . . . . . . . . . . . .6-2
     a.   Imported Software. . . . . . . . . . . . . . . . . .6-2
     b.   Centers of Excellence. . . . . . . . . . . . . . . .6-2

CHAPTER 7 - PROCESSING NATIONAL SECURITY INFORMATION . . . . .7-1

700  INTRODUCTION. . . . . . . . . . . . . . . . . . . . . . .7-1
     a.   Background . . . . . . . . . . . . . . . . . . . . .7-1
     b.   Scope. . . . . . . . . . . . . . . . . . . . . . . .7-1
     c.   Preceding Chapters . . . . . . . . . . . . . . . . .7-1
     d.   References . . . . . . . . . . . . . . . . . . . . .7-1
     e.   Format . . . . . . . . . . . . . . . . . . . . . . .7-1

701  PROGRAM OVERVIEW. . . . . . . . . . . . . . . . . . . . .7-2
     a.   Issues, Purpose, and Scope . . . . . . . . . . . . .7-2
     b.   Systems Covered. . . . . . . . . . . . . . . . . . .7-2
     c.   Exceptions . . . . . . . . . . . . . . . . . . . . .7-2

702  PROGRAM ORGANIZATION AND MANAGEMENT . . . . . . . . . . .7-2
     a.   Management Philosophies. . . . . . . . . . . . . . .7-2
     b.   NASA AIS Program Goal and Objectives . . . . . . . .7-2
     c.   Program Elements . . . . . . . . . . . . . . . . . .7-3
     d.   NASA AIS Policy. . . . . . . . . . . . . . . . . . .7-3
     e.   HQ Responsibilities and Structures . . . . . . . . .7-3

703  CENTER AND DPI REQUIREMENTS . . . . . . . . . . . . . . .7-3
     a.   Center Requirements. . . . . . . . . . . . . . . . .7-3
     b.   DPI Requirements . . . . . . . . . . . . . . . . . .7-4
     c.   Management Process . . . . . . . . . . . . . . . . .7-5
     d.   Security Risk Assessments. . . . . . . . . . . . . .7-5
     e.   Misuse and Abuse . . . . . . . . . . . . . . . . . .7-5
     f.   Security Certification . . . . . . . . . . . . . . .7-5
     g.   Screening Non-Federal Personnel. . . . . . . . . . .7-7
     h.   Access by Foreign Nationals. . . . . . . . . . . . .7-7
     i.   Contingency and Disaster Recovery Planning . . . . .7-8
     j.   CSIR Capability. . . . . . . . . . . . . . . . . . .7-8
     k.   CSAT . . . . . . . . . . . . . . . . . . . . . . . .7-8
     l.   Procurement of Products and Services . . . . . . . .7-8

704  AUTOMATED INFORMATION CATEGORIES AND SENSITIVITY/CRITICALITY
     LEVELS. . . . . . . . . . . . . . . . . . . . . . . . . .7-8
     a.   Information Categories . . . . . . . . . . . . . . .7-8
     b.   Sensitivity/Criticality Levels . . . . . . . . . . .7-8
     c.   Protective Measure Baseline. . . . . . . . . . . . .7-8

705  AIS PLANNING. . . . . . . . . . . . . . . . . . . . . . 7-10
     a.   Headquarters AIS Planning. . . . . . . . . . . . . 7-10
     b.   Center AIS Planning. . . . . . . . . . . . . . . . 7-10
     c.   DPI AIS Planning . . . . . . . . . . . . . . . . . 7-10
     d.   External Requests for Reports on Planning Activity 7-10

706  SPECIAL CONSIDERATIONS FOR MICROCOMPUTER PLATFORMS. . . 7-10

707  OTHER SPECIAL CONSIDERATIONS FOR CLASSIFIED PROCESSING. 7-10
     a.   Security Reviews, Tests, and Reporting . . . . . . 7-10
     b.   Maintenance Personnel. . . . . . . . . . . . . . . 7-10
     c.   Visitors . . . . . . . . . . . . . . . . . . . . . 7-11
     d.   Physical Security. . . . . . . . . . . . . . . . . 7-11
     e.   Access Control/Password Management System. . . . . 7-13
     f.   Audit Trails/Logs. . . . . . . . . . . . . . . . . 7-14
     g.   Hardware and Software Security . . . . . . . . . . 7-14
     h.   Declassifying Memory, Media and Equipment. . . . . 7-15
     i.   Clearing Memory, Media and Equipment . . . . . . . 7-16
     j.   Upgrading. . . . . . . . . . . . . . . . . . . . . 7-16
     k.   Downgrading. . . . . . . . . . . . . . . . . . . . 7-17
     l.   Communications and Network Security. . . . . . . . 7-18
     m.   Maintenance Procedures . . . . . . . . . . . . . . 7-18
     n.   Modes of Operation . . . . . . . . . . . . . . . . 7-19
     o.   System High Requirements . . . . . . . . . . . . . 7-19
     p.   Multi-Level Requirements . . . . . . . . . . . . . 7-20

Appendix A,  REFERENCES. . . . . . . . . . . . . . . . . . . .A-1

Appendix B,  ABBREVIATIONS . . . . . . . . . . . . . . . . . .B-1

Appendix C,  TERMINOLOGY . . . . . . . . . . . . . . . . . . .C-1

Appendix D,  DECLASSIFYING AND DESTROYING MEMORY, MEDIA, AND
             EQUIPMENT . . . . . . . . . . . . . . . . . . . .D-1

Appendix E,  PLAN FORMAT FOR NASA CLASSIFIED SYSTEMS . . . . .E-1

Appendix F,  INDEX . . . . . . . . . . . . . . . . . . . . . .F-1

Appendix G,  FEDERAL AIS REQUIREMENTS. . . . . . . . . . . . .G-1



                  CHAPTER 1.  PROGRAM OVERVIEW

100  INTRODUCTION

     a.   Management Issue.  Automated Information Security (AIS)
is an increasingly important issue for all NASA managers.  Modern
technology and the demands of space research/exploration have made
NASA more and more dependent on computers to store and process vast
amounts of information that support sensitive and mission-critical
functions.  NASA's computer and information assets have such great
value that they must be managed to the same extent as the more
traditional organizational assets (i.e., people, money, equipment,
natural resources, and time).

     b.   Value of Information and Computing Resources.  The value
of NASA's information and computing resources and the importance of
NASA missions creates a need for these resources to be adequately
protected to assure ready availability, high integrity, and
confidentiality, as appropriate.  The appropriate protection of
automated information must be motivated and supported by the
managers who own or use that information.

     c.   Life Cycle Phases.  Some automated systems are acquired
"off the shelf" and can be used immediately.  Others must be
specially designed, developed, and implemented over months or
years.  Once an automated system is fully operational, the options
available to provide adequate security are somewhat limited. 
However, if security is designed into an automated system, the
safeguard options are vastly increased and the safeguard costs over
the life of the system are substantially reduced.  This is true for
computer hardware, system software, and application software. 
Therefore, it is important for NASA managers to ensure that
security is appropriately integrated into all phases of the life
cycle development methodology for automated systems, especially in
the early planning stages.

     d.   History.  In the past, NASA computer security guidance
was provided through the following:

          (1)  NASA Handbook (NHB) 2410.1, "Information Processing
               Resources Management." April 1985.

          (2)  Assorted NASA policy letters, such as:

               (a)  "Interim Standard for Identification of NASA
                    Sensitive Automated Information and
                    Applications," NASA Headquarters (HQ) Code NT
                    letter, November 1987.

               (b)  "Responding to and Reporting Automated
                    Information Security Incidents," NASA HQ Code
                    NT letter, January 1988.

          (3)  Assorted NASA guidelines, such as:

               (a)  "Guidelines for Certification of Existing
                    Sensitive Systems," July 1982.

               (b)  "Guidelines for Development of NASA Computer
                    Security Training Programs," May 1983.

               (c)  "Guidelines for Developing NASA ADP Security
                    Risk Management Plans," August 1983.

               (d)  "Guidelines for Developing NASA ADP Security
                    Risk Reduction Decision Studies," January
                    1984.

               (e)  "NASA ADP Risk Analysis Guidelines," July
                    1984.

               (f)  "NASA Guidelines for Assuring the Adequacy and
                    Appropriateness of Security Safeguards in
                    Sensitive Applications," September 1984.

               (g)  "NASA Guidelines for Meeting DOD Accreditation
                    Requirements for Processing Classified Data,"
                    March 1985.

               (h)  "Guidelines for Contingency Planning,"
                    November 1982. 

               (i)  "Guidelines for Selection of Backup
                    Strategies," November 1982.

     e.   References.  Appendix A lists the references used in this
Handbook, which expands on NMI 2410.7, "Assuring the Security and
Integrity of NASA Automated Information Resources," and replaces
the following:

          (1)  NHB 2410.1, Chapter 3.

          (2)  All prior computer security policy letters.

          (3)  All of the documents listed in subparagraph d(3).

     f.   Terminology.  Appendix B is a list of abbreviations. 
Appendix C provides definitions for most of the AIS related terms
used in this Handbook.  Given the number of terms unique to the
computer and/or security disciplines, readers should familiarize
themselves with the definitions in Appendix C before going on to
Chapter 2.

101  PURPOSE

     The purpose of this Handbook is to present more specific
guidance on the general computer security management philosophies,
policies, and requirements outlined in NMI 2410.7.  This Handbook
is intended to be used by the Center Automated Information Security
Managers (C-AISM's) and HQ Program Office Automated Information
Security Managers (PO-AISM's).  This Handbook is not intended to be
site-specific.  Headquarters Program Offices and Centers are
encouraged to supplement this Handbook with procedures, duties, and
titles in order to tailor guidance to their unique organizational
structure and automated system environments.  

102  ORGANIZATIONAL SCOPE

     a.   The provisions of this Handbook apply to all NASA and
support contractor organizations as provided by law and/or contract
and as implemented by the appropriate contracting officer. 
Generally excluded are contractor or research facility computing
and information resources not under direct NASA management
cognizance or that are merely incidental to a contract (e.g., a
contractor's payroll and personnel system).  The managing
organization (i.e., HQ Program Office or Center) may, through the
appropriate contracting officer, elect to include any automated
information resources excluded by this Handbook.

     b.   Within reason, the provisions of this Handbook should be
applied in university environments (where NASA is supported through
formal agreements such as grants, cooperative agreements,
contracts, and purchase orders.)  NASA managers/sponsors of such
activities should take a reasonable approach that will not impose
unnecessary constraints on the open university environment.  The
extent of compliance with this Handbook in university environments
needs to be evaluated on a case-by-case basis and may range from
minimal compliance (i.e., for one-time research activities in which
there is no clear indication that NASA is the information owner) to
more stringent compliance (i.e., for universities processing
NASA-owned information on a long-term basis).  A risk assessment
should be conducted to identify acceptable risk exposures and
determine how unacceptable risk exposures can reasonably be reduced
to more acceptable levels.

103  SYSTEMS COVERED

     This Handbook covers the protection of all NASA automated
information systems, including the information they store, process,
and transmit. It also provides for the continuity of operations of
automated systems and applications.

104  EXCEPTIONS

     In certain situations, other protective measures may already
be in place to meet the general requirements contained in this
Handbook.  Exceptions from implementing the specifics of this
Handbook may be granted by the managing organization overseeing the
Data Processing Installation's (DPI's) activities.  Delegation of
this exception authority shall be no lower than the C-AISM. 
PO-AISM's have exception authority for systems under their purview. 
NASA Centers and Program Offices cannot grant exceptions to
national level (Office of Management and Budget [OMB], Office of
Personnel Management [OPM], Occupational Safety and Health
Administration [OSHA], National Fire Protection Academy [NFPA],
etc.) requirements.  Additionally, exceptions cannot be granted
locally for classified systems, as noted in paragraph 701c.

105  NASA COMPUTER SYSTEMS ENVIRONMENT

     NASA represents one of the larger, more complex, and diverse
computing environments in the Federal Government.  NASA has an
annual information technology resource budget in the 4 billion
dollar range that supports nine NASA Centers and the Jet Propulsion
Laboratory (JPL).  It is recognized that while JPL is not viewed as
a NASA Center, it is a facility performing research and development
for NASA under contract by the California Institute of Technology
(Caltech) and thus NASA policy is applicable to JPL to the extent
provided for in the NASA/Caltech contract.  These Centers manage
automated information resources on a decentralized basis at a large
number of DPI's, many of which are operated under contract.  The
computer system configurations range from the largest mainframe and
super computers in the world to minicomputers, microcomputers, and
intelligent/engineering work stations.  Computing and network
operations support earth and space mission functions for a full
array of processing environments ranging from administrative
computing in office settings to scientific and engineering
computing in academic, research center, production plant, and space
vehicle environments.  Providing appropriate protection in such a
diverse environment involves a continuing management process of
balancing user needs for unrestricted access to information with
the sometimes conflicting requirements to control access for
preserving high integrity, ready availability, and confidentiality.

106  IMPORTANCE OF AN EFFECTIVE COMPUTER SECURITY PROGRAM
  
     a.   Management Priority.  The importance to senior NASA
management of an effective computer security program has been
indicated by past NASA Administrators in policy letters to all NASA
employees.  These letters expressed personal expectations for "full
support...cooperation...and an aggressive program ...." 

     b.   Public Image.  NASA has high public visibility due to the
nature of its operations.  Human safety during manned space flight
and the success of research and missions in space is highly
dependent on the reliability of supporting computer (including
network) resources and the integrity of automated information. 
Public and Congressional confidence in the Space Program are
directly keyed to the clarity of NASA's commitment to excellence in
all areas.

     c.   Increasing Incidents.  In recent years all Federal
agencies have experienced an increase in international electronic
intrusions and electronic worm/virus penetrations.  These problems
are expected to become more technically complex and more widespread
with advancements in computer and telecommunication technologies. 
Therefore, it has become increasingly important to develop Computer
(and Network) Security Incident Response (CSIR) capabilities to
minimize the affects of such incidents.  See paragraph 309 for
details of such response capabilities.

107  NASA AIS PROGRAM BACKGROUND

     a.   Initial Policy.  In 1979 NASA formally implemented its
AIS Program by defining and promulgating Agencywide policies
regarding the security and integrity of Agency automated systems. 
The main focus was on maintaining continuity of operations and
minimizing the potential for improper use of computing resources. 
These policies were issued in accordance with the Office of
Management and Budget (OMB) Circular A-71, Transmittal Memorandum
No. 1, July 27, 1978, "Security of Federal Automated Information
Systems."  This memorandum required each Federal agency to
establish an automated information security program.

     b.   Initial Handbook.  NASA's basic automated information
security policy was augmented in 1980 with the publication of
extensive guidelines for implementing automated information
security requirements within the Agency.  These guidelines were
published in NHB 2410.1, "Information Processing Resources
Management."  NHB 2410.1 was updated in 1982 and again in 1985. 
NASA operated under its basic policy (circa 1979) until 1988, when
it published NMI 2410.7, "Assuring the Security and Integrity of
NASA Automated Information Resources."  NASA then began
restructuring its AIS Program to bring the agency into compliance
with the Computer Security Act of 1987 and technological advances
in computing and telecommunication systems.

     c.   Summary of Other Milestones.  The Agency has had an
established AIS Program since 1979; a full-time AIS Program Manager
since 1985; Computer (and Network) Security Awareness and Training
(CSAT) since 1983; and management evaluations of Agency automated
information security activities since 1979.

108  ORIGIN OF NATIONAL POLICY

     a.   National Organizations.  As presented in Exhibit 1-1, the
NASA AIS Program is based on public laws promulgated by Congress. 
The following organizations then issued national policies,
standards, and guidelines:

          (1)  The Department of Commerce (DOC).

          (2)  The National Institute for Standards and Technology
               (NIST).

          (3)  The Office of Management and Budget (OMB).

          (4)  The Office of Personnel Management (OPM).

          (5)  The National Security Agency (NSA).

          (6)  The Department of Defense (DOD).

          (7)  The General Services Administration (GSA).

          (8)  Various Presidential committees on computer and
               telecommunications systems security.

     b.   National Documents.  National policy and guidance
          documents include:

          (1)  Computer Security Act of 1987 (PL 100-235).

          (2)  Executive Order 12356.

          (3)  OMB Circular A-130.

          (4)  NIST Federal Information Processing Standards
               (FIPS) Publications.

          (5)  DOD guidance on protecting classified information.

          (6)  NSA guidance on trusted computer systems.

          (7)  OPM Personnel Letter 732.

          (8)  GSA Federal Information Resource Management
               Regulation (FIRMR).

          (9)  GSA Federal Information Processing Management
               Regulation.

          (10) GSA Federal Procurement Regulation (FAR).

   (EXHIBIT 1-1.  National Policy and Guidance--See hardcopy)



         CHAPTER 2.  PROGRAM ORGANIZATION AND MANAGEMENT

200  INTRODUCTION

     This Chapter covers the NASA AIS Program goal, objectives,
organizational structure and management.

201  MANAGEMENT PHILOSOPHIES

     a.   Integration.  The NASA AIS Program is designed to provide
appropriate, cost-effective protection for sensitive, classified,
mission critical, life support, and high-dollar-value information
and computer/network resources.  In this regard, NASA has an
extensive AIS Program that is highly integrated into its management
functions through management points-of-contact, intra-agency
working groups, councils, and committees.  These management and
coordinating bodies range from a senior management Security
Coordinating Committee (SCC) and Information Resources Management
(IRM) Council to PO-AISM's, C-AISM's, local Data Processing
Installation AIS Officials (DPI-AISO's), and Computer Security
Coordinators (CSC's) at the system level.  A concept of total
systems engineering, Total Quality Management (TQM), and total AIS
integration is applied.

     b.   Decentralization.  Due to NASA's highly decentralized
approach to managing a large number of diverse computer and network
environments nationwide, a decentralized approach for managing
automated information security has been taken.  NASA HQ interprets
national policy and guidance and issues general policy and guidance
appropriate for the NASA computer/network environment.  Each
Program Office (PO) is responsible for establishing an automated
information security management function which will ensure the
security, integrity, and continuity of operations for automated
information resources directly related to their program missions. 
Each Center is responsible for establishing and sustaining an
automated information security program that assures each DPI under
its management cognizance complies with automated information
security requirements that are consistent with the DPI's unique
computer/network environment.  Specific protective decisions (e.g.,
cost-effective approaches, benefits to be derived) are made by
management at the IPO, Center, and DPI levels based on risk
assessment activities.  Functional security requirements and
technical security specifications are integrated into appropriate
system life cycle phases and appropriate security-related
responsibilities are included in job descriptions and performance
evaluation criteria.  Compliance is assured through multiple levels
of top-down management and compliance review activities.  

     c.   Perfection.  A state of absolute protection is not
practical nor desirable in most cases.  Numerous reasons include
the following:

          (1)  Absolute protection would make the Agency's systems
virtually unusable by the research community for which the Agency's
mandate, under the Space Act of 1958, is to provide the most useful
information to the widest possible audience.

          (2)  Some vulnerabilities may not be known, as in the
technically complex cases where vendor-supplied operating systems
contain security flaws.

          (3)  Computer and network technology is constantly
advancing at a rapid pace.  While these advances create new
opportunities for our scientists and engineers, they also offer new
opportunities for those who wish to commit malicious acts.

          (4)  Protection must be applied in a cost-effective
manner in order to meet Agency responsibilities in its expenditures
of public funds.

202  NASA AIS PROGRAM GOAL AND OBJECTIVES

     a.   Goal.  The goal of the NASA AIS Program is to provide
cost-effective protection that assures high integrity, ready
availability, and confidentiality of NASA automated information
resources.  Thus, the main focus in scientific and engineering
environments is to provide appropriate cost-effective protection
and management emphasis that assures appropriate levels of
information integrity and computing resource availability without
unnecessarily impacting innovative productivity or the advancement
of technology.  In these environments and in the administrative
environments, where the sensitive or classified nature of
information calls for mandatory or discretionary protection from
unauthorized disclosure, additional consideration must be given for
providing cost-effective protection that assures information
confidentiality.

     b.   Objectives.  The objectives of the NASA AIS Program are
to:

          (1)  Protect against deliberate or accidental corruption
of NASA automated information.

          (2)  Protect against deliberate or accidental actions
that cause NASA automated information resources to be unavailable
to users when needed.

          (3)  Ensure that there in no deliberate or accidental
disclosure of NASA sensitive or classified automated information.

203  PROGRAM ELEMENTS

     a.   Basic Elements.  The basic elements of the NASA AIS
Program are illustrated in Exhibit 2-1.  They are to be employed in
appropriate combinations, and at appropriate times throughout a
system's life cycle, to adequately protect sensitive, critical,
valuable, and

       (EXHIBIT 2-1.  NASA Automated Information Security 
                   Program Logo--See hardcopy)

important NASA automated information resources at acceptable levels
of risk.  The NASA AIS Program covers both classified and
unclassified automated information resources.

          (1)  AIS Policy/Guidance.  Automated information security
policies and guides are needed to define the overall framework
(including lines of authority, main points-of-contact, range of
responsibilities, requirements, procedures, and management
processes) for implementing and sustaining an efficient and
cost-effective NASA AIS Program.

          (2)  AIS Planning.  Automated information security
planning must provide a consistent and specific approach for
determining short- and long-range management objectives,
documenting accomplishments, developing security enhancement
proposals, mapping proposals to budget requests, and assuring the
implementation of appropriate cost-effective protective measures.

          (3)  Sensitivity and Criticality Identification.  The
automated information resources used to support NASA missions have
various levels of sensitivity and criticality.  These levels need
to be determined, since they are critical to deciding which
protective measures are most appropriate.

          (4)  Risk Management.  NASA managers need to continually
identify and analyze potential threats to NASA's computer/network
environments and reduce risk exposures to acceptable levels.  This
process is called risk management.

          (5)  Protective Measure Baseline.  There are numerous
combinations of technical, physical, administrative, and personnel
protective measures available to NASA managers.  A baseline of
these protective measures needs to be defined/suggested to
facilitate development of acceptable levels of protection for
computing and information resources managed by NASA or
operated/processed in support of NASA missions.

          (6)  Certifications/Recertifications.   Certifications
and recertifications of automated applications document that
current risk levels are acceptable.  They also document the
accountability for the acceptance of residual risks and complete
the evaluation process for protective measures (controls and
validation routines) programmed into automated applications.

          (7)  Multilevel Compliance Assurance Mechanism.
Management and compliance reviews should be periodically conducted
to sustain optimal security levels at all centers and DPI's.

          (8)  Incident Response.  It is necessary to develop
specific and appropriate responses to the various security
incidents that may occur.  It is also necessary to provide feedback
information to senior management on significant incident
situations.  This information also supports the tracking of
agencywide trends.

          (9)  Continuous CSAT.  Continuous CSAT is necessary to
elevate and sustain management and personnel awareness and provide
specific guidance to personnel who design, implement, use, or
maintain automated information resources.

     b.   Sustaining Program Effectiveness.  After policies and
procedures have been established and initial security management
tasks have been accomplished at the IPO's, Centers, and DPI's, the
ongoing aspects of risk assessment, recertification, computer
security awareness and training, and compliance review activities
must continually refresh local automated information security
programs and keep them alive.  The ongoing aspects of significant
incident reporting and annual submission of automated information
security plans must provide managers at the Center and HQ levels
with sufficient information to continually reassess current program
status and determine future management direction.

204  NASA AIS POLICY

     It is NASA policy that:

     a.   Technical,
     b.   Personnel,
     c.   Administrative,
     d.   Environmental, and
     e.   Access

protective measures be used, alone or in combination, to
cost-effectively provide an appropriate level of protection for
NASA automated information resources, and especially for automated
information.  The rigor of controls must be commensurate with the
sensitivity/criticality level of the information resources to be
protected.  Selection of protective measures for a specific
automated information system environment should be based on an
assessment of risks and the existence of reasonable ratios between
the costs/benefits of proposed protective measures and the
sensitivity, criticality, and/or value of the assets requiring
protection.  Appropriate emphasis should be placed on:

     f.   Automated information,
     g.   Computer hardware, and
     h.   Computer software

to assure that they are appropriately protected from threats that
include unauthorized:

     i.   Access,
     j.   Alteration,
     k.   Destruction,
     l.   Removal (e.g., theft),
     m.   Disclosure,
     n.   Use/abuse, and
     o.   Delays

as a result of improper actions or adverse events.

205  HEADQUARTERS ROLES AND RESPONSIBILITIES

     A.   Overview.  There are many organizations that have roles
and responsibilites related to implementing and managing the NASA
AIS Program, although the Head of each Federal agency has ultimate
responsibility.

     b.   Multidisciplinary Coordination.

          (1)  Management Disciplines.  All traditional management
disciplines and functions must be employed in a coordinated fashion
to effectively manage automated information security.  The reason
for this multidisciplinary situation is that, over the years, NASA
has become more electronically based and dependent on automation
technologies to support all aspects of its operations and missions.

          (2)  Security Disciplines.  As shown in Exhibit 2-2,
there are many security-related disciplines, each with its own set
of policies and procedures.  Each security discipline is almost
always an entirely separate career field throughout the Federal
Government.  Only when all such disciplines are working together,
in a highly coordinated fashion, can the entire security process
function properly and efficiently.  Thus, it is important for
automated information security managers at all levels to regularly
coordinate with other security-related disciplines.

     c.   NASA AIS Program Manager.  As shown in Exhibit 2-3, NASA
primary authority for managing an Agencywide AIS Program has been
delegated through the Associate Administrator for Management
Systems and Facilities to the Director, Security, Logistics and
Industrial Relations Division.  In addition to the general
requirements of NMI 2410.7:

          (1)  The Director, Security, Logistics and Industrial
Relations Division, shall:

               (a)  Designate a management official knowledgeable
in both computing and automated information security principles and
practices to be the NASA AIS Program Manager; and 

               (b)  Apprise Center Directors, through appropriate
Program Associate Administrators, of program management reviews
conducted in response to the 
requirements of NMI 2410.7 and this Handbook and make
recommendations for improvements, as appropriate.

  (EXHIBIT 2-2.  Automated Information Security's Relationship 
         with Other Security Disciplines--See hardcopy)

    (EXHIBIT 2-3.  Headquarters Roles and Responsibilities--
                          See hardcopy)

          (2)  The NASA AIS Program Manager shall:

               (a)  Serve as an Agency focal point of coordination
among NASA senior management, HQ Program Offices, Centers, and
external organizations on automated information security matters;
 
               (b)  Develop, and coordinate the implementation of
Agency plans, policies, procedures, and guidelines related to the
requirements of NMI 2410.7 and this Handbook;

               (c)  Conduct program management reviews of Centers
to assess the sustained effectiveness of Center management
oversight processes that have been implemented at DPI's under
Center management cognizance and make recommendations to the
Director, Security, Logistics and Industrial Relations Division,
for improvement, as appropriate.

               (d)  Coordinate the review and dissemination of
information identifying emerging trends to keep NASA management
informed.

     d.   Headquarters Program Offices.  In addition to the general
requirements of NMI 2410.7, Program Associate Administrators shall:

          (1)  Promulgate Program Office specific policies,
procedures, and guidelines related to the general requirements of
NMI 2410.7 and this Handbook, as deemed appropriate.

          (2)  Designate a management official knowledgeable in
both computing and computer/network security methods and practices
to be the PO-AISM.  The PO-AISM should  serve as a focal point to
coordinate Agencywide activities required in NMI 2410.7 and this
Handbook between the HQ AIS Program Manager and cognizant Program
Office organizations.  In cases where multiple organizational
levels or program area applications exist, assistant PO-AISM's
and/or CSC's may be designated to accomplish specific automated
information security responsibilities.

          (3)  Implement and coordinate an appropriate management
oversight process that ensures awareness and compliance with
applicable portions of NMI 2410.7 and this Handbook in cognizant
organizations.

          (4)  Assure that Program Office AIS Coordinators (i.e.,
PO-AISC's) are designated at appropriate centers.

          (5)  Assure that all NASA and appropriate NASA contractor
computing and telecommunications resources processing NASA
information are identified and included under the management of a
DPI.

          (6)  Assure that, through the contracting officer, all
appropriate contractors are required to comply with applicable
provisions of NMI 2410.7 and this Handbook.

          (7)  Review and concur on exceptions from implementing
specific requirements of this Handbook.

     e.   Other Headquarters Offices.  Other HQ Offices that play
an integral role include:

          (1)  Inspector General, which has independent audit and
               criminal investigation responsibilities.

          (2)  NASA Security Office, which has traditional
               security responsibilities in the areas of personnel
               security, physical security, and national
               (including defense-related) security document and
               operations control.

          (3)  Office of Space Communications, which has
               Agencywide responsibility for telecommunications
               management.

          (4)  Office of Procurement, which is responsible for
               ensuring that appropriate security requirements are
               included in NASA procurement policies, regulations,
               and procedures for acquiring Federal Information
               Processing (FIP) resources.

          (5)  Management Operations Division, which is
               responsible for the NASA Internal Controls Program.

          (6)  Office of Safety and Mission Quality, which has
               responsibilities relating to software engineering
               and automated information resources supporting
               manned space flight operations. 

206  INDIVIDUAL RESPONSIBILITIES FOR AIS

     As illustrated in Exhibit 2-4, the situation discussed in
paragraph 205 dictates that virtually everyone in the organization
who manages, designs, programs, operates, or uses NASA automated
information resources has personal job-related responsibilities
that contribute toward meeting the goal and objectives of the NASA
AIS Program.  The practice of effective automated information
security management principles typically becomes an integral
function of good business and professional practice when it can be
demonstrated that positive benefits can be derived.  For example:

     a.   Appropriately restricting unauthorized access can greatly
          contribute to ensuring information/system integrity,
          availability, and confidentiality.

   (EXHIBIT 2-4.  Who Is Responsible for Automated Information
                    Security?--See hardcopy)

     b.   Systems that are well planned and passed through a
          quality assurance/certification process are typically
          more efficient and have fewer and less costly maintenance
          problems in operational use.

     c.   Technology that is used in a controlled environment can
          be expected to have greater reliability.

207  PROGRAM ORGANIZATIONAL STRUCTURE

     a.   Overview.  In order to effectively manage the day-to-day
aspects of an automated information security program in a large and
diverse organization like NASA, a network of designated AIS mangers
must be established at all levels throughout the organization. 
Exhibit 2-5 illustrates the relationship between NASA-designated
AIS managers and officials at the HQ, Center, and DPI levels.

     b.   Headquarters Level.

          (1)  The NASA AIS Program Manager has a direct working
relationship and communications link with HQ PO-AISM's and C-AISM's
to focus on resolving Agencywide AIS issues.

          (2)  Each PO-AISM has a direct working relationship and
communication link with their program management personnel at NASA
Centers to focus on resolving AIS issues with respect to their
program missions.

          (3)  Three PO-AISM's also have institutional
responsibility for a specific grouping of NASA Centers.  They are
called Institutional Program Office AIS Managers (IPO-AISM's). 
Each IPO-AISM has a direct working relationship and communication
link with the C-AISM's at their assigned centers to focus on
resolving AIS issues that impact multiple centers within a specific
grouping.
 
     c.   Center Level.  Each C-AISM has a direct working
relationship and communication link with DPI-AISO's to focus on
resolving AIS issues that impact multiple DPI's.

   (EXHIBIT 2-5.  NASA Automated Information Security Program
                Structure, Part 1--See hardcopy)

208  MANAGEMENT REVIEW AND COMPLIANCE ASSURANCE PROCESS

     Since automated information security compliance levels have an
inherent tendency to degrade with time, management reviews are
necessary to retain a high level of compliance.  Therefore, the
NASA AIS Program will require periodic management reviews at all
levels.

     a.   Headquarters Reviews.  NASA HQ will conduct periodic
management reviews of Centers to evaluate their management and
coordination of AIS programs at DPI's under their management
cognizance.

     b.   Center Reviews.  Centers will conduct periodic
self-assessments and compliance reviews at DPI's under their
management cognizance at least every 1 to 3 years.  Review
activities should be commensurate with the value and
sensitivity/criticality of automated information resources.  Review
activities should also focus on the following three areas:

          (1)  Tracking Systems and Random Checks.  Tracking
systems are needed to monitor implementation of recommendations
from prior review activities (e.g., audits, compliance reviews,
recertifications, risk assessment's).  Random checks and tests
ensure actual implementations of appropriate procedures and that
protective measures do, in fact, reduce identified risk exposures
to acceptable levels.

          (2)  Security Incidents.  Reported security incidents
should be tracked to determine trends, to identify general problem
areas and security needs, and to ensure implementation of
appropriate procedures and protective measures.  The result should
be fewer incidents in the future.

          (3)  Contingency Planning.  Contingency and disaster
recovery plans provide overall protection when other safeguarding
features may have failed.  Such plans should be in place and
periodically tested.  For the most sensitive and critical systems,
contingency and disaster plan testing must be conducted annually.
  
          (4)  CSAT.  Managers should ensure that continuous CSAT
is conducted at DPI's, as appropriate. 

     c.   DPI Reviews.  Each DPI will conduct ongoing
self-assessment review activities to include CSAT, risk
assessments, and recertification reviews.



             CHAPTER 3.  CENTER AND DPI REQUIREMENTS

300  CENTER REQUIREMENTS

     a.   Designation of Authorities.  In addition to the general
requirements of NMI 2410.7, Center Directors shall:

          (1)  Promulgate specific Center policies, procedures, and
guidelines related to the general requirements of NMI 2410.7 and
this Handbook, as deemed appropriate; and

          (2)  Designate, in writing, a management official
knowledgeable in both computing and automated information security
methods and practices to be the C-AISM.

     b.   C-AISM Responsibilities.  The C-AISM shall serve as a
focal point to manage a program that is responsive to the Center
Director and coordinate activities required in NMI 2410.7 and this
Handbook between the HQ AIS Program Manager and cognizant DPI's. 
In cases where multiple DPI's exist, Assistant C-AISM's may be
designated to accomplish specific Center automated information
security responsibilities.  The C-AISM responsibilities include:

          (1)  Implementing and coordinating an appropriate,
Center-wide management oversight process that ensures awareness and
compliance with the Center automated information security program.

          (2)  Assuring that each NASA, and appropriate NASA
contractor, DPI under Center management cognizance develops,
implements, and sustains an effective automated information
security program that ensures awareness and compliance at the DPI
level.

          (3)  Scheduling and conducting periodic compliance
reviews at cognizant DPI's to assess the adequacy of security
plans, the sustained effectiveness of its automated information
security procedures and program, and to make recommendations for
improvement, as appropriate.  Compliance reviews should be
conducted every 1 to 3 years based on the reviewing management's
judgement.  Factors to be considered in making this decision
include the reviewing management's perception of the sensitivity,
criticality, and/or value of the computing and information assets
to be protected at each DPI.

          (4)  Assuring that procedures are implemented for
identifying automated information security incidents that occur at
DPI's under their management cognizance.  These procedures shall
ensure that significant automated information security incidents
are reported to their IPO-AISM and the HQ AIS Program Manager
immediately following detection and that significant incident
information received from HQ is disseminated to cognizant DPI's. 
(See paragraph 309 for a description of this procedure.)
 
          (5)  Assuring that, through the contracting officer, all
appropriate contractors comply with applicable provisions of NMI
2410.7, this Handbook, and Center automated information security
directives.

          (6)  Coordinating all functional security requirements
with organizations/individuals having procurement, training, or
security-related responsibilities (e.g., those having
responsibilities in personnel security, physical security, national
(including defense-related) security, telecommunications security,
information security, internal management control, auditing,
quality assurance/control, administrative security, emissions
security, and operations security).

          (7)  Implementing procedures to ensure that all imported
software is reasonably safe for its intended environment and that
all software has been properly registered and/or licensed prior to
implementation.  This includes software developed for microcomputer
platforms, mainframe computers, communications systems, networks,
or other automated information systems.

     c.   Identifying DPI's.  As illustrated in Exhibits 2-4 and
3-1, the focus of implementing technical AIS requirements begins at
the Center and DPI levels.  Center management is to ensure that all
NASA and appropriate NASA contractor computing and
telecommunication resources processing NASA information are
identified and included under the management of a DPI.   A DPI is
established by drawing an imaginary boundary around a logical
grouping of information, computing, and telecommunications
resources for the purpose of managing those resources as an
identifiable entity.  C-AISM's are responsible for assuring that
DPI's have been identified.  This is accomplished by negotiating
with organizations under the cognizance of Center management to
determine the most logical approach.  For example, DPI's might
represent logical groupings of information, computing, and
telecommunications resources within the boundaries of:

          (1)  A physical structure at a geographic location
               (e.g., an entire building or a central computing
               facility).

          (2)  An organizational structure (e.g., HQ or Center
               organizational code's).

          (3)  A combination of these approaches.

           (EXHIBIT 3-1.  NASA Automated Information 
        Security Program Structure, Part 2--See hardcopy)

The grouping of automated information resources for automated
information security requirements should be consistent, if
possible, with that used for DPI's as defined in the Information
Technology System Plan (ITSP) submitted annually by each NASA
Center.

     d.   Identifying Additional Entities.  DPI-AISO's are
responsible for determining if identification of additional
entities under the DPI is needed to more effectively manage and
coordinate aspects of the DPI automated information security
program.  These entities could represent logical groupings of
information, computing, and telecommunications resources associated
with sub-elements of the DPI organization, major hardware or
software configurations, or clusters of microcomputers.

301  DPI REQUIREMENTS

     a.   Designation of Authorities.  Each NASA (or appropriate
NASA contractor) manager in charge of a NASA Center shall assure
that a management official, knowledgeable in both computing and
automated information security management methods and practices, is
designated as the DPI-AISO.  Day-to-day security responsibilities
may be delegated to technical support personnel.  In cases where
multiple computer/network systems or program area applications
exist, DPI-AISC's may be designated to accomplish specific
automated information security responsibilities.

     b.   DPI-AISO Responsibilities.  (See Exhibit 3-2.)  The
DPI-AISO in coordination with the appropriate C-AISM shall:

                   (EXHIBIT 3-2--See hardcopy)

          (1)  Implement and administer a management process
appropriate to the DPI environment to ensure that sensitivity
and/or criticality of information is determined by the application
sponsors/owners and that appropriate administrative, technical,
physical, and personnel protective measures are incorporated into
all new and existing computer systems and applications processing
sensitive or mission-critical information to achieve and sustain an
acceptable level of security.  (See paragraph 302 for a description
of this management process.)

          (2)  Formulate, continually update, and annually review
a DPI automated information security plan that will allow the
appropriate approving (i.e., DPI) or reviewing (e.g., Center and HQ
Program Office) authorities to judge the comprehensiveness and
effectiveness of the DPI automated information security program. 
In cases where multiple DPI's, computer systems, or program area
applications exist, multiple plans may be appropriate.  The
planning process may also be integrated into Center-level planning
activities as deemed appropriate by the C-AISM.  (See paragraph 503
for a description of the required contents of a DPI automated
information security plan.)

          (3)  Develop and implement protective measures designed
to prevent misuse and abuse of computing resources.  (See paragraph
304 for a description of these protective measures.)

          (4)  Develop and implement a process, as appropriate, for
providing contingency planning and reasonable continuity of
operations for computer systems and computer applications
supporting mission-critical functions.  (See paragraph 308 for a
description of this process.)

          (5)  Develop and implement procedures in coordination
with the C-AISM for identifying automated information security
incidents and reporting significant automated information security
incidents, as described in paragraph 309.

          (6)  Assure that plans are developed and implemented for
conducting continuous CSAT to ensure that NASA and appropriate NASA
contractor personnel involved in managing, designing, developing,
or maintaining computer applications or who use computer systems,
are aware of their security responsibilities and know how to
fulfill them.  This includes being kept aware of vulnerabilities
and being trained in techniques to enhance security.  (See
paragraph 310.)

          (7)  Coordinate all functional security requirements with
organizations/individuals having procurement, training, or
security-related responsibilities, e.g., those having
responsibilities in personnel security, physical security, national
(including defense-related) security, telecommunications security,
information security, internal management control, auditing,
quality assurance, administrative security, emissions security, and
operations security.

302  MANAGEMENT PROCESS

     The management process must ensure that the following, as a
minimum, are carried out (see Exhibit 3-3):

                   (EXHIBIT 3-3--See hardcopy)

     a.   Risk Assessments.  Periodic risk assessments must be
conducted for new and existing DPI's to assure that appropriate,
cost-effective protective measures are incorporated and are
commensurate with the sensitivity, criticality, and value of
associated computer systems, computer applications, and information
processed.  (See paragraph 303 for a description of the risk
assessment process.)

     b.   Certifying Requirements.  Procedures must be established
for defining functional security requirements, developing technical
security specifications, conducting security design reviews and
system tests, certifying and recertifying computer applications at
appropriate phases of the system life cycle, and approving
technical security specifications for the acquisition of
computer/network resources or related services.  (See paragraph 305
for minimum functional security requirements and certification
procedures.)

     c.   Personnel Screening.  Personnel who participate in
managing, designing, developing, operating, or maintaining computer
applications processing sensitive or mission-critical information,
or who access automated sensitive or mission-critical information,
must be appropriately screened to a level commensurate with the
sensitivity, criticality, or value of the information to be
accessed or handled and the risk and magnitude of loss or harm that
could be caused by the individual.  Federal personnel are to be
screened in accordance with the Federal Personnel Manual, Section
732 and NHB 1620.3,  "NASA Security Handbook."  Guidance on
screening non-Federal personnel is presented in paragraph 306.

     d.   Access Protection and Accountability.  Appropriate
protective measures should be established, to the extent
economically and technically feasible, for maintaining personal
accountability of individual users granted access to sensitive or
mission-critical information on multi-user systems and for ensuring
that they have access to no more information than they are
authorized to access.  Access to public automated library systems
must be:

          (1)  Physically or logically isolated from users with
higher security (i.e., integrity, availability, or confidentiality)
requirements;

          (2)  Protected by read-only access; and

          (3)  Protected by current backup copies of all files.

     e.   Compliance Assurance.  Followup procedures must be in
place to assure implementation of protective measures in accordance
with recommendations from compliance review and certification and
recertification activities.

     f.   Contingency and Disaster Recovery Plans.  Appropriate
disaster recovery plans and contingency plans must be established
and maintained to prevent loss of information, minimize
interruption, and provide reasonable continuity of computer and
network services should adverse events occur that would prevent
normal operations.

     g.   Approval of Methodologies.  Automated information
security plans, risk assessments, and security certification
methodologies shall be approved by appropriate management officials
in sponsoring organizations.

303  RISK ASSESSMENT PROCESS

     a.   Purpose.  NASA recognizes the importance of conducting
risk assessments as a basis for making informed management
decisions related to accepting identified risk exposures or
implementing appropriate cost-effective protective measures to
reduce risk exposures to acceptable levels.  When used
appropriately, risk assessment is a very effective management tool. 
It should serve to provide a systematic approach for:

          (1)  Determining the relative value, sensitivity, and
               criticality of DPI information and computing
               resources.

          (2)  Assessing potential threats and perceived risk
               exposure levels.

          (3)  Identifying existing protective measures.

          (4)  Identifying and assessing additional protective
               alternatives.

          (5)  Determining acceptability of identified risk
               levels.

          (6)  Documenting the assessment process and resulting
               management decisions.

     b.   Scope.  Risk assessments may vary from an informal review
of a small-scale microcomputer installation to a formal, fully
documented analysis (i.e., risk analysis) of a large-scale computer
installation.  Since risk assessments can involve many disciplines
and organizations, a team approach is recommended, regardless of
the size of the systems being analyzed.  A tremendous amount of
time and effort can be saved by bringing together the right people
with the needed knowledge and experience to review concerns and
make subjective judgements, based on professional experience and
knowledge. In addition to standard risk assessments, Security
Testing may be appropriate for some systems.  Security Testing is
discussed in section 303d.

     c.   Minimum Requirements.  DPI's will continue to be given
flexibility for selecting methodologies and implementing risk
assessment programs that are most appropriate for their
computer/network environments. However, the risk assessment process
must ensure, at a minimum, the following (see Exhibit 3-4):

                   (EXHIBIT 3-4--See hardcopy)

          (1)  A risk assessment methodology is selected (i.e.,
qualitative, quantitative, or a combination of both) that includes
the following elements and logical steps, as appropriate:
               
               (a)  Determination of Risk Assessment Scope.  For
example, a risk assessment may consider an entire DPI, including
all hardware, software, and telecommunication aspects, or may be
limited to an assessment of an individual mainframe or
microcomputer system.  Regardless of the approach, the scope of the
risk assessment should be planned and maintained within manageable
limits and the level of effort should be commensurate with the
nature of the DPI being assessed.  For example, a risk assessment
of a stand-alone microcomputer installation could be done
informally by the owner of the information.

               (b)  Asset Identification and Value.  Identification
of major DPI assets and general approximations of their current
replacement value in order to establish a basis for making informed
decisions on protective measures as described in subparagraph (g). 
For example, if it is known that the approximate value of computing
resources within the scope of the risk assessment is about $1
million, it may make sense to spend several hundred dollars or
several thousand dollars to enhance protective measures.

               (c)  Determination of Potential Impacts.  General
determination of collective sensitivity, criticality, and/or value
of information processed or stored at the DPI and potential impacts
if information is misused, altered, destroyed, or disclosed.  This
determination should be based on an analysis of individual
functional security requirements (which are prepared by
sponsors/owners) of computer applications processed.

               (d)  Identification of existing protective measures
(i.e., those already in place).  Documentation of specific
vulnerabilities and protection measures should be marked "FOR
OFFICIAL USE ONLY" and be adequately protected.  (See NHB 1620.3,
"NASA Security Handbook," Section 1208.)

               (e)  Identification of existing and potential
threats and hazards and qualitative estimates of risk exposure
and/or quantitative calculations; for example, Annual Loss
Expectancy (ALE) associated with potential adverse events.

               (f)  Determination of acceptable risk exposures,
and/or determination of alternative protective measures, associated
benefits, and associated costs needed to reduce identified risk
exposures and/or ALE to acceptable levels.

               (g)  Recommendations for accepting risk exposures
and/or ALE's, or recommendations for additional appropriate
protective measures that are needed to improve security (reducing
risk exposure and/or ALE) based on an analysis of the ratio between
the estimated cost/benefit of proposed protective measures and the
value/sensitivity of information/computing resources requiring
protection.  The cost of protective measures should not normally
exceed a reasonable percentage of the value of assets requiring
protection (as identified in subparagraphs (b) and (c)).

               (h)  Documentation of actions taken or planned as a
result of the risk assessment findings and recommendations.

               (i)  Followup procedures to assure that all action
planned have been carried out.

          (2)  Risk assessments are performed:

               (a)  Prior to construction or operational use of a
new DPI.

               (b)  Whenever there is a significant change to the
existing DPI.

               (c)  At periodic intervals, established by the
DPI-AISO, that are commensurate with the sensitivity or criticality
of the information processed by the DPI, but not to exceed 5 years
if no risk assessment has been performed during that time.

          (3)  The selected risk assessment methodologies and
results are approved by appropriate management officials at the
Center and DPI levels and taken into consideration when certifying
or recertifying computer applications.

          (4)  Risk assessment results are available for
consideration during the evaluation of internal controls, conducted
in accordance with NMI 1200.7, "NASA's Internal Management Control
System," that apply to DPI's or computer applications processing
sensitive or mission-critical information.

     d.   Security Testing.  In some instances, additional Security
Testing may be warranted. 

          (1)  Purpose.  The purpose of an Security Testing is to
identify AIS weaknesses and, when appropriate, to recommend actions
to correct identified weaknesses.  

          (2)  Procedure.  Sensitive/critical automated information
systems that are under development should be reviewed to ensure
that risk assessments and other security tests are scheduled and
conducted within the proposed general support environment.  Systems
tests are required before sensitive/critical automated information
systems are certified and placed into operational use.    

          (3)  Minimum Security Testing Requirements.  Security
Testing must ensure that the following minimal requirements are
met.

               (a)  System vulnerabilities and flaws are identified
and evaluated for all systems within the scope of the Security
Tests.

               (b)  Operating level Security Testing for selected
systems should cover flaws known to be in the hardware, operating
system software, and, when practical, application software.

               (c)  Vendor supplied user identifiers and passwords
are changed from the initial installation defaults.

               (d)  Privileged user access and password files are
adequately protected.

               (e)  Field service/maintenance user identifiers are
kept inactive when not in use.

               (f)  Remote diagnostic facility connections are
secured when not in use.

               (g)  Network router source and destination
authorizations and restrictions are functioning as expected.

          (4)  Detection of Weaknesses.  If weaknesses or flaws in
the operating system or other components are detected:

               (a)  Corrective action should be taken as soon as
possible.

               (b)  System AIS plans and risk assessment
documentation should be updated.   Updates should indicate
corrective actions taken and/or scheduled.

          (5)  No Guarantee.  It is important to understand that
Security Testing or other system level evaluations do not guarantee
that AIS problems will not be encountered in the future.  Such
tests can only give management and users a greater level of
confidence in a particular automated information system.  It is
vital that these evaluations be conducted in an open and positive
manner that keeps DPI managers and systems developers informed of
all planned Security Testing activities.  

304  PROTECTIVE MEASURES TO PREVENT MISUSE AND ABUSE

     In addition to other appropriate protective measures (such as
those covered in Chapter 4), protective measures to prevent misuse
and abuse of computing resources should include the following (see
Exhibit 3-5):

                   (EXHIBIT 3-5--See hardcopy)

     a.   Developing and implementing a procedure, where
technically and economically feasible, to maintain automated
computer system logs of access to multi-user computer systems to
determine whether unauthorized accesses are being attempted.  In
addition, more detailed system and network security monitoring may
be appropriate provided the legal requirements (such as, warning
banners upon login) have been met.

     b.   When there is "probable cause" for concern to NASA
management and prior coordination with appropriate legal and
criminal investigative officials has taken place, reviewing the
contents of computer system files at unannounced intervals by means
of random sampling.

     c.   Developing and implementing awareness procedures
requiring all personnel who access computer and network systems to
have a working knowledge of automated information security ethics,
responsibilities, policies, and procedures.

     d.   Ensuring that all actions constituting suspected or
confirmed significant automated information security incidents are
brought to the immediate attention of the appropriate DPI-AISO and
C-AISM; that the extent and cause of any incident is determined;
and that reasonable steps are taken to minimize the probability of
further incidents including additional training, counseling,
disciplinary action, and/or notifying criminal investigative and
law enforcement authorities, as appropriate.

305  CERTIFICATION PROCESS

     Certification is required to provide reasonable assurance that
a proposed or significantly changed computer application meets all
applicable requirements and the original design specifications and
that installed protective measures are adequate and functioning
properly prior to operational use.  The certification process
should involve all those individuals who have participated (or will
participate) in the sponsoring, planning, designing, programming,
operation, and/or use of the application.  Since this process could
involve many organizations and individuals, a team approach is
recommended.  A representative from each functional area should be
responsible for witnessing the system test and signing off on his
or her area of responsibility.  The primary responsibility for
accomplishing certification tasks in coordination with the DPI-AISO
should reside within the sponsoring/data owner organization.  The
DPI-AISO of the Installation in which the application will be
processed should assure that the following process has taken place
prior to operational use:

     a.   New or Modified Applications.  For new or significantly
changed computer applications that process level 2 through level 3
(as defined in Chapter 4) sensitive or mission-critical information
assure that (see Exhibit 3-6):

                   (EXHIBIT 3-6--See hardcopy)

          (1)  Functional security requirements are defined by the
system and/or information sponsors/owners based on established
installation procedures that include the following:

               (a)  Identifying and determining the nature of the
sensitivity and/or criticality of information to be processed as
discussed in Chapter 4 of this Handbook, and determining how the
information may be vulnerable to potential threats (e.g., misuse,
alteration, destruction, or disclosure);

               (b)  Determining primary and secondary system
security concerns (i.e., integrity, availability, confidentiality);

               (c)  Determining potential impacts if sensitive or
mission-critical information is misused, altered, destroyed, or
disclosed (e.g., embarrassment, legal liability, missed research
opportunity, and lost dollars);

               (d)  Determining when an application that supports
a mission-critical function must be back in operation after an
interruption to avoid adversely affecting the mission of the user
or the sponsoring/owner organization; and

               (e)  Determining general approximation of
replacement values associated with the application/information;

          (2)  System designers develop technical security
specifications that detail functional security requirements and
describe how specific protective techniques will be employed. 
These specifications should be described in technical terms that
system developers and programmers can implement;

          (3)  Functional security requirements and technical
security specifications are reviewed and approved prior to
acquiring or starting formal development;

          (4)  Results of risk assessments performed at the DPI in
which the computer application will be processed are taken into
consideration when defining and approving technical security
specifications for computer applications;

          (5)  Security design reviews and system tests are
conducted and approved prior to operational use of computer
applications; and

          (6)  Upon successful completion of the system test, the
computer application is certified prior to operational use as
meeting requirements of documented and approved functional security
requirements, technical security specifications, and related DPI
procedures, and that results of the system test demonstrate that
the application, computer system, and DPI protective measures are
adequate and functioning properly.

     b.   Recertifications.  For operational applications, assure
that:

          (1)  Periodic reviews are conducted and recertifications
are made of the continued adequacy and proper functioning of
protective measures;

          (2)  The recertification process takes into consideration
all available information (e.g., other reviews and audits that may
have been conducted subsequent to the last certification); and

          (3)  Recertifications are conducted at least every 3
years, as appropriate.  Time intervals should be commensurate with
the sensitivity/criticality of the information processed.  If no
significant change has taken place and no deficiencies have been
identified in other review activities, the recertification process
may be less stringent than the initial certification process.

306  MINIMUM LEVEL OF SCREENING FOR NON-FEDERAL ADP PERSONNEL

     Individuals participating in the management, design,
development, operation, or maintenance of automated information
assets and associated system protective measures shall receive at
a minimum, a National Agency Check.  NHB 1620.3, "NASA Security
Handbook," will provide procedures and information on additional
personnel screening actions that may be required in moderate or
high risk situations for level 2 and level 3 automated systems.

307  CONTROLLING ACCESS BY FOREIGN NATIONALS 

     a.   Introduction.  The NASA Office of Policy Coordination and
International Relations has primary responsibilities for
establishing NASA policy on controlling access to NASA facilities
by foreign nationals (i.e., all people who are not citizens or
nationals of the United States).  Refer to NMI 1371.3,
"Coordination and Authorization of Foreign Visits to NASA
Facilities," for additional information.  

     b.   Purpose.  The following policy and procedure has been
developed from the perspective of the NASA AIS Program by the NASA
Security Office in coordination with the Office of Policy
Coordination and International Relations.  It sets forth NASA
guidelines on controlling electronic access by foreign nationals to
NASA computer systems that process sensitive or mission-critical
information.

     c    Categories.  There are two basic categories of foreign
nationals that seek access to NASA automated information systems:
(1) those hired by contractors to perform work tasks in the normal
course of business; and (2) those who seek access pursuant to
international partnership agreements to conduct work on major
multinational projects (e.g., Space Station).  Foreign nationals,
under category (1) above, that are hired by contractors in the
normal course of business, need to be investigated and managed much
like other contractor employees.  Foreign nationals, under category
(2) above, need to be investigated and managed in a manner
consistent with requirements that are negotiated into international
partnership agreements.

     d.   Sponsors.  Requests for foreign national access to NASA
computer systems must be sponsored by NASA, another or appropriate
U.S. Government agency, or a contractor organization.

     e.   Submission/Approval of Requests.  C-AISM's shall ensure
that appropriate procedures are in place to evaluate requests for
foreign national access.  Requests for foreign national access to
NASA computer systems are to be submitted through the appropriate
DPI-AISO (i.e., at the Installation where primary access will
occur) to the Center Security Office for appropriate investigation
and approval.  Requests for access by foreign nationals from
designated areas will be reviewed on a case-by-case basis.  Refer
to Exhibit 3-7 for additional guidance.

     f.   Exceptions.  Requests for foreign national access which
present unusual concerns for a Center's Security Office should be
coordinated with the NASA Security Office, appropriate HQ Program
Office, and the International Relations Division (Code IP) for
further analysis and concurrence.  

    (EXHIBIT 3-7.  Minimum Information for Foreign National 
                 Access Requests--See hardcopy)

308  CONTINGENCY AND DISASTER RECOVERY PLANNING

     a.   Definitions.  Disaster recovery plans for DPI's and
contingency plans for computer applications shall provide for
minimizing interruptions and reasonable continuity of services if
adverse events occur that prevent normal operations.  These
planning activities may be integrated with each other or other
planning activities at the discretion of the C-AISM.

          (1)  Disaster Recovery Plan.  Disaster recovery plans are
documents containing procedures for emergency response, extended
backup operations, and post-disaster recovery should a DPI
experience a partial or total loss of computer and network
resources and physical facilities.  The primary objectives of these
plans, in conjunction with computer application contingency plans,
are to provide a reasonable assurance that a DPI can recover from
such incidents, continue to process mission-critical applications
in a degraded mode (i.e., as a minimum, process computer
applications previously identified as most critical), and return to
a normal mode of operation within a reasonable time.  Such plans
are a protective measure generally applied based on assessments of
other protective measures already in place, potential risk
exposures, cost and benefits to be derived, and feasibility of
implementation.

          (2)  Contingency Plans.  Contingency plans describe
procedures and identify personnel necessary to respond to abnormal
situations, and ensure that computer application sponsors/owners
can continue to process important applications in the event that
computer support at the primary DPI is interrupted (e.g.,
appropriate automated and/or manual backup capabilities should be
considered).  These plans are developed in conjunction with
computer application or data sponsors/owners and maintained at the
primary and backup data processing installation.

     b.   Plan Content.  Contingency and disaster recovery plans
for a DPI should include: 

          (1)  Identifying which applications support
mission-critical functions.  This information should be derived
from functional security requirements developed by owners/sponsors.

          (2)  Potential impacts to the DPI should unnecessary
processing delays occur.

          (3)  When applications must be back in operation after an
interruption to avoid adversely affecting the critical missions of
the users or the sponsoring/owner organizations.  This information
should be derived from functional security requirements developed
by owners/sponsors.

          (4)  The relative criticality of each application to the
overall mission of the local organization, the Center, HQ Program
Office, or the Federal agency, and establishing priorities to
restore processing support in a logical fashion after an
interruption.  The relative ranking of applications should be based
on recommendations from sponsor/owner organizations and approved by
DPI management.

          (5)  The appropriate amount of documentation.  The amount
of documentation detailed in these plans should be commensurate
with the nature of the DPI (e.g., documented in more detail for
large complex DPI's supporting multi-user computer systems and
documented in less detail for small DPI's supporting single-user
computer systems).

          (6)  Test intervals and providing reasonable assurance
that recovery requirements can be met.  

               (a)  Contingency plans for new applications should
be operationally tested at the supporting DPI during initial system
tests and at time intervals commensurate with the associated risk
of harm or loss that could be experienced.  It is the sponsor/owner
organization's responsibility to ensure that a DPI can meet
specified functional security requirements.  This includes
identifying and considering alternative processing DPI's or
providing additional funding to enhance protective measures at the
supporting DPI. 
 
               (b)  Disaster recovery plans should be tested at
least annually using a cost-effective and reasonable approach.  For
example, a limited test based on sample test data from the most
critical applications normally provides meaningful results.

               (c)  Formal written agreements should be established
to ensure that sufficient processing capacity and time will be
available to meet the recovery requirements of computer
applications when backup processing at an alternate DPI is
considered necessary.

          (7)  Identifying key individuals and developing proper
emergency notification and response procedures. 

309  NETWORK AND COMPUTER SECURITY INCIDENT RESPONSE (CSIR)
     CAPABILITY

     a.   Responsibilities.  

          (1)  Each Center is responsible for establishing a
Computer (and Network) Security Incident Response (CSIR)
capability, which is integrated with the Center's Technical Help
Desk facility to provide coverage for local computer systems and
local area networks.  The C-AISM will serve as the primary
management point-of-contact and further designate technical support
individuals to serve as technical points-of-contact.  C-AISM's will
maintain a listing of all management and technical
points-of-contact at DPI's under their management cognizance.

          (2)  HQ Program Offices are responsible for establishing
a network security incident response capability for each major
network or communications backbone they manage.  The Network
Security Manager (NSM) shall:

               (a)  Serve as the primary management point of
                    contact,

               (b)  Further designate technical support points of
                    contact, and

               (c)  Maintain a listing of all contacts.
 
     b.   Objectives.  This procedure has been developed as a
method for timely reporting of significant computer and network
security incidents, for determining the type of information to be
reported, and for appropriate follow-on activities after initial
notification of an incident.  Reports of significant computer and
network security incidents will be used to alert NASA and
appropriate NASA contractor DPI's to computer system
vulnerabilities, unauthorized access to computer systems, and other
problems which could adversely affect any NASA or appropriate NASA
contractor site.  Sharing incident information can result in new
vulnerabilities being identified, computer security awareness being
elevated, and risk exposures being minimized.  The timely reporting
of significant computer security incidents also serves to alert
NASA management to situations that might affect flight readiness or
receive adverse public attention.

     c.   Procedure Elements.  This procedure provides necessary
steps for reporting significant computer and network security
incidents at DPI's that have implemented (or are in the process of
implementing) the NASA AIS Program.  Use of this procedure should
be compatible with incident and emergency response and reporting
procedures that may already be in place.

          (1)  Immediately after detection of a significant
computer or network security incident (i.e., an incident that could
affect other DPI's under the cognizance of the same Center), the
DPI-AISO must notify the appropriate C-AISM.  If it is determined
that the incident could affect other NASA or NASA contractor
installations under the cognizance of the Center, an immediate
notification must be sent to all appropriate technical
points-of-contact.  The ultimate objective of this initial notice
is to alert other DPI's to potential problems that may impact them. 
The initial notice should provide the following:

               (a)  A general description of what has occurred.

               (b)  If appropriate, characterization of
                    perpetrator('s) thought to be involved (i.e.,
                    insider, outsider).

               (c)  What corrective actions are recommended, have
                    been taken, or are planned.

          (2)  If the C-AISM determines that the incident is
significant at the Center level (i.e., that it could represent
significant loss, affect mission readiness, affect other Centers,
and/or attract public attention), the C-AISM must: 

               (a)  Immediately notify the NASA AIS Program
                    Manager and the appropriate IPO-AISM.

               (b)  Coordinate with all appropriate technical
                    points-of-contact who support the affected
                    constituencies (e.g., UNIX, VAX, MS-DOS,
                    Macintosh, NSI, PSCN, NASCOM, etc.).

               (c)  Immediately notify other appropriate Center
                    C-AISM's, who must coordinate with their
                    affected constituencies.

          (3)  The C-AISM, in consultation with the Center Security
Office, the Office of Inspector General, the DPI-AISO, and NSM's as
appropriate, must determine what type of support (e.g., technical,
Inspector General, local law enforcement, FBI, legal, physical or
personnel security, classification, and/or public relations) is
required.  Names and telephone numbers of persons contacted in
these organizations must be maintained and included in follow-on
reports.  Should a classification review determine that an incident
affects a classified environment (and therefore, is itself
classified), all communications between the DPI, Center, Center
Security Office, NASA Security Office, and NASA AIS Program Manager
must be through secure channels.  (See Chapter 7.)

          (4)  After all applicable information has been obtained,
a written follow-on report must be forwarded, through the same NASA
channels, to the NASA AIS Program Manager.  This follow-on report
must contain the information shown in Exhibit 3-8.

          (5)  A copy of these significant automated information
security incident reports must be retained by the C-AISM and
DPI-AISO.  The retention period for these records will be
determined by the C-AISM.  Factors to be considered in determining
this retention period include the need for availability of this
information during periodic security reviews, risk assessments, and
trend analysis activities.

          (6)  The NASA AIS Program Manager will serve as the main
point-of-coordination among NASA Center automated information
security management, HQ Program Office automated information
security management, NASA senior management, and external
organizations.

          (7)  The Center closest to the occurrence of a
significant automated information security incident must assume a
lead role in developing accurate reports of related facts and
coordinating public releases of information with the local Public
Affairs  Office, the NASA AIS Program Manager, and the Office of
Public Affairs, Code P.

        (EXHIBIT 3-8.  Follow-On Incident Report Content
                         --See hardcopy)

     d.   Non-duty Hours Considerations.  Current listings must be
maintained of emergency situations and designated NASA CSIR
management officials to be notified.  The listing must be complete
with after-hours phone numbers and designated alternates for each
official.  These procedures should be integrated, as appropriate,
with local procedures for after-hours incident response.  Emergency
situations after hours that require immediate HQ involvement should
be directed to the Goddard Space Flight Center (GSFC) Emergency
Console.  Calls will then be forwarded to the responsible
management official.

310  CSAT

     a.   Continuous CSAT.  Continuous Computer (and Network)
Security Awareness and Training (CSAT) is required at centers and
DPI's to sustain the effectiveness of the NASA AIS Program. 
Employees who understand their responsibilities, the need for
security, and what they must do to promote it are one of the best
safeguards against automated information security incidents. 
Therefore, training should be provided on an ongoing basis to
employees and contractors, as appropriate.  New employees should
receive awareness training during their initial orientation. 
Refresher training should be offered at least annually.  Additional
training will be required whenever there are major changes in a
computing environment or the protective measures baseline.

     b.   Multifaceted Approach.  NASA has a multifaceted approach
(e.g., top-down and bottom-up, internal and external sources, etc.)
to CSAT.  It is believed that an effective CSAT program must offer
more to personnel than just an hour of classroom training once a
year or a limited selection from NASA-sponsored training
activities.  The training should incorporate a variety of
instructional approaches and be appropriate for the target
audience.  To this end, NASA sponsors many internal security
conferences, seminars, workshops, and meetings which are considered
part of the overall NASA CSAT Program.  NASA encourages personnel
to seek both internal and external CSAT sources to meet specific
job-related needs.  Sources available include:

          (1)  Annual NASA-Specific Conference.  The NASA Security
Office sponsors an annual automated information security conference
that is specific to NASA computer/network environments.  This
conference provides a forum for and promotes interaction among
(NASA employee and NASA contractor) automated information security
representatives from NASA HQ, centers, and DPI's.  It also
facilitates the exchange of technical and management information
related to protecting computer systems and automated information
throughout the NASA community.

          (2)  Annual C-AISM Working Group Sessions.  The NASA
Security Office sponsors at least two C-AISM Working Group Meetings
per year for the purpose of bringing all C-AISM's together to share
current information, solve common problems, and plan future NASA
AIS Program management strategies.

          (3)  NASA Electronic AIS Bulletin Boards.  The Security
Office sponsors NASA bulletin board services on NASAMAIL to share
information related to AIS.

          (4)  Periodic AIS Articles.  Articles on automated
information security are made available on NASAMAIL bulletin
boards.  Centers are encouraged to disseminate this information to
cognizant DPI's or extract information to enhance their own
periodic publications that are disseminated to raise security
awareness of current issues.

          (5)  Ongoing DPI AIS Training.  CSAT is required to
sustain the effectiveness of the NASA AIS Program.  Flexibility is
given to allow this training to be conducted in a manner that is
cost-effective and appropriate for a DPI.  Some options include:

               (a)  Formal Classroom Training.
               (b)  Self-Instruction Courses.
               (c)  Computer-Assisted Instruction (CAI).
                    Movies (16 mm).
               (d)  Videotapes.
               (e)  Newsletters and Bulletins.

          (6)  Ongoing Training from External Sources.  NASA and
NASA contractor personnel are encouraged to evaluate their specific
CSAT needs and seek additional generic and specialized training
from external sources (e.g., OPM, GSA, Department of Agriculture
Graduate School, DOD Computer Institute, National Computer Security
Center, and commercial vendors).

          (7)  Significant Incident Reporting (Feedback Loop). 
Reports of significant incidents are used to alert other Centers
and DPI's to potential threats and other problems that could
adversely affect their operations.  Through the sharing of incident
information NASA management can be kept informed, national trends
can be determined, computer security awareness can be elevated, and
potential risk exposures minimized.

          (8)  Sharing of AIS Tools and Techniques.  The NASA AIS
Program has established a network of automated information security
contacts at all organizational levels.  Individuals at all levels
are encouraged to establish professional working relationships with
their counterparts for the purpose of improving communications and
sharing effective automated information security tools and
techniques.  Such relationships are vital to reduce burden, solve
common automated information security problems, and provide
effective response during significant incident situations.  The
NASA Security Office encourages all Centers to continually submit
effective management tools and technical techniques they have
developed for dissemination to other Centers for evaluation and
implementation consideration.

311  PROCUREMENT OF PRODUCTS AND SERVICES

     The Office of Procurement (HQ Code H) has primary
responsibility for developing policy and guidance related to NASA
procurements.  The following guidance has been developed by the
NASA Security Office from the NASA AIS Program perspective in
coordination with the Office of Procurement.    

     a.   Introduction.  Functional security requirements must be
developed by sponsors/owners to integrate appropriate security
protective measures into hardware, software, telecommunications, or
supporting contractor services.  Also, detailed technical security
specifications must be developed by designers.  These requirements
(e.g., risk assessment, technical hardware/software measures,
design reviews, system tests, certification prior to operational
use, personnel screening, CSAT) must be included in technical
specifications and solicitations/contracts.   

     b.   NASA Contracting Environment.  Due to the nature of NASA
operations, NASA has virtually every type of contractual situation
for the acquisition of computer/network products and related
support services.  Because of the diverse range of procurement and
contractual situations and the degree of control management that
may exist between NASA and any given contractor, a reasonable
approach must be taken.  Procurement and contractual situations
must be evaluated on a case-by-case basis in order to avoid
imposing unnecessary constraints on contractors that are not under
the direct management of NASA.   

     c.   Project Manager Responsibilities.  The DPI management
official (e.g., the Project Manager or the Contracting Officer's
Technical Representative (COTR)) is responsible for assuring that
appropriate functional security requirements, technical security
specifications, and methods for evaluating security adequacy are
included in solicitation documents.

     d.   Sponsoring Organization Responsibilities.  Functional
security requirements and technical security specifications shall
be developed and approved by sponsors of the acquisition and
reviewed by the designated DPI-AISO.  General guidance for some
types of functional security requirements are included in GSA's
FIRMR.  Other DPI-specific requirements will have to be developed
based on the protective techniques selected as the result of a risk
assessment and further guidance, which may be provided by NASA
procurement offices, GSA, and NIST.  To the extent feasible,
functional security requirements should be stated in functional
terms (i.e., "what" is needed) relative to security objectives. 
This will permit the DPI to benefit from new technology or an
innovative application of existing technology. 

     e.   Contracting Officer Responsibilities.  For the
procurement of computing resources or related support services,
contracting officers shall:

          (1)  Ensure that no action is taken on a request for
proposal or procurement for computing resources or related support
services unless appropriate functional security requirements and
specifications are included in accordance with established DPI
procedures.

          (2)  Ensure that NASA technical proposal instructions
include a statement requiring a detailed outline and demonstration
of the offeror's automated information security capabilities that
comply with the functional security requirements of the
solicitation and contract.

          (3)  Include a clause in solicitations and contracts
requiring the contractor to comply with the functional security
requirements set forth in applicable parts of NMI 2410.7 and this
Handbook.
   
     f.   Evaluating Security Capabilities.  Proposal evaluators
shall review the offeror's proposed approach and witness live test
demonstrations, as appropriate, to evaluate the adequacy of
protective measures and the capability of the offeror to meet the
functional security requirements and technical security
specifications contained in the solicitation and contract.
Exceptions to live test demonstrations will be considered in cases
where it may be determined to be cost prohibitive.  Proposal
evaluators shall then certify, if appropriate, the adequacy of the
offeror's compliance.  This certification shall be obtained by the
contracting officer before proceeding with the procurement.

     g.   Contract Administration.  C-AISM's and DPI-AISO's shall
conduct in coordination with their Contracting Officer, COTR's,
Project Managers, periodic reviews of contracts in progress to
ensure continued compliance with functional security requirements. 
All instances of noncompliance shall be reported to the contracting
officer or designated representative.

     h.   Requirements for Contractor-Operated DPI's.  As indicated
in paragraph 102, the provisions of this Handbook apply to support
contractor organizations as provided by law and/or contract and as
implemented by the appropriate contracting officer.  The Center and
DPI management processes should assure that, in contracts for
equipment, software, the operation of DPI's, or related services:

          (1)  Appropriate functional security requirements and
specifications are included in procurement specifications and/or
statements of work.

          (2)  Functional security requirements and technical
security specifications are reasonably sufficient for the intended
application; that they comply with established DPI procedures; and
that protective provisions at the acquired DPI are adequate and
functioning properly prior to operational use.

          (3)  Resource-sharing service agreements provide for
compliance with applicable provisions of NMI 2410.7 and this
Handbook by responsible management officials at the acquired
processing DPI.



          CHAPTER 4.  AUTOMATED INFORMATION CATEGORIES
               AND SENSITIVITY/CRITICALITY LEVELS

400  INTRODUCTION

     There are two important concepts covered in this Chapter.  The
first concept is that there are reasonably definable "categories"
of information, each with its own unique management and security
concerns.  The second is that once automated information has been
categorized, it is necessary to determine a relative sensitivity
and/or criticality level for that information, so appropriate
protective measures can be considered and a protective measures
baseline established for supporting software, hardware, and
telecommunication systems.  The technical depth of a risk analysis,
type and frequency of security awareness training, and the
requirement for incident reporting are all examples of areas where
increasing sensitivity level should cause increased emphasis and
resource expenditures.

     a.   Information Categories.  Information categories are
simply logical groupings of information that are based on a legal
requirement, a policy requirement, or a management concern to treat
a category of information in a particular way.  An understanding of
these categories is the first step in determining the nature of the
sensitivity and/or criticality of automated information and the
types of protective measures that may be appropriate when the
information is processed by a computer system or transmitted over
a telecommunications network.  In order to assist application
sponsors and information owners with the sometimes subjective task
of identifying the nature of sensitive automated information and
identifying automated information that supports mission-critical
functions, NASA has developed a method for categorizing automated
information (as illustrated in Exhibit 4-1).  All NASA automated
information falls into one or more of these categories.

     b.   Sensitivity/Criticality Levels.  NASA has established
four hierarchical "levels" of sensitivity/criticality to assist
application sponsors, information owners, system designers, and
system developers (see Exhibit 4-2).  All NASA automated
information falls into one of these four sensitivity/criticality
levels, in which each level has a generic set of protective measure
considerations (as illustrated in Exhibit 4-3).  The following
paragraphs describe the 13 automated information categories, the
four sensitivity/criticality levels, and the protective measure
considerations for each sensitivity or criticality level.

      (EXHIBIT 4-1.  NASA Automated Information Categories
                         --See hardcopy)

     (EXHIBIT 4-2.  NASA Unclassified Automated Information
          Sensitivity/Criticality Levels--See hardcopy)

401  INFORMATION CATEGORIES

     a.   Statutes.  As shown in Exhibit 4-1, NASA has defined 13
categories of information to facilitate managing automated
information.  The predominant statutory bases for these categories
are:

          (1)  Federal Managers Financial Integrity Act of 1982
               (Public Law 97-255; 31 U.S.C 66a).

          (2)  Paperwork Reduction Act of 1980 (Public Law 96-511;
               44 U.S.C. 3501).

          (3)  Freedom of Information Act of 1974 (Public Law
               93-502; 5 U.S.C. 552b).

          (4)  Privacy Act of 1974 (Public Law 93-579; 5 U.S.C.
               552a).

     b.   Derivations.  Categories 1-5 are derived from the
statutes indicated above in subparagraph a and apply to all Federal
agencies.  Category 6 is derived from guidance in National Security
Decision Directives.  Categories 7-12 are derived from assorted
Federal directives.  Category 13 is derived from Presidential
Executive Order (EO) 12356.  The categories are defined in the
following paragraphs.

          (1)  Information About Persons (Category 1).  This
category includes information related to personnel, medical, and
similar information.  All information covered by the Privacy Act of
1974 falls into this category.

          (2)  Financial, Commercial, and Trade Secret Information
(Category 2).  Category 2 includes information from applications
such as financial, procurement, inventory, and decision-making.  It
also includes commercial information received in confidence, trade
secrets, and proprietary information.

          (3)  NASA Internal Operations (Category 3).  This incudes
information related to the internal operations of NASA.  This
category includes certain personnel rules, bargaining positions,
advance information concerning procurement actions, etc.

          (4)  Investigatory, Intelligence-Related, and Security 
Information (Category 4).  This category includes information
related to police intelligence and/or law enforcement
investigations or informants.  It also includes some automated
information security information (such as detailed security plans
for the protection of systems and specific automation
vulnerabilities).  Note that this category does not include general
plans, policies, or requirements, nor does it include Federal or
national security intelligence information.

        (EXHIBIT 4-3.  Protective Measure Considerations 
                  (Parts 1 & 2)--See hardcopy)

          (5)  Other Federal Agency Information (Category 5). 
Other Federal agency information includes information whose
gathering and/or maintenance is required by statute or another
Federal agency.  Information in this category is not the primary
responsibility of NASA.  For example, this category would include
DOD or Department of Energy (DOE) information processed in NASA
computers for DOD or DOE, respectively.

          (6)  Unclassified National Security-Related Information
(Category 6).  This category includes national defense and Federal
intelligence-related information subject to the policy, procedural,
and protective requirements established by the National
Telecommunications and Information Systems Security Committee
(NTISSC).  This information may require protection in addition to
that required under NASA guidance.  This information is not
classifiable under EO 12356, but requires protection in accordance
with NTISSC policy.

          (7)  National Resource System Information (Category 7). 
This is information related to the protection of a national
resource (such as the Space Shuttle or the Space Station Freedom.)

          (8)  Mission-Critical Information (Category 8). This is
information that has been designated as critical to the NASA
mission.

          (9)  Operational Information (Category 9).  This is
information that requires protection during operations.  It is
usually time-critical information.

          (10) Life Critical Information (Category 10).  This is
information critical to life support systems (i.e., information
whose inaccuracy, loss or alteration reasonably could be expected
to result in loss of life).

          (11) High or New Technology Information (Category 11). 
This is information relating to high or new technology prohibited
from disclosure to certain foreign governments, or may require an
export license from the Department of State and/or the Department
of Commerce.

          (12) Other Unclassified Information (Category 12).  This
is any information that does not logically fall into one or more of
the 11 categories and that is not classified for national security
purposes.  Use of this category should be very rare.

          (13) Classified National Security-Related Information
(Category 13).  This is information classified for national
security purposes (i.e., under EO 12356).  All automated
information security actions related to this category are covered
in Chapter 7 of this Handbook.

402  SENSITIVITY/CRITICALITY LEVELS

     a.   Introduction.  The sensitivity levels defined in Exhibit
4-2 are based on the amount of harm or loss that could be
experienced from an adverse event that affects the availability,
integrity, or confidentiality of NASA computing or information
resources.  A hypothetical relationship between the automated
information categories and sensitivity/criticality levels is
presented in Exhibit 4-4 for general guidance only.  Detailed
analyses should be conducted by information sponsors and owners, on
a case-by-case basis, and should be reviewed by a DPI-AISO before
making any final sensitivity and/or criticality determinations. 
The sensitivity/criticality of automated information should also be
periodically re-evaluated, as the influencing factors change.

     b.   Automated Information and Applications.  The sensitivity
and/or criticality of automated information is determined by
applicable sponsors and information owners.  A sensitivity and/or
criticality level should also be assigned to each automated
application, based on the sensitivity and/or criticality of the
automated information the application will process.  The
sensitivity/criticality level of an automated application is at
least as high as the most sensitive/critical automated information
that will be processed by that application.  The internal formulas
or the information editing criteria in the application source code
could raise the sensitivity/criticality level of the application
even higher than the information it handles.

     c.   Computer Systems.  DPI-AISO's should assign each NASA
computer a sensitivity/criticality level, based on the
sensitivity/criticality level of the applications processed on each
computer system.  Each computer system should have a
sensitivity/criticality level that is at least as high as that of
the most sensitive/critical application processed.  However,
significant replacement costs, unusually large numbers of
applications supported, and/or an unusually large volume of
information processed can raise the sensitivity/criticality level
of a computer system even higher than the sensitivity/criticality
level of the most sensitive/critical application processed on that
system.

403  PROTECTIVE MEASURE BASELINE CONSIDERATIONS
 
     Protective measure considerations for each sensitivity and/or
criticality level are presented as general guidance in the
following paragraphs.  The selection or omission of protective
measures should be justified and based on the results of a risk 
assessment.  DPI's may elect to increase their protection, add
additional protective measures, or establish a mandatory minimum
protective measures baseline, as they deem appropriate.  (See
Exhibit 4-3.  Also, see paragraph 304.)

   (EXHIBIT 4-4.  Categories and Sensitivities--See hardcopy)

     a.   Sensitivity/Criticality Level 0.   Sensitivity level 0
systems should provide for adequate protection of information
through the following protective measures:

          (1)  Access Protection.  Whenever a single computer
system is used by more than one person (whether or not that use is
concurrent), physical, procedural, and technical protective
measures should be provided that allow for identification and
authentication of individual users and to prevent access by
unauthorized persons.

          (2)  Configuration Management.  All files should be
cataloged and there should be licenses for all software used.  All
licensed software should be registered as requested or required by
the author/vendor prior to operational use.  Furthermore, software
should be tested and determined to be reasonably safe (free of
malicious design/code) for use in its intended environment. 

          (3)  Back-up Copies of Software.  At least one generation
of backup software should be maintained.  Backups of changed
information files should also be maintained.

          (4)  Physical Access.  Physical security (such as, door
locks and cable lock's) should be required when the automated
information resources are unattended.

          (5)  Personnel Security.  All users of computer systems
should be trained in the automated applications they use, proper
software handling procedures, and basic automated information
security practices.

          (6)  Environmental Measures.  Proper environmental
measures (to minimize the impacts of dust, water, temperature,
humidity, and ventilation) should be required.  Also, power surge
protection should be required for hardware.

          (7)  Storage Media.   Proper storage bins or containers
should be required for information storage media (e.g., disks,
tapes, etc.).

          (8)  Communications.  The communications links connecting
a computer system to other systems, networks, workstations, or
terminals must be approved by responsible management prior to the
implementation of the connection.

     b.   Sensitivity/Criticality Level 1.  Sensitivity level 1
systems should provide all Sensitivity level 0 protective measures
as well as:
 
          (1)  Access Protection.  Physical, procedural, or
technical protective measures should be provided that allow
physical and/or logical management of authorization and access to
the system and processing resources.
 
          (2)  Configuration Management.  A configuration
management process should be developed and maintained that monitors
changes to any security-related and sensitive software, hardware,
or procedure for the system.
 
          (3)  Back-Up Copies of Software.  At least two
generations of back-ups should be maintained, with the oldest
generation being stored at a location other than the immediate
vicinity of the system.

          (4)  Physical Access.  Systems should be physically
protected to prevent unauthorized access, theft, or destruction. 
Physical key locks should be used on microcomputer fixed/hard
disks.  Separate physical locks should also be used to prevent
hardware theft.

          (5)  Network Access.  Passwords should be required for
access to or from any network.  Use of software that provides error
checking and some error correction capability should be required
when performing file transfers using networks.

          (6)  Contingency and Disaster Recovery Plans. 
Contingency Plans for applications and Disaster Recovery Plans for
computer installations should be developed to provide for minimal
interruptions and reasonable continuity of services.  These plans
should be developed in accordance with Paragraph 308.
 
     c.   Sensitivity/Criticality Level 2.  Sensitivity level 2
systems should implement the protective measures required for
sensitivity levels 0 and 1, in addition to the following:
 
          (1)  Access Protective.  Physical, procedural, and
technical protective measures should be provided that allow for:

               (a)  Restriction of the functional capabilities of
                    individual users.

               (b)  Ability for individual users to manage access
                    (e.g., read, modify, or delete) by other
                    individual users to their information and
                    applications.

               (c)  Consideration of encryption of stored data.

          (2)  Audit Trails.  The system should provide for the
generation of journals, or audit logs, of accesses to the system
and to information and applications at the individual user level. 
Access to journals and audit logs should be restricted to a
well-defined group of users authorized by DPI management.
 
          (3)  Communications.  All communication paths for the
system should be described, and a well-defined path should exist
for the initial user identification and authentication processes. 
Encryption of data to be transmitted should be evaluated by
sponsors/owners.

          (4)  Network Access.  Written consent, identifying other
network nodes authorized to access the system node, should be
obtained from responsible DPI management prior to enabling any
network connection or interconnection.
 
          (5)  Logoff/Time Out Features.  The system should logoff
work stations that have not been in communication with the CPU for
a period of time determined by DPI management.
 
          (6)  Data Base Management Systems (DBMS's).  DBMS's
should provide for the integrity, confidentiality, and availability
of all information resident in the data base.  Individuals
responsible for data base administration are responsible for
ensuring that information owners are informed of and concur in all
defined access privileges to and uses of their information.

     d.   Sensitivity/Criticality Level 3.   Sensitivity level 3
systems must implement the minimum protective measures required for
sensitivity levels 0, 1, and 2, in addition to the following:
 
          (1)  Access Protective Measures.  Access protective
measures must be provided that can at all times restrict and log
individual user access by system resource, application, and
information files. Authorization to access system resources,
applications, and information files must be confirmed by the
information owners and shall be reconfirmed periodically but at
least every 6 months.

          (2)  Information and Application Labels.  Sensitivity
level indicators must be associated with computing resources,
applications, and information while Level 3 sensitive information
is being processed.

          (3)  Configuration Management.  Protective measures must
be in place to allow appropriate data bases to be stored off-line.

          (4)  Communications.  There shall be no uncontrolled
dial-up access or unauthorized connections to external networks. 
Management decisions not to use data encryption should be
justified.  Threats and risks associated with connections to
wide-area and/or internationally linked networks shall be evaluated
by sponsors/owners, and the resulting management decisions should
be justified.



                    CHAPTER 5.  AIS PLANNING

500  INTRODUCTION

     This Chapter presents details of the required automated
information security planning activities.  Specifically, it covers
the NASA AIS Program Plan, HQ Program Office Automated Information
Security Plans (PO-AISPs), HQ Institutional Program Office
Automated Information Security Plans (IPO-AISPs), Center Automated
Information Security Plans (C-AISPs), DPI AIS Plans (DPI-AISPs),
and Contingency Plans.  The Computer Security Act of 1987 requires
AIS planning at the system level.  This chapter prescribes AIS
plans in addition to those required by the Act.  A discussion of
how NASA complies with external requests for planning information
is found in paragraph 505.

501  HEADQUARTERS AIS PLANNING

     a.   NASA AIS Program Plan.

          (1)  Scope.  The NASA AIS Program Plan is Agencywide in
scope.  It does not provide detailed planning information for any
particular PO, IPO, Center, computer system, or automated
application.  

          (2)  Purpose.  The purpose of the NASA AIS Program Plan
is to document the overall NASA AIS Program goal, objectives,
directions, and strategies for the NASA AIS Program Manager.  

          (3)  Manager.  The NASA AIS Program Manager is
responsible for developing, implementing, monitoring, and
evaluating this plan.

          (4)  Publication/Distribution.  The NASA AIS Program Plan
shall be revised annually and integrated into the Agency 5-year IRM
Plan.

          (5)  Period Covered.  The NASA AIS Program Plan addresses
automated information security accomplishments for the past fiscal
year, AIS activities for the current budget year, and planned
strategies for the next budget year.  Thus, it covers a total of 3
years.

          (6)  Relationship with Other Plans.

               (a)  The NASA AIS Program Plan provides a basis for
responding to requests for NASA AIS Planning information from
national level organizations such as OMB and GSA.  PO/IPO-AISP's,
C-AISP's, and
DPI-AISP's must operate within the NASA AIS Program goal and
objectives, as defined in the NASA AIS Program Plan.

               (b)  Major security activities reported in the PO,
IPO, and Center automated information security plans will be
reflected in the NASA AIS Program Plan.

          (7)  Content and Format.  The content and format for the
NASA AIS Program Plan is shown in Exhibit 5-1.

     b.  Headquarters Program Office/Institutional Program Office
AIS Plans (PO/IPO-AISP's).

          (1)   Scope.  A PO-AISP covers a major Agency-wide
program activity which uses automated computer/network systems
and/or computer applications at one or more NASA Centers.  An
IPO-AISP covers a specific grouping of NASA Centers for which a
designated IPO has institutional management responsibilities. 
Where organizationally feasible, PO AIS planning requirements may
be integrated into and covered by other appropriate IPO-AISP's.

           (2) Purpose.  The purpose of the PO-AISP is to summarize
overall direction and status of strategies/objectives,
accomplishments, and on-going activities for projects managed by
the PO.  The purpose of the IPO-AISP is to summarize overall
direction and status of strategies/objectives, accomplishments, and
on-going activities for a particular IPO AIS program.  Information
contained in the PO/IPO-AISPs may address Center's, DPI's,
automated computer/network systems, and/or computer applications
under the management cognizance of a PO/IPO.  The intent is not to
repeat details already documented in Center level AIS planning
processes, but to integrate elements of these plans into one
consolidated document enabling PO/IPO-AISM's to plan for major
PO/IPO AIS activities, resolve inter-PO/IPO conflicts and
inconsistencies, and to provide overall AIS management oversight
and coordination from a PO/IPO perspective.

          (3)  Manager.  PO/IPO-AISM's are responsible for
coordinating with other appropriate PO-AISM's, IPO-AISM's, and
C-AISM's to develop/integrate, monitor, evaluate, and maintain
their PO/IPO-AISP's in an accurate and consistent manner.

          (4)  Publication/Distribution.  PO/IPO-AISP's shall be
updated in an ongoing manner as significant changes occur.  An
updated copy shall be forwarded annually to the NASA AIS Program
Manager by December 1; however, the primary function of these
consolidated AIS plans is for PO/IPO management use.  In cases
where PO AIS planning requirements are covered by other IPO-AISP's,
responsible PO-AISM's shall conduct reviews at least annually of
automated system resources under their management cognizance and
provide a letter of assurance (i.e., assurance of adequate
coverage) to the NASA AIS Program Manager by December 1.

          (5)  Period Covered.  PO/IPO-AISP's shall cover a 3-year
period (prior year through current budget year plus one).

          (6)  Relationship with Other Plans.  PO/IPO-AISP's
provide input for the NASA AIS Program Plan and may cross-reference
more detailed information found in other PO/IPO, Center, or DPI AIS
plans.  To the extent feasible, AIS planning activities should be
closely coordinated with and integrated into the more traditional
automated systems planning activities at the Center, DPI, and
system levels.

          (7)  Content and Format.  The level of detail in a
PO/IPO-AISP is determined by the complexity and scope of a
PO's/IPO's automated information resources.  Content and format
should be consistent with Center level AIS planning processes (see
Exhibit 5-2).

502  CENTER AIS PLANNING

     a.   Center AIS Plan (C-AISP).

          (1)  Scope.  C-AISPs are Centerwide in scope. An
automated information security plan is required for each NASA
Center.

          (2)  Purpose.  The purpose of a C-AISP is to summarize
the overall status and direction of AIS strategies/objectives,
accomplishments, and ongoing activities throughout a Center and at
DPI's under the management cognizance of the Center.  The intent is
not to repeat details already documented in each DPI-AISP, but to
integrate these plans into one consolidated document enabling
C-AISM's to plan for major automated information security
activities, to resolve inter-DPI conflicts and inconsistencies, and
to provide overall automated information security oversight and
coordination at the Center level.

          (3)  Manager.  The C-AISM is responsible for the
development, monitoring, evaluation, and maintenance of an
automated information security plan covering all automated
information resources under the management or oversight of his or
her Center.

          (4)  Publication/Distribution.  The C-AISP shall be
updated in an ongoing manner as significant changes occur,
published annually by October 1, and a copy sent to IPO-AISM's by
October 15; however, it is primarily for Center management use.

      (EXHIBIT 5-2.  CENTER AUTOMATED INFORMATION SECURITY 
                       PLAN--See hardcopy)

          (5)  Period Covered.  Automated information security
plans for NASA Centers shall comprehensively cover automated
information security activities over a 3-year period (prior year
through current budget year plus one).

          (6)  Relationship with Other Plans.  Center emergency
response plans should reflect the security activities in the
C-AISP.  The C-AISP should incorporate, in summary form,
information from all DPI-AISP's under the management oversight of
the Center.  In case of differences between a PO/IPO-AISP and a
C-AISP, the C-AISP shall take precedence.  If a PO/IPO-AISP
requires more stringent security than provided by the C-AISP, the
sponsoring PO/IPO shall be responsible for providing additional
funding to meet such requirements.  However, PO/IPO-AISP's must at
least meet minimum baseline security requirements of a C-AISP.

     b.   Content and Format.  The level of detail in the C-AISP is
determined by the complexity and scope of the Center's automated
information resources.  A sample format is shown in Exhibit 5-2.

          (1)  Introduction and Overview.  The "Introduction and
Overview" paragraph should define the Center AIS Program
environmental context.  This might include:

               (a)  Identifying those constraints that may impact
                    the implementation of the overall Center AIS
                    Program.

               (b)  Describing the management structures,
                    relationships, and personnel (DPI-AISO's,
                    information owners and users, application
                    sponsors, working groups, committees, etc.)
                    that have responsibilities for the
                    implementation and maintenance of the Center
                    AIS Program.

               (c)  Describing the management processes
                    established to ensure that DPI activities are
                    carried out in a timely and complete fashion.

          (2)  Center AIS Program Goal and Objectives.  The C-AISP
should have a paragraph clearly defining the Center AIS Program
goal and objectives.  Goals and objectives should include avoiding
any identified material weakness conditions.

          (3)  Significant AIS Activities.

               (a)  The "Significant AIS Activities" paragraph
should describe all significant automated information security
activities over a 3-year period.  Each activity should be related
to the security goals and objectives defined earlier.  For each
year in the 3-year period, Centers could cover specific activities
with actual or anticipated completion dates.

               (b)  These activities should be consistent with
those described in applicable DPI-AISP's.  However, they would
appear here in summary form only.  Specific activities might
include, but are not limited to:

                    (1)  Risk Management.
                    (2)  Application Certification/Accreditation.
                    (3)  Self Assessments and Compliance Reviews.
                    (4)  AIS Training.
                    (5)  Personnel Screening.
                    (6)  Network Security.
                    (7)  Contractor Security.
                    (8)  AIS Working Groups/Committees.
                    (9)  Reporting Problems.
                    (10) Sensitivity/Criticality Identification.
                    (11) Security Testing.
                    (12) AIS Planning/Budgeting. 
                    (13) Functional Automated Information
                         Requirements/ADP Acquisition.
                    (14) Computer/Network System Monitoring.
                    (15) Contingency Plan Testing.
                    (16) Incident Response, Coordination, and
                         Resolutions. 
                    (17) Audit Coordination and Resolution.
                    (18) AIS/Internal Controls Coordination.
                    (19) Research/Development of Tools/Techniques.
                    (20) Revised Policy, Procedures, and Guidance.

               (c)  Where possible, Centers should identify the
financial, personnel (Federal and contractor), and training
resources needed (i.e.,currently needed, and future projections) to
adequately support automated information security activities.  When
funding is endemic to the development, operation, or maintenance of
automated systems or computer applications, Centers should estimate
the percentage of that funding that is (or will be) directed toward
providing adequate automated information security controls.

503  DPI AIS PLANNING

     a.   Purpose.  The purpose of the DPI-AISP is to provide a
document that serves as the management summary of more detailed
information that may be associated with the basic elements of the
DPI's AIS Program.  It should serve as a basis for informing
management of security posture and needs, performing security
self-assessments, performing management and compliance reviews, and
facilitating the extraction of summary information in response to
Center, HQ, or other Federal agency requests for AIS planning
information.  The extent to which this planning activity is
integrated into the Center planning activities is left to the
discretion of each C-AISM; however, a DPI must comply with the
Center's AISP.

     b.   Content.  The DPI-AISP must be kept current and should
include elements that are relative to the coverage of the plan and
to the computing environment of the DPI, as follows (see Exhibit
5-3):

                   (EXHIBIT 5-3--See hardcopy)

          (1)  Summary of the management process describing the
general administrative, technical, physical, and personnel
protective measures employed at the DPI.  If special provisions
apply to selective computer systems or applications, this
information should be included.

          (2)  Reference to list(s) that uniquely identify computer
applications that process sensitive or mission-critical
information, the sponsors and/or owners of such applications, and
the computer systems that provide processing support.

          (3)  Reference to contingency and disaster recovery
plans.

          (4)  Reference to schedules indicating planned and
completed risk assessments, certifications/recertifications,
compliance reviews, and CSAT sessions.  Schedules should, at a
minimum, indicate the fiscal year planned for completing such
tasks.

          (5)  Reference to documents containing the results of the
latest self-assessment, compliance reviews, risk assessments,
security design reviews, certifications/recertifications, system
tests,  and followup actions on previous recommendations from these
review activities.

          (6)  Reference to a plan for continually providing CSAT
to personnel who manage, design, develop, operate, maintain, or use
automated systems.  Plans for off-site users may be less specific
and describe approaches for disseminating security awareness and
training information (e.g., online tutorials and security
bulletins).

          (7)  Identification of software tools used to enhance
security.

          (8)  Reference to the procedures for identifying
automated information security incidents and reporting significant
incidents.

          (9)  Reference to lists of key personnel and how they can
be contacted during emergencies.  Key personnel may include but are
not limited to:

               (a)  The DPI-AISO.
               (b)  Assistant DPI-AISO's.
               (c)  DPI-AISC's.
               (d)  Automated information security incident
                    response personnel.

                    (1)  DPI management.
                    (2)  Operations.
                    (3)  Technical support.
                    (4)  Information sponsors/owners/users.
                    (5)  local NASA public relations officer.

               (e)  Technical help desk facility.
               (f)  Local IG criminal investigator.
               (g)  Physical emergency response personnel
     
                    (1)  Building maintenance.
                    (2)  Building protective services.
                    (3)  Fire department.
                    (4)  Local guard force.
                    (5)  Local law enforcement.

504  DPI CONTINGENCY AND DISASTER RECOVERY PLANS

     Contingency and Disaster Recovery Plans are covered in
paragraph 308.

505  EXTERNAL REQUESTS FOR REPORTS ON AIS PLANNING ACTIVITY

     a.   NASA is subject to ongoing requests for reports of
automated information security planning activity from such external
agencies as OMB, NIST, NSA, GSA, and GAO.  To reduce management
burden and administrative paperwork, detailed documentation
(related to the basic elements of automated information security
planning) should be maintained at the lowest organizational levels. 
This information should be stored in ways that make it easily
located, extracted, analyzed, and formatted.

     b.   Exhibit 5-4 illustrates the relationship of all NASA
automated information security planning activities.  The flow of
planning information (from general requirements to specific
security information) provides a master directory and
cross-reference system for locating detailed documentation on any
specific aspect of the agencywide AISP.  This systematic approach
eliminates the need to retain multiple copies of documents at DPI,
Center, and Headquarters levels.  However, this structure requires
that:

          (1)  The NASA AIS Program Plan reference more detailed
information contained in HQ Program Office and Center automated
information security plans;

          (2)  The Program Office and Center plans reference more
detailed information contained in DPI-AISP; and

    (Exhibit 5-4.  Relationship of NASA Automated Information
           Security Planning Activities--See hardcopy)

          (3)  The DPI-AISP references more detailed information in
automated information system level documentation.

     c.   An example of an external request is documented in OMB
Bulletin 90-08, issued .  OMB Bulletin 90-08 required computer
systems that processed sensitive information to be identified by
drawing logical boundaries around major application support systems
and/or general hardware support systems, based on the similarities
among functional security requirements and options.  NASA is able
to comply with such requests using existing organizational
structures (i.e., HQ Program Offices, Centers, and DPI's) and
consolidating management summary information contained in existing
documentation (at all levels).



      CHAPTER 6.  SPECIAL CONSIDERATIONS FOR MICROCOMPUTERS

600  INTRODUCTION

     a.   Security Principles.  All automated information security
policies, standards, responsibilities, guidelines, principles, and
techniques covered in Chapters 1 through 5 apply to microcomputers. 
However, the application of those policies, standards,
responsibilities, guidelines, principles, and techniques may need
to be "scaled down" and can vary dramatically.

     b.   Security Implications.  Microcomputer characteristics
result in security implications not normally found on mainframes.

          (1)  The increased use of microcomputers on networks has
exposed them to external threats.

          (2)  Microcomputer operating systems have few, if any,
security features designed into them.

          (3)  Most users do not understand the protective measures
available for microcomputers.

          (4)  Microcomputers normally operate in office areas and
are, therefore, accessible to most employees.

          (5)  Microcomputers are easy to move and therefore can be
easily stolen.  In addition, moving a microcomputer can damage it
internally.

          (6)  Most users do not understand that protective
measures are necessary to safeguard the valuable data and not just
the loss of the computer hardware.

602  SPECIAL PROTECTIVE MEASURES FOR MICROCOMPUTERS

     The following paragraphs identify security implications of
microcomputers that require special protective measures.

     a.   Technical Protective Measures.  The following technical
protective measures for microcomputers are available as software or
hardware add-ons:

          (1)  Sensitive information encryption.
          (2)  Computer access control using passwords.
          (3)  Erased disk area overwrite.
          (4)  Disk backup.
          (5)  Deleted file retrieval.
          (6)  Authorized program execution control.
          (7)  Virus scanning/detection software.

     b.   Administrative Protective Measures.  The following
administrative protective measures are available for
microcomputers:

          (1)  Virus-free microcomputer input verification.
          (2)  Licenses for all software purchased for use.

     c.   Physical Protective Measures.  The following physical
protective measures are available for microcomputers:

          (1)  Keyboard lock.
          (2)  Computer tie-down device.
          (3)  Diskette storage cabinet.
          (4)  Removable hard disk.
          (5)  Lockable office or enclosure.

     d.   Personnel Protective Measures.  Personnel security for
microcomputers must include:

          (1)  User Security Awareness and Training.
          (2)  Employee Screening and Monitoring.

603  AIS SOFTWARE MANAGEMENT ISSUES

     a.   Imported Software.  When freeware, public domain
software, shareware, and other imported software is used, it might
contain code that initiates unauthorized actions such as destroying
data.  As noted in paragraph 403a(2), procedures must be
implemented that provide reasonable assurance that software does
what it is intended to do, and nothing more.  The threat from
imported software is particularly significant with microcomputers
and networks, because they so frequently use imported software. 
Evaluations of risk in specific situations should consider the
software pedigree (who wrote the software and is the author also a
user of the software), the complexity of the code, the criticality
of the application, and the potential for causing harm to other
users on a network or through movement of files on removable media
such as diskettes.

     b.   Centers of Excellence.  NASA supports development of
Centers of excellence that allow specific Field Installations to
take leadership roles in searching for the best methods of reducing
the security risks from imported software in microcomputers and
networks.



      CHAPTER 7 - PROCESSING NATIONAL SECURITY INFORMATION

700  INTRODUCTION

     a.   Background.  "National security information" is
information that could reasonably be expected to cause damage to
national security upon unauthorized disclosure.  This "classified"
information falls into one of three classifications - CONFIDENTIAL,
SECRET, or TOP SECRET.  Executive Order (EO) 12356 tasks the
National Security Council with overall policy direction for
classified information.  The Secretary of Defense is the Executive
Agent and the Director, National Security Agency (NSA), is the
National Manager for the handling of classified information in
computers.  The authority to declare information classified has
been granted to selected Federal agencies, including NASA.

     b.   Scope.

          (1)  The policies and procedures for handling classified
information outside of computers and networks are in the "NASA
Security Handbook."

          (2)  When classified information is entered into
computers or networks, it must be afforded the special security
considerations in this Chapter.

          (3)  Although the requirements in this Chapter are
generally consistent with those of other agencies, computers or
networks that contain information classified by another Federal
agency (such as, the Department of Defense, the Department of
Energy, or the Department of State) are bound by the policies and
procedures specified by that agency.  

     c.   Preceding Chapters.  The preceding six chapters of this
Handbook apply to all automated systems, whether or not they
process classified information.  This chapter supplements the
preceding chapters by providing those additional policies unique to
computers that process classified information.

     d.   References.  The National Security Council has issued
many National Security Decisions (such as NSD 42) and National
Security Decision Directives.  In addition, the National Security
Telecommunications and Information Systems Security Committee
(NSTISSC) has issued various national security policies (such as
NTISSP 200).  The "NASA Security Handbook" covers the manual
handling of classified information in NASA.  The provisions in this
chapter are consistent with all such references in Appendix A.

     e.   Format.  This chapter follows the format of the overall
Handbook.  It includes the following seven major paragraphs:
program overview, program organization and management, Center and
DPI requirements, automated information categories and
sensitivity/criticality levels, AIS planning, special
considerations for microcomputer platforms, and other special
considerations for classified processing.

701  PROGRAM OVERVIEW

     a.   Issues, Purpose, and Scope.  The provisions of paragraphs
100 through 102 apply.

     b.   Systems Covered.  As noted in paragraph 103, this
Handbook applies both to NASA organizations and to support
contractors.  Therefore, the provisions of this chapter on
classified systems apply to all automated systems that store,
process, or communicate information for which NASA is the
classification authority.

          (1)  This chapter supplements, but does not replace, the
requirements in other NASA management instructions and handbooks on
security.

          (2)  This chapter covers all automated systems that
process classified information for NASA-sponsored activities.  When
NASA computers are used to process the Department of Defense's
(DOD's), some other Federal agency's, or some other country's
classified information, the security requirements of the
information owner apply.

          (3)  NASA may be required to support SCI or Special
Access programs from time to time.  These programs are outside the
scope of this Handbook and may require additional security
measures.  In these cases, mutual cooperation between Center
Designated Approving Authorities (DAA's) and the responsible Agency
or department is a must.

     c.   Exceptions.  Exemptions to the safeguards mandated by
this chapter for classified systems can not be granted by a DPI's
managing organization, as stated in paragraph 104 for unclassified
automated systems.  Waivers to specific requirements in this
chapter will be rare and must be requested in writing through the
appropriate C-CAISM, DAA, and PO-AISM to the Director, Security,
Logistics and Industrial Relations Division, at NASA Headquarters. 
Since most requirements in this chapter are mandated by other
Federal agencies, each waiver request must be examined to determine
which Federal policies apply and what appropriate actions to take
with those Federal agencies.

702  PROGRAM ORGANIZATION AND MANAGEMENT

     a.   Management Philosophies.  Although most safeguards for
classified environments are mandatory, reasonableness,
cost-effectiveness, and a total systems engineering approach remain
the driving philosophies in NASA (see paragraph 201).

     b.   NASA AIS Program Goal and Objectives.

          (1)  The goal in paragraph 202a applies to classified
environments, but the safeguard priorities and implementation
strategies are different.  Most safeguards for classified
environments are mandatory.  By the time all mandatory safeguards
are implemented, the funds available for discretionary safeguards
based on risk assessments are typically limited.  Nevertheless, the
goal of cost-effectiveness remains sound for both unclassified and
classified environments in NASA.

          (2)  This chapter focuses on achieving an effective level
of "confidentiality" in classified environments; however, the other
AIS objectives (i.e., integrity and availability) covered in
paragraph 202b apply as well.

     c.   Program Elements.  The program elements and the need for
sustaining effectiveness, as described in paragraph 203, apply.

     d.   NASA AIS Policy.  The policy in paragraph 204 applies. 
In addition:

          (1)  All elements and components of automated systems
that process NASA classified information shall function in a
cohesive, identifiable, predictable, and reliable manner such that
malfunctions can be detected and reported.  (See par. 704c(2).)

          (2)  NASA classified information handled and produced by
automated systems shall be safeguarded and labeled as appropriate
for the classification level of the information.  Automated systems
shall provide each user access to all information to which
entitled, but no more.  Each file shall have an identifiable origin
and use.

          (3)  Communications links shall be secured in accordance
with the NASA Communications Security (COMSEC) Handbook
(operational draft).  Only Protected Distribution Systems (PDS's)
or cryptographic devices approved by NSA will be used.  (Also, see
par. 707l.)

          (4)  There shall be sufficient separation of duties and
responsibilities to ensure that no one person can compromise
classified information in automated systems without detection.  See
NCSC-TG-015.

     e.   HQ Responsibilities and Structures.  The responsibilities
and program structures in paragraphs 205 through 208 apply.

703  CENTER AND DPI REQUIREMENTS

     a.   Center Requirements.  The policies in paragraph 300
apply.  Also, see NCSC-TG-027.  In addition the following apply for
classified automated system environments:

          (1)  A DAA must be appointed at each NASA Center.  DAA is
a Center Director or senior management official designated by them,
unless otherwise designated in an MOU, who is in a position of
authority and capable of accepting or reducing identified risks
exposures and is therefore accountable for making decisions on
cost/risk tradeoffs relating to adequate safeguarding alternatives. 
The DAA authority may be delegated to an equivalent sponsoring
management authority in a major program area after such individuals
have received proper training on performing DAA responsibilities.

     b.   DPI Requirements.  The policies in paragraph 301 apply. 
In addition,

          (1)  The DPI-AISO shall:

               (a)  Ensure appropriate classified automated system
security planning.  (See par. 705c.)

               (b)  Develop and document, in coordination with the
C-CAISM and DAA, a procedure for validating user identifiers
(USERID's) and passwords, and for prompt notification when a USERID
is no longer needed.  (See par. 707e.)

               (c)  Coordinate the processing of classified
information with other DPI functional areas that may impact, or may
be impacted by, the classified processing.

          (2)  A Classified System Security Manager (CSSM) shall be
appointed in writing for each automated system (whether a computer,
a communication system, or a network) that processes classified
information.  See NCSC-TG-026.  The CSSM shall:

               (a)  Ensure system security is being managed and
operated in accordance with the requirements outlined in this
Handbook and assure DPI compliance with COMSEC, PDS, TEMPEST,
physical, personnel, administrative, and information security
requirements, as they apply to classified processing.

               (b)  Ensure that each classified system has been
implemented as described in its system security plan and that the
specified protective measures are in place and operating.  (See
paragraphs 704c and 705c.)

               (c)  Ensure that a CSAT program for users of systems
processing classified information is developed, presented, and
documented.  (See par. 703h.)

               (d)  Assure that the established classified
automated information security incident reporting system is
maintained for the DPI.  (See par. 703g.)

               (e)  Recommend, develop, and implement, in
conjunction with applicable NASA and Federal requirements, the
selection, acquisition, distribution, and implementation of
security measures, as appropriate.  (See par. 703i.)

               (f)  Develop and maintain a system security plan. 
(See par. 705c.)

               (g)  Provide security auditing for automated system
and support activities.  (See par. 707f.)

               (h)  Report any known or potential security problems
to the DPI-AISO immediately.

               (i)  Assure that appropriate contingency plans are
developed, documented, tested, and maintained.  (See par. 703f.)

               (j)  Assure the proper marking, handling,
destruction, and removal of classified automated system media and
equipment.  (See par. 707h.)

               (k)  Develop and publish procedures for monitoring
and responding to system security threat situations (including
system-generated warning messages).

               (l)  Partially or completely suspend automated
system operations, when deemed appropriate, to resolve security
incident situations.

     c.   Management Process.  The requirements in paragraph 302
apply.

     d.   Security Risk Assessments.  The requirements in paragraph
303 apply.

     e.   Misuse and Abuse.  The requirements in paragraph 304
apply.

     f.   Security Certification.  The requirements in paragraph
305 apply.  In addition classified systems must be "accredited" by
a Designated Approving Authority (DAA).

          (1)  Accreditation.  The decision to accredit takes into
account the system certification, the system security plan, and the
results of system security tests.  See NCSC-TG-007 and NCSC-TG-008. 
As noted in Exhibit 7-1, the DPI-AISO provides such information
through the C-CAISM and C-AISM to the DAA.  Without accreditation,
classified processing cannot occur.  The C-CAISM and C-AISM review
the information and forward an approval/disapproval recommendation
to the DAA.  If acceptable, the DAA issues the formal written
accreditation that specifies the authorized security mode, as
defined in paragraph 707n.  However, only NSA can approve the
operation of a Database Management System (DBMS) in a "multi-level"
security mode, as defined in paragraph 707n.  See NCSC-TG-021 for
more information on DBMS's.

               (a)  Security tests may need to be conducted by a
group independent from the user/developer and the DPI-AISO.  Should
any vulnerabilities be revealed during these tests, the DPI-AISO
shall ensure that the necessary steps are taken to eliminate or
minimize their impact.  Any modifications, changes, or additions to
the system security measures shall be included in a revised system
security plan, and the revised plan must be approved.

               (b)  Once approved, security testing is performed as
required by the DPI-AISO and C-CAISM.  The DPI-AISO evaluates the
implementation of the security

     (Exhibit 7-1.  Accreditation Process for Systems
     Processing Classified Information--See hardcopy)

features for the automated system (including network connections)
and verifies that the automated system operates in accordance with
the approved system security plan.  The security test results and
the certification are forwarded through the C-CAISM and the C-AISM
to the DAA.

               (c)  The DAA reviews the certification and test
results and formally issues a written accreditation accepting the
risk of operating the automated system and authorizing its use as
documented in the system security plan.  A written accreditation is
forwarded through the accreditation chain to the DPI-AISO.

               (d)  The DAA may grant interim accreditations (such
as for 30 or 90 days).  Such interim accreditations might also be
limited to processing lower levels of classified information.

          (2)  Reaccreditation.  A revised system security plan and
a reaccreditation is required whenever the security environment
changes, the security requirements change, or 3 years have elapsed
since the prior accreditation, whichever is sooner.

          (3)  Network Accreditation.

               (a)  A classified communications link/network shall
be formally accredited prior to its operational use.  The DAA
shall, on an annual basis, review the communications link/network
implementation to verify that no changes have been made to the
communications link/network that might degrade overall security. 
The DPI-AISO responsible for the communications link/network and
the DAA shall be advised of any alterations to the communications
link/network design that might impact on its security.  See
NCSC-TG-005, NCSC-TG-006, and NCSC-TG-011.

               (b)  The DPI-AISO should coordinate with the
appropriate NASA offices to determine the compatibility of the
overall network security plan and individual system security plans
of each communications link/network component.  Reaccreditation is 
required prior to the activation of a network whose components or
automated systems have undergone security-relevant changes.

     g.   Screening Non-Federal Personnel.  Paragraph 306 does not
apply, since the screening requirements for classified environments
are more stringent than those for unclassified environments. 
Access to classified information requires both a clearance and a
need-to-know, as defined in the proposed "NASA Security Handbook"
Section 1308.

     h.   Access by Foreign Nationals.  Paragraph 307 applies,
providing all other requirements have been satisfied for confirming
appropriate clearance, access, and need-to-know, as described in
NHB 1620.3, "NASA Security Handbook" Section 1308.

     i.   Contingency and Disaster Recovery Planning.  Paragraph
308 applies.  Also, see NCSC-TG-022.

     j.   CSIR Capability.  Paragraph 309 applies.  In addition,
any incident involving the potential unauthorized disclosure of
classified information is considered "significant."  When CSIR
reports potentially disclose vulnerabilities in classified
automated systems, they are to be classified at the level of the
affected systems.  Classified reports must be marked and handled
according to NHB 1620.3, "NASA Security Handbook."  Managers
without proper clearance and need-to-know cannot see such reports;
the flow of classified reports must by-pass uncleared managers.

     k.   CSAT.  Paragraph 310 applies.  In addition,  DPI-AISO's
should:

          (1)  Ensure that all users and support personnel involved
in classified processing sign an acknowledgment confirming that
CSAT training has been completed and is understood, prior to being
granted systems access, and yearly thereafter.

          (2)  Provide training in handling classified materials,
security modes of operation, reporting malfunctions/incidents,
unauthorized system use, current threats, and user access
requirements.

          (3)  Provide annual refresher training to system users. 
System support personnel shall also be trained in appropriate
operational security procedures for the particular automated system
and facility before they assume their duties.

     l.   Procurement of Products and Services.  The requirements
in paragraphs 311 and 312 apply.  In addition, NSA policies and
guidelines apply to the acquisition of TEMPEST and other special
features when the target environment will include classified
information.  (See par. 704c.)  Also, Centers shall actively
involve the DAA in the finalization of security requirements for
acquiring automated systems that will process classified
information.  Also, see NCSC-TG-009, NCSC-TG-013, and NCSC-TG-014.

704  AUTOMATED INFORMATION CATEGORIES AND SENSITIVITY/CRITICALITY
     LEVELS

     a.   Information Categories.  Paragraphs 400 and 401 apply. 
Classified systems fall into category 13.

     b.   Sensitivity/Criticality Levels.  Paragraph 402 applies. 
Sensitivity/criticality level three applies to automated systems
that process classified information.

     c.   Protective Measure Baseline.  The "protective
considerations" for sensitivity/criticality level three systems
(see paragraph 403) are all mandatory for classified environments. 
In addition, the provisions below and in paragraph 707 apply:

          (1)  Emissions Security.  Emissions Security, frequently
referred to as TEMPEST, is the study and control of compromising
radio frequency (RF) emanations that originate from all automated
system and communications systems.  In order to minimize the threat
of exploitation of these signals, countermeasures shall be
implemented.  Countermeasure determinations shall be accomplished
using the guidelines established in NTISSI 7000.  This document
outlines the requirements based on the classification level of the
information being processed, the amount of classified information
being processed, and the physical surroundings of the facility.

          (2)  Trusted Criteria.  Classified systems must be
designed and built to ensure a reasonable level of trust in their
ability to safeguard classified information.  This means at least
a C2 level for all classified systems.  The trusted computer
evaluation criteria, as defined in DoD 5200.28, "Department of
Defense Trusted Computer System Evaluation Criteria," apply to
existing and new systems.  See CSC-STD-003-85, CSC-SAD-004-85, and
NCSC-TG-019.  The level of trust is based on the specific security
features and the degree of assurance that these features are
appropriately implemented.  The criteria for determining
level of trust are divided into four divisions: D, C, B, and A,
ordered in a hierarchical manner with the highest division  (A)
being reserved for systems providing the most comprehensive
security.  Within these four categories are sub-categories, such
as, 1, 2, or 3.  These sub-categories are also arranged
hierarchical, with 1 being the least restrictive.  The level of
trust required for a particular system depends on the level of
classified information processed and the minimum security clearance
level of the system users, as shown in Exhibit 7-2.

      (EXHIBIT 7-2.  Minimum Levels of Trust--See hardcopy)

          (3)  Communications Access.  Classified systems cannot
allow any dial-up access, no matter how controlled that access
might be.  Similarly, there can be no connections to external
networks or maintenance offices.

          (4)  Terminal Access.  User terminals connected to
classified systems must automatically log-off users who are
inactive for 5 minutes or longer.  The system must also ensure that
users are inactivated after three incorrect authorization attempts
(such as entering incorrect user passwords).  Reactivation of such
users must require manual validation of the user's identity.


705  AIS PLANNING

     a.   Headquarters AIS Planning.  Paragraphs 500 and 501 apply. 
These plans will be classified, if appropriate, in accordance with
"NASA Security Handbook" Chapter 12.

     b.   Center AIS Planning.  Paragraph 502 applies.  These plans
will be classified, if appropriate, in accordance with NHB 1620.3,
"NASA Security Handbook" Chapter 12.

     c.   DPI AIS Planning.  Paragraph 503 applies, except that the
plan format and content shall be as shown in Appendix E.  These
plans will be classified, if appropriate, in accordance with the
proposed "NASA Security Handbook" Chapter 12.  Each plan will flow
from system security manager, through the DPI-AISO, through the
C-CAISM, to the C-AISM.

     d.   External Requests for Reports on Planning Activity.  The
requirements in paragraph 505 apply.

706  SPECIAL CONSIDERATIONS FOR MICROCOMPUTER PLATFORMS

     a.   The requirements in Chapter 6 apply.  Note that these
requirements are in addition to those in paragraph 704c above. 
Also, see NCSC-WA-002 and NCSC-C1 Technical Report 001.

     b.   For the most part, microcomputer platforms and stand
alone work stations will be operated in the "dedicated" security
mode.  When this is not the case, appropriate security precautions
must be taken in order to fulfill the intent of the requirements
outlined this chapter.  All microcomputer platforms and stand alone
work stations that process classified information must be
configured with an automated password management system and audit
trail capability.

707  OTHER SPECIAL CONSIDERATIONS FOR CLASSIFIED PROCESSING

     a.   Security Reviews, Tests, and Reporting.  Security reviews
and tests will be performed at least annually to determine the
effectiveness of implemented protective measures and to support
certification of systems processing classified information.  The
scope of the reviews and tests should be appropriate for the level
of classified information and applications processed and the
operating environment.  These reviews and tests shall be conducted
jointly by the CSSM and the DPI-AISO and a report sent through the
C-CAISM and the C-AISM to the DAA within 2 weeks summarizing the
results of the review.

     b.   Maintenance Personnel.  All persons involved in hardware
maintenance or repairs requiring entry to the automated system
facility, or access to any parts or components of the automated
system, where: (i) the complexity of the automated system; (ii) the
nature of maintenance/repairs to be performed; (iii) the frequency
of visits; (iv) the amount and classification level of information
processed; or (v) the availability of knowledgeable escorts makes
escorting impractical, shall have a personnel security clearance at
least to the classification level that the automated system is
approved to process.  Maintenance personnel who do not have the
appropriate clearance must be accompanied by an authorized,
technically knowledgeable, escort.  Escorts must be cleared at
least to the classification level of the system approved and take
all reasonable protective measures to ensure that the activities of
the individual being escorted do not alter the integrity of the
automated system.

     c.   Visitors.  Persons visiting the area on an official
one-time, or infrequent basis, and who will not have access to
classified information or to the system hardware or software, may
be admitted to the area when accompanied by an authorized escort
who will manage visitor access and be responsible for visitor
activities while in the area.  For other persons requiring entry
into the automated system facility or remote terminal areas who
will have access to classified information or to the automated
system hardware or software, follow applicable NASA procedures (see
NHB 1620.3, "NASA Security Handbook").

     d.   Physical Security.  Each DPI-AISO must, in cooperation
with the C-CAISM and the Physical Security Manager, ensure their
facilities are protected at the highest degree of protection
required by the classification of their system(s).  Physical
security safeguards instituted must be in compliance with
applicable Federal and NASA guidelines, i.e. NHB 1620.3, "NASA
Security Handbook."

          (1)  Each automated system, including remote terminals,
printers or other output devices, communications paths, memory, and
other interconnected devices, shall be afforded physical security
commensurate with the highest level and most restricted category of
classified information for which the system is accredited. 
Protective measures to safeguard the physical equipment apply not
only to the automated system equipment and its peripheral
equipment, but also to all removable media such as magnetic tapes,
magnetic disk packs, and spare or replacement parts once they are
associated with a system processing classified information.

               (a)  Where two or more automated systems of equal
classification are located in the same area, and the equipment
comprising each automated system is located and protected so that
direct physical access is effectively limited to that system, the
area limited to each system may be considered that automated
system's facility.  The measures and techniques for so "isolating"
that system shall be reflected in the automated system's system
security plan.

               (b)  Where two or more automated systems of
differing classification are located in the same area and the
personnel using these automated systems are not all cleared to the
appropriate level, strict physical security measures must be
implemented to eliminate the possibility of inadvertent disclosure. 
"Periods processing" may also be required.

               (c)  During "periods processing," remote terminals
not needed for classified processing shall be disconnected at the
automated system facility by physically disconnecting the device.

          (2)  Physical security safeguards and access protection
must be established and continuously maintained for automated
systems approved for the processing of classified information.

          (3)  When the automated system is used to process
classified information in an unattended mode, or when classified
information is left in a system, or elsewhere, unsecured within the
automated system facility or remote terminal areas, "closed areas"
must be established in compliance with Chapter 26 of NHB 1620.3,
"NASA Security Handbook."  Similarly, the storage of automated
system hardware and associated media should be in compliance with
NHB 1620.3, "NASA Security Handbook."

          (4)  When classified information has been removed from
the system and properly secured, continuous physical protection
areas must be established and maintained for the automated system
facility and areas housing remote terminals.  A protected area is
continuously protected by a collective level of physical security
safeguards and personnel access protection keyed to the prevention
or detection of unauthorized modification of the automated system
hardware.  The specific security measures established will vary
depending on:

               (a)  The overall physical security protection
already in effect at the facility;

               (b)  The environment in which the automated system
is employed, the relative potential for unauthorized access, and
the effectiveness of safeguards in reducing the risks of identified
threats;

               (c)  The classification level and volume of the
information to be processed; and

               (d)  The consistency, reliability, and auditing
capabilities of the safeguards to be employed.

Appropriate physical safeguards may include:

               (e)  Locks on doors to buildings and rooms that
provide reasonable protection against unauthorized entry into the
area;

               (f)  Intrusion Detection Systems (IDS's); and

               (g)  The use of equipment covers, enclosures, seals,
or locks to prevent or detect unauthorized access to the inside of
the equipment.

When unauthorized entry and/or modification of system hardware is
suspected, a thorough inspection/investigation of the protected
area and equipment must be conducted prior to classified
processing.  Such incidents must be recorded in the audit trail
system.

     e.   Access Control/Password Management System.  User access
to a system used to process classified information must be
controlled at a minimum through the use of a USERID and a unique
system log-on password.  See CSC-SAD-002-85, NCSC-TG-017, and
NCSC-TG-020A.

          (1)  USERID Requirements.

               (a)  Classification.  USERID's will be unclassified.

               (b)  Deletion.  The assigned USERID will be disabled
immediately, when a system user no longer requires access to the
system, and deleted from the system within 30 days from the date it
is disabled.

               (c)  Entry Failures.  The number of system log-on
entries allowed each USERID shall be limited to three consecutive
failures.  Inability on the part of any user to successfully access
the system within the established limits must automatically
deactivate the user's ability to access the system, create an
appropriate audit trail record, and require the DPI-AISO, or
alternate, to reactivate the user's capability.

          (2)  Password Requirements.

               (a)  Selection/Generation.  In all cases, the
procedure used by the DPI-AISO to select log-on passwords must be
approved by the C-CAISM.  System log-on passwords will be randomly
selected pronounceable words no shorter than six characters in
length.  Common passwords, such as names, birth dates etc. are not
permitted.  Passwords including alphanumeric combinations and
special characters (e.g., $ and @) should be encouraged.  It is
also good practice to make some passwords, especially those
granting system administrator privileges, longer than the
six-character minimum.

               (b)  Validation.  Passwords must be validated by the
system each time the user accesses the system.

               (c)  Display.  System log-on passwords must not be
displayed at any terminal or printed at any printer.

               (d)  Change.  Passwords must be changed at least
every 6 months.  Compromised passwords must be deleted from the
system immediately.  The system manager will override and reassign
a new interim password whenever original passwords are lost or
forgotten.  The user will then be required to log-on with the new
password and request an immediate change.

               (e)  System Storage.  Passwords in computers must be
stored in encrypted form.

               (f)  Sharing.  Passwords shall not be shared or
copied by users.

               (g)  Classification.  Passwords shall be classified
at the level of the information to which they allow access.

          (3)  Password Access.  Access to files containing
(encrypted) passwords will be limited to the DPI-AISO and CC-AISM,
or alternates.

     f.   Audit Trails/Logs.  Audit trails provide a chronological
record of the use of the computer and system support activities
related to classified processing.  Approved audit trails will
provide the information required by the specific mode of operation. 
(See NCSC-TG-001.)  In addition to these requirements, the
following applies:

          (1)  Review.  Audit records must be reviewed weekly -
minimally.  Records should normally be reviewed every Monday for
the previous week or daily if system usage is considered heavy.

          (2)  Retention.  Audit records should be retained for a
minimum of one year for "dedicated" and "system high" mode and two
years for systems operating in the "multi-level" mode.

          (3)  Manual System Logs.  For systems that are incapable
of providing an automated audit trail (such as some microcomputer
platforms), manual logs are required.  These logs must require
users to sign in and log their names and start/stop time.

          (4)  Non-System Logs.  In addition to auditing basic
system usage, manual logs are required for all maintenance and
repairs, hardware/software configuration changes, and all anomalies
that might affect the security posture of the system.  See
NCSC-TG-006.

     g.   Hardware and Software Security.  The establishment and
maintenance of hardware and software integrity is essential to
ensuring continued safeguarding of classified information.  See
NCSC-C Technical Report 79-91 and NCSC-C Technical Report 111-91.

          (1)  System Hardware.  Integrity of the hardware is
attained through the physical security provisions in paragraph 707d
and through configuration management.  See NCSC-TG-006 and
NCSC-TG-007.

          (2)  System Software.  System software must be classified
and controlled at the highest level of information processed on
that system.  When copies of the system software are not in the
system, they must still be safeguarded at the highest level of
information processed.  System software may be retained in the
automated system on non-removable storage media when the automated
system is not in use, provided that the entire system has
continuous physical protection as outlined in paragraph 707d. 
Also, see NCSC-TG-006, NCSC-TG-007, and NCSC-TG-008.

          (3)  Classified Application Software.  Application
software that in itself contains classified data or comments, or
that implements classified processes or algorithms, shall be
classified.  These provisions are applicable to both human and
machine readable versions of the software, as well as to supporting
and related documentation.  Changes to classified software shall be
made only by appropriately cleared and officially authorized
personnel.

          (4)  Unclassified Application Software and Data.  Steps
must be taken that preclude the inadvertent removal of classified
information and/or the introduction of malicious code by
unclassified applications.  Prior to introduction into a classified
environment, unclassified software shall be write-protected,
assigned an unclassified control number, and be reviewed, approved,
and authorized for use by appropriately cleared and knowledgeable
personnel who understand the security implications.  During the
control process, a duplicate copy should be made for everyday use
on the classified system.  Once unclassified media have been used
on a classified system, they must be treated as classified until a
stringent classification review process ensures that no classified
material has been written to the unclassified media.

          (5)  Storage Media.  Protection requirements for storage
media on which classified software and data reside will be attained
by marking, recording (i.e., an accountability record), and storing
the media in accordance with the provisions of this Handbook and
the proposed "NASA Security Handbook."  Protective measures shall
be commensurate with the requirements for the highest
classification level of software of data contained thereon.

     h.   Declassifying Memory, Media and Equipment. 
Declassification is the conversion from classified to unclassified. 
It is sometimes called "sanitizing."  All magnetic storage media
and equipment must eventually be declassified or destroyed.  The
record and marking requirements for declassification are covered by
NHB 1620.3,  "NASA Security Handbook."  But, automated system
media, such as, semiconductor memory and magnetic tapes, and
equipment use recording technologies that require special
declassification considerations.

          (1)  Most, but not all, internal memories can be
declassified using memory overwriting routines that meet NSA
standards.  See Appendix D for details.

          (2)  Most magnetic storage media can be declassified by
the use of degaussing equipment and devices approved by NSA. 
C-CAISM's should be aware of currently approved devices. 
Procedures shall be established to ensure strict compliance with
the manufacturer's instructions for operating the degaussing
equipment and to ensure continuing effectiveness of the equipment.

          (3)  Where feasible, each declassification action must be
verified (at least randomly) to ensure that all classified
information contained on the media has been destroyed.
     
          (4)  Specific guidance for declassification and
verification of particular types of memory, storage media, and
equipment is available from C-CAISM's.  Authorized procedures for
declassification of the most commonly used memory types and storage
media are discussed in Appendix D.

     i.   Clearing Memory, Media and Equipment.  Clearing does not
declassify.  Clearing sufficiently removes classified information
from memory or storage media to justify a lower (or downgraded) or
a higher (or upgraded) classification level.  But, clearing cannot
be used to declassify memory, media, or equipment.  C-CAISM's and
DPI-AISO's must be aware of authorized procedures for clearing and
verifying specific memory types, storage media, devices, and
equipment.  (See Appendix D.)  Note that procedures here are not
intended to be all-inclusive.

          (1)  Internal Memory.  Clearing internal memory is
similar to declassification, except it applies primarily to
internal memory such as random access memory (RAM).  DPI-AISO's and
CSSM's must ensure that internal memory is cleared prior to system
downgrade.  This can be accomplished by overwriting all of memory
at least once.  Volatile memory can be cleared by powering the
automated system down.

          (2)  Magnetic Media.  Magnetic media such as magnetic
tapes, disks, and drums may be cleared by using an approved
overwrite routine that writes over every addressable bit with one
sequence of 1's, another with 0's, and again back to 0's, or
similar procedure.  After verification of media contents, the media
may be used for recording information with an equal or lower
classification level.

          (3)  Equipment.  Any equipment or device physically
handling classified material shall be visually examined as part of
the process of clearing the equipment.  An examination of the
equipment must include a search of the locations in which
classified media or material may have become lodged.

     j.   Upgrading.  The following procedures shall be implemented
to adjust an automated system to a higher security level for:
initiating a classified processing period after an automated system
has not been in use; changing from an unclassified to a classified
processing period; or processing a higher level or more restrictive
type of classified information:

          (1)  If an automated system has not been in use or has
been processing unclassified information unattended, and continuous
physical protection has been provided in accordance with the
provisions of paragraph 707d, the immediate area shall be inspected
for signs of unauthorized entry and the equipment shall be
inspected for signs of unauthorized access to the interior (e.g.,
loose covers, broken seals, missing screws, pry marks, or
scratches).  If signs are found of potential unauthorized entry
into the area or access inside the equipment, established DPI
procedures should be followed for notifying the proper authorities
and reporting the incident.  Before using the automated system for
classified processing, the automated system hardware shall be
inspected and tested to reveal any hardware modifications. 
Appropriate software diagnostic routines may by used in this
process.

          (2)  All remote terminals that are not secured to the
higher level of classified processing to be accomplished shall be
disconnected.

          (3)  All storage media of a lower classification level
shall be disabled, disconnected, or dismounted, and either removed
from the area or otherwise segregated within the area.

          (4)  All internal memory, buffer storage, and other
reusable storage devices not disabled, disconnected, or dismounted
shall be appropriately cleared.  (See paragraph 707i.)

          (5)  Higher level protective measures for the computer
facility and connected remote terminal areas, including the
implementation of access protection, personnel clearance, and
physical security requirements, shall be imposed.

          (6)  A dedicated and previously protected copy of the
system software shall be loaded into the computer.  However, if the
system software has been retained in a computer on non-removable
storage media, the system software need not be reloaded if: (i) the
computer has not been used since the last classified processing
period; (ii) the security level of the last processing period was
not lower than the security level of the processing period being
established; (iii) continuous physical protection of the computer
has been maintained in accordance with the provisions of paragraph
707d, and there are no signs of unauthorized entry into the area
and/or access to the computer.

          (7)  A final security check shall be made of the
foregoing prior to initiation of the higher level processing. 
Audit trail records will be maintained indicating what actions were
taken, when, and by whom.

     k.   Downgrading.  The following procedures shall be
implemented to adjust a computer to a lower security level for:
processing a lower level or less restrictive type of classified
information; changing from a classified to an unclassified
processing period; or establishing a level of continuous physical
protection when the computer is not to be used (in accordance with
the provisions of paragraph 707d): 

          (1)  All removable storage media (including that
containing the safeguarded higher level software), listings,
ribbons, cards, classified waste, etc., associated with the higher
level of classified processing, shall be dismounted, collected,
marked, removed, and appropriately secured.  Where area protective
measures for the automated system facility or remote terminal areas
are continuously maintained at the higher classification level, the
removable storage media may be segregated within, rather than
removed from, the area, provided the higher level classified
information is not on line or otherwise accessible through the
automated system.

          (2)  All internal memory, buffer storage, and other
reusable storage devices remaining on the computer shall be cleared
in accordance with the procedures in paragraph 707i.

          (3)  A security check shall be made of the foregoing and
audit trail records maintained indicating what actions were taken,
when, and by whom.

     l.   Communications and Network Security.  Each communications
link that supports a classified system shall be protected according
to the highest level and most restrictive category of information
carried by that link.  

          (1)  Communications Security.  Only PDS's or NSA-approved
cryptographic devices shall be used to protect classified
information on communications lines that pass outside the security
area of an automated system.  The specific security of an automated
system facility, including the cryptographic devices or PDS to be
used, shall be defined in the system security plan.  See NASA
Communications Security (COMSEC) Handbook (Operational Draft) for
details.

          (2)  Network Security.  All requirements outlined in this
Handbook apply to systems of all configurations, including local
and wide area networks.  Each Installation shall be responsible for
ensuring networks originating at their Center, or under their
management cognizance, are compliant with these standards.  See
NCSC-TG-005 and NCSC-TG-011.

     m.   Maintenance Procedures.  System maintenance is a common,
necessary, function that is normally performed by vendor supplied
technicians.  In order to ensure the integrity of the classified
information processed by, and stored within, these systems the
following procedures apply.

          (1)  Uncleared Maintenance Personnel.  Whether these
technicians are NASA employees, contractors, or vendor supplied,
uncleared maintenance personnel must be: U.S. citizens; logged on-
and off-site as visitors; escorted at all times by a technically
knowledgeable, cleared, escort; and approved by the DPI-AISO. 
Prior to maintenance visits by uncleared personnel, all classified
information and non-volatile media shall be placed in secure
storage, all volatile components cleared, and an "UNCLASSIFIED -
For Maintenance Only" copy of the operating system installed.

          (2)  Cleared Maintenance Personnel.  Cleared maintenance
personnel shall be allowed unescorted access to the system, but to
ensure that only information required by the technician is made
available: (i) non-volatile media must be stored in secure
containers or areas, (ii) volatile memory cleared, and (iii) access
to fixed disk limited to only those portions required for
maintenance.  Although cleared technicians have the clearances
required to enter a DPI unescorted, they do not have the
need-to-know to allow them unlimited access to the information
resident within NASA systems.

          (3)  Components.  Malfunctioning components will be
either repaired within the facility using a factory fresh component
or a factory-repaired component obtained through a random selection
process (e.g. not previously, or intentionally, assigned to NASA). 
Components with all volatile memory may be released after
preparation of a locally prepared Equipment/Media Declassification
and Release Record.  This record must be filled out by the
technician, signed by the CSSM, and maintained for one year. 
Components with non-volatile memory should be destroyed.  If
however, it becomes necessary to release, or swap, a component
containing non-volatile memory (i.e. EEPROM) the component may be
released only after all non-volatile areas are declassified.  A
locally prepared Equipment/Media Declassification and Release
Record must be completed prior to release.  This record must be
filled out by the technician, signed by the CSSM and the DPI-AISO,
and maintained for 2 years.

     n.   Modes of Operation.  System security modes are authorized
variations in security environments and methods of operating
automated systems that handle classified information.  The modes
are primarily defined by the manner in which personnel security
clearance and need-to-know requirements are implemented.  NASA may
accordingly process, store, use, and produce classified information
in an automated system that is operating in one of the following
modes:

          (1)  Dedicated.  An automated system is operating in the
"dedicated" security mode when each user has both a security
clearance and a need-to-know for all information on the system.

          (2)  System High.  An automated system is operating in
the "system high" security mode when each user has a security
clearance for all information on the system, but a need-to-know for
only some of the information.

          (3)  Multi-Level.  An automated system is operating in
the "multi-level" security mode when some users have neither a
need-to-know nor a security clearance for one or more categories of
information on the system.

     o.   System High Requirements.  In addition to the security
requirements covered earlier in this Handbook, systems processing
classified information in the "system high" security mode will:

          (1)  Insofar as is technically feasible, restrict the
security features of the system to a protected domain separate from
both the operating system and application program code.  It should
operate in a privileged state allowing no direct access by
non-privileged software or users, including operating system
software and system engineer/programmer users.

          (2)  Rigorously control and account for all automated
system hardware and software components, including system parameter
file references and all input/output devices.

          (3)  Limit the use of functions/utilities requiring
system privileges to only those persons and programs that have an
established need to use them.

          (4)  Define and control access between system users and
named objects (e.g., files and programs).  The enforcement
mechanism must be approved by the C-CAISM and DAA and must allow
system users to specify and control the sharing of those objects by
named individuals or explicitly defined groups of individuals, or
by both.  The access control mechanism must, either by explicit
user action or by default, provide that all objects are protected
from unauthorized access (discretionary access control).  Access
permission to an object by users not already possessing access
permission shall only be assigned by authorized users of the
object.

          (5)  No information, including encrypted representations
of information, produced by a prior subject's actions is to be
available to any subject that obtains access to an object that has
been released back to the system.

          (6)  The system shall be able to create, maintain, and
protect from modification or unauthorized access or destruction, an
audit trail that provides the capability to record, in addition to
other requirements, the use of identification mechanisms,
introduction of objects into a users address space (e.g., file open
and program initiation), and deletion of objects.  The audit trail
capability must also record all output requests, any actions taken
by automated system operator personnel, all actions taken by system
administrators and system security personnel, and any override of
human-readable output markings.

          (7)  The system shall be able to enforce individual
accountability by providing the capability to uniquely identify
each individual automated system user and associating this identity
with all auditable actions taken by that individual.  See
NCSC-TG-017.

     p.   Multi-Level Requirements.  In addition to meeting all
security requirements for the "dedicated" and "system high" modes,
"multi-level" systems must:

          (1)  Support a trusted communications path between itself
and each system user for initial log-on and verification. 
Communications via this path shall be initiated exclusively by the
user.  The system shall enforce a mandatory access control policy
over all resources that are directly or indirectly accessible by
subjects external to the Trusted Computer Base (TCB).

          (2)  Assure restriction of users' access to those
portion(s) of the system for which they have been approved.

          (3)  Enforce compartmentation between clearance levels
(labels) and data type, and provide rigorous need-to-know controls
within each compartment and subcompartment.  Sensitivity labels
associated with the system shall be maintained by the operating
system and used as the basis for mandatory access control
decisions.  Sensitivity labels shall accurately represent security
levels of the specific subjects or objects with which they are
associated.

          (4)  Enforce, under system control, a system-generated
security classification banner at the top and bottom of each
physical page of system output.

          (5)  Enforce an upgrade/downgrade principle where all
users have a system-maintained classification, no data is read by
a process/user with a lower classification, and no data is written
by a process/user unless its security classification level is equal
to that of the process/user.  This rule may be overridden only by
trusted software approved by the C-CAISM and DAA.

          (6)  Prevent the availability of data/image residue from
previous processes/users in central memory, disk storage, cache
memory, or peripheral processor memory.

          (7)  Provide an audit trail that records the routing of
all system jobs and output, classification upgrading and
downgrading, and the importing and exporting of labeled
information.

          (8)  Contain a mechanism that is able to monitor the
occurrence or accumulation of security auditable events that may
indicate an imminent violation of security policy.  This mechanism
shall be able to immediately notify the security administrator when
thresholds are exceeded and, if the occurrence or accumulation of
these security relevant events continues, the system shall take the
least disruptive action to terminate the event.

          (9)  Be capable of supporting two or more security
levels.

          (10) Meet the requirements of a B2 system as defined in
DOD 5200.28.SAD, "Department of Defense Trusted Computer System
Evaluation Criteria," also known as the "orange book."



                           Appendix A,
                           REFERENCES

A.   LAWS

1.   Computer Fraud and Abuse Act of 1986 (PL 99-474).

2.   Computer Security Act of 1987 (PL 100-235).

3.   Counterfeit Access Device and Computer Fraud and Abuse Act of
     1984 (PL 98-473).

4.   Debt Collection Act of 1982 (PL 97-365).

5.   Federal Managers' Financial Integrity Act of 1982 (PL 97-255).

6.   Foreign Corrupt Practices Act of 1977 (PL 95-213).

7.   Freedom of Information Act of 1969 (PL 93-502).

8.   Freedom of Information Reform Act of 1989.

9.   Paperwork Reduction Act of 1980 (PL 96-511).

10.  Privacy Act of 1974 (PL 93-579).

11.  Semiconductor Chip Protection Act of 1984 (PL 98-620).

12.  Small Business Computer Security and Education Act of 1984 (PL
     98-362).

13.  Title VIII - Paperwork Reduction Reauthorization Act of 1986
     (PL 99-500).

B.   FEDERAL DOCUMENTS

1.   OMB Circular A-123 (Revised), "Internal Control Systems,"
     August 16, 1983.

2.   OMB Circular A-127, "Financial Management Systems," December
     19, 1984.

3.   OMB Circular A-130, "Management of Federal Information
     Resources," December 1985.

4.   OMB Bulletin 90-08, "Guidance for Preparation of Security
     Plans for Federal Computer Systems that Contain Sensitive
     Information," July 9, 1990.

5.   OMB Bulletin 91-10, "Information Resources Management (IRM)
     Plans Bulletin," March 28, 1991.

6.   OMB Bulletin 92-05, "Information Resources Management (IRM)
     Plans Bulletin," March 6, 1992.

7.   OPM Regulation, "Training Requirement for the Computer
     Security Act," July 1988.

8.   Title 41, Code of Federal Regulations, Chapter 201, Parts
     201-6 and 201-7, entitled "Protection of Personal Privacy" and
     "Security of Information Resource Systems," respectively.

9.   FIPS PUB 31, "Guidelines for ADP Physical Security and Risk
     Management," June 1974.

10.  FIPS PUB 39, "Glossary for Computer Systems Security,"
     February 1976.

11.  FIPS PUB 41, "Computer Security Guidelines for Implementing
     the Privacy Act of 1974," May 1975.

12.  FIPS PUB 46-1, "Data Encryption Standard," January 1988
     (Reaffirmed until 1993).

13.  FIPS PUB 48, "Guidelines on Evaluation of Techniques for
     Automated Personal Identification," April 1977.

14.  FIPS PUB 65, "Guideline for Automatic Data Processing Risk
     Analysis," August 1, 1979.

15.  FIPS PUB 73, "Guidelines for Security of Computer
     Applications," June 1980.

16.  FIPS PUB 74, "Guidelines for Implementing and Using the NBS
     Data Encryption Standard," April 1981.

17.  FIPS PUB 81, "DES Modes of Operation," December 1980.

18.  FIPS PUB 83, "Guideline on User Authentication Techniques for
     Computer Network Access Control," September 1980.

19.  FIPS PUB 87, "Guidelines for ADP Contingency Planning," March
     27, 1981.

20.  FIPS PUB 88, "Guideline on Integrity Assurance and Control in
     Database Administration," August 1981.

21.  FIPS PUB 94, "Guideline on Electrical Power for ADP
     Installations," September 1983.

22.  FIPS PUB 102, "Guideline for Computer Security Certification
     and Accreditation," September 1983.

23.  FIPS PUB 112, "Standard on Password Usage," May 1985.

24.  FIPS PUB 113, "Standard on Computer Data Authentication," May
     1985.

25.  FIPS PUB 139, "Interoperability and Security Requirements for
     Use of the Data Encryption Standard in the Physical Layer of
     Data Communications,' August 1983.

26.  FIPS PUB 140, "General Security Requirements for Equipment
     Using the Data Encryption Standard," April 1982.

27.  FIPS PUB 141, "Interoperability and Security Requirements for
     Use of the Data Encryption Standard with CCITT Group 3
     Facsimile Equipment," April 1985.

28.  SPEC PUB 500-54, "A Key Notarization System for Computer
     Networks," October 1979.

29.  SPEC PUB 500-61, "Maintenance Testing for the Data Encryption
     Standard," August 1980.

30.  SPEC PUB 500-85, "Executive Guide to ADP Contingency
     Planning," July 1981.

31.  SPEC PUB 500-109, "Overview of Computer Security Certification
     and Accreditation," April 1984.

32.  SPEC PUB 500-120, "Security of Personal Computer Systems - A
     Management Guide," January 1985.

33.  SPEC PUB 500-133, "Technology Assessment: Methods for
     Measuring the Level of Computer Security," October 1985.

34.  SPEC PUB 500-134, "Guide On Selecting ADP Backup Process
     Alternatives," November 1985.

35.  SPEC PUB 500-137, "Security for Dial-Up Lines," May 1986.

36.  SPEC PUB 500-153, "Guide to Auditing for Controls and
     Security: A System Development Life Cycle Approach," April
     1988.

37.  SPEC PUB 500-156, "Message Authentication Code (MAC)
     Validation System: Requirements and procedures," May 1988.

38.  SPEC PUB 500-157, "Smart Card Technology: New Methods for
     Computer Access Control," September 1988.

39.  SPEC PUB 500-158, "Accuracy, Integrity, and Security in
     Computerized Vote-Tallying," August 1988.

40.  SPEC PUB 500-160, "Report of the Invitational Workshop on
     Integrity Policy in Computer Information Systems (WIPCIS),"
     January 1989.

41.  SPEC PUB 500-166, "Computer Viruses and Related Threats: A
     Management Guide," August 1989.

42.  SPEC PUB 500-168, "Report of the Invitational Workshop on Data
     Integrity," September 1989.

43.  SPEC PUB 500-169, "Executive Guide to the Protection of
     Information Resources," October 1989.

44.  SPEC PUB 500-170, "Management Guide to the Protection of
     Information Resources," October 1989.

45.  SPEC PUB 500-171, "Computer User's Guide to the Protection of
     Information Resources," December 1989.

46.  SPEC PUB 500-172, "Computer Security Training Guidelines,"
     November 1989.

47.  SPEC PUB 500-174, "Guide for Selecting Automated Risk Analysis
     Tools," October 1989.

48.  SPEC PUB 500-189, Security in ISDN," September 1981.

49.  SPEC PUB 800-1, "Computer Security in the 1980's: Selected
     Bibliography," December 1990.

50.  SPEC PUB 800-2, "Public-Key Cryptography," April 1991.

51.  SPEC PUB 800-3, "Establishing a Computer Security Incident
     Response Capability (CSIRC)," November 1991.

52.  SPEC PUB 800-4, "Computer Security Considerations in Federal
     Procurements: A Guide for Procurement Initiators, Contracting
     Officers, and Computer Security Officials," March 1992.

C.   NASA REQUIREMENTS

1.   NMI 1200.7, "NASA's Internal Management Control System."

2.   NMI 1371.3, "Coordination and Authorization of Foreign Visits
     to NASA Facilities."

3.   NMI 1600.2, "NASA Security Program."

4.   NMI 2410.7, "Assuring the Security and Integrity of NASA
     Automated Information Resources."

5.   NMI 2410.10, "NASA Software Management, Assurance, and
     Engineering Policy."

6.   NMI 2410.11, "Information Resources Management."

7.   NMI 9950.1, "The NASA Investigations Program."

8.   NHB 2410.1, "NASA Information Resources Management Handbook."

9.   NHB 1620.3, "NASA Security Handbook."

D.   NASA GUIDELINES

1.   "NASA Automated Information Security Awareness Briefing,"
     August 1988.

2.   "NASA Computer Security Awareness Course," September 30, 1989.

3.   "NASA Computer Security Awareness and Training (CSAT) Guide,"
     April 1990.

4.   "NASA Computer Security for Applications Development," August
     15, 1990.

5.   "NASA Microcomputer Security Course," September 30, 1989.

6.   "Report on Available AIS Software Tools," January 1992.

7.   "Security Risk Management Guideline," Guide AIS-2, January
     1992.

8.   "Software Assurance Guidebook," SMAP-GB-A201, September 1989.

E.   SUPPLEMENTAL POLICIES AND GUIDELINES FOR CLASSIFIED SYSTEMS

                     Executive Branch Policy

1.   "Advisory Memorandum on Office Automation Security Guideline,"
     NTISSAM COMPUSEC/1-87, January 16, 1987.

2.   "National Policy for the Security of National Security
     Telecommunication and Information Systems (U)," NSD 45,
     CONFIDENTIAL, July 5, 1990.

3.   "National Policy on Controlled Access Protection," NTISSP 200,
     July 15, 1987.

4.   "National Security Information," Executive Order 12356, April
     2, 1982.

5.   "TEMPEST Countermeasures for Facilities (U)," NTISSI 7000,
     SECRET, October 17, 1988.

                  Department of Defense Policy

6.   "Trusted Computer System Evaluation Criteria," DOD
     5200.28-STD, December 1985.

                National Computer Security Center
                    Standards and Guidelines

7.   CSC-STD-002-85, "Department of Defense Password Management
     Guideline," April 12, 1985.

8.   CSC-STD-003-85, "Computer Security Requirements - Guidance for
     Applying the Department of Defense Trusted Computer System
     Evaluation Criteria in Specific Environments," CSC-STD-003-85,
     June 25, 1985.

9.   CSC-STD-004-85, "Technical Rationale Behind CSC-STD-003-85:
     Computer Security Requirements - Guidance for Applying the
     Department of Defense Trusted Computer System Evaluation
     Criteria in Specific Environments," CSC-STD-004-85, June 25,
     1985.

10.  NCSC-C Technical Report 79-91, "Integrity in Automated
     Information Systems," September 1991.

11.  NCSC-C Technical Report 111-91, "Integrity-Oriented Control
     Objectives: Proposed Revisions to the Trusted Computer System
     Evaluation Criteria (TCSEC), DOD 5200.28-STD," October 1991.

12.  NCSC-C1 Technical Report 001, "Computer Viruses: Prevention,
     Detection, and Treatment," March 12, 1990.

13.  NCSC-TG-001 (Version-2), "A Guide to Understanding Audit in
     Trusted Systems," June 1, 1988.

14.  NCSC-TG-002 (Version-1), "Trusted Product Evaluations: A Guide
     for Vendors," June 22, 1990.

15.  NCSC-TG-003 (Version-1), "A Guide to Understanding
     Discretionary Access Control in Trusted Systems," September
     30, 1987.

16.  NCSC-TG-004 (Version-1), "Glossary of Computer Security
     Terms," October 21, 1988.

17.  NCSC-TG-005 (Version-1), "Trusted Network Interpretation,"
     July 31, 1987.

18.  NCSC-TG-006 (Version-1), "A Guide to Understanding
     Configuration Management in Trusted Systems," March 28, 1988.

19.  NCSC-TG-007 (Version-1), "A Guide to Understanding Design
     Documentation in Trusted Systems," October 2, 1988.

20.  NCSC-TG-008 (Version-1), "A Guide to Understanding Trusted
     Distribution in Trusted Systems," December 15, 1988.

21.  NCSC-TG-009 (Version-1), "Computer Security Subsystem
     Interpretation of the Trusted Computer System Evaluation
     Criteria," September 16, 1988.

22.  NCSC-TG-011 (Version-1), "Trusted Network Interpretation
     Environments Guideline," August 1, 1990.

23.  NCSC-TG-013 (Version-1), "Rating Maintenance Phase Program
     Document," June 23, 1989.

24.  NCSC-TG-014 (Version-1), "Guidelines for Formal Verification
     Systems," April 1, 1989.

25.  NCSC-TG-015 (Version 1), "A Guide to Understanding Trusted
     Facility Management," October 18, 1989.

26.  NCSC-TG-017 (Version 1), "A Guide to Understanding
     Identification and Authentication in Trusted Systems,"
     September 1991.

27.  NCSC-TG-019 (Version-2), "Trusted Product Evaluation
     Questionnaire," May 2, 1992.

28.  NCSC-TG-020A (Version-1), "Trusted Unix (TRUSIX) Working Group
     Rationale for Selecting Access Control List Features for the
     Unix System,"  August 18, 1989.

29.  NCSC-TG-021 (Version-1), "Trusted Database Management System
     Interpretation," April 1991.

30.  NCSC-TG-022 (Version-1), "A Guide to Understanding Trusted
     Recovery in Truster Systems," December 30, 1991.

31.  NCSC-TG-025 (Version-1), "A Guide to Understanding Data
     Remanence in Automated Information Systems," September 1991.

32.  NCSC-TG-026 (Version-1), "A Guide to Writing the Security
     Features User's Guide for Trusted Systems," September 1991.

33.  NCSC-TG-027 (Version-1), "A Guide to Understanding Information
     System Security Officer Responsibilities for Automated
     Information Systems," May 1992.

34.  NCSC-WA-002-85, "Personal Computer Security Considerations,"
     December 1985.

35.  NCSC "Final Evaluation Reports" for about 50 specific
     automated systems.

36.  NSA Information System Security Catalogue (Quarterly) that
     includes:

         Cryptographic Products List
         Endorsed Data Encryption Standard (DES) Products List
         Protected Services List
         Evaluated Products List
         U.S. Government Preferred Products List
         Degausser Products List

37.  "Language for RFP Specifications and Statements of Work - An
     Aid to Procurement Officers," July 1, 1992 (DRAFT)

                  NASA Policies and Guidelines

38.  NMI 1610.3, "Suspension, Revocation, and Denial of NASA
     Personnel Security Clearances."

39.  NMI 2520.1, "NASA Communications System Acquisition and
     Management."

F.   SOURCES OF REFERENCES:

1.   Local NASA Publications Offices

2.   Superintendent of Documents
     Government Printing Office
     Washington, DC 20402

3.   National Technical Information Service (NTIS)
     Springfield, VA 22161

4.   U.S. Department of Commerce
     National Institute of Standards and Technology (NIST)
     Gaithersburg, MD 20899

5.   National Security Agency (NSA)
     National Computer Security Center (NCSC)
     9800 Savage Road
     Fort George G. Meade, MD 20755-6000

6.   NASA Institutional Program Office AIS Manager
     (Through the Center AIS Manager)



                           Appendix B,
                          ABBREVIATIONS

A1 - An evaluation class requiring B3 plus a verified design

AC - Alternating Current

ACL - Access Control List

ADP - Automatic Data Processing

AIS - Automated Information Security  (NOTE: Although the "S" in 
AIS often stands for "System" in computer circles, the "S"  in AIS
always means "Security" in this Handbook.)

AISM - AIS Manager (NASA)

AISP - Automated Information Security Plan (NASA)

AOSS - Automated Office Support Systems (DOE)

ARC - Ames Research Center

B1 - An evaluation class requiring C2 plus labeled security
protection

B2 - An evaluation class requiring B1 plus structured security
protection

B3 - An evaluation class requiring B2 plus security domains

C1 - An evaluation class requiring discretionary access protection

C2 - An evaluation class requiring C1 plus controlled access
protection

CAI - Computer Assisted Instruction

C-AISM - Center AIS Manager (NASA)

C-AISP - Center AIS Plan (NASA)

C-CAISM - Center Classified AIS Manager (NASA)

CCITT - International Telephone and Telegraph Consultative
Committee


CAI - Computer-Assisted Instruction

CAISP - Center Automated Information Security Plan (NASA)

CERT - Computer Emergency Response Team (DOE)

CD-ROM - Compact Disc Read Only Memory

CM - Configuration Management

CO - Contracting Officer

COMPSEC - Computer Security (DOD)

COMPUSEC - Computer Security (DOD)

COMSEC - Communications Security

COTR - Contracting Officer's Technical Representative

COTS - Commercial-Off-The-Shelf

COMSEC - Communications Security

CPU - Central Processing Unit

CRT - Cathode Ray Tube

CSAT - Computer (and Network) Security Awareness and Training

CSC - Computer Security Coordinator; Computer Security Center
(Replaced by the NCSC)

CSIR - Computer Security Incident Response

CSIRC - Computer Security Incident Response Capability

CSSM - Classified System Security Manager (DOD)

CSTVRP - Computer Security Technical Vulnerability Reporting
Program (DOD)

DAA - Designated Approving Authority; Designated Accreditation
Authority

DAC - Discretionary Access Control

db - Decibels

DBMS - Data Base Management System

DCID - Director of Central Intelligence Directive

DES - Data Encryption Standard

DOC - Department Of Commerce

DOD - Department of Defense

DOE - Department of Energy

DOJ - Department Of Justice

DPI - Data Processing Installation

DPI-AISC - DPI AIS Coordinator (NASA)

DPI-AISO - DPI AIS Official (NASA)

DPI-AISP - DPI AIS Plan (NASA)

DPI-CSO - DPI Computer Security Official (DoD)

DTLS - Descriptive Top Level Specification (NSA)

EEPROM - Electronically-Programmable Read-Only Memory

EO - Executive Order

EPL - Evaluated Products List (NSA)

ETL - Endorsed Tools List (NSA)

FBI - Federal Bureau of Investigation

FIP - Federal Information Processing

FIPS PUB - Federal Information Processing Standards Publication
(NIST)

FIRMR - Federal Information Resources Management Regulation (GSA)

FIRST - Forum of Incident Response Teams

FOCI - Foreign Owned, Controlled, or Influenced

FOIA - Freedom Of Information Act

FTLS - Formal Top Level Specification (NSA)

FY - Fiscal Year

GSA - General Services Administration

GSFC - Goddard Space Flight Center

HQ - Headquarters

I&A - Identification and Authentication

IDS - Intrusion Detection System

IG - Inspector General

IPO - Institutional Program Office (NASA HQ)

IPO-AISM - IPO AIS Manager (NASA HQ)

IPO-AISP - IPO AIS Plan (NASA HQ)

IRM - Information Resources Management

ISDN - Integrated Services Digital Network

ISSM - Information Systems Security Manager (DoD)

ITSP - Information Technology Systems Plan (NASA)

JPL - Jet Propulsion Laboratory

JSC - Lyndon B. Johnson Space Center

KSC - John F. Kennedy Space Center

LCM - Life Cycle Management

LaRC - Langley Research Center

LeRC - Lewis Research Center

MAC - Mandatory Access Control (NSA); Message Authentication Code
(NIST)

MCSC - Microcomputer Security Coordinator
 
MLS - Multi-Level Security (NSA)

MSFC - George C. Marshall Space Flight Center

NACSI - National Communications Security Instruction (NSA)

NASA - National Aeronautics and Space Administration

NASCOM - NASA Communications (Mission-Supporting Network)

NBS - National Bureau of Standards (now called NIST)

NCSC - National Computer Security Center (NSA); National
Communications Security Committee (subsumed by the NTISSC)

NCSL - National Computer Systems Laboratory (NIST)

NHB - NASA HandBook

NIST - National Institute of Standards and Technology

NISTIR - NIST Interagency Report

NMI - NASA Management Instruction

NSA - National Security Agency

NSD - National Security Directive (NSA)

NSDD - National Security Decision Directive (NSA)

NSI - NASA Science Internet

NSIR - Network Security Incident Response (NASA)

NSM - Network Security Manager

NSTISSAM - National Security Telecommunications and Information
Systems Security Advisory Memorandum (NSA)

NSTISSC - National Security Telecommunications and Information
Systems Security Committee (NSA)

NSTISSD - National Security Telecommunications and Information
Systems Security Advisory Directive (NSA)

NSTISSI - National Security Telecommunications and Information
Systems Security Advisory Instruction (NSA)

NSTISSP - National Security Telecommunications and Information
Systems Security Advisory Policy (NSA)

NTCB - Network Trusted Computing Base (NSA)

NTISSAM - National Telecommunications and Information Systems
Security Advisory Memorandum (NSA)

NTISSC - National Telecommunications and Information Systems
Security Committee (NSA)

NTISSD - National Telecommunications and Information Systems
Security Directive (NSA)

NTISSI - National Telecommunications and Information Systems
Security Instruction (NSA)

NTISSP - National Telecommunications and Information Systems
Security Policy (NSA)

OMB - Office of Management and Budget

OPM - Office of Personnel Management

PC - Personal Computer

PCL - Personnel (Security) Clearance

PDS - Protected Distribution System

PO-AISC - Program Office AIS Coordinator (NASA HQ)

PO-AISM - Program Office AIS Manager (NASA HQ)

PO-AISP - Program Office Automated Information Security Plan (NASA
HQ)

PSCN - Program Support Communications Network (NASA)

PTR - Preliminary Technical Review (NSA)

RAMP - RAting Maintenance Phase (NSA)

RF - Radio Frequency

RFP - Request for Proposal

RVM - Reference Validation Mechanism (NSA)

SAISS - Subcommittee on Automated Information Systems Security
(NTISSC)

SCC - Security Coordinating Committee (NASA)

SCI - Sensitive Compartmented Information

SFUG - Security Features User's Guide (NSA)

SMAP - Software Management and Assurance Program (NASA)

SPEC PUB - Special Publication (NIST)

SSC - John C. Stennis Space Center

ST&E - Security Test and Evaluation

STD - Standard

STS - Subcommittee on Telecommunications Security (NTISSC)

TCB - Trusted Computing Base (NSA)

TCSEC - Trusted Computer System Evaluation Criteria (NSA)

TEMPEST - (not an abbreviation)

TFM - Trusted Facilities Manual (NSA)

TLS - Top Level Specification (NSA)

TNI - Trusted Network Interpretation (NSA)

TNIEG - Trusted Network Interpretation Environments Guideline (NSA)

TQM - Total Quality Management

TSCM - Technical Security Countermeasures (NSA)

U - Unclassified

U.S.C. - United States Code

USERID - User Identifier

WORM - Write Once Read Many



                           Appendix C,
                           TERMINOLOGY

ACCEPTABLE RISK - A level of risk at which there is reasonable
assurance of management acceptance.  In practice, acceptability of
risk is a judgement call, based on local details (e.g., security
specifications, systems testing results, appropriateness and
completeness of the requirements definitions, perceptions of the
threat environments, and compliance with applicable policies).

ACCESS - Ability to obtain or change information.  Within a
computer, "access" is the interaction between a subject (e.g.,
person, process, or device) and an object (e.g., record, file,
program, or device) that results in the flow of information from
one to the other.  The nature or type of access can be read, write,
execute, append, modify, delete, or create.

ACCESS CONTROL - The process of limiting the access subjects have
to objects.  These limits can be enforced using discretionary or
mandatory access controls.  Discretionary access controls allow a
subject (e.g., an authorized user) with a certain access permission
(e.g., authorization to access file "XYZ001") the ability to pass
that permission on to another subject.  Such passing abilities are
typically controlled using access control lists (ACL's),
capabilities or tickets, user profiles, protection bits, and/or
passwords.  (See NCSC-TG-003.)  Mandatory access controls prohibit
a subject with a certain access permission from passing that
permission on to another subject.  Mandatory access controls are
implemented in computer hardware, firmware, and/or software.

ACCESS CONTROL MEASURES - Hardware and software features, physical
controls, operating procedures, management procedures, and various
combinations of these designed to detect or prevent unauthorized
access to a computer and to enforce access control.

ACCESS PORT - A logical or physical path for input/output data
streams.  Computers normally use unique identifiers to distinguish
among their many access ports.

ACCOUNTABILITY - The property that enables activities to be traced
to individuals who may then be held responsible for their actions.

ACCOUNTABILITY INFORMATION - A set of records, often referred to as
an audit trail, that collectively provide documentary evidence of
the processing, or other actions related to the security of a
computer.

ACCREDITATION - The written authorization from a DAA to activate or
operate a particularly sensitive (such as a classified) computer:
(i) in a particular security mode; (ii) with a prescribed set of
technical and nontechnical security safeguards; (iii) against a
defined threat; (iv) in a given operational environment; (v) under
a stated operational concept; (vi) with stated connections to other
automatic information systems/networks; and (vii) at an acceptable
level of risk for which the accrediting official has formally
assumed responsibility.  The accreditation statement affixes
security responsibility with the accrediting official and shows
that due care has been taken for security.

ADMINISTRATIVE SECURITY - The management procedures and
constraints, operational procedures, accountability procedures, and
supplemental controls established to provide an acceptable level of
protection for information.

APPLICATION -  A set of computer commands, instructions, and
procedures used to cause a computer to process information.  To
avoid confusion in this Handbook, the phrase "automated
application" is normally used.  Application software does not
include operating systems, generic utilities, etc. that are
normally referred to as "system software."

APPLICATION INTERNAL CONTROLS - Security controls in the
application software.  The objectives of these controls include
information validation, user identity verification, user service
authorization verification, journalling, variance detection, and
encryption.

APPLICATION OWNER - The highest level manager whose responsibility
is limited to the primary function supported by that application. 
By definition, every computer application has an owner.

APPROPRIATE - Actions, policies, procedures, or events that are
reasonably defensible, based on local environments, risk
assessments, and generally accepted practices.  Since neither the
NASA AIS Program Manager nor the PO-AISM's can prescribe sufficient
and reasonably detailed procedures for all situations NASAwide,
local managers must make many decisions concerning the
appropriateness of local actions, policies, procedures, and events. 
These decisions will be discussed during regular management reviews
and audits.

ASSETS - See "automated information resources."

ASSURANCE TESTING - A process used to determine that the security
features of a system are implemented as designed, and that they are
adequate for the proposed environment.  This process may include
hands-on functional testing, penetration testing and/or
verification.

AUDIT TRAIL - A record of computer activities.  Normally audit
trails are chronological.  They should be sufficient to enable the
reconstruction and examination of a sequence of events,
environments, activities, procedures, or operations from inception
to final result.

AUTHENTICATION - The act of validating the claimed identity of a
"subject" (e.g., individual, station, or originator).  Normally,
authentication follows the process of "identification."  This term
is also used to describe the act of verifying the integrity of
information (that has been stored, transmitted, or otherwise
exposed to possible unauthorized modification).

AUTHORIZATION - The privilege granted to a "subject" (e.g.,
individual, program, or process) by a designated official to do
something, such as access information based on the individual's
clearance and need-to-know.

AUTOMATED INFORMATION  - All recorded information regardless of its
media form (e.g., audible tone; paper; magnetic core, tape, or
disk; microform; electronic signal; and visual/screen displays)
that is processed by or stored for the purpose of being processed
by a computer.  The terms "automated information," "automated
data," "information," and "data" are considered synonymous and used
interchangeably in this Handbook.

AUTOMATED INFORMATION ASSETS - See "automated information
resources."

AUTOMATED INFORMATION RESOURCES - Data and information; computers,
ancillary equipment, software, firmware, and similar procedures;
facilities that house such resources; services, including support
services; and related resources used for the acquisition, storage,
manipulation, management, movement, control, display, switching,
interchange, transmission, or reception of data.  This includes
communications and network systems.

AUTOMATED INFORMATION SECURITY - The technical, personnel,
administrative, environmental, and access safeguards needed to
provide an appropriate level of protection for automation assets. 
This term is synonymous with "computer security" and includes
protecting computers, communications systems, networks, data,
information, and all related assets.

AUTOMATED INFORMATION SECURITY INCIDENT - An adverse event or
situation associated with a computer that results in: (i) a failure
to comply with security regulations or directives; (ii) an
attempted, suspected, or actual compromise of information; (iii)
the waste, fraud, abuse, loss, or damage of government property or
information; or (iv) the discovery of a vulnerability.

AUTOMATED INFORMATION SYSTEM - See "computer."

AUTOMATED OFFICE SUPPORT SYSTEMS - Stand-alone microprocessors,
word processors, memory typewriters, and terminals connected to
mainframe computers.

AUTOMATION ASSETS - Information stored in computers;
computer/communications hardware, firmware, and software; physical
facilities used to house computer/communications systems; and the
people and procedures needed to operate computer/communications
systems.

AVAILABILITY - The state that exists when data can be obtained
within an acceptable period of time. 

BELL-LA PADULA MODEL - A formal state transition model of computer
security policy that describes a set of access control rules for
hierarchical (e.g., classified) information structures.  In this
model the entities in a computer are divided into abstract sets of
subjects and objects.  The notion of a secure state is defined, and
it is proven that each state transition preserves security by
moving from secure state to secure state, thereby inductively
proving that a system is secure.  A system state is defined to be
"secure" if the only permitted access modes of subjects to objects
are in accordance with a specific security policy.  To determine if
a specific access mode is allowed, the clearance of a subject is
compared to the classification of the object.  See "simple security
property" and "star property."
  
CATEGORY - A grouping of information for which specific labelling
and/or handling must be used and that should influence the choice
of automated information security controls.  This Handbook lists 13
of these information categories.

CENTER - One of the major NASA sites (including the laboratories
and the HQ), as defined in NMI 1120.1, "Policy and Requirements for
Organization Changes."  For purposes of this Handbook, there are
ten centers: (i) Ames Research Center (ARC), (ii) George C.
Marshall Space Flight Center (MSFC), (iii) Goddard Space Flight
Center (GSFC), (iv) Jet Propulsion Laboratory (JPL), (v) John C.
Stennis Space Center (SSC), (vi) John F. Kennedy Space Center
(KSC), (vii) Langley Research Center (LaRC), (viii) Lewis Research
Center (LeRC), (ix) Lyndon B. Johnson Space Center (JSC), and (x)
NASA Headquarters (HQ).

CERTIFICATION - A written acknowledgement (by a NASA management
official) that there is reasonable assurance that an automated
application and its automated environment (i) meet all applicable
NASA and other Federal policies, regulations, and standards
covering security; and (ii) have been tested and technically
evaluated thoroughly enough to demonstrate that the installed
security controls are adequate.

CLASSIFIED - TOP SECRET, SECRET, or CONFIDENTIAL, in accordance
with EO 12356, in the interest of national security.

CLASSIFIED COMPUTER SECURITY PROGRAM - All of the technological
safeguards and managerial procedures established and applied to
computer facilities and computers (including hardware, software,
and information) in order to ensure the protection of classified
information.

CLASSIFIED INFORMATION - Data or information that requires
safeguarding in the interest of national security.  This
information is TOP SECRET, SECRET, or CONFIDENTIAL, in accordance
with EO 12356.  In this Handbook, "classified information" refers
specifically to information for which NASA is the classification
authority.

CLEARANCE - Authorization to access classified information at a
specified level (e.g, TOP SECRET).  A clearance does not in and of
itself authorize access to any classified information.  Access to
classified information requires both the appropriate clearance and
a need-to-know.  The need-to-know is based on a determination by
the information's owner or authorized custodian.

CLEARING - The overwriting of information on magnetic memory/media
such that the information cannot be reconstructed using normal
system capabilities (such as through the keyboard, or with advanced
diagnostic utilities).  Clearing is sufficient to allow reuse, but
not sufficient for declassification or sanitization.  Clearing is
less thorough than "purging."

CLOSED AREA - An area where security measures are primarily taken
to safeguard classified material and in which entry to the area
alone provides visible or audible access to classified information. 
A subcategory of "security area."  (See NHB 1620.3, "NASA Security
Handbook.")

COERCIVITY - The strength of a magnetic field, measured in
oersteds, based on the amount of applied magnetic field of opposite
polarity needed to reduce the magnetic induction to zero.

COLLATERAL CLASSIFIED - All national security information
classified under the provisions of an Executive Order for which
special intelligence community systems of compartmentation (i.e.,
sensitive compartmented information) are not formally established.

COMMUNICATIONS SECURITY - The protection resulting from all
measures designed to deny unauthorized persons information of value
that might be derived from the possession and study of
telecommunications, or to mislead unauthorized persons in their
interpretation of the results of such possession and study.  This
includes ensuring the authenticity of such telecommunications. 
COMSEC includes cryptosecurity, transmission security, emission
security, and physical security of communications security material
and information.  (See NHB 1620.3, "NASA Security Handbook.")

COMPARTMENTATION - Separation between unclassified, CONFIDENTIAL,
SECRET, and TOP SECRET information/processes in computers.  (NOTE:
This is not the same as "sensitive compartmented information,"
which is a separate category of sensitive information beyond the
scope of this Handbook.)

COMPLIANCE REVIEW - A review and examination of records,
procedures, and review activities at a DPI in order to assess the
automated information security posture and ensure compliance with
this Handbook.    

COMPROMISE - The disclosure of private (especially classified) data
to persons who are not authorized to receive such data.

COMPROMISING EMANATIONS - Unintentional data-related or
intelligence-bearing signals that, if intercepted and analyzed,
could disclose the information being transmitted, received,
handled, or otherwise processed by any information processing
equipment.  The study and control of compromising emanations is
called TEMPEST.  (See NHB 1620.3, "NASA Security Handbook.")

COMPUTER - Any equipment or interconnected system or subsystems of
equipment used in the automatic acquisition, storage, manipulation,
management, movement, control, display, switching, interchange,
transmission, or reception, of data or information.  This includes
mainframe computers, minicomputers, microcomputers, scientific
workstations, word processors, microprocessors, automated office
support systems, communications systems, and networks.

COMPUTER FACILITY - One or more rooms, generally contiguous,
containing the elements of a computer.

COMPUTER RESOURCES - See "automated information resources."

COMPUTER SECURITY - See "automated information security."

CONFIDENTIALITY - The state that exists when data is held in
confidence and is protected from unauthorized disclosure.
   
CONFIGURATION MANAGEMENT - Control of changes made to a computer's
hardware, software, and/or documentation (including an inventory of
the system elements) throughout the development and operational
life of the system.

CONFINEMENT PROPERTY - When the security features ensure that a
"subject" can WRITE to an "object" only if the security level of
the subject is equal to that of the object.  This is also known as
the "star property" or "* property" in the Bell-La Padula security
model.

CONTINGENCY PLAN - A document, developed in conjunction with
application owners and maintained at the primary and backup
computer installation that describes procedures and identifies the
personnel necessary to respond to abnormal situations (including
disasters).  Contingency plans help managers ensure that computer
application owners continue to process (with or without computers)
mission-critical applications in the event that computer support is
interrupted.

CONTROL ZONE - The space, typically expressed in feet of radius,
surrounding equipment and in which there is sufficient physical and
technical control to preclude and unauthorized entry or compromise. 
Also, see "security area."

CONTROLLED ACCESS - See "access control."

CONTROLS - See "safeguards."

COUNTERMEASURES - See "safeguards."

COVERT CHANNEL - A communications channel that allows a process to
transfer information in a manner that violates the system's
security policy.

COVERT TIMING CHANNEL - A covert channel in which one process
signals information to another by modulating its own use of system
resources (e.g., CPU time) in such a way that this manipulation
affects the real response time observed by the second process.

CRITICAL RESOURCES -  Those physical and information assets
required for the performance of the installation's mission.

CRITICALITY RATING - An importance-related and/or time-related
designation assigned to a computer application that indicates when
it must be back in operation to avoid mission impacts after a
disaster or interruption in computer support services at a
multi-user installation.  To facilitate prioritized recovery
procedures and for operating at offsite backup facilities in a
degraded mode, computer applications should be assigned criticality
ratings of varying importance (e.g., most critical, critical,
important, deferrable). Applications with the same criticality
rating should be additionally ranked (e.g., numerically) according
to installation-determined processing priorities and perceptions of
importance.

CRYPTOGRAPHY - The principles, means, and methods for rendering
information unintelligible, and for restoring encrypted information
to intelligible form.

DATA - Raw facts, figures, and numbers that, once properly
processed, can produce information.  However, in this Handbook
"data" is synonymous with "information."

DATA ENCRYPTION STANDARD - A cryptographic algorithm for the
protection of unclassified data, as defined in FIPS Standard 46.

DATA OWNER - See "information owner."

DATA PROCESSING INSTALLATION - A logical grouping of one or more
computers with common environmental and/or security characteristics
for automated information security management purposes.

DECLASSIFICATION - An administrative decision or procedure to
remove or reduce the security classification of a system, device,
or collection of information.  (See NHB 1620.3, "NASA Security
Handbook.")

DEDICATED SYSTEM - A computer or network that incorporate the
security mode of operation wherein all system users have
clearances, access approvals, and a need-to-know for all
information in the system.

DEGAUSSING - Reducing the magnetic flux density to zero by applying
a reverse magnetizing field.

DESTRUCTION - The physical alteration of computer media or
components such that they can no longer be used for storage or
retrieval of information.

DISASTER RECOVERY PLANS - Documents containing procedures for
emergency response, extended backup operations, and post-disaster
recovery should a computer installation experience a partial or
total loss of computer resources.  The primary objectives of these
plans, in conjunction with contingency plans, are to provide
reasonable assurance that a computer installation can recover from
such incidents, continue to process mission-critical applications
in a degraded mode and return to a normal mode of operation within
a reasonable time. Such plans are a protective measure generally
applied based on assessments of risk, cost, benefit, and
feasibility and an evaluation of other protective measures in
place.

DISCLOSURE - The (authorized or unauthorized) release of
information to someone.

DOMAIN - The unique context (e.g., access control parameters) in
which a program is operating.  In effect, the "domain" is the set
of objects that a subject has the ability to access.

EMANATIONS - See "compromising emanations."

EMISSION SECURITY - The protection resulting from all measures
taken to deny unauthorized persons information of value that might
be derived from the intercept and analysis of compromising
emanations from electronic/electrical systems.

ENCRYPTION - See "cryptography."

EXCLUSION AREA - A security area for the protection of classified
materials where mere access to the area would result in access to
those classified materials.  See "security area."

FEDERAL INFORMATION PROCESSING (FIP) RESOURCES - See "automated
information resources."  The term FIP is used throughout the
Federal IRM regulation (FIRMR).

FORMAL SECURITY POLICY MODEL - A mathematically precise statement
of a security policy.

FREEWARE - Software developed by computer hobbyists for personal
use by other computer hobbyists free of charge.

IDENTIFICATION - The process that enables recognition of users or
resources as identical to those previously described to a computer,
generally by the use of unique machine-readable names.

INFORMATION - Any communication or reception of knowledge, such as
facts, data, or opinions, including numerical, graphic, or
narrative forms, whether oral or maintained in any medium,
including computerized data bases, paper, microfilm, tapes, disks,
memory chips, Random Access Memory (RAM), Read Only Memory (ROM),
microfiche, communication lines, or display terminals.  The terms
"data," "information," "material," "documents," and "matter" are
used interchangeably in this Handbook.  

INFORMATION OWNER - The highest level manager whose responsibility
is limited to the primary functional area supported by that
information.  By definition, every piece of information has an
owner.

INFORMATION RESOURCES MANAGEMENT (IRM) - The planning, budgeting,
organizing, directing, training, and control of information itself
and of the related resources (such as, personnel, equipment, funds,
and technology).  (See NMI 2410.11, "Information Resources
Management.")

INSTALLATION CHIEF OF SECURITY - The Chief of Security at a NASA
Center who is responsible for the center's personnel, information,
and physical security programs.  (See NHB 1620.3, "NASA Security
Handbook.")   

INSTITUTIONAL MANAGEMENT - In NASA, the Headquarters level
management of Center level resources.

INSTITUTIONAL PROGRAM OFFICE (IPO) - One of the program offices at
NASA HQ that has been assigned overall facilities management
responsibility for one or more NASA Center.

INTEGRITY - The state that exists when computerized data is the
same as that in the source documents or has been correctly computed
from source data and had not been exposed to accidental or
malicious alteration or destruction.  Thus, the data meets an a
priori expectation of quality, soundness, or perfection.

INTELLIGENCE INFORMATION - Classified information defined as
intelligence information by DCID 1/19.

ISOLATION - The containment of users and resources in a computer
such that they are separate from one another as well as from the
protection controls of the operating system.

LABEL - The marking of an item of information to reflect its
information category and/or security classification.  An internal
label reflects the classification and sensitivity of the
information within the confines of the medium containing the
information.  An external label is a visible and readable marking
on the outside of the medium (or the cover of the medium) that
reflects the category and/or classification of the information
within the medium.

LEAST PRIVILEGE - A principle that requires each subject be granted
the most restrictive set of privileges (including the lowest
clearance) needed for the performance of authorized tasks.  This
limits the damage from accidents, errors, and unauthorized use.

LICENSED SOFTWARE - Software sold under a license that requires
payment of a fee for each copy prior to use.

LIMITED ACCESS - See "access control."

LIMITED AREA -  A security area for the protection of classified
matter where guards, security inspectors, or other internal
controls can prevent access.  This is a subcategory of "security
area."

LOG-ON - The identification and authentication sequence that
establishes the user/process as authorized to access the computer. 
Conversely, "log-off" is the sequence that terminates user/process
access to the system.

LOGIC BOMB - A resident computer program that triggers the
perpetuation of an unauthorized act when particular states of the
system are realized.

LONG-RANGE PLAN - A written description of the strategy for
implementing a program covering the next 5 years.

MANAGEMENT REVIEW - A review and examination of records,
activities, policies, and procedures established by Field Centers
to manage and coordinate automated information security programs
under their cognizance.  This review is normally conducted by
Headquarters personnel with NASA-wide automated information
security program management responsibilities.

MICROCOMPUTER PLATFORMS - All microcomputers.  This includes, but
is not limited to, all systems running PC/DOS, MS/DOS, Macintosh,
and UNIX operating systems.

MISSION-CRITICAL INFORMATION - Plain text or machine-encoded data
that, as determined by competent authority (e.g., information
owners), has high importance related to accomplishing a NASA
mission and requires special protection because unnecessary delays
in processing could adversely affect the ability of NASA, an owner
organization, or a NASA Center to accomplish such missions.

MODES OF OPERATION - See "security modes of operation."

MULTILEVEL SYSTEM - A computer or network that incorporate the
security mode of operation that allows two or more classification
levels (including unclassified) of information to be processed
simultaneously within the same system when some users are not
cleared for all levels of information present.

NASA CLASSIFIED INFORMATION - Information classified by NASA for
national security reasons as authorized in EO 12356.  Information
classified by other agencies, such as DOD as authorized in EO
12356, must be handled according to the policies and procedures
prescribed by the classifying agency.  (See NHB 1620.3, "NASA
Security Handbook.")

NASA COMPUTER SYSTEMS - Computers that process information on
behalf of NASA to accomplish a NASA function, whether operated by
NASA, NASA contractors, or other organizations.  (This definition
is based on the definition of "Federal computer system" in the
Computer Security Act of 1987.)

NATIONAL SECURITY INFORMATION - CONFIDENTIAL, SECRET, or TOP SECRET
information that could reasonably be expected to cause damage to
national security upon unauthorized disclosure.  This category of
sensitive information was established by a White House Executive
Order.  (See NHB 1620.3, "NASA Security Handbook.")

NEED-TO-KNOW - The necessity for access to, knowledge of, or
possession of specific information required to carry out assigned
duties.  (See NHB 1620.3, "NASA Security Handbook.")

NETWORK -  A communications medium and all components attached to
that medium that are responsible for the transfer of information. 
Such components may include computers, packet switches,
telecommunications controllers, key distribution centers, technical
control devices, and other networks.

NETWORK SECURITY PLAN -  The security plan that covers a network,
whether it is a local area network, a wide area network, or a
global network.

NON-FEDERAL PERSONNEL -  Non-civil servant employees.  The use of
the term non-Federal personnel in paragraph 306 of this Handbook
does not include foreign nationals.

OBJECT - A passive entity that contains or receives information. 
Access to an object potentially implies access to the information
it contains.  Examples of objects include:  records, blocks, pages,
segments, files, directories, programs, bits, bytes, fields, words,
processors, displays, keyboards, clocks, printers, and network
nodes.

OBJECT REUSE - The reassignment and reuse of a storage medium
(e.g., page frame, disk sector, magnetic tape, floppy disk) that
once contained one or more objects.  To be securely reassigned and
reused by a new subject, storage media must contain no residual
data (magnetic remanence) from the object(s) previously contained
in the media.

OPERATIONS SECURITY -  An analytical process by which one can deny
potential adversaries information about capabilities and intentions
by identifying, controlling, and protecting evidence of the
planning and execution of sensitive activities and operations. 
OPSEC includes the protection of unclassified information that
might shed light on classified activities.  (See NHB 1620.3, "NASA
Security Handbook.")

ORANGE BOOK - A common nickname for DoD 5200.28-STD, "DoD Trusted
System Evaluation Criteria."

PASSWORD - A protected word, phrase, or a string of symbols that is
used to authenticate the identity of a user.  Knowledge of the
password associated with a particular USERID is considered proof of
authorization to use the capabilities associated with that USERID.

PASSWORD SPACE - The total number of possible passwords that can be
created by a given password generation scheme.

PENETRATION TESTING - Testing the feasibility of and confirming the
methods for defeating the security controls of a system.  This is
a specific subcategory of "security testing."

PERIODS PROCESSING - The processing of various levels of sensitive
information (e.g., SECRET and CONFIDENTIAL or classified and
unclassified) at distinctly different times on the same system. 
The system must be purged of all information from one processing
period before transitioning to the next.

PERSONNEL SCREENING - A protective measure applied to determine
that an individual's access to sensitive information is admissible. 
The need for and extent of a screening process is normally based on
an assessment of risk, cost, benefit, and feasibility as well as
other protective measures in place.  Effective screening processes
are applied in such a way as to allow a range of implementation,
from minimal procedures to more stringent procedures commensurate
with the sensitivity of the data to be accessed and the magnitude
of harm or loss that could be caused by the individual.  (See NHB
1620.3, "NASA Security Handbook.")

PERSONNEL SECURITY - The procedures established to ensure that all
personnel who have access to any classified information have the
required authorizations, as well as the appropriate clearances. 
(See NHB 1620.3, "NASA Security Handbook.")

PHYSICAL SECURITY - The use of physical barriers (e.g., locks,
guards, badges, alarms, and similar measures, alone or in
combination) against threats to resources and sensitive
information.  (See NHB 1620.3, "NASA Security Handbook.")

PREFERRED PRODUCTS LIST - A list of commercial equipment that meets
TEMPEST and other NSA requirements.

PROCEDURAL SECURITY - See "administrative security."

PROCESS - A program in execution.

PROGRAM OFFICE - A focal point within NASA HQ (such as, Codes M, O,
R, or S) with management control responsibility for research and
development activity within their functional area at NASA Field
Centers.  Some Program Offices also have institutional
responsibility for several NASA Centers.  They are called
Institutional Program Offices (IPO's).

PROPERTY PROTECTION AREA - See "security area."

PROTECT AS RESTRICTED DATA (PARD) - A handling method for
computer-generated information that is not readily recognized as
classified or unclassified because of the high volume of output and
low density of potentially classified data.  Information is
designated as PARD because it has not had a sensitivity
(classification) review and must be protected under a different set
of security rules.

PROTECTED DISTRIBUTION SYSTEM (PDS) - A telecommunications system
to which acoustical, electrical, electromagnetic and physical
safeguards have been applied to permit its use for secure
electrical or optical transmission of unencrypted classified
information or sensitive unclassified information.

PROTECTION INDEX - A measure of perceived risk determined from the
combination of the clearance level of users and the classification
of the data on the classified computer.
 
PROTECTIVE MEASURES - Physical, administrative, personnel, and
technical security features and actions that, when applied
separately or in combination, are designed to reduce the
probability of harm to, loss of, or compromise of computer
resources.

PUBLIC DOMAIN SOFTWARE - Software developed for use by anyone free
of charge.

PURGE - The removal of sensitive data such that the data cannot be
reconstructed using open-ended laboratory techniques.  The specific
procedures used vary, depending on the sensitivity of the data and
the characteristics of the memory/media used.  When combined with
appropriate administrative procedures, purging is a sufficient
basis for declassification.

QUALITATIVE RISK ASSESSMENT - A risk assessment that uses labels
(such as, high, medium, and low), rather than actual numbers, to
characterize anticipated likelihood and extent of harm to automated
information resources.

QUANTITATIVE RISK ASSESSMENT - A risk assessment that requires the
use of actual numbers, calculations of ALE, and mathematical
probabilities to characterize the anticipated likelihood and extent
of harm to automated information resources.

RECERTIFICATION - An ongoing reassurance that a previously
certified system has been periodically reviewed, that compliance
with established protection policies and procedures remains in
effect, and that security risks remain at an acceptable level.

REFERENCE MONITOR CONCEPT - An access-control concept wherein an
abstract machine mediates all accesses to objects by subjects.

RISK - The probability of harm or loss.  The term "at risk" means
that harm or loss is reasonably likely.

RISK ASSESSMENT - The determination of security risk exposure. 
This usually requires a list of automated information resources and
a list of potential security impacts on those resources from
adverse events to determine the anticipated level of harm or loss
that, in turn, is used by management to determine whether the AIS
risk is acceptable.  Risk assessment methodologies may be
qualitative, quantitative, or a combination of both.  Often called
security "risk analysis."

RISK MANAGEMENT - The avoidance of excessive security risk through
a process of assessing, reducing, and monitoring uncertain events
that might affect a system.

RISK MONITORING - The daily activity to ensure the effectiveness of
AIS protective measures.  This includes determining when sufficient
change has occurred to the AIS risk exposure to require another
risk assessment.

RISK REDUCTION - The lessening of security risk exposure to an
acceptable level.  This requires the identification, analysis,
selection, approval, and implementation of cost-effective AIS
protective measures.  Risk reduction is sometimes called "safeguard
implementation."

SAFEGUARDS - The protective measures and controls used to improve
security (i.e., reduce risk).  Also called "countermeasures,"
"security controls," or "security features."

SANITIZATION - The elimination of classified information from a
computer or media associated with a computer to permit the reuse of
the computer or media at a lower classification level or to permit
the release to uncleared personnel or personnel without the proper
information access authorizations.

SECURITY AREA - A physically defined space to provide protection
for property or classified material.  See "control zone,"
"restricted area," "limited area," and "closed area."  (Reference
Chapter 26 of NHB 1620.3, "NASA Security Handbook.")

SECURITY DESIGN REVIEW - A review process where the objective is to
ascertain whether or not implemented protective measures meet the
original system design and approved computer application security
requirements.  The security design review may be a separate
activity or an integral function of the overall application system
design review activity.  

SECURITY KERNEL - The hardware, firmware, and software elements of
a TCB that implement the reference monitor concept.  A security
kernel should mediate all accesses, be protected from modification,
and be verifiable as correct.

SECURITY MODES OF OPERATION - A description of the conditions under
which a computer functions, based on the classification level of
the data processed and the clearance/authorization levels of the
system's users.  Three modes apply in NASA - dedicated, system
high, and multilevel.  (See paragraph 707n.)

SECURITY TEST AND EVALUATION - An examination and analysis of the
security safeguards of a system as they have been applied in an
operational environment to determine the security posture of the
system.

SECURITY TESTING - A process used to determine if the security
features of a system are implemented as designed.  This includes
hands-on functional testing, penetration testing, and verification.

SENSITIVE ADP POSITION - A personnel position that cannot be
occupied until completion of an employee background check.  The
comprehensiveness of the background check is determined by the
sensitivity of the information to which access will be allowed.

SENSITIVE INFORMATION - Unclassified information that requires
protection due to the risk and magnitude of loss or harm that could
result from inadvertent or deliberate disclosure, alteration, or
destruction of the information.  This includes information whose
improper use or disclosure could adversely affect the ability of an
agency to accomplish its mission, proprietary information, records
about individuals requiring protection under the Privacy Act, and
information not releasable under the Freedom of Information Act. 
(Reference OMB Circular A-130.)  This is not the same as the NSA
term "sensitive, but unclassified information."

SENSITIVITY AND/OR CRITICALITY LEVELS -  Four NASA hierarchical
groupings (labeled 0 through 3) used to help determine which
automated information security controls are needed.

SHAREWARE - Software developed for general use and distributed for
evaluation free of charge.  However, shareware authors usually
request that users pay a registration fee and/or pay for a
certified copy of the software if used after evaluation.

SHORT-RANGE PLAN - A 1 year (i.e., tactical) plan.

SIGNIFICANT AIS INCIDENT - An event that would be of concern to
senior NASA management due to potential for public interest or
embarrassment to the organization, or potential for occurrence at
other NASA sites.  These events would include such things as
unauthorized access, theft, an interruption to computer service or
protective controls, an incident involving damage, a disaster, or
discovery of a vulnerability.

SIGNIFICANT CHANGE - A change in a computer installation that could
impact overall processing requirements and conditions or
installation security requirements (e.g., adding a local area
network; changing from batch to online processing; adding dial-up
capability; carrying out major hardware configuration upgrades;
operating system changes; making major change to the physical
installation; or changing installation location).

SIMPLE SECURITY PROPERTY - When the security features ensure that
a "subject" can READ an "object" only if the security level of the
"subject" is equal to or greater than that of the "object."

SOFTWARE - The computer code or "program" that executes or "runs"
on a computer.  The term "software" also includes the supporting
documentation.  Also, see "freeware," "public domain software,"
"shareware," and "licensed software."

SPONSOR/OWNER - The sponsor/owner of an automated application is
the local management individual with overall responsibility for the
functional area supported by the automated application in question. 
The sponsor/owner is the person responsible for development of
functional security requirements.

SPOOFING - An attempt to gain access to a system by posing as an
authorized user.  Synonymous with "impersonating," "masquerading,"
or "mimicking."

STAR PROPERTY - See "confinement property."

SUBJECT - An active entity (or process/domain pair) that causes
information to flow among objects or changes the system's state. 
Examples include people, processes, and devices.

SYSTEM - See "computer."

SYSTEM DESIGNER - The system designer is the person who interprets
the functional security requirements (developed by the application
sponsor/owner) and designs the technical security specifications.

SYSTEM DEVELOPER - The system developer incorporates the technical
security specifications into an operational system.

SYSTEM HIGH SYSTEM - A computer or network that incorporate the
security mode of operation wherein all system users have clearances
and access permission for all information in the system, but have
a need-to-know for only some of the information in the system.

SYSTEM SECURITY PLAN - An organized, logical, documented approach
for protecting a particular computer.

SYSTEM SUPPORT PERSONNEL - Those who administer and operate a
computer.  This includes all persons in the immediate vicinity of
the system attending to the operation, protection, and functioning
of the system, as well as those persons who manage, design,
program, modify, test, or install system software.

SYSTEM USERS - Those authorized persons who make use of application
programs via over-the-counter or remote means, and who have the
ability and means to create, destroy, change, or retrieve data or
program instructions in the system are also considered to be users.

TECHNICAL FEASIBILITY ANALYSIS - A study to determine the
alternative controls for reducing identified risks.

TELECOMMUNICATIONS SECURITY - That domain of automated information
security that is concerned with protecting the point-to-point
communication (e.g., input device to computer, computer to
computer, etc.) of information with appropriate cost-effective
measures (e.g., data encryption and protected distribution
systems).  Such communications generally occur via data
communication systems, links, and devices such as wide area
networks, local area networks, telephone/wire lines, fiber optics,
radio waves/microwaves, and integrated circuits.

TEMPEST - A code name referring to the investigation and study of
compromising emanations.  It is sometimes used synonymously for the
term "compromising emanations" (e.g., TEMPEST tests and TEMPEST
inspections).

THREAT - Any circumstance or event with the potential to cause
harm.

TRANQUILITY - A security model rule stating that the security level
of an object cannot change while the object is being processed.

TRAP DOOR - A hidden software or hardware mechanism that can be
triggered to permit system protection mechanisms to be
circumvented.  Synonymous with "back door."

TROJAN HORSE - A computer program with an apparently (or actually)
useful function that contains additional (hidden) functions that
surreptitiously exploit the legitimate authorizations of the
invoking process.

TRUSTED PROCESS - A process whose incorrect or malicious execution
is capable of violating system security policy.

TRUSTED SYSTEM - A computer that employs sufficient hardware and
software integrity measures to allow its use for simultaneously
processing a range of classified information.

TRUSTED COMPUTER SYSTEM EVALUATION CRITERIA - The trusted computer
system evaluation criteria are defined in DOD 5200.28, "Department
of Defense Trusted Computer System Evaluation Criteria." These
criteria include requirements for specific security features and
assurance that those features are properly implemented.  The
criteria are divided into four divisions: D, C, B, and A, ordered
in a hierarchical manner with the highest division (A) being
reserved for systems providing the most comprehensive security. 
Within these four categories are sub-categories, some of which
range from 1 to 3.  These sub-categories are also arranged
hierarchical, with 1 being the least restrictive.  Thus, the seven
categories are:  C1, C2, B1, B2, B3, and A1.

TRUSTED COMPUTING BASE - The totality of protection mechanisms
within a computer (i.e., hardware, firmware, and software)
responsible for enforcing security.  The ability of a TCB to
correctly enforce security depends solely on the mechanisms within
the TCB and on the correct input of parameters (e.g., a user's
clearance) by system personnel.

USER - Any (authorized or unauthorized) individual or process that
operates the computer, accesses the computer, inputs commands to
the computer, or receives output from the computer.

USER IDENTIFIER - A unique symbol or character string used in a
computer to identify a user.  USERID's are not normally protected
as private/privileged information, but they are normally unique.

VERIFIABLE IDENTIFICATION FORWARDING - An identification method
used in networks where the sending host can verify that an
authorized user on its system is attempting a connection to another
host.  The sending host transmits the required user authentication
information to the receiving host.  The receiving host can then
verify that the user is validated for access to its system.  This
operation may be transparent to the user.

VIRUS - Self-propagating software that parasitically attaches
itself to authorized software.  Viruses normally have three
functional components - mission, trigger, and self-propagation. 
See NCSC-C1 Technical Report 001.  Viruses are capable of doing
anything that software can do, both good and bad, once they are
activated in a system.

VULNERABILITY -  A weakness in security that could be exploited to
harm automation assets.

WORM - Software that surreptitiously burrows its way into systems
using unauthorized techniques.  Unlike viruses, worms need no host
software and do not self-replicate.  Like viruses, worms are
capable of doing anything software can do, both good and bad, once
they are inside a system.



                           Appendix D,
                  DECLASSIFYING AND DESTROYING
                  MEMORY, MEDIA, AND EQUIPMENT

1.   INTRODUCTION.  All internal memory, removable media, and
computer equipment must eventually be declassified or destroyed. 
The record and marking requirements for declassification are
covered by NHB 1600.XX, "NASA Security Handbook."  But, computers
use recording technologies that require special declassification
considerations.  Also, see NCSC-TG-025.

2.   INTERNAL MEMORY.  Most internal memories can be declassified
using memory overwrite routines that meet NSA standards.  Once
declassified, they can be destroyed as unclassified waste. 
However, in some cases, destruction by burning, shredding or acid
bath, is the only approved method.  Authorized procedures for
declassification of the most commonly used memory types discussed
in the following subparagraphs:

     a.   Volatile Semiconductor Memory.  This type of memory can
be declassified by overwrite each memory location with any
character pattern, or by the removal of main and any backup power
from the system.  

     b.   Non-Volatile Semiconductor Memory.  Normally this type of
memory holds only unclassified information, such as addresses and
instructional commands, but if required, declassification shall be
in accordance with vendor and NASA suggested methods i.e. manually
setting bit patterns to 0's or clearing memory with ultra-violet
light.

     c.   Magnetic Core Memory.  This type of memory, which is used
in the processing of information classified no higher than SECRET,
can be declassified by overwriting each addressable memory location
alternately with any pattern of bits and then with its
complementary or opposite bit pattern (1's then 0's) 100 changes of
state.  The same procedure applies for the declassification of core
memory use for processing TOP SECRET information, except that the
memory must be overwritten for 1,000 changes of state.

2.   REMOVABLE MEDIA.  Most removable magnetic storage media can be
declassified using electrical degaussing equipment approved by NSA. 
Once declassified, they can be disposed of as unclassified media.

     a.   Magnetic Tape Reels and Cartridges.  These tapes can be
declassified by the use of an approved degausser.

          (1)  Normally, the residual signal level after erasure
with AC electrical degaussing equipment must measure 90 decibels
(db) below signal level.  When degaussing with a hand held
permanent magnet or with DC electrical powered equipment the
equipment must saturate the media so the noise is raised to mask
the signal level.  Attention must be paid to the inter-record gap
areas to ensure all addressable spaces are declassified.

          (2)  Tape degaussers can purge Type I and Type II tapes
(i.e., those with 0-350 oersteds and 350-750 oersteds
respectively), but are not authorized for purging "high-energy"
magnetic tapes with a coercivity greater than 750 oersteds (called
Type III tapes.)  Type III tapes can not be declassified and shall
have a distinguishing label applied to the reel that identifies it
as "not declassifiable."

     b.   Flexible (Floppy) Disks and Diskettes.  Floppy disks can
be declassified by use of an approved floppy disk degausser, an
approved electrical tape degausser with the appropriate adapter, or
exposing the recording surfaces to an approved hand-held permanent
magnet.

     c.   Hard Disk Platters, Drums and Similar Devices.  These
types of media can be declassified by exposing the recording
surfaces to an approved hand-held permanent magnet or using an
approved overwrite procedure such as those previously mentioned. 
Attention must be paid to the bad sectors of these types of media
to ensure all addressable spaces are declassified.

     d.   Magnetic Cards.  Magnetic cards can be declassified by
using an approved electrical tape degausser with the appropriate
adapter.

3.   EQUIPMENT.  Computer equipment can be declassified by
declassifying all classified components.  Once declassified, the
system can be disposed of as an unclassified system.

     a.   Cathode Ray Tube.  Cathode Ray Tube (CRT) screen surfaces
shall be inspected and/or tested to detect evidence of burned-in
information.  If the inspection reveals classified information
etched into the phosphor, the CRT device shall be retained within
the appropriate security environment, or the screen itself shall be
destroyed.  In the absence of burned-in classified information, the
CRT may be handled as unclassified.

     b.   Laser/Optical Disks.  Laser disks (such as, CD-ROM and
WORM) shall be declassified by use of techniques approved by NSA. 
There are currently no procedures that adequately purge such media. 
Therefore, when no longer needed, laser disks must be destroyed in
accordance with NSA-approved procedures.



                          Appendix E, 
             PLAN FORMAT FOR NASA CLASSIFIED SYSTEMS

A classified system security plan is prepared as the basic system
security document and as evidence that the proposed classified
computer system, or update to an existing classified computer
system, meets the appropriate automated information security
program requirements for classified processing.  The plan is used
throughout the certification and accreditation process and serves
for the lifetime of the system as the formal record of the system
and its environment as approved for operation.  It also serves as
the basis for inspections of classified systems.  The C-CAISM
shall maintain a current copy of all approved system security
plans for the Installation.  The DAA shall maintain current
accreditation documentation of systems for which they are the
designated approving authority (i.e., accrediting official).

Note:  A classified system security plan may contain classified
information and shall be marked and protected according to NASA's
established policies and procedures for classified handling.

     1.   ATTACHED DOCUMENTS.  Where sections of the following
information are common to several classified computer systems at
an Istallation, the information may be contained in a separate
document and that document attached to or referenced in each
system security plan.

     2.   CLASSIFIED SYSTEM SECURITY PLAN CONTENTS.  The plan
formally documents the operation of a classified system and the
mechanisms that are used to control access and protect the system
and its information.  To make appropriate accreditation
decisions, the DAA needs to understand the complete system
environment.  Therefore, at a minimum, each plan shall contain
the following information:
     
     a.   The identification and location of the computer system.

     b.   The name, location, and phone number of the responsible
C-CAISM or DPI-AISO.

     c.   A narrative description of the classified computer
system and the rules for permitting and denying access to the
information that is processed, stored, transferred, or accessed
by the system,  These rules must describe how access will be
controlled based on the classification of information processed,
and the clearance level and need-to-know of users.

     d.   A description of the system's computing environment
that includes at least:

          (1)  Determination of the protection requirements for
the system.

          (2)  Description of the methods used to meet the above
protection requirements including a description of security
related software.
 
          (3)  The level and amount of classified information to
be processed, stored, transferred, or accessed in the system.

          (4)  The architecture of the system, including all
hardware components, showing the organization, interconnections,
and interfaces of these components. (A schematic drawing may be
used to satisfy this requirement.)

          (5)  A detailed inventory of the classified system
components including software and hardware.

          (6)  Description of the control mechanisms to be used
for review and approval of modifications to the classified
system.

     e.   The evidence, or basis for certification, that each of
the requirements of this Handbook have been met.  This
description shall specifically address the requirements of at
least the following areas:

          (1)  Personnel Security.
          (2)  Physical Security.
          (3)  Telecommunications Security.
          (4)  Hardware and Software Security.
          (5)  Administrative Security.

     f.   A description of the management controls established to
prevent waste, fraud, and abuse.

     g.   A risk assessment that provides a measure of the
relative vulnerabilities and threats.

     h.   A description of the security training required for the
personnel associated with the classified computing system.

     i.   The procedures to be used by the personnel associated
with classified system for reporting any automated information
security incidents to appropriate management.   These procedures
shall include the actions to be taken to secure the classified
system during a security-related incident.

     j.   The contingency plan and recovery procedures for the
classified system, including the designation of persons
responsible for carrying out particular procedures, and the plan
for testing the operations of the contingency plan.

     k.   A description of the process used to protect the
current backup copies of critical software, information, and
documentation.

     l.   Escort procedures, including procedures unique to this
classified system.

     m.   A description of the controls for access to the
classified system.  If passwords are used for access control,
describe how they are selected, their length, the size of the
password space, etc.

     n.   The procedures for operating the system in an interim
period during updates or changes to the system.

     o.   If remote diagnostic services are to be used, specify
the methods of connection and disconnection and related security
measures.



                           Appendix F,
                              INDEX

A1 . . . . . . . . . . . . . . . . . . . . . . . . . . .B-1, C-19
AC . . . . . . . . . . . . . . . . . . . . . . . . . . . B-1, D-2
acceptable risk. . . . . . . . . . . . . . . . . . .1-3, 3-9, C-1
access control . .6-1, 7-13, 7-21, 7-22, A-2, A-4, A-7, A-8, B-1,
                          B-2, B-5, C-1, C-4, C-7, C-8, C-10, E-3
access port. . . . . . . . . . . . . . . . . . . . . . . . . .C-1
accountability . . . . . . . . . . 2-3, 3-6, 7-16, 7-21, C-1, C-2
accreditation. . 1-2, 5-5, 7-5, 7-6, 7-7, A-3, B-2, C-1, C-2, E-1
ACL. . . . . . . . . . . . . . . . . . . . . . . . . . . B-1, C-1
administrative security. . . . . . . . . 3-2, 3-5, C-2, C-13, E-2
ADP. . . . . . . . . . . . . .1-2, 3-14, 5-5, A-2, A-3, B-1, C-16
AISM . . . . . . . . .i, 1-3, 1-4, 2-1, 2-8, 2-11, 3-1, 3-2, 3-4,
                   3-5, 3-11, 3-14, 3-15, 3-18, 3-19, 3-20, 3-22,
              3-24, 5-1, 5-2, 5-3, 5-6, 7-2, 7-5, 7-6, 7-7, 7-10,
                                   7-11, 7-15, B-1, B-4, B-7, C-2
application. . . . . . . .1-1, 3-4, 3-10, 3-11, 3-12, 3-13, 3-15,
            3-17, 3-24, 3-25, 4-1, 4-8, 4-12, 5-1, 5-5, 5-8, 6-1,
                 6-2, 7-16, 7-21, C-2, C-4, C-7, C-15, C-17, C-18
application owner. . . . . . . . . . . . . . . . . . . . . . .C-2
application sponsor. . . . . . . . . . . . . . . . . . . . . C-17
ARC. . . . . . . . . . . . . . . . . . . . . . . . . . . B-1, C-4
assurance testing. . . . . . . . . . . . . . . . . . . . . . .C-2
audit trail. . . . . . .7-11, 7-13, 7-14, 7-15, 7-18, 7-19, 7-21,
                                                   7-22, C-1, C-2
authentication . . . . .4-10, 4-12, A-2, A-3, A-4, A-7, B-4, B-5,
                                                  C-3, C-10, C-19
authorization. . . . .3-14, 4-10, 4-12, 7-10, A-5, C-1, C-2, C-3,
                                                  C-5, C-12, C-15
automated information assets . . . . . . . . . . . . . .3-14, C-3
automated information resources. . . . . . i, 1-2, 1-3, 1-4, 1-5,
                                  2-2, 2-3, 2-5, 2-10, 2-14, 3-4,
                    4-10, 5-2, 5-3, A-5, C-2, C-3, C-6, C-9, C-14
automation assets. . . . . . . . . . . . . . . . . C-3, C-4, C-19
availability . . . . . . . . . . .1-1, 1-4, 2-2, 2-11, 3-6, 3-12,
                            3-20, 4-8, 4-12, 7-3, 7-11, 7-22, C-4
B1 . . . . . . . . . . . . . . . . . . . . . . . . . . .B-1, C-19
B2 . . . . . . . . . . . . . . . . . . . . . . . .7-23, B-1, C-19
B3 . . . . . . . . . . . . . . . . . . . . . . . . . . .B-1, C-19
backup . 1-2, 3-6, 3-15, 3-17, 4-10, 6-2, A-3, C-7, C-8, D-1, E-3
C-AISM . . . . .i, 1-3, 1-4, 2-1, 2-11, 3-1, 3-2, 3-4, 3-5, 3-11,
                   3-14, 3-15, 3-18, 3-19, 3-20, 3-22, 3-24, 5-1,
                    5-2, 5-3, 5-6, 7-5, 7-6, 7-7, 7-10, 7-11, B-1
C-CAISM. . . . . . . . 7-2, 7-4, 7-5, 7-6, 7-7, 7-10, 7-11, 7-12,
                           7-14, 7-16, 7-17, 7-21, 7-22, B-1, E-1
C1 . . . . . . . . . . . . . . . . . . . . . 7-11, A-7, B-1, C-19
C2 . . . . . . . . . . . . . . . . . . . . . . . . 7-9, B-1, C-19
CAI. . . . . . . . . . . . . . . . . . . . . . . . 3-22, B-1, B-2
category . . . . . . . 3-14, 4-1, 4-4, 4-7, 7-9, 7-12, 7-19, C-4,
                                                  C-6, C-10, C-11
CCITT. . . . . . . . . . . . . . . . . . . . . . . . . . A-3, B-1
CD-ROM . . . . . . . . . . . . . . . . . . . . . . . . . B-2, D-2
CERT . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-2
certification. . . . 1-2, 2-11, 3-6, 3-7, 3-11, 3-12, 3-13, 3-23,
                    3-24, 5-5, 7-5, 7-7, 7-11, A-3, C-4, E-1, E-2
classification . . . 3-19, 7-2, 7-3, 7-9, 7-11, 7-12, 7-13, 7-14,
                               7-16, 7-17, 7-18, 7-19, 7-22, C-4,
                            C-5, C-8, C-10, C-11, C-14, C-15, E-1
classified information . . . . 1-6, 7-1, 7-2, 7-3, 7-4, 7-6, 7-7,
                          7-8, 7-9, 7-10, 7-11, 7-12, 7-13, 7-15,
             7-16, 7-17, 7-18, 7-19, 7-20, C-5, C-10, C-11, C-13,
                                  C-14, C-15, C-18, D-2, E-1, E-2
clearance. . . . . . 7-8, 7-10, 7-11, 7-18, 7-20, 7-22, B-6, C-3,
                            C-4, C-5, C-10, C-14, C-15, C-19, E-1
clearing . . . . . . . . . . . . . . . . . . . .2, 7-17, C-5, D-1
closed area. . . . . . . . . . . . . . . . . . . . . . .C-5, C-15
CM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-2
coercivity . . . . . . . . . . . . . . . . . . . . . . . C-5, D-2
communications security. . . . . . . . . 7-3, 7-19, B-2, B-5, C-5
compartmentation . . . . . . . . . . . . . . . . . 7-22, C-5, C-6
compartmented. . . . . . . . . . . . . . . . . . . .B-7, C-5, C-6
compliance review. . . . . . . . . . . . . . . 2-1, 2-5, 3-6, C-6
compromise . . . . . . . . . . . . . . . 7-3, C-3, C-6, C-7, C-14
compromising emanations. . . . . . . . . . . . . . C-6, C-8, C-18
COMPUSEC . . . . . . . . . . . . . . . . . . . . . . . . A-6, B-2
computer facility. . . . . . . . . . . . . . . . . . . .7-18, C-6
computer resources . . . . . . . . . . . . . . . . C-6, C-8, C-14
computer security. .1-1, 1-2, 1-3, 1-4, 1-5, 1-6, 2-1, 2-5, 3-18,
                   3-22, 3-23, 5-1, A-1, A-2, A-3, A-4, A-5, A-6,
                A-7, A-8, A-9, B-2, B-3, B-5, C-3, C-4, C-6, C-11
computer security incident . . . . . . . . . . . . 3-18, A-4, B-2
COMSEC . . . . . . . . . . . . . . . . . 7-3, 7-4, 7-19, B-2, C-5
confidentiality. . . . . . . 1-1, 1-4, 2-2, 2-11, 3-6, 3-12, 4-8,
                                                   4-12, 7-3, C-6
configuration management . .4-10, 4-11, 4-12, 7-15, A-7, B-2, C-6
confinement property . . . . . . . . . . . . . . . . . .C-6, C-17
contingency plan . . . . . . . . . . . . . . . . . . . . C-7, E-2
control zone . . . . . . . . . . . . . . . . . . . . . .C-7, C-15
controlled access. . . . . . . . . . . . . . . . . .A-6, B-1, C-7
controls . . . . . . . .2-3, 2-5, 2-10, 3-9, 5-6, 7-22, A-3, C-1,
            C-2, C-4, C-7, C-10, C-13, C-15, C-16, C-18, E-2, E-3
COTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-2
countermeasures. . . . . . . . . . . . . 7-9, A-6, B-8, C-7, C-15
covert channel . . . . . . . . . . . . . . . . . . . . . . . .C-7
covert timing channel. . . . . . . . . . . . . . . . . . . . .C-7
critical resources . . . . . . . . . . . . . . . . . . . . . .C-7
criticality. .2-3, 2-5, 2-14, 3-1, 3-4, 3-6, 3-7, 3-8, 3-9, 3-12,
                     3-13, 3-17, 4-1, 4-8, 4-10, 4-11, 4-12, 5-5,
                                         6-2, 7-2, 7-9, C-7, C-16
criticality level. . . . . . 2-5, 4-1, 4-8, 4-10, 4-11, 4-12, 7-9
criticality levels . . . . . . . . . . . 4-1, 4-8, 7-2, 7-9, C-16
criticality rating . . . . . . . . . . . . . . . . . . . . . .C-7
cryptography . . . . . . . . . . . . . . . . . . . .A-4, C-7, C-9
data encryption standard . . . . . . . . .A-2, A-3, A-8, B-3, C-8
data owner . . . . . . . . . . . . . . . . . . . . . . .3-12, C-8
data processing installation . . . . . . 1-4, 2-1, 3-15, B-3, C-8
DBMS . . . . . . . . . . . . . . . . . . . . . . . 4-12, 7-7, B-3
DCID . . . . . . . . . . . . . . . . . . . . . . . . . .B-3, C-10
declassification . . . . . .7-16, 7-17, 7-20, C-5, C-8, C-14, D-1
dedicated system . . . . . . . . . . . . . . . . . . . . . . .C-8
degaussing . . . . . . . . . . . . . . . . . .7-16, C-8, D-1, D-2
DES. . . . . . . . . . . . . . . . . . . . . . . . .A-2, A-8, B-3
destruction. . . 2-5, 3-12, 4-11, 7-5, 7-21, C-8, C-10, C-16, D-1
dial-up access . . . . . . . . . . . . . . . . . . . . 4-12, 7-10
disaster recovery plan . . . . . . . . . . . . . . . . . . . 3-15
disclosure . . . . 2-2, 2-5, 3-12, 4-7, 7-1, 7-8, 7-12, C-6, C-8,
                                                       C-11, C-16
discretionary access control . . . . . . . . . . . 7-21, A-7, B-2
DOC. . . . . . . . . . . . . . . . . . . . . . . . . . . 1-6, B-3
DoD. . . . . . . . 1-2, 1-6, 3-22, 4-7, 7-2, 7-9, 7-23, A-6, B-2,
                                       B-3, B-4, C-11, C-12, C-18
DOJ. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-3
domain . . . . . . . . . . . . . 6-2, 7-21, C-8, C-14, C-17, C-18
DPI-AISO . . . 2-1, 2-11, 3-4, 3-9, 3-11, 3-12, 3-14, 3-19, 3-20,
              3-24, 4-8, 5-5, 5-7, 7-4, 7-5, 7-6, 7-7, 7-8, 7-10,
               7-11, 7-12, 7-14, 7-15, 7-17, 7-19, 7-20, B-3, E-1
DPI-CSO. . . . . . . . . . . . . . . . . . . . . . . . . . . .B-3
DTLS . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-3
emanations . . . . . . . . . . . . . . . . . .7-9, C-6, C-8, C-18
emergencies. . . . . . . . . . . . . . . . . . . . . . . . . .5-7
emergency. . . . . . . 3-15, 3-17, 3-18, 3-20, 5-3, 5-7, B-2, C-8
emission . . . . . . . . . . . . . . . . . . . . . . . . C-5, C-8
emission security. . . . . . . . . . . . . . . . . . . . C-5, C-8
encryption . . . . 4-11, 4-12, 6-1, A-2, A-3, A-8, B-3, C-2, C-8,
                                                        C-9, C-18
EO . . . . . . . . . . . . . . 4-4, 4-7, 7-1, B-3, C-4, C-5, C-11
EPL. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-3
ETL. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-3
exclusion area . . . . . . . . . . . . . . . . . . . . . . . .C-9
FBI. . . . . . . . . . . . . . . . . . . . . . . . . . .3-19, B-3
FIPS PUB . . . . . . . . . . . . . . . . . . . . . .A-2, A-3, B-3
FIRMR. . . . . . . . . . . . . . . . . . . . .1-7, 3-24, B-3, C-9
FIRST. . . . . . . . . . . . . . . . . . . . . . . . . . 4-1, B-4
FOCI . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-4
FOIA . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-4
formal security policy model . . . . . . . . . . . . . . . . .C-9
FTLS . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-4
FY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-4
GSA. . . . . . . . . . . 1-6, 1-7, 3-22, 3-24, 5-1, 5-8, B-3, B-4
GSFC . . . . . . . . . . . . . . . . . . . . . . . 3-20, B-4, C-4
guidelines . . . . .1-2, 1-5, 1-6, 2-8, 3-1, 3-14, 6-1, 7-8, 7-9,
                               7-12, A-2, A-4, A-5, A-6, A-7, A-8
I&A. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-4
identification . . . . . . . 1-2, 2-3, 3-4, 3-8, 3-9, 4-10, 4-11,
                         5-5, 5-7, 7-21, A-2, A-7, B-4, C-3, C-9,
                                            C-10, C-15, C-19, E-1
identification forwarding. . . . . . . . . . . . . . . . . . C-19
IDS. . . . . . . . . . . . . . . . . . . . . . . . . . .7-13, B-4
incident . . . . . . . . . . . . .1-5, 2-3, 2-5, 3-2, 3-11, 3-18,
                           3-19, 3-20, 3-21, 3-23, 4-1, 5-5, 5-7,
               7-5, 7-8, 7-18, A-4, B-2, B-4, B-5, C-3, C-16, E-2
information owner. . . . . . . . . . . . . . . 1-3, 7-2, C-8, C-9
information resources management . . 2-1, A-2, A-5, B-3, B-4, C-9
information security . i, 1-1, 1-2, 1-3, 1-5, 2-1, 2-3, 2-4, 2-5,
                2-6, 2-7, 2-8, 2-10, 2-11, 2-12, 2-13, 2-14, 3-1,
           3-2, 3-3, 3-4, 3-5, 3-7, 3-11, 3-20, 3-22, 3-23, 3-24,
               4-4, 4-7, 4-10, 5-1, 5-2, 5-3, 5-4, 5-5, 5-6, 5-7,
                5-8, 5-9, 6-1, 7-4, 7-5, A-5, B-1, B-2, B-7, C-3,
                        C-4, C-6, C-8, C-11, C-16, C-18, E-1, E-2
installation chief of security . . . . . . . . . . . . . . . .C-9
institutional management . . . . . . . . . . . . . . . . . . .C-9
integrity. . . .i, 1-1, 1-2, 1-4, 1-5, 2-2, 2-11, 3-6, 3-12, 4-4,
                           4-8, 4-12, 7-3, 7-11, 7-15, 7-19, A-1,
                              A-2, A-4, A-5, A-6, C-3, C-10, C-18
intelligence information . . . . . . . . . . . . . . . .4-4, C-10
internal control . . . . . . . . . . . . . . . . . . . . . . .A-1
internal controls. . . . . . . . . . . .2-10, 3-9, 5-6, C-2, C-10
inventory. . . . . . . . . . . . . . . . . . . . . .4-4, C-6, E-2
IPO-AISM . . . . . . . . . . .2-11, 3-2, 3-19, 5-1, 5-2, 5-3, B-4
IRM. . . . . . . . . . . . . . . . . . . . . . 2-1, A-2, B-4, C-9
ISDN . . . . . . . . . . . . . . . . . . . . . . . . . . A-4, B-4
isolation. . . . . . . . . . . . . . . . . . . . . . . . . . C-10
ISSM . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-4
ITSP . . . . . . . . . . . . . . . . . . . . . . . . . . 3-4, B-4
JPL. . . . . . . . . . . . . . . . . . . . . . . . .1-4, B-4, C-4
JSC. . . . . . . . . . . . . . . . . . . . . . . . . . . B-4, C-4
KSC. . . . . . . . . . . . . . . . . . . . . . . . . . . B-4, C-4
label. . . . . . . . . . . . . . . . . . . . . . . . . .C-10, D-2
LaRC . . . . . . . . . . . . . . . . . . . . . . . . . . B-5, C-4
LCM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-4
least privilege. . . . . . . . . . . . . . . . . . . . . . . C-10
LeRC . . . . . . . . . . . . . . . . . . . . . . . . . . B-5, C-4
life critical. . . . . . . . . . . . . . . . . . . . . . . . .4-7
life cycle . . . . . . . . . . . . . 1-1, 2-1, 2-2, 3-6, A-3, B-4
limited access . . . . . . . . . . . . . . . . . . . . . . . C-10
limited area . . . . . . . . . . . . . . . . . . . . . C-10, C-15
log. . . . . . . . . . . 4-12, 7-10, 7-13, 7-14, 7-15, 7-22, C-10
logic bomb . . . . . . . . . . . . . . . . . . . . . . . . . C-10
logs . . . . . . . . . . . . . . . . . . . . . . 3-11, 4-11, 7-15
long-range plan. . . . . . . . . . . . . . . . . . . . . . . C-11
loss . . . . . . .3-6, 3-9, 3-15, 3-17, 3-19, 4-7, 4-8, 6-1, C-3,
                                      C-8, C-13, C-14, C-15, C-16
MAC. . . . . . . . . . . . . . . . . . . . . . . . . . . A-4, B-5
management review. . . . . . . . . . . . . . . . . . . 2-14, C-11
mandatory access control . . . . . . . . . . . . . . . .7-22, B-5
MCSC . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-5
microcomputer platforms. . . . . . . . 3-2, 7-2, 7-11, 7-15, C-11
mission-critical . . . 1-1, 3-4, 3-5, 3-6, 3-9, 3-12, 3-13, 3-14,
                              3-15, 4-1, 4-7, 5-6, C-7, C-8, C-11
MLS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-5
modes of operation . . . . . . . . . . 7-8, 7-20, A-2, C-11, C-15
monitoring . . . . . . . . . . . . 3-11, 5-1, 5-5, 6-2, 7-5, C-15
MSFC . . . . . . . . . . . . . . . . . . . . . . . . . . B-5, C-4
multilevel system. . . . . . . . . . . . . . . . . . . . . . C-11
NACSI. . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-5
NASA classified. . . . . . . . . . . . . . . . . . 7-3, C-11, E-1
NASA computer systems. . . . . . . . . . . . . . .1-4, 3-14, C-11
national security information. . . . . . . . .7-1, A-6, C-5, C-11
NBS. . . . . . . . . . . . . . . . . . . . . . . . . . . A-2, B-5
NCSC . . . .7-3, 7-4, 7-5, 7-7, 7-8, 7-9, 7-11, 7-13, 7-15, 7-19,
                                   7-21, A-6, A-7, A-8, A-9, B-2,
                                              B-5, C-1, C-19, D-1
NCSL . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-5
need-to-know . . .7-8, 7-20, 7-22, C-3, C-5, C-8, C-12, C-17, E-1
network security plan. . . . . . . . . . . . . . . . . .7-8, C-12
NHB. . . . .2, i, 1-1, 1-2, 1-5, 3-6, 3-8, 3-14, 7-8, 7-10, 7-12,
                             7-13, 7-16, A-5, B-5, C-5, C-6, C-8,
                                 C-9, C-11, C-12, C-13, C-15, D-1
NIST . . . . . . . . . . . . . 1-6, 3-24, 5-8, A-9, B-3, B-5, B-7
NMI. . . . i, 1-2, 1-3, 1-5, 2-6, 2-8, 2-10, 3-1, 3-2, 3-9, 3-14,
                              3-24, 3-25, A-5, A-8, B-5, C-4, C-9
non-federal personnel. . . . . . . . . . . . . .2, 3-6, 7-8, C-12
NSA. . . 1-6, 5-8, 7-1, 7-3, 7-7, 7-8, 7-16, 7-19, A-8, A-9, B-3,
                    B-4, B-5, B-6, B-7, B-8, C-13, C-16, D-1, D-2
NSD. . . . . . . . . . . . . . . . . . . . . . . . .7-1, A-6, B-5
NSDD . . . . . . . . . . . . . . . . . . . . . . . . . . . 2, B-5
NSI. . . . . . . . . . . . . . . . . . . . . . . . . . .3-19, B-5
NSM. . . . . . . . . . . . . . . . . . . . . . . .3-18, 3-19, B-5
NSTISSAM . . . . . . . . . . . . . . . . . . . . . . . . . . .B-6
NSTISSC. . . . . . . . . . . . . . . . . . . . . . . . . 7-1, B-6
NSTISSD. . . . . . . . . . . . . . . . . . . . . . . . . . . .B-6
NSTISSI. . . . . . . . . . . . . . . . . . . . . . . . . . . .B-6
NSTISSP. . . . . . . . . . . . . . . . . . . . . . . . . . . .B-6
NTCB . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-6
NTISSAM. . . . . . . . . . . . . . . . . . . . . . . . . A-6, B-6
NTISSC . . . . . . . . . . . . . . . . . . . . 4-7, B-5, B-6, B-7
NTISSD . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-6
NTISSI . . . . . . . . . . . . . . . . . . . . . . .7-9, A-6, B-6
NTISSP . . . . . . . . . . . . . . . . . . . . . . .7-1, A-6, B-6
object . . . . . . . . . . .7-21, C-1, C-4, C-6, C-12, C-17, C-18
object reuse . . . . . . . . . . . . . . . . . . . . . . . . C-12
OMB. . . . . . . . . 1-4, 1-5, 1-6, 5-1, 5-8, A-1, A-2, B-6, C-16
operations security. . . . . . . . . . . . . . . . 3-2, 3-5, C-12
OPM. . . . . . . . . . . . . . . . . . . 1-4, 1-6, 3-22, A-2, B-6
orange book. . . . . . . . . . . . . . . . . . . . . . 7-23, C-12
password . . . .3-10, 7-11, 7-13, 7-14, 7-15, A-3, A-6, C-12, E-3
password space . . . . . . . . . . . . . . . . . . . . .C-12, E-3
PC . . . . . . . . . . . . . . . . . . . . . . . . . . .B-6, C-11
PCL. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-6
PDS. . . . . . . . . . . . . . . . . . .7-3, 7-4, 7-19, B-6, C-14
penetration testing. . . . . . . . . . . . . . . .C-2, C-13, C-16
periods processing . . . . . . . . . . . . . . . . . . 7-12, C-13
personnel screening. . . . . . . . . . 3-6, 3-14, 3-23, 5-5, C-13
personnel security . . . . 2-10, 3-2, 3-5, 3-19, 4-10, 6-2, 7-11,
                                             7-20, A-8, C-13, E-2
physical security. . . . .2-10, 3-2, 3-5, 4-10, 7-12, 7-13, 7-15,
                                   7-18, A-2, C-5, C-9, C-13, E-2
plan . . . . 2-14, 3-4, 3-5, 3-15, 3-22, 5-1, 5-2, 5-3, 5-4, 5-6,
             5-7, 5-8, 7-4, 7-5, 7-7, 7-8, 7-10, 7-12, 7-19, B-1,
        B-2, B-3, B-4, B-7, C-7, C-11, C-12, C-16, C-17, E-1, E-2
plans. . . . 1-2, 2-5, 2-8, 2-14, 3-1, 3-5, 3-6, 3-7, 3-10, 3-15,
         3-17, 4-4, 4-11, 5-1, 5-2, 5-3, 5-6, 5-7, 5-8, 7-5, 7-8,
                                    7-10, A-1, A-2, C-7, C-8, E-1
PO-AISM. . . . . . . . i, 1-3, 1-4, 2-1, 2-8, 2-11, 7-2, B-7, C-2
policies . i, 1-3, 1-5, 1-6, 2-3, 2-5, 2-6, 2-8, 2-10, 3-1, 3-11,
                4-4, 6-1, 7-1, 7-2, 7-4, 7-8, A-6, A-8, C-1, C-2,
                                             C-4, C-11, C-14, E-1
preferred products list. . . . . . . . . . . . . . . . .A-8, C-13
privacy. . . . . . . . . . . . . . . . . . . .4-4, A-1, A-2, C-16
procedural security. . . . . . . . . . . . . . . . . . . . . C-13
program office . . . . . i, 1-3, 2-1, 2-8, 2-11, 3-4, 3-15, 3-17,
              3-20, 5-1, 5-2, 5-8, A-9, B-4, B-6, B-7, C-10, C-13
property protection area . . . . . . . . . . . . . . . . . . C-13
protect as restricted data . . . . . . . . . . . . . . . . . C-13
protected distribution system. . . . . . . . . . . . . .B-6, C-14
protection index . . . . . . . . . . . . . . . . . . . . . . C-14
protective measures. . . . . .1-4, 2-3, 2-5, 2-14, 3-4, 3-5, 3-6,
               3-7, 3-8, 3-9, 3-11, 3-12, 3-13, 3-14, 3-15, 3-17,
               3-20, 3-23, 3-24, 4-1, 4-8, 4-10, 4-11, 4-12, 5-6,
                6-1, 6-2, 7-4, 7-11, 7-12, 7-16, 7-18, 7-19, C-8,
                                                 C-13, C-14, C-15
PTR. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-7
purge. . . . . . . . . . . . . . . . . . . . . . . . . .C-14, D-2
qualitative risk assessment. . . . . . . . . . . . . . . . . C-14
quantitative risk assessment . . . . . . . . . . . . . . . . C-14
RAMP . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-7
read only. . . . . . . . . . . . . . . . . . . . . . . . B-2, C-9
recertification. . . . . . . . . . . . 2-5, 2-14, 3-6, 3-13, C-14
reference monitor concept. . . . . . . . . . . . . . . C-14, C-15
RF . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-9, B-7
RFP. . . . . . . . . . . . . . . . . . . . . . . . . . . A-8, B-7
risk assessment. . . . . . 2, 1-3, 2-1, 2-5, 2-14, 3-6, 3-7, 3-8,
                           3-9, 3-10, 3-23, 3-24, C-14, C-15, E-2
risk management. . . . . . . . . . .1-2, 2-3, 5-5, A-2, A-5, C-15
risk monitoring. . . . . . . . . . . . . . . . . . . . . . . C-15
risk reduction . . . . . . . . . . . . . . . . . . . . .1-2, C-15
RVM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-7
safeguards . . . .1-2, 3-20, 7-2, 7-3, 7-12, 7-13, C-1, C-3, C-4,
                                            C-7, C-14, C-15, C-16
SAISS. . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-7
sanitization . . . . . . . . . . . . . . . . . . . . . .C-5, C-15
SCI. . . . . . . . . . . . . . . . . . . . . . . . . . . 7-2, B-7
security area. . . . . . . .7-19, C-5, C-7, C-9, C-10, C-13, C-15
security design review . . . . . . . . . . . . . . . . . . . C-15
security kernel. . . . . . . . . . . . . . . . . . . . . . . C-15
security modes of operation. . . . . . . . . . . .7-8, C-11, C-15
security test. . . . . . . . . . . . . . . . . . . 7-7, B-7, C-16
security test and evaluation . . . . . . . . . . . . . .B-7, C-16
security testing . .2, 3-7, 3-9, 3-10, 3-11, 5-5, 7-7, C-13, C-16
security tests . . . . . . . . . . . . . . . . . . 3-10, 7-5, 7-7
sensitive ADP position . . . . . . . . . . . . . . . . . . . C-16
sensitive information. . . . . . .4-12, 5-8, 6-1, A-1, C-6, C-11,
                                                       C-13, C-16
sensitivity. . . . .2-3, 2-5, 2-14, 3-1, 3-4, 3-6, 3-7, 3-8, 3-9,
                          3-12, 3-13, 4-1, 4-8, 4-10, 4-11, 4-12,
                      5-5, 7-2, 7-9, 7-22, C-10, C-13, C-14, C-16
SFUG . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-7
short-range plan . . . . . . . . . . . . . . . . . . . . . . C-16
significant. . . 2-3, 2-5, 3-2, 3-5, 3-9, 3-11, 3-13, 3-18, 3-19,
                        3-20, 3-23, 4-8, 5-5, 5-7, 6-2, 7-8, C-16
significant change . . . . . . . . . . . . . . . .3-9, 3-13, C-16
simple security property . . . . . . . . . . . . . . . .C-4, C-17
SMAP . . . . . . . . . . . . . . . . . . . . . . . . . . A-5, B-7
SPEC PUB . . . . . . . . . . . . . . . . . . . A-3, A-4, A-5, B-7
sponsor/owner. . . . . . . . . . . . . . . . . . . . . 3-17, C-17
spoofing . . . . . . . . . . . . . . . . . . . . . . . . . . C-17
SSC. . . . . . . . . . . . . . . . . . . . . . . . . . . B-7, C-4
ST&E . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-7
standards. . . 1-6, 6-1, 7-16, 7-19, A-6, A-9, B-3, B-5, C-4, D-1
star property. . . . . . . . . . . . . . . . . . . C-4, C-6, C-17
STS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-7
subject. . . . . . 4-7, 5-8, 7-21, C-1, C-3, C-4, C-6, C-8, C-10,
                                                       C-12, C-17
system designer. . . . . . . . . . . . . . . . . . . . . . . C-17
system developer . . . . . . . . . . . . . . . . . . . . . . C-17
system high. . . . . . . . . . . . . 7-15, 7-20, 7-21, C-15, C-17
system security plan . . . . 7-4, 7-5, 7-7, 7-12, 7-19, C-17, E-1
system support personnel . . . . . . . . . . . . . . . .7-8, C-17
system users . . . . . . . . . . 7-8, 7-10, 7-21, C-8, C-17, C-18
TCB. . . . . . . . . . . . . . . . . . . . .7-22, B-7, C-15, C-19
TCSEC. . . . . . . . . . . . . . . . . . . . . . . . . . A-6, B-7
technical feasibility analysis . . . . . . . . . . . . . . . C-18
telecommunications security. . . . . . . 3-2, 3-5, B-7, C-18, E-2
tempest. . . . . . . . . 7-4, 7-8, 7-9, A-6, B-7, C-6, C-13, C-18
TFM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-7
threat . . . . . . . . . . . . . . .6-2, 7-5, 7-9, C-1, C-2, C-18
TLS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-8
TNI. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-8
TNIEG. . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-8
tranquility. . . . . . . . . . . . . . . . . . . . . . . . . C-18
trap door. . . . . . . . . . . . . . . . . . . . . . . . . . C-18
trojan horse . . . . . . . . . . . . . . . . . . . . . . . . C-18
trusted computer system. . . . . . 7-9, 7-23, A-6, A-7, B-7, C-18
trusted computing base . . . . . . . . . . . . . . B-6, B-7, C-19
trusted process. . . . . . . . . . . . . . . . . . . . . . . C-18
trusted system . . . . . . . . . . . . . . . . . . . . C-12, C-18
TSCM . . . . . . . . . . . . . . . . . . . . . . . . . . . . .B-8
U.S.C. . . . . . . . . . . . . . . . . . . . . . . . . . 4-4, B-8
USERID . . . . . . . . . . . . . 7-4, 7-13, 7-14, B-8, C-12, C-19
users. . . .2-2, 3-6, 3-11, 3-17, 4-10, 4-11, 5-5, 5-7, 6-1, 6-2,
               7-4, 7-8, 7-10, 7-14, 7-15, 7-20, 7-21, 7-22, C-8,
               C-9, C-10, C-11, C-14, C-15, C-16, C-17, C-18, E-1
verifiable . . . . . . . . . . . . . . . . . . . . . . C-15, C-19
virus. . . . . . . . . . . . . . . . . . . . . . . 1-5, 6-2, C-19
vulnerability. . . . . . . . . . . . . . . . B-2, C-3, C-16, C-19
worm . . . . . . . . . . . . . . . . . . . . .1-5, B-8, C-19, D-2



                           Appendix G,
                    FEDERAL AIS REQUIREMENTS

                      (Table--See hardcopy)