SD PAMPHLET 800-11

Acquisition Management

GUIDE FOR

INDEPENDENT READINESS REVIEWS

EXPENDABLE LAUNCH VEHICLE (ELV) MISSIONS

10 MARCH 1986

D E P A R T M E N T 0 F T H E A I R F 0 R C E

H E A D Q U A R T E R S S P A C E D I V I S I 0 N (A F S C)

FOREWORD

Existing directives prescribe a series of program reviews to help assure mission readiness. However, circumstances remain which merit the additional precaution of an Independent Readiness Review (IRR). Review redundancy provides an exhaustive and objective check which is unavailable through any other management method. In keeping with our objectives of eliminating risks within the scope of our control, these reviews are recommended for launches which are uncommonly significant. Factors which may warrant conduct of an IRR include but are not limited to the following: mission uniqueness, one-of-a-kind or first in-a-series launch, unusual complexity or cost, history of system or subsystem failures, incorporation of untried subsystems, problems encountered during development and testing, significant cost overruns or schedule changes.

This pamphlet has been developed to recommend practices and serve as a guide for implementing those portions of SDR 540-15; AFSC Field Activity Management Policy, Mission Readiness and SDR 800-12, Readiness Review Process, pertinent to Expendable Launch Vehicle IRRs. All IRR team members and program directors are encouraged to make maximum use of the guidance provided in this publication.

FORREST S. McCARTNEY

Lieutenant General, USAF

Commander

DEPARTMENT OF THE AIR FORCE SD PAMPHLET 800-11

Headquarters Space Division (AFSC)

Post Office Box 92960

Los Angeles CA 90009-2960 10 March 1986

Acquisition Management

GUIDE FOR INDEPENDENT READINESS REVIEWS

The purpose of the Independent Readiness Review (IRR) is to minimize the risk involved in a forthcoming launch by having an independent group of experts assess the readiness for launch of the flight hardware and the appropriate supporting elements. This pamphlet provides guidelines for the scope, organization, interface with program office/contractors, review tasks, and review procedures for the conduct of an effective review. It is not intended to be all inclusive, since IRRs can vary widely in size and scope. It is, therefore, a document designed to orient an IRR team (IRRT), and to provide a framework and guide for conducting an investigation.

1. Independent Readiness Review Scope:

a. Independent Readiness Reviews (IRR) for programs utilizing expendable launch vehicles are established under the authority of SDR 540-15 and SDR 800-12. At least 6 months prior to launch, the cognizant System Program Director and/or the Air Force Space Technology Center will send the Space Division Commander (SD/CC) (with information copy to SD/YOF) a written recommendation for conducting or not conducting an IRR. They may be conducted separately on spacecraft or launch vehicles, or combined into a single review with spacecraft and booster panels. When both the spacecraft and the launch vehicle are to be reviewed, it is normally desirable to conduct one "stacked" IRR.

b. Guidance is provided by SD/CC to the IRRT for establishing the scope of the review. The elements to be reviewed are dependent on the changes since previous missions, known program problems/deficiencies, areas of design complexity and risk. The IRRT should consult with the cognizant program director during definition of the scope but, when doing so, should keep in mind the requirement for independence of the review. Section 5 should be used when defining the scope.

c. The IRRT chairperson is responsible for interpreting the scope of the review for the team members and assuring that their activities remain within the scope.

2. IRR Team (IRRT) Organization:

a. The IRRT is comprised of experts selected from government and industry and is chaired by an Air Force officer with an Aerospace Corporation

Supersedes SDP 800-1l, 12 February 1982.

No of Printed Pages: 43

OPR: SD/YOF (Lt Shipman)

Approved by: Lt Col Craig S. Martin

Editor: Orvil K. Ormond

Distribution: X:

HQ AFSC/SD . . . 1

AUL/SE, Maxwell AFB AL 36112 . . . 1

representative as vice chairperson. The SD Directorate of Spaceflight Requirements (SD/YOF) recommends, for the commander's approval, an Air Force officer to chair the review. The Aerospace Corporation selects an individual to be vice chairperson. These incumbents and additional individuals selected from the IRRT comprise the executive panel. The executive panel forms technical review panels as appropriate for the program being reviewed. Additionally, one to three Air Force personnel are assigned to each IRRT to perform administrative staff functions such as document control, briefing and final report preparation. A typical IRRT organization is shown in figure 1.

Figure 1. Typical IRRT Organization

b. For the review of a spacecraft program, a typical panel structure and the suggested areas of investigation are:

(1) Mechanical. Propulsion, structures, temperature control, materials, ordnance, mass properties, environmental criteria, mechanisms, mechanical interfaces.

(2) Electronic/Electrical. Attitude control, electrical power, electrical distribution, telemetry, tracking, command, communications, sensors (except payload), electrical interfaces.

(3) Integration and Test. System test procedures, test results, configuration management, ground support equipment (GSE), launch operations, orbital operations, software, Air Force Satellite Control Facility (AFSCF) support, documentation, system interfaces, test range instrumentation support and planning, ground and flight safety, and security.

(4) Payload/Sensor. Optics, detectors, focal plane, data processing electronics, communications, payload/sensor interfaces.

(5) Product Assurance. Quality assurance/manufacturing, reliability, parts/materials/processes control, contamination/ corrosion control and system safety.

c. For the review of a launch vehicle program, a typical panel structure and its corresponding areas of investigation are:

(1) Mechanical. Propulsion, thrust vector control mechanical hardware, structures, temperature control, mass properties, ordnance, environmental criteria, materials, mechanical interfaces.

(2) Guidance/Electronics/Electrical. Guidance hardware and software, thrust vector control electronic hardware, telemetry, tracking, electrical power, electrical distribution, electrical interfaces.

(3) Integration and Test. Vehicle performance, system interfaces, system checkout procedures, configuration management, launch operations, range support, documentation, system interfaces.

(4) Product Assurance. Quality assurance/manufacturing, reliability, parts/materials/processes control, contamination/ corrosion control, and system safety.

d. Depending upon the need to emphasize individual areas for one or more unique program requirements, it may be advisable to increase the number of panels for a particular review. For example, a launch and orbital operations panel might be formed instead of including these items under areas of investigation of the integration and test panel. It is strongly recommended that a product assurance panel be formed for most IRRs. The SD staff offices of Directorate of Product Assurance (SD/PD), Deputy for Acquisition Logistics (SD/AL), and Directorate of Safety (SD/SE) should be consulted for selection or panel members.

e. There is no prescribed size or composition for particular panels. Typically, the number of personnel assigned to each panel for the duration of the review should be kept as small as possible. Additional specialists may be tasked to review specific items identified by the panel members and the duration of their involvement should be minimized.

f. The panels should consist of, as appropriate, experts from Space Division and its subordinate units, Aerospace Corporation, the user, National Aeronautics and Space Administration (NASA), and other military and civilian organizations, as well as other qualified individuals known to have particular expertise. Where successive reviews of the same program occur, a reasonable number of personnel used for follow-on reviews(s) should be drawn from the cadre participating in the previous review(s) to minimize indoctrination requirements.

g. Each panel should be assigned a chairperson who is responsible for organizing and conducting the panel review and for reporting results to the executive panel. Normally, panel chairpersons also serve as members of the executive panel.

h. The staff provides administrative support to the review team. Its responsibilities are to:

(1) Maintain a central file of all team documentation.

(2) Edit and standardize team action items and concern writeups.

(3) Act as interface between the IRRT and the office/contractor(s) for all formal requests for information (action items).

(4) Assure that the program office and contractor(s) are informed of activities.

(5) Provide forms and supplies.

(6) Arrange for briefings and meetings.

(7) Assist the executive panel in preparing interim and final briefings to SD/CC.

(8) Prepare minutes of the IRRT briefing to the SD Commander.

(9) Assist the executive panel in preparation of the IRR final report.

The staff interacts directly with a program office representative who serves as the focal point for information exchanged the IRRT and the program office/contractor(s).

i. The "mix", if any, of military and civilian personnel serving as members and chairpersons of the various panels is at the discretion of the IRRT Chairperson and is influenced by both technical and management considerations. In general, Aerospace team members are responsible for technical areas of the review. Military members are most frequently included in operational areas (test, integration, etc.) and for staff duties. The chairperson is responsible for identifying and obtaining military personnel to serve on the IRRT. SD/YOF will assist the chairperson in this process and will coordinate administrative actions as needed to effect the assignment of personnel from other than SD units.

j. The chairperson must ensure that all IRRT members are aware of program security requirements and hold the proper security clearances. The chairperson must also obtain team agreements to protect proprietary data when appropriate.

3. Schedule:

a. The independent readiness review should be scheduled to provide sufficient time for an effective review of the spacecraft/vehicle and for the cognizant program office to implement critical recommendations prior to the scheduled launch. The intensive review period (starting with phase II) may be 2 to 6 weeks. Review activities should be scheduled generally as follows:

(1) Phase I - Executive Panel Preparations. The executive panel should become sufficiently familiar with the program and requirements for the review to define the scope of the review and establish the team organization. A detailed schedule for all review activities should then be worked out and agreed upon among the executive panel, program office, and contractor(s) prior to start of phase II. This schedule should be adhered to as rigidly as practical to minimize impact of the review on contractor, program office, and team members. The program office makes contractual arrangements and prepares material to support the review during phases II and III. Phase I should be started approximately 1 month prior to phase II.

(2) Phase II - Kickoff Meetings. Introductory sessions should be conducted for the entire IRRT to explain the purpose of the review, present schedules, provide background data, and receive program familiarization briefings from the program office and contractor(s).

(3) Phase III - Panel Review. Parallel reviews in specific areas are accomplished by the review panels. Close coordination of activities with the executive panel occurs during the phase.

(4) Phase IV - Output Presentation. Detailed results of the various technical reviews are written on individual report forms and organized into a briefing format as discussed in section 7.

(5) Phase V - Reporting. A summary briefing on results of the review activities with conclusions and recommendations as provided to the SD Commander and the cognizant program director, and a final report is prepared (see paragraph 7 and SDR 800-12). When possible, the final report should be published prior to launch.

b. In planning the timing of independent reviews, the objective should be to provide findings to the program office sufficiently in advance of system shipment to the launch site or initiation of final launch preparations to allow maximum time for implementing critical recommendations. At the same time, it is useful to the review effort if the IRR is conducted at a point where the flight configuration of the spacecraft/booster has been generally fixed. Depending upon the schedule of the program being reviewed, a spacecraft IRR would most probably need to be centered around the time of the thermal/vacuum (T/V) tests, as shown in figure 2. In this example, IRRT activities prior to the T/V tests would concentrate on such areas as program familiarization, spacecraft design, previous flight problems, documentation, and training associated with orbital support, range support, etc., while the post-T/V review effort would focus on test anomalies and their resolution. Similarly, booster review activities should be arranged, where practical, to allow the team to participate in the contractor mission readiness review, which is scheduled by the program office approximately one month before launch. A typical launch vehicle IRR schedule is shown in figure 3. In determining a suitable schedule, the chairperson must weigh the convenience and desirability of keying the review to such events against the constraints of minimizing impact on program office/contractor activities, and of minimizing the time available for correcting problems identified in the review.

PHASE

I - Executive Panel Preparations

II - Kickoff Meetings

III - Panel Reviews

(Spacecraft T/V tests)

IV - Output Preparation

V - Reporting

5 4 3

MONTHS PRIOR TO LAUNCH

Figure 2. Typical Spacecraft IRR Schedule

PHASE

I - Executive Panel Preparations

II - Kickoff Meetings

III - Panel Reviews

(Contractor Mission Readiness Review)

IV - Output Preparation

V - Reporting

3 2 1

MONTHS PRIOR TO LAUNCH

Figure 3. Typical Launch Vehicle IRR Schedule

c. After presentation of the IRRT findings and recommendations to the SD Commander and the program office, the team should be disbanded. Upon publication of the final report, the team's activities cease, except that the chairperson may be called upon for subsequent commentary on the IRRT findings and recommendations. The team has no further responsibility with respect to launch readiness unless directed by the SD Commander. The follow-up and closeout of all concerns are the responsibilities of the program office.

4. Program Office/Contractor Support:

a. The program office assists the IRRT by supplying an indoctrination briefing, requested documents, contractual authorization for contractor support, a contact point for information exchange (see paragraphs 2 and 6), and other assistance when requested. The indoctrination briefing should include a discussion of known program problems/discrepancies, including the disposition of previous review actions and concerns. If the IRRT is to visit the launch base and/or the mission control centers, the program office should coordinate the necessary arrangements.

b. The program office ensures that the contractors and vendors give the review all necessary support. The support required of the contractors is defined by the executive panel and discussed in meetings with, and documented in correspondence to, the program office. This should be accomplished at least 2 weeks prior to the start of phase II, giving the contractor adequate time to prepare the required material. Generally, the contractor support includes the following:

(1) Indoctrination briefing/review data package. It is advisable to request that the contractor set up a centralized liaison office where a complete, properly catalogued set of pertinent, up-to-date documents is available. A list of typical items to be requested of the contractors for the indoctrination briefing/review data package is provided in Attachment 1.

(2) Support of panel working group meetings by appropriately qualified personnel.

(3) Collecting and furnishing copies of requested information.

(4) Provision of meeting facilities.

c. During phases II, III and IV of the review, specific requests to the program office for contractor or vendor information are documented by panel members in AFSC Forms 1746 Readiness Review Action Item, with desired suspense times and approved by the appropriate panel chairperson and the executive panel. The staff processes these action items by entering them

in a status log all action items, forwarding them to the program office contact, and following up to ensure their completion by the program office.

d. While an effective review requires that the independence of the IRRT be maintained, substantial coordination and cooperation between the IRRT and the program office are also necessary. It is recognized that extraordinary support by the program office/contractors/vendors might interfere with their capability to conduct required program affairs. The executive panel should work with the program office and program director to minimize IRRT impact on resources and schedules. In addition, the IRRT should coordinate activities with reviews being conducted by the program office to minimize duplication of effort by contractor and review team support personnel.

e. The program office whose payload is being reviewed is responsible for all costs associated with the independent review (SDR 800-12). These costs may include:

(1) Aerospace Corporation Costs. Manpower funding is

accounted for with a separate job order number to allow cost traceability.

(2) Travel Costs. Temporary Duty (TDY)/per diem for both military and nonmilitary team members. Cost accounting is provided to identify IRR travel costs separately.

(3) Contractor Support Costs. All charges and contract modifications resulting from contractor support for IRR investigations.

(4) Civilian Consultant Costs. Funding for expertise purchased from commercial firms to augment the IRR investigation.

5. Review Tasks:

a. Reviews are generally broken down into spacecraft reviews and launch vehicle reviews. Both types of reviews may include other elements which require investigation. In general, the two types of reviews include the following:

(1) Spacecraft Reviews:

(a) Spacecraft

(b) Ground support equipment (GSE)

(c) Ground and airborne software

(d) Orbital support

(e) Satellite support ground stations (AFSCF, program unique)

(f) Orbital operations

(g) Test support equipment

(h) Experiment packages (as required)

(2) Launch Vehicle Reviews:

(a) Launch vehicle

(b) GSE

(c) Ground and airborne software

(d) Launch complex

(e) Range support

(f) Booster support ground stations

(g) Launch operations

(h) Test support equipment

Note that the elements above are included under the various review panels suggested in paragraph 2. Attachment 2 provides a checklist of potential items to be reviewed with each element above. The checklist is not all inclusive and is intended only to begin and guide individual investigations. Also, the program being reviewed may have unique elements, or the review effort may suggest new or additional checklist items. These should be transmitted to SD/YOF for inclusion in revisions of this pamphlet.

b. The extent of the review effort depends on hardware design and development stage of the program, hardware performance history, resources available for the review, changes since previous reviews, and the scope of previous reviews. Where no previous review has been accomplished, and where no flight experience exists, the readiness review may expend more effort than otherwise would be required examining critical design areas, the adequacy of the test program, and adequacy of preparations for launch and flight operations support. Where flight experience exists, and where adequate readiness reviews have been previously accomplished, the effort can be concentrated on review of program changes/discrepancies since the previous review and adequacy of the response to previous review recommendations. A guide for elements to be included and the depth of review is provided in figure 4.

6. Information Exchange:

a. Requests for information from the program office or contractors are documented by team members on AFSC Form 1746, Readiness Review Action Item. Concerns generated by team members as a result of the review effort are

Figure 4. Guide for Elements and Depth of Review

documented on AFSC Form 1747, Readiness Review Report. Instructions on the use of these forms are included on the following pages. The executive panel reviews and approves all action items and concerns before they are forwarded to the program office. The staff is responsible for administratively tracking the status of action items and concerns, distributing AFSC Form 1746 and 1747 to the program office through the established contact point, and obtaining the program office's response. The program director and the IRR chairperson should agree, at the outset of the IRR, on a mutually acceptable standard suspense for response to IRRT requests. Periodic meetings between the program office and IRRT management should be held to review action items and concerns to which the program office cannot respond expeditiously.

b. AFSC Form 1746. This form is used for documentation and administrative handling of all IRRT requests to the program office for information or specified action by the program office or contractor in support of the review effort. It should be treated as a working document; there is no requirement to retain these forms on file after the IRR has been completed. Handwritten entries are encouraged.

(1) The upper half of the form is for use by the IRRT. An individual team member initiates the action item by filling in the SUBJECT, REQUESTED, BY, and ACTION REQUESTED blocks and entering a SUSPENSE DATE by which the information or action is required. The action item is then reviewed by the panel chairperson and the executive panel before it is referred to the staff for processing and submission to the program office contact point. Executive panel review of all action items is required to assure the request is consistent with IRR objectives, avoid duplication of requests from the various panels, and provide for management control of suspense assigned. This review is indicated by the chairperson or other designated executive panel member who signs under CERTIFICATION OF EXECUTIVE PANEL REVIEW. The staff, who should maintain a status log for all action items, assigns a control NUMBER and records the TRANSMITTAL DATE when forwarding the action item to the program office. Space provided for indicating when the action item was OPENED and CLOSED may be used by the originator or the staff as desired.

(2) The lower half of the form is used by the program office in making a response. Space is provided to indicate the office/individual to whom the action is assigned as well as the office/individual who actually prepares the response. Space is also provided to reflect coordination by appropriate program office or contractor staff agencies and management.

c. AFSC Form 1747. This form is used for documentation and administrative handling of concerns raised by the IRRT as a result of the review effort. The IRRT's findings and recommendations for correcting the concern, as well as the executive panel's assessment of the level of risk posed by the concern, are recorded on the front of the form. Space is provided on the back of the form to document the program office's response to the concern and the IRRT's evaluation of that response.

(1) Section I, IRRT FINDING. Information blocks in this section are self-explanatory. A NUMBER is assigned by the staff according to administrative procedures determined by the executive panel.

(2) Section II, EXECUTIVE PANEL EVALUATION. The executive panel evaluates the concern and, upon approving it, assigns a level of risk with a recommendation for timing of the action. These assessments are, to a large extent, subjective; for each IRR, the executive panel should devise criteria suitable to the particular requirements of tie program and mission under review.

(a) Risk definition should be based on considerations of mission impact and probability of occurrence. The following formula is suggested:

Serious impact, high probability = HIGH RISK

Degraded mission, high probability = MEDIUM RISK

Serious impact, low probability = MEDIUM RISK

Degraded mission, low probability = LOW RISK

(b) The recommendation to FIX BEFORE LAUNCH or FIX LATER should be based primarily on mission needs but should also take into account the cost and the schedule impact of the corrective action.

(3) Section III, PROGRAM OFFICE RESPONSE. This section is used by the program office to record its position, as initially formulated or subsequently revised, on the concern and IRRT recommendations. Space is included to identify the program office/contractor author of the response and for coordination by program office management.

(4) Section IV, IRRT DISPOSITION. The panel which initiated the concern reviews the program office response; an IRRT evaluation of its adequacy, as approved by the executive panel, is then entered in the space provided. If satisfied that no further action by the program office is necessary, the executive panel indicates that the concern is CLOSED. Otherwise, the disposition of the concern is OPEN until all agreed upon actions by the program office are completed or, if disagreement remains, differences are resolved by the SD Commander.

(5) The completed AFSC Forms 1747 constitute a major portion of the IRRT final report. As published in the report, they should reflect the most recent status of all concerns at the time the report is published. While the IRR is in progress, the report form should be treated as a working document intended to facilitate the exchange of views between the IRRT and program office. The orderly and timely completion of these forms as the information becomes available minimizes the final effort of report preparation and aids the program office in complying with recommendations of the IRRT. Finally, it is important that all valid concerns be documented, including those that are raised and later satisfactorily resolved; this record can save considerable time for subsequent IRRTs.

7. OUTPUT. The output of the IRRT consists of the following:

a. AFSC Forms 1747 recommending program office action. These are furnished to the program office as soon as they are validated by the executive panel.

b. Briefing to the SD Commander, the program director, and Aerospace program office personnel. The briefing should:

(1) Summarize the scope of the IRR.

(2) Provide an assessment of mission readiness.

(3) Describe those concerns which are considered launch limiting, with an assessment of the risk if not corrected and the IRRTs recommendations for corrective action. (Other concerns, not considered launch limiting, may be included at the discretion of the IRRT chairperson.)

(4) Include an opportunity for response by the program office to the concerns briefed by the IRRT.

c. Minutes of the IRR team's briefing to the SD Commander. (Minutes will be coordinated with and copies furnished to the program office and SD/YOF.)

d. A final report, with copies furnished to the program director, SD/YOF, and the Aerospace Corporation program office system engineering director. The report should include:

(1) A description of the charter, objective, scope, and tasks of the IRR.

(2) A listing of team members and their specialty areas.

(3) A very brief summary of team activities.

(4) A collection of AFSC Forms 1747 generated by the IRRT.

(5) A hard copy of the final briefing to SD/CC (including program office briefing charts addressing IRRT findings).

(6) The IRRT recommendation on readiness of the mission to proceed.

e. Feedback to SD/YOF identifying "lessons learned" concerning the IRR process.

8. SD/YOF RESPONSIBILITIES. The SD Directorate of Spaceflight Requirements (SD/YOF) is the overall monitor of the SD readiness review process. SD/YOF's responsibilities with respect to independent readiness reviews include the following:

a. Nominating candidates to the SD Commander to chair the IRR.

b. Assisting, if requested, the chairperson in obtaining military personnel to serve on the IRRT.

c. Providing or securing executive and administrative support as required.

d. Maintaining a file of documentation pertinent to IRRs including final reports.

OFFICIAL FORREST S. McCARTNEY, Lt Gen, USAF

Commander

CAROLYN O. UECKER, Maj, USAF

Director of Administration

TYPICAL ITEMS FOR INDOCTRINATION BRIEFING/REVIEW

DATA PACKAGE

I. Indoctrination Briefing

a. Describe baseline system and subsystems in reasonable detail. Include discussion of recent changes in baseline, if any.

b. Review test philosophy, test flow and test facilities. Discuss recent changes, if any. Compare environmental test levels with those called out in Military Standard, MIL-STD-1540A, Test Requirements for Space Vehicles. Include a discussion of methods for analyzing and trending test data. Describe interaction of design engineering with test procedures and data analysis.

c. Discuss methods to ensure that GSE is compatible and current with vehicle and launch site facilities.

d. Review failures, results of failure analysis, and corrective action for current vehicle design/configuration. This should include all significant flight and system level ground test failures for pertinent flights to date as well as system level ground test failures for hardware being reviewed.

e. Survey single point failure analysis and failure mode and effects analyses conducted. Provided a list of all uncorrected single point failures.

f. Explain any significant preliminary design review/critical design review/first article configuration inspection open items and waivers.

g. Discuss any predicted failures, if any, to meet specification performance.

h. Describe quality contract requirements and implementation with emphasis on configuration control, failure disposition methods, and piece part quality control.

i. Identify the baseline for launch and deployment activities and discuss recent changes, if any.

j. Provide data on remaining life of time-cycle critical components.

k. Discuss quality assurance (QA) audits, both contractor internal and those conducted by the government.

l. Review nonconforming material reports (waivers and deviations), DOD-STD-480A, Configuration Control - Engineering Changes, Deviations and Waivers.

(1) Attributable to design.

(2) Attributable to quality of work.

m. Review program peculiar security classification guidance.

II. Review Data Package

Provide:

a. Technical description/engineering analysis reports describing system/subsystem.

b. Selected chronological history of each black box since start of acceptance testing at the manufacturer, listing failures and corrective actions.

c. Test discrepancy and failure analysis reports.

d. Test plans and procedures for system level tests.

e. Safety Hazard Analysis (HA) and Accident Risk Assessment Report (ARAR).

f. Launch Base Test Plan.

g. Program Requirements Document (PRD).

CHECKLIST FOR

INDEPENDENT READINESS REVIEW

Section Title Page

I General Program Overview 14

II Program Software 16

III Spacecraft 17

IV Booster 28

V Launch Vehicle Integration 33

VI Launch Base Processing 36

VII Ascent and Orbital Support 40

SECTION I: GENERAL PROGRAM OVERVIEW

A. Mission Objectives

B. Roles and Responsibilities of Participants

C. System Overview

D. Major Changes

E. Master Schedule Changes

F. General Review Items Not Oriented to Specific Subsystems

A. Mission Objectives.

What are the general mission objectives of the program? How were these mission objectives converted into overall system requirements?

B. Roles and Responsibilities of Participants.

Who are the prime contractor and government participants?

What is the overall management structure?

How are requirements allocated between program segments?

How is coordination between participants handled? Who has final approval authority?

Who has the system responsibility? Who has launch vehicle integration responsibility?

Have previous IRRs been accomplished on the system? What were their findings/dispositions?

C. System Overview.

Obtain a brief system overview which includes all major program segments of the specific hardware being reviewed (including booster, space vehicle, GSE/facilities, orbital operations, ascent).

What are the overall system performance requirements? What are the overall performance and design requirements for each program segment?

What is the planned on-orbit system operation?

How are system tradeoffs made between program segments? What segments have flown before?

What is the flight and ground test failure history?

Are portions of the program using other programs as a baseline? What segments are involved? How much dependence is placed on the baseline program? Have changes been made in either the program or the baseline which invalidate this approach?

D. Major Changes.

What changes have been made to the present configuration baseline? What major changes were made to existing program segments or major subsystems (e.g., engine change, propellant subsystem change)?

Review system interface control documentation for problems and impact of changes.

Were program requirements changed from the originally defined program? What was reason for the change (e.g., were test levels changed because of difficulty getting a unit through test)?

Were major changes made after critical program milestone (e.g., loads analysis) and was the analysis adequate?

E. Master Schedule Changes.

What is the overall master program schedule? What is the overall master schedule for each program segment (e.g., booster, spacecraft, upper stage, remote tracking sites)?

How is the present program schedule different from the original schedule? What are the reasons for deviation from the original schedule? What compromises were made? Did these include technical compromises? Was adequate analysis/test performed to verify the changes?

F. General Review Items not Oriented to Specific Subsystems.

What was the technical completeness of failure modes and effects analysis including single point failure determination, elimination, and assessment of risk associated with remaining single point failure?

Were hardware test methods in compliance with MIL--STD-1540A? (Even though the program may predate MIL-STD-1540A, implications of deviations should be examined.) What were the test data collection/analysis techniques? Were there critical functions that were not checked at the launch base? What is the adequacy/status of test procedures?

What is the adequacy/status of internal reviews or audits which have been completed or scheduled for this mission? What failure analyses and corrective actions were performed?

Is the program in compliance with SD Commander's Policies (SDR 540 series)?

SECTION II: PROGRAM SOFTWARE

A. Performance

B. General

A. Performance.

What are the software requirements and specifications (in actual operations and against simulated scenarios)? What specification deviations and waivers have been applied? What stress testing has been accomplished?

What has been software availability (uptime record)? Are there any performance limitations? What is the impact of input data overload? Are there any plans for changes (what, when, why, how will they be tested)?

B. General.

Review validation/certification status. Is there detailed validation of software for flight-unique aspects (such as orbital characteristics, vehicle characteristics, mission requirements, and relationship to/interaction with other active flights)?

What were software design assumptions and testing? Review version identification and configuration control.

Has operator interface been clearly established? Are there verified procedures and manuals?

Has data base readiness been reviewed for completeness, thoroughness of checking, sensor calibration, command repertoire validation, and method for incorporating updates?

Are the documents which describe the vehicle and mission parameters under configuration control? Do these documents contain all parameters from which the guidance logic assemblies and constants are derived? When are the flight program assemblies and flight program and constants tapes placed under configuration control?

SECTION III. SPACECRAFT (Including Associated ASE)

A. Overview

B. Test Philosophy

1. Overview

2. Piece Parts - Procurement and Screening

3. Box Level Tests

4. System/Subsystem Tests

5. Retest

C. Design

1. Overview

2. Electrical Design

3. Mechanical Design

4. Attitude Control Design

D. Product Assurance

1. Quality Assurance/Manufacturing

2. Reliability

3. Parts, Materials Processes Control

4. Corrosion and Contamination Control

5. System Safety

E. Configuration Control

F. Environments

A. Overview. Obtain a briefing on the overall design and operation of the spacecraft. This briefing should include the salient features of design and the intended on-orbit separation to satisfy mission requirements. The following questions should be asked:

What elements of the spacecraft are critical, in the sense of having little design margin over requirements?

What use is made of existing hardware? What is the history of that hardware?

What hardware required new development?

What hardware represents state-of-art design?

B. Test Philosophy.

1. Overview:

What are the test environments and how were they derived? Has test software been validated?

What is the sequence of system/subsystem testing?

How are the qualification and acceptance levels related to flight?

Is there a logical flow in test from box level to system level?

How does the test program compare with the requirements of MIL-STD-1540A? What were specification deviations and waivers?

What flight failures have occurred? What were resulting changes/ testing/dispositions?

2. Piece Parts - Procurement and Screening:

What is the overall piece part procurement and testing philosophy? What procurement specifications are used? Is a monitored line used?

What incoming inspection/screening/testing is performed?

What is the parts control system?

Were there any hardening requirements?

What new, commercial, or non-specification parts were used?

What were the parts problems experienced by the program?

3. Box Level Tests:

a. Qualification:

What is the qualification status of all boxes? Which boxes were qualification tested and which were qualified by similarity?

What problems/failures were experienced in qualification and what was their resolution?

What were the time and levels of qualification? What is the hot vibration requirement?

b. Acceptance:

What is the acceptance status of the boxes?

What were the problems/failures for flight boxes and spares? What were the levels and time for acceptance?

What is the hot vibration requirement for acceptance?

4. System/Subsystem Tests:

What development tests were performed? What was their purpose and what were the results? When were these tests performed with respect to major program milestones?

What qualification tests were performed at the system/subsystem level? What were the results of these tests? Were there any configuration design or environment changes after these tests? How were these changes resolved with respect to the qualification status?

What was the system/box test interaction? (If a box is changed out part way through system testing, what significance does this place on system tests accomplished prior to the change-out?)

What acceptance tests were performed? What were the results?

What additional testing is to be performed?

Will the electrical signature of the spacecraft be established before launch?

5. Retest:

What are the requirements for retest, after failure and/or after minor redesign?

What were test discrepancies/extent of resulting rework/adequacy of associated retest? Were any waivers or deviations granted?

C. Design.

1. Overview:

What are the critical design areas and related margins? What is the significance of single point failures in the design?

If flight experience exists, have there been design changes or modifications since the previous flight? What was the purpose of these changes? Was there an impact on interfaces, single point failure susceptibility, or adequacy of retest? (If no previous flight experience exists, this item should include all changes since the Critical Design Review.)

2. Electrical Design. The purpose of these questions is to assure that the design meets the performance criteria. The questions should encompass a review of the design criteria, a review of the changes among the engineering model, the qualification model and the flight model, and an understanding as to why changes were made. Also, conduct a review of the single point failures and understand why these risks have been considered acceptable. Review the redundancy and the margins that have been incorporated into the design. Review the design waivers.

a. Power and Electrical:

(1) Battery.

What are the type and flight history of battery?

What are the battery capacity and test history?

What is the battery conditioning philosophy?

How was the battery storage life verified?

What is the battery build history and what are the storage procedures?

(2) Solar Arrays.

What are the type and flight history of solar arrays?

What performance testing was performed on solar arrays?

(3) Life.

What assumptions were used to predict end-of-life performance for the power system?

What is the compatibility of depth of discharge requirement with the life requirement for batteries?

(4) Distribution.

What is the power conditioning philosophy box, bus or both?

What is the power distribution fusing philosophy?

How is state of charge of battery determined both on pad and on-orbit?

What is the grounding philosophy? Is single point ground used throughout the spacecraft and with the upper stage and ASE?

If Teflon wire is used, what special procedures are used to prevent problems as a result of cold flow?

Do all connectors meet Military Specification requirements?

(5) Reliability.

Was there a failure modes and effects analysis (FMEA) for power and electrical subsystem?

Has the power loading profile been calculated? Does testing include this profile? Were contingencies included? What are the contingencies for power subsystem failures? What planning has been performed?

(6) Electromagnetic interference (EMI) and electromagnetic compatibility (EMC) requirements.

How is compliance determined?

What factory and launch base tests are performed?

b. Telemetry and Command Design:

What is the command rate? Is an encryption system used? Is authentication used? How has redundancy been incorporated?

c. Interfaces

What are the interface criteria - How are they tested?

How is redundancy across interfaces established - How was it designed? How was it tested? Is redundancy tested with failed units simulated?

How was the Communication Security interface verified?

3. Mechanical Design.

a. Loads:

(1) Overview

What were the preliminary design loads factors?

What is the loads analysis plan?

How many loads cycles were performed? When were these loads cycles performed with respect to major milestones? Was a ground vibration test and static loads test performed? What method was used? Who has primary responsibility for loads calculation? Does the spacecraft contractor have the capability to calculate loads for the spacecraft? What is the current status of the loads effort?

(2) Models

What are the size and complexity of the spacecraft models for each cycle?

Was independent modeling done?

(3) Events Analyzed

Were all events (ignitions/shutdowns) analyzed? If not, why not?

Which events are critical?

(4) Forcing Functions

How were the forcing functions generated? Were they multiple forcing functions or a composite? How were the forcing functions verified?

(5) Methods

What interface stiffness was assumed?

Were the loads combined (worst case) event phased or time phased?

b. Structural Integrity:

What are the size and complexity of the loads transformation matrix? Was it independently validated?

How did the loads compare with the design capability of the members? What were the margins of safety for each member?

Did the static test envelop all member loads? What were the margins? How was the test performed? How was each member loaded (that is, percent of design load)?

Is there a fracture control plan according to SD-YV-0068, Fracture Control Requirements for DOD Payloads?

How was the ground vibration survey run? What were the test criteria? What spacecraft was used? How did the results compare with the model? When was it run with respect to other program milestones (that is, manufacturing of the flight vehicle or the static loads test)?

c. Mechanisms:

(1) Requirements/Design

Have single point failure been avoided?

Do deployables have adequate clearance margin? For deployables, has the design allowed testing in one g?

Does the design provide adequate force/torque margin?

Have backup modes and/or fail-safe features been properly considered?

(2) Verification of Design

Have all the proper environments been tested for deployables that require "zero g" supports for functioning; have the supports been proper?

Has a life test been successfully performed?

(3) Hardware Acceptability

How have force/torque margins been determined/demonstrated? Have proper controls been used in protecting the mechanism from debris, oxidations, etc?

Has proper functioning been demonstrated with the proper environments?

(4) Explosive Ordnance

Has a proper lot acceptance test program been used?

Have storage life limits been specified and is there a good basis for the limits, or is there other adequate storage life control?

Has all ordnance been stored properly?

What ordnance, if any, will exceed storage life prior to use and what provisions have been made by the payload program office for requalification?

d. Launch Vehicle Interfaces:

(1) Mechanical

Is there a flatness/planarity requirement?

Is there a loads peaking requirement - how is it satisfied?

What is the allowable shimming? Is the interface plane the field joint?

Who provides and installs the interface hardware?

Was an interface test or interface tool used to verify the integrity of the interface?

(2) Separation System

Who did the separation analysis? Was it independently verified? Was there a margin analysis (worst case) for the squibs?

Was there an all up system separation test?

What are the tip off requirements?

What margin does the analysis show for tip off? What clearances are available at separation?

Are the springs balanced? Who controls the spring selection?

Are the springs stored in compression? If so, does the analysis take this into account?

How is the flight separation system tested, in the factory, and on the pad?

How is the separation sequence initiated? Is it redundant? What are the criteria? Are there backup commands? Does the power go through a separation connector?

What is the booster control state at separation? Was it considered in the separation analysis?

Is there any potential conflict between the primary separation command and the backup command? Is a collision avoidance maneuver required?

What are the post separation clearances? Was upper stage engine tailoff considered?

(3) Interface Signals

What signals cross the interface? Are they redundant?

Are backup signals carried through separate connectors?

Can a failure condition (for example, a short) on the spacecraft side adversely affect the upper stage or orbital mission? Is this a single point failure condition?

Are there any race conditions that could cause problems?

e. Thermal Control:

Have temperatures to all critical locations been predicted by analysis?

Was there an adequate temperature verification test program?

Are there adequate margins for qualification and acceptance?

Are heaters/thermostats for all critical equipment fully redundant? Will failure of a redundant device allow sufficient margin?

Has there been adequate allowance for degradation of thermal surface characteristics, that is, solar absorption?

Are heaters/thermostats checked out on the equipment so that they or other equipment are not overheated or undercooled?

When is the last time that heaters/thermostats are checked?

f. Propulsion:

(1) Overview

What are the propellant requirements for attitude control, station keeping, and orbit adjust? Is there adequate propellant margin?

What is the overall propulsion system design? Is the system redundant? Can it survive any one line break or failed valve? Are commandable work-arounds available?

(2) Thrusters

Is the design consistent with the expected duty cycle and life?

Are the thrusters redundant in each plane?

(3) Latching Valves

What is the qualification status?

Are the latching valves normally open or closed?

(4) Tanks and Tubing System Safety/Pressure Testing

Has this system flown before? Is the tank qualified?

Are the end-of-life pressurants adequate?

Does the system comply with the Commander's Policy on pressure vessels?

(5) Temperature Control

Is sufficient thermal control of the propellant provided at the tanks, lines, valves, and thrusters? Is it redundant?

4. Attitude Control Design.

a. Overview:

Has the basic control system design flown before?

Based upon control system configuration, is the control system completely redundant?

What are the control system interfaces (power, thermal, payload, TT&C)?

What are the control system performance requirements (pointing accuracy, maneuver requirements, life, pointing references, sun, earth etc.)?

Have the payload dynamics been identified? How have interface changes been identified (payload dynamics, mass properties, etc.)? Have the control system dynamics been verified by flight and/or an independent analysis?

Does the spacecraft have flexible appendages and, if so, are they included in the dynamic analysis?

Have the dynamics during failure been analyzed? Have all operating modes been analyzed?

Can the control system be tested end-to-end?

Are the mass properties critical? How are they verified?

What contingency modes have been identified? What ground control interface is required to handle recovery from failure? What development tests have been conducted to demonstrate control system performance?

Has the mission sequence been tested for the control system including contingency modes?

b. Sensors:

Have the attitude sensors flown previously? Are the sensors completely redundant?

Have potential sensor interference problems been analyzed such as moon in field of view of earth sensor, reflections off spacecraft appendages into sensor, etc.?

Have sensor requirements been defined for all possible spacecraft orientations including failure conditions?

Does the payload provide attitude sensing information - if so, has the interface been defined?

Has the switching between redundant attitude sensors been tested? Are the sensors and control electronics cross-strapped? If so, have all possible combinations been checked?

c. Actuators:

Have dynamic requirements of control actuators been defined (thrust valves, reaction wheels, etc.)?

For dual spin vehicles, what are requirements for

despin motor bearing and slip-ring assembly?

What are the torque margins on the actuators? Have the margins been demonstrated by test?

What are life requirements of actuators? Have life tests been conducted?

Are actuators redundant? Have all actuator failure modes been identified?

Are there any unique performance requirements such as torque variation (noise, friction torque, etc.)? Have these performance requirements been demonstrated over the life requirements?

d. Control Modes:

Have all unsafe control system modes been identified in the vehicle constraints document?

How are improper commands identified?

Does the system design have a safe mode in the event of component failure? Are control mode transitions automatic or controlled via the ground? What provisions or contingencies are provided for in the event of failure during the mode transition?

Can the vehicle be recovered from a tumble mode? Have the subsystem interfaces been defined for tumble vs. uncontrolled mode before recovery?

Do control modes require ground intervention? Has program software been tested to verify these modes work? What telemetry and command access are available in each control mode?

Is there any way vehicle access can be lost by getting into a dynamic mode in which antenna coverage is inadequate?

e. Stability Analysis:

Have system stability and dynamic performance been verified for all operating modes and reasonable ranges of parameters?

Were the component, vehicle, and environmental models used in the analysis adequately detailed and correct? Were the component models verified by test? Were all backup modes analyzed? What was the basis for model verification, that is, analysis, test, etc.?

Was a mission timeline analyzed?

How were the mass properties determined or obtained? How were the structural dynamics obtained? Was an independent analysis

performed by a second agency?

Have the results of the contractor's and independent agency's analyses been compared?

f. Attitude Control Accuracy Requirements:

What are the attitude control system (ACS) accuracy requirements?

Are the ACS accuracy requirements defined for all mission modes? Does the payload set the ACS accuracy requirements? How has the ACS performance accuracy been verified - test, analysis, etc.?

What software is required to process ACS data to determine performance? Has the software been tested with the hardware?

Are the performance requirements consistent with any vehicle maneuvering for sensor pointing?

Does a detailed error analysis exist? How were the error sources determined? What hardware tests have been conducted to verify the error allocations?

D. Product Assurance.

1. Quality Assurance/Manufacturing. Accomplish the following in the order of priority listed:

a. Has the contractor prepared and submitted a Quality Assurance program plan according to Military Specification, MIL-Q-9858A, Quality Program Requirements, or other equivalent government specification?

b. Review and verify the implementation of corrective

action for all major system and subsystem acceptance test failures:

(1) Review the original corrective action recommendation appearing on the materials review board or the failure review board paperwork or whatever paperwork contained the original corrective action disposition.

(2) Ensure that corrective action was actually implemented in the hardware being IRR'd by reviewing the paperwork which implemented the corrective action (for example, rework manufacturing shop order). Ensure that rework was inspected by contractor quality assurance personnel and Government quality assurance personnel (if required). Verify rework by actual reinspection if possible.

(3) For items which required retest to verify corrective action, verify that the item being IRR'd actually passed the retest by reviewing retest results. Verify that retest results were bought off by contractor quality assurance personnel and Government quality assurance personnel (if required).

(4) Verify that any generic corrective action resulting from any item in (2) above was implemented by reviewing whatever changes were required (that is, drawing change, specification change, process change, material change or procedure change).

c. Review and verify the implementation of corrective action for all major system and subsystem level physical nonconformances. As in a above, review the original corrective action disposition, ensure the corrective action was incorporated into the hardware being IRR'd and was bought off by contractor quality assurance personnel and Government quality assurance personnel (if required).

d. Review all major open nonconformances against the end item being IRR'd. Ensure that the contractor has a plan for dispositioning these nonconformances prior to the end item being shipped.

e. Review as-built and test packages of selected critical components. Candidates for review should be chosen on the basis of mission criticality (for example, contain one or more single point failures) or for any reason that would single out certain components as being high risk to mission success.

2. Reliability.

a. Reliability Assessment:

What piece part and component life and reliability tests have been conducted?

Is there a reliability program plan?

Are limited life items defined and controlled?

How are the life predictions justified?

Was a failure modes and effects analysis (FMEA) conducted according to MIL-STD-1543, Reliability Program Requirements for Space and Missile Systems, or SD STD 77-2, Failure Modes and Effects Analysis for Satellite, Launch Vehicle, and Reentry Systems?

Were both reliability and design groups involved?

Were single point failures (SPF) and modes

identified and reported?

Were special tests and controls for SPFs defined and implemented?

Where in production through launch and operations are SPFs tested or checked?

Is there a reliability prediction model and report?

What are significant changes in the current prediction from earlier predictions?

How were piece part failure rates predicted or estimated?

What derating criterion was defined and used in designs and changes?

What worst case analysis was done?

b. Failure Reporting and Corrective Action System:

Is there a "closed loop" failure reporting and corrective action system used?

What failures are analyzed and require corrective action?

What is retest after failure practice?

Is failure data systematically collected and analyzed to identify reliability problems or determine trends?

Are failures classed in terms of severity or criticality?

Was program management visibility maintained of failures and analysis status?

Was there a failure review board and how was it managed?

c. Reliability Status:

Are all piece parts and components qualified? How were they qualified? How long has it been since qualification? Have parts/components been properly stored?

Is there a critical parts list and critical items list?

What was criteria for defining critical items?

Were critical item tests and controls (compensating features) defined and implemented (including SPFs noted above)?

Have all failures been closed out?

Are there any unverified or not repeatable failures?

What is the contractor's system for responding to

part and component alerts from Government-Industry Data Exchange Program (GIDEP), the payload program office, and industry?

Are determinations made that suspect or reported bad parts or components are not installed in system, subsystem, or box level units, including subcontractor's?

3. Parts, Materials, and Processes (PMP) Control.

a. Is there a PMP program plan?

b. How was the PMP control board or PMP advisory group managed?

c. What variations were accepted from original PMP requirements?

d. Review major failures caused by electronic piece part failures. Review specific (this hardware) and generic corrective action for adequacy. Review configuration and build paperwork to verify incorporation of corrective action on hardware being IRR'd.

e. Complete actions outlined in d above for all major process failures.

f. Complete actions outlined in d above for all major material failures.

4. Contamination/Corrosion Control.

a. Has the contractor prepared and submitted a contamination control plan according to MIL-STD-l246A, Product Cleanliness Levels and Contamination Control Program?

b. Has the contractor prepared and submitted a corrosion control plan that describes how the corrosion control activities are to be accomplished and maintained?

c. Cleanliness Requirements:

What are the cleanliness requirements?

How were they derived? Are they valid?

What are the sources?

Are the requirements satisfied (at the factory, at the launch base)?

5. System Safety.

Has the contractor prepared and submitted a system safety program plan which describes how safety activities are to be accomplished throughout the life cycle?

Are system safety requirements evaluated in terms of their relationship to reliability and quality assurance?

Has the contractor performed the system safety analyses specified by the statement of work (SOW)? Examples of these are:

a. Preliminary hazard analysis.

b. Subsystem hazard analysis.

c. System hazard analysis.

d. Operating hazard analysis.

Have these safety analyses been evaluated during design and program reviews?

What are the methods of hazard detection used by the contractor?

Does the test program incorporate requirements which are used to verify that the system design is safe, and is the system tested to these requirements?

What action was taken to minimize or eliminate hazards which are catastrophic or critical?

What action was taken on those catastrophic or critical hazards which cannot be eliminated or controlled to an allowable limit?

E. Configuration Control.

Are all drawings, specifications, and procedures controlled and up-to-date?

Does the hardware match the paper? (Perform spot comparisons if possible.)

Can hardware fabrication or rework be accomplished to unreleased or marked-up drawings or procedures?

Is there a complete pedigree package for each component?

How are drawings, procedures, etc., controlled - who can approve changes?

Are substitute parts allowed and, if so, how are they controlled?

Is there a Configuration Management Plan?

What is the authority of the responsible engineer during manufacture and test of the unit?

What system does the contractor use to verify that all approved changes have been incorporated?

F. Environments.

What are the "flight levels" for the following environments: acoustics, vibration, shock, and EMI? How were they derived (historical, empirical, analytical, or test), and what was the statistical statement of the data?

What environmental test levels or durations have been compromised or modified to reflect hardware fragility or special test problems?

SECTION IV: BOOSTER

A. Intra Booster Interfaces

B. Fairing

1. Clearances

2. Contamination

3. Flight Integrity

C. Hardware

1. New Equipment

2. Acceptance and Qualification Status

D. System Test

E. Guidance Software

1. Program Peculiar Changes

2. Software Validation

3. Launch Base Interface

F. Product Assurance

1. Quality Assurance/Manufacturing

2. Reliability

3. Parts, Materials, and Processes Control

4. Contamination and Corrosion Control

5. System Safety

A. Intra Booster Interfaces.

Have there been any changes since the last flight of this booster configuration?

Is this configuration different from the standard or baseline configuration?

What test are performed to verify the intra booster interfaces?

B. Fairing.

1. Clearances.

Who performs the dynamic clearance analysis? Did it consider stacking tolerances, manufacturing tolerances, spacecraft dynamics? How was the analysis performed?

Is the static clearance verified by measurement both in the factory and at the launch base?

What is the clearance to critical points?

Has the dynamic behavior of the fairing been tested or verified? What is the time history of clearance to critical points at separation? What assumptions were made in the analysis?

Have the separation dynamics of the fairing (modes and frequencies) been tested or verified?

Have all thermal effects been considered (that is, hot dogging, etc.)?

2. Contamination.

What are the spacecraft cleanliness requirements?

What are the sources of contamination from fairings? Were they verified by analysis and/or test? Were the requirements satisfied? Has the analysis been reviewed by the spacecraft contractor?

3. Flight Integrity

Has this fairing flown before? Are the loads and thermal environment for the flight encompassed by previous flights or original qualification? If not, has the fairing been requalified?

Are the cut outs, access doors, and/or radio frequency (RF) windows qualified? Has their incorporation significantly changed the structural characteristics of the fairing?

C. Hardware

1. New Equipment

Is there any new hardware or equipment that has not previously flown on this vehicle? How was it qualified? Have all interfaces been tested and verified? What development tests were conducted to prove out the design?

2. Acceptance and Qualification Status.

Is all equipment qualified? Have there been any environmental changes since the equipment was qualified?

What is the acceptance status for all flight equipment and flight spares? What were the acceptance test or manufacturing problems? How were they resolved? What piece parts substitutions or design changes have been made?

D. System Test.

Where does the booster stand, as far as test status is concerned? Will it make its scheduled delivery? Are there any outstanding test anomalies that have not been resolved?

Has the interface with the spacecraft vehicle been validated? Will it be validated again in the fully mated condition?

Are there any recent failures on other boosters which might affect the flight status?

Have all booster test requirements been incorporated in the integrated schedule? Have proper provisions been made to validate interfaces?

What functions, equipment, and interfaces will be tested at launch base?

Has an analysis of all critical functions been performed, and when were the critical functions last tested?

What environmental tests were performed on the booster?

E. Guidance Software.

1. Program Peculiar Changes.

Is the guidance program similar to that previously employed on another launch or programs? What functional changes have been made? What changes have been made to the program logic, equations, and constants?

2. Software Validation.

Is a software validation plan available? What tools and techniques were used in software validation? Do the test techniques identify all processor alarm conditions during simulations? Describe the test performed in detail.

Have all contingency actions, implemented in the software, been documented in the mission or vehicle description documents? Has each of these contingency actions been validated?

Have all branches of the flight program been validated? Has the most severe case of processor loading been identified and the remaining margin determined? How are the validation results documented?

Has an independent validation of the flight program/program tape and program constants/constants tape been performed?

Are the documents which describe the vehicle and mission parameters under configuration control? Do these documents contain all parameters from which the guidance logic and constants are derived? When are the flight program assemblies, flight program and constants tapes placed under configuration control?

3. Launch Base Interface.

What technique is employed to insure that the program and constants tapes have been correctly loaded into the flight computer?

E. Product Assurance.

1. Quality Assurance/Manufacturing. Accomplish the

following in the order of priority listed:

a. Review and verify the implementation of corrective

action for all major system and subsystem acceptance test failures:

(1) Review the original corrective action recommendation appearing on the materials review board or the failure review board paperwork or whatever paperwork contained the original corrective action disposition.

(2) Ensure that the corrective action was actually implemented in the hardware being IRR'd by reviewing the paperwork which implemented the corrective action (for example, rework manufacturing shop order). Ensure that rework was inspected by contractor quality assurance personnel and Government quality assurance personnel (if required). Verify rework by actual reinspection if possible.

(3) For items which required retest to verify corrective action, verify that the item being IRR'd actually passed the retest by reviewing retest results. Verify that retest results were bought off by contractor quality assurance personnel and Government quality assurance personnel (if required).

(4) Verify that any generic corrective action resulting from any item in (2) above was implemented by reviewing whatever changes were required (that is, drawing change, specification change, process change, material change or procedure change).

b. Review and verify the implementation of corrective action for all major system and subsystem level physical nonconformances. As in a above, review the original corrective action disposition, ensure the corrective action was incorporated into the hardware being IRR'd and was bought off by contractor quality assurance personnel and Government quality assurance personnel (if required).

c. Review all major open nonconformances against the end item being IRR'd. Ensure that the contractor has a plan for dispositioning these nonconformances prior to the end item being shipped.

d. Review as-built and test packages of selected critical components. Candidates for review should be chosen on the basis of mission criticality (for example, contain one or more single point failures) or for any other reason that would single out certain components as being high risk to mission success.

2. Reliability.

a. Reliability Assessment.

What piece part and component life and reliability tests have been conducted?

Is there a Reliability Program Plan?

Are limited life items defined and controlled?

How are the life predictions justified?

Was a failure modes and effects analysis (FMEA) conducted according to MIL-STD-1543 or SD STD 77-2?

Were both reliability and design groups involved?

Were single point failures (SPF) and modes identified and reported?

Were special tests and controls for SPFs defined and implemented?

Where in production through launch and operations are SPFs tested or checked?

Is there a reliability prediction model and report?

What are significant changes in the current prediction from earlier predictions?

How were piece part failure rates predicted or estimated?

What derating criterion was defined and used in designs and changes?

What worst case analysis was done?

b. Failure Reporting and Corrective Action System.

Is there a "closed loop" failure reporting and corrective action system used?

What failures are analyzed and require corrective action?

What is retest after failure practice?

Is failure data systematically collected and analyzed to identify reliability problems or determine trends?

Are failures classed in terms of severity or criticality?

Was program management visibility maintained of failures and analysis status?

Was there a failure review board and how was it managed?

c. Reliability Status.

Are all piece parts and components qualified? How were they qualified? How long has it been since qualification? Have parts/components been properly stored?

Is there a critical parts list and critical items list?

What was criterion for defining critical items?

Were critical item tests and controls (compensating features) defined and implemented (including SPFs noted above)?

Have all failures been closed out?

Are there any unverified or not repeatable failures?

What is the contractor's system for responding to part and component alerts from GIDEP, the payload program office, and industry?

Are determinations made that suspect or reported bad parts or components are not installed in system, subsystem, or box level units, including subcontractor's?

3. Parts, Materials, and Processes (PMP) Control.

a. Is there a PMP program plan?

b. How was the PMP control board or PMP advisory

group managed?

c. What variations were accepted from original PMP requirements?

d. Review major failures caused by electronic piece part failures. Review specific (this hardware) and generic corrective action for adequacy. Review configuration and build paperwork to verify incorporation of corrective action on hardware being IRR'd.

e. Complete actions outlined in d above for all major process failures.

f. Complete actions outlined in d above for all major material failures.

4. Contamination and Corrosion Control.

What are the cleanliness requirements?

How were they derived? Are they valid?

What are the sources?

Are the requirements satisfied (at the factory, at the launch base)?

5. System Safety.

Has the contractor prepared and submitted a system safety program plan which describes how safety activities are to be accomplished throughout the life cycle?

Are the system safety requirements evaluated in terms of their relationship to reliability and quality assurance?

Has the contractor performed the system safety analyses specified by the statement of work (SOW)? Examples of these are:

a. Preliminary hazard analysis.

b. Subsystem hazard analysis.

c. System hazard analysis.

d. Operating hazard analysis.

Have these safety analyses been evaluated during design and program reviews?

What are the methods of hazard detection used by the contractor?

Does the test program incorporate requirements which are used to verify that the system design is safe, and is the system tested to these requirements?

What action was taken to minimize or eliminate hazards which are catastrophic or critical?

What action was taken on those catastrophic or critical hazards which cannot be eliminated or controlled to an allowable limit?

SECTION V: LAUNCH VEHICLE INTEGRATION

A. Overview

B. Documentation Status

1. Specification Tree

2. Management Documentation

3. Range Documentation

4. AFSCF Documentation

C. Verification/Validation

1. Upper Stage and Spacecraft Loads

2. Stability Analysis

3. Separation

4. Trajectory

5. Injection Error Analysis

6. Environments

D. Trajectory

1. Design Trajectory

2. Injection Errors

3. Launch Constraints

A. Overview.

Who is the integrator of the stacked vehicle? Who has overall system responsibility for the upper stage and ascent support?

What are the roles and responsibilities of all participants? How are they documented?

What is the policy on independent verification and validation? What are the roles of the participants, by task, for prime responsibility review, verification and/or validation?

B. Documentation Status.

1. Specification Tree.

What is the hierarchy of specifications? How are the program requirements delegated to lower level specifications?

Is there a formal and controlled system to track the status of satisfying all requirements? If not, how does the program know that all requirements have been satisfied and the required reviews/validations have been performed?

a. Launch Vehicle System Specification

What is the top specification for the upper stage system? How are requirements delegated within the boost system (that is, reliability, etc.)?

b. Interface Control Documents (ICDs)

What ICDs are on the program? Who do they control? When were they signed relative to the program schedule? Does the validation section (Section IV) call out all methods of validation or is it only for tests?

2. Management Documentation.

How is the program organized? What are the roles and responsibilities of the participants?

How is the program managed? Is there a launch vehicle integration working group? How are action items handled, etc.?

Are the interagency agreements in effect and current? Do they reflect the intended roles and responsibilities? Are they being implemented?

3. Range Documentation.

a. Safety Documents.

Do any unique range safety requirements exist? Are responsibilities spelled out in appropriate documents, that is, memoranda of agreements? Are all safety deviations covered by waivers?

Is emergency documentation current, that is, impact teams, accident investigations? Is approval of hazardous procedures documented?

b. Flight Plan Approval.

Was submittal of plan timely? Was plan based on actual, predicted parameters or planning parameters only? Is plan current with any program changes?

c. Requirements/Capabilities.

Have the program requirements document (PRD)/program support plan (PSP) been reviewed and coordinated on by all responsible agencies? Have any unique requirements been identified? Are operations requirements (OR)/operations directives (OD) on adequate schedule to support the operation? Do ORs and 0Ds differ on critical requirements?

d. Test.

Has the R day schedule been modified to accommodate any peculiar test? Are there any unique launch constraints? Have all integrated test procedures been approved by all necessary agencies?

4. AFSCF Documentation.

Requirements/Capabilities:

What document(s) provides support requirements to the AFSCF? Are all procedures, operating instructions, etc., written and approved?

Are there vehicle anomaly procedures and contingency plans? Who writes them? Are they complete and adequate?

Is all operational documentation available, complete, and adequate (for example, orbital requirements document, orbital test plan, technical operations instructions, etc.)?

Is there a software and data base configuration control system? Is it adequate?

How many rehearsals are planned? Will the rehearsals include planned anomalies? Do the rehearsals include all program segments required during a launch for on-orbit support? Who writes the operational documentation?

What is the review and approval cycle? Is this cycle adequate?

Are the operational procedures complete? Is an operational constraints document available? Is this document formal and binding upon the AFSCF? Does the payload program office have final authority to change this document and authorize deviations?

Will contractor technical analysis personnel be available at launch and during the on-orbit verification plan? Will contractor technical personnel be available during the on-orbit operation phase?

C. Verification/Validation.

What was the method used for verification/validation of:

1. Booster and Spacecraft Loads.

Were independent models derived from drawings? Was a completely independent loads analysis run for each loads cycle? Were all the events checked? Were the forcing functions verified? Were the interface stiffness assumptions verified?

2. Stability Analysis.

Were independently generated models used? Was powered flight stability checked for several flight times?

Were all possible spacecraft propellant loads and slosh modes considered?

Was the on-orbit stability verified?

Was recovery from anomalous conditions verified?

3. Separation.

Was an independent separation analysis conducted? Did it consider spring misalignments, attitude control states, stacking tolerances, manufacturing tolerances, mating preloads, engine tailoff and worst case mass properties? Is the use or nonuse of a collision avoidance maneuver required? Were attitude requirements met?

4. Trajectory.

Were independent dispersed, nominal and tag trajectories run? Are all constraints satisfied? How close was the agreement with the primary analysis?

Does everyone understand and use the same performance ground rules? Have all the ground rules been verified?

Has the margin allocation been verified as being optimum? Has the launch window been verified?

5. Injection Error Analysis.

Has an independent injection error analysis been performed? Are the results consistent with the primary analysis and the spacecraft requirements? Was the same analytical method used for both the primary and verification analyses?

6. Environments.

Are the environments consistent with other similar programs? Have the analyses been verified? Are all assumptions consistent?

D. Trajectory.

1. Design Trajectory.

Is there a vehicle data book?

Is there a mission description document?

Has the booster sequence of events been defined? Does the sequence include contingency/backup events (for example, backup stage shutdown and staging commands)?

Will the data and constraints employed in the design trajectory meet all mission requirements without violating any vehicle constraints? What is the remaining performance margin?

2. Injection Errors.

Has an error analysis been performed including vehicle position, velocity, attitude and timing errors at payload release?

Are the insertion errors within the mission error budget requirements? Was the error analysis conducted using linear system covariance methods or using Monte Carlo simulation techniques?

Are the system linearity characteristics consistent with the error analysis techniques employed? Have the error source values used in the analysis been reviewed by the responsible subsystem engineer?

3. Launch Constraints.

Have all mandatory requirements for launch been identified? Have timely reporting mechanisms for each of these requirements to the launch base been implemented?

Have the criteria for the launch window been reviewed by the responsible engineer? Has the launch window been identified?

SECTION VI: LAUNCH BASE PROCESSING

A. Overview.

B. Roles and Responsibilities

C. Transportation and Handling

1. Spacecraft Transportation

2. Spacecraft-Booster Mating

3. Fairing Mating

4. Ground Support Equipment (GSE) Installation

5. Security and Technical Surveillance

D. Test Sequences

1. General

2. Test Control

3. Support

E. Procedures

F. Facility Checkout

G. Rehearsals

A. Overview.

Is the launch base ready to receive the vehicle, in the sense of any paperwork or planning details that could possibly detract from the smooth checkout and launch flow?

B. Roles and Responsibilities.

Review status of interface specifications.

Is there a signed interface document delineating responsibilities at the launch base among payload program office/test group/Aerospace/contractor? Are the documents clear? Does it cover the entire spectrum? Does it appear as if working arrangements have' been set in accordance with the interface document?

Is there a launch base test plan which defines the sequence of ground tests which the payload must complete prior to launch? Are policies and procedures in effect to assure proper test discipline and provisions for authorizing deviations to approved test sequences and procedures?

C. Transportation and Handling.

1. Spacecraft Transportation.

Have the transporters been adequately tested? Are any special waivers or permits required for road transportation? Will the transporter be properly instrumented? Are escorts provided?

2. Spacecraft-Booster Mating.

Have written procedures for spacecraft-booster mating been prepared and approved? Do they adequately provide for all environmental constraints?

Has safety been adequately provided for?

Review fit check of launch vehicle with launch complex.

3. Fairing Mating.

Have written procedures for fairing mating been prepared? Have adequate precautions been taken to prevent dropping loose objects inside the fairing, or any place inside the vehicle?

How will internal clearances be verified?

4. Group Support Equipment (GSE) Installation.

Review the launch complex/GSE failure history. Are there any consistent failure modes which could impact this launch?

Has all the GSE been installed and properly checked out?

If the GSE is to go to the launch site with the vehicle, have adequate preparations been made to install and check it out prior to arrival of vehicle?

Do the facilities provide adequate environmental control for the GSE? Do the facilities provide adequate power, personnel access, security provisions, etc. Will all GSE be adequately protected from the launch environment?

5. Security and Technical Surveillance.

Will any of the GSE require special handling because of security requirements?

Will there be any special security arrangements required by the vehicle? Have these arrangements been made? Are they adequate?

Will technical surveillance be required? When? Have these arrangements been made?

D. Test Sequences.

1. General.

Has the booster test schedule been integrated into the overall test schedule? Have any special test requirements been worked out with all associated contractors?

Does the spacecraft checkout require cooperation of the Vandenberg Tracking Station (VTS), or other related facilities? Have procedures been coordinated with them? Have they been provided with necessary software, operational constraints, etc.? Has an adequate block of test time been allocated?

Has functional and proof testing (including test plans and reports) been adequate since the last launch?

Will there be at least one simulated countdown run? Will all concerned contractors and facilities participate?

Is the draft of the countdown manual on schedule? Will all concerned parties participate in the review?

Has the countdown manual been reviewed for completeness? Does it include planned holds? Are all contingency modes included? Is the manual clear and easily understood? Have provisions been made for last minute reviews and changes? Will distribution be adequate?

2. Test Control.

Is the test control responsibility documented in appropriate interface agreements? Is there an effective interface between contractors?

Is manning adequate for proper test control?

Are there adequate "confidence" builders in the system, that is, launch test working group (LTWG), launch vehicle integration working group (LVIWG), technical interchange meeting (TIM), readiness reviews?

What agencies review post test data? Does the payload program office/test group/Aerospace review all test data?

How are anomalies resolved and documented? Who gives retest direction to the contractor?

How is the "AF Team" organized to work off problems?

Is data turn around time adequate to support following tests and mission directors review? Will post flight data be available in time to support the next launch?

Are post test/launch critiques and reviews held?

3. Support.

Have all support groups received adequate notification of their requirements?

Have all necessary support documents (orbital requirements document, program requirements document) been written, approved, and issued?

Have arrangements been made for last minute wind measurements? Have sufficient computers been made available? Are wind analysis software programs all complete and verified?

Have all range safety requirements been met? Is there an emergency recovery plan? Have emergency recovery teams been adequately manned and instructed?

Have arrangements been made for contractor and payload program office management personnel to be available during countdown?

Will telemetry ships and/or aircraft be required? Have proper arrangements been made, if necessary?

E. Procedures.

Review readiness of test/checkout procedures and planning. Have all procedures been adequately reviewed and approved for both technical content and safety provisions?

Do the procedures, taken as a whole, provide the complete and thorough system checkout? What functions will not be verified at the launch site, and why?

Does the overall checkout provide an adequate test of the vehicle?

Have all vehicle functions been verified on the system level? What tests are run in the factory that cannot be run at the launch base? Is measurement accuracy sufficient to satisfy communications electronic instruction (CEI) requirements?

Will the launch base procedures be run in the factory, just prior to shipment? Is factory test data reviewed?

Who establishes that the vehicle is ready to be shipped?

Are all hazardous procedures properly identified? Have all necessary approvals been received from safety?

Do the contractor's pyro handling techniques meet with test group approval?

F. Facility Checkout.

Will there be a qualification test vehicle (QTV) used to validate the launch facility? If a QTV was not used, how will the launch facility be validated?

Will there be a vehicle simulator used on subsequent launches to validate electrical connections?

Will there be adequate launch complex/vehicle/satellite EMI testing?

Will there be adequate rehearsals of all procedures?

Is there a facility checkout plan? Which office will certify that the facility is ready to be connected to the flight hardware?

What is the extent and adequacy of refurbishment since the last launch? Have all changes that have been made since the last launch from the facility been validated?

Have all required facility changes been incorporated? What is the impact on interfaces?

What is the status of spares or replacements for systems critical to launch?

Is there an adequate maintenance plan?

Is obsolete equipment being used? Does its use impact this launch?

G. Rehearsals.

Review experience/training of launch crew. Have adequate rehearsals been scheduled? Are they scheduled early enough to make procedure changes should they become required? Will the rehearsals include all concerned contractors and support groups?

Have the AF Satellite Test Center and/or program related command and control agency received all necessary data (that is, command formats, constraints, etc.)?

Have adequate rehearsals been run to establish that any vehicle data review requirements can be met in the real time launch environment? How will payload program office/test group review vehicle test data?

What are the requirements for the generation of failure reports, and how will they be processed and closed out?

Will daily data review meetings be held?

Who is responsible for declaring a particular test as satisfactory? What will the role of payload program office/test group be in reviewing data during the terminal count?

SECTION VII: ASCENT AND ORBITAL SUPPORT

A. Range Support

1. Support

2. Documentation

3. Analysis

4. Procedures

5. Contingency Planning

B. AFSCF Support

1. Analysis

2. Documentation

3. Procedures

4. Support

5. Test

6. Command Capability

7. Contingency Planning

8. Lines of Control/Authority

C. Program Unique Ground Stations

1. Documentation

2. Support

3. Test

A. Range Support.

1. Support.

What telemetry coverage is required? How is the telemetry coverage obtained? Do the telemetry coverage requirements include all critical events? Will the telemetry coverage cover all critical events with a high probability of success?

What is the availability and capability of ships, aircraft, and range systems to support the launch, including down-time history and back-up capability? Have there been recent modifications which could impact compatibility for this mission?

2. Documentation.

What is the status of the program requirements document and program support plan including changes since last launch? What is the status of the vehicle range safety package and flight waivers including changes since last launch?

Is there a record of violations on radio frequency (RF) allocation? What is the means of control?

Has proper documentation been generated to assure safe handling of biological and radiological hazards associated with the launch vehicle and satellite?

Review launch software package and its applicability to the launch vehicle and satellite. Have there been any changes which could affect range safety plots or hazards?

3. Analysis.

What is the status of the telemetry mode and displays? What are the procedures and are they complete and adequate? How is the data handled? Do the procedures specify who is responsible for analyzing each telemetry point? Is the responsibility clear or could there be confusion between organizations?

4. Procedures.

Are there specific range operations procedures for this launch? Are the procedures being provided to all elements and organizations on a timely basis? Does the schedule allow adequate time for review by all organizations?

Are the procedures clear and concise?

5. Contingency Planning.

Does the ascent and orbital procedures documentation include planning for contingencies? Have all critical failure modes been identified and included correct response? Do the procedures include both on-orbit vehicle and ground anomalies?

What are the contingency plans and constraints for supporting launch delays?

What is the availability and test status of fluids and gases necessary to support the launch?

B. Air Force Satellite Control Facility (AFSCF) Support.

1. Analysis.

What is the status of the telemetry mode and displays? What is the status of telemetry and command prepasses? What are the procedures and are they complete and adequate? How is the data handled? Who analyzed the telemetry? Do the procedures specify who is responsible for analyzing each telemetry point? Is the responsibility clear or could there be confusion between organizations?

2. Documentation.

What is the status of program office documentation?

a. Orbital requirements document

b. Test Plan

c. Test procedures

e. Calibration data

f. Crypto information

g. Any other vehicle peculiar data

What is the status of AFSCF documentation?

a. Operating procedures

b. Test sequences

c. Test Operations Order (TOO), Annex E

d. Command plans (pass plans)

3. Procedures.

Who writes the orbital operations procedures? Are the

procedures being provided to concerned organizations on schedule? Does the schedule allow for adequate review by all organizations? Are the procedures clear and concise?

Are the following procedures/documents available: initial checkout and daily operations scenario, pass planning, communications between Remote Tracking Station (RTS) locations and the Satellite Tracking Center (STC), loss of communications, handoff procedures between stations? Are these documents clear, concise and adequate?

Is the procedure to handle on-orbit anomalies complete? Are adequate contingency plans available? Does the program office have final approval on all on-orbit anomaly tests? Is there a formal agreement among all organizations and is it binding?

4. Support.

What is the anticipated support for required test operations?

Have alternatives been developed to the support timelines to avoid operational conflicts?

Is/are technical advisor support/facilities adequate? What is the operational availability of AFSCF equipment (digital television (DTV), printers, plotters, computers, reproduction equipment, etc.)? Are there any system modifications planned which could conflict with required support?

Has downtime been adequately forecast to minimize conflict with the launch schedule and early orbit checkout periods? Is program peculiar support software operational?

5. Test.

Has vehicle-ground station compatibility test planning/testing been accomplished?

What is the status of tests with assigned external agencies such as Pillar Point and the Development Test Facility?

What is the status of development rehearsals/dress rehearsals?

6. Command Capability.

What command capability exists at each RTS? What authority is required to send commands? Are there deviations to these procedures under specific circumstances?

Are emergency commands available? Are there redundant back-up commands to automatic sequences (for example, separation)? Are these included in the contingency procedures? Under what circumstances can an RTS send these commands?

Are there loss of communications procedures?

Are procedures established such that commands cannot be sent from more than one RTS at the same time?

Are there commands that are protected from inadvertent transmission?

How is command generation accomplished? Has the software been verified?

How is command control accomplished? Is special hardware/software required at the RTS to support the on-orbit verification and/or operational phases? Is it ready? Have sufficient tests been performed?

What is the status of 3D/6D nominal ephemeris and look angles?

7. Contingency Planning.

Are contingency plans adequate?

Does the ascent and orbital procedures documentation include planning for contingencies? Have all critical failure modes been identified and included in the contingency planning? Do the procedures include failure mode signatures and the correct response? Do the contingencies include both on-orbit vehicle and ground anomalies?

8. Lines of Control/Authority.

Who has overall operations responsibility? How is this delegated? Who has operational control?

What are the specific responsibilities of each organization? What is the operations organization? What are the lines of control/authority, and who has responsibility?

How are operational constraints established? Who can authorize deviations from operational constraints? Are compatibility tests performed among all organizations?

Who has final authority to establish the activity plan/pass plan? Is this plan coordinated and provided to all concerned organizations?

Are handoff procedures between stations adequate?

Who performs discrepancy correction and verification? What is the approval chain? Does the program office have final approval?

Is a formal agreement available which delineates overall operational authority and responsibility for on-orbit operations, vehicle health, and anomaly resolution?

Does the program office have final authority to authorize deviations, to establish constraints, and to authorize on-orbit anomaly investigations?

C. Program Unique Ground Stations.

Note: Items presented in the preceding section should be reviewed for application to program unique ground stations. This section provides supplementary items. Use of Section VII.B combined with this section will assist completeness.

1. Documentation

What is the status of program office delivery of required documentation?

a. Test plan

b. Test procedures

c. Calibration/limit data

d. Crypto information

e. Other vehicle peculiar data

What is the status of operational ground station documentation?

a. Command plans

b. Detailed test procedures

c. Success/fail criteria for each test

d. Verified operational procedures and manuals

2. Support

What is the status of command hardware and software?

Have real time telemetry processing software and offline support programs been validated?

Is planned test/operational team support adequate? Are facilities adequate?

Have ground station readiness meetings been conducted? Are there any outstanding action items?

3. Test

Is planning/preparation for category II testing adequate? Has adequate communication/organization been accomplished? Is scheduled support for category II testing sufficient (including integration of AFSCF tests with operational ground station tests, and the identification of possible conflicts between testing and operational requirements)?

Have equipment and procedures to be used in communicating with the AFSCF and other external agencies been verified/tested?

What is the status of nominal ephemeris?

Has a detailed timeline been established for all tests? Has test timeline been coordinated with the AFSCF?