3. FSTD Objective

3.1 PURPOSE

Quality metrics, are quantitative parameters for periodic assessment, and crucial for monitoring the efficacy and functionality of FSTDs. It is a systematic approach to maintain high standards of device performance and training quality, adapting to evolving requirements and enhancing operational efficiency.

3.2 DATA COLLECTION

To ensure precise and comprehensive analysis and quality evaluation, we meticulously record specific information using various methods related to the operations of Flight Simulation Training Devices. These documents may be hard copies or in electronic means or may combined.

1) Training Log sheet

Training log sheet shall be used as an independent mechanism to receive comments, feedback and log of discrepancies along with record of training session and FSTD configuration details. Any sort of discrepancy recorded by the user shall be processed in accordance with the of this manual.

  1. User Data
    1. STD Identification Number
    2. Type of Training
    3. Type of Training
    4. User organisation
    5. Every crew member name including distinguishment between Crew and Instructor
    6. Scheduled Time or booking time for the session as received from ACAT scheduling
    7. Actual Time session start and end as defined by crew/Engineering personnel
    8. Utilization Time breakup
    9. Any comments written in the log sheet (as required).
  2. User rating on Device perfomance
    1. 1 = Unsatisfactory
    2. 2 = Poor
    3. 3 = Acceptable
    4. 4 = Good
    5. 5 = Excellent
    6. Note: Any unfilled ratings will not be considered for analysis.
  3. Technical Data
    1. FSTD down time
    2. Device Failure Time
    3. Lost Training Time
    4. Number of interrupts during a planned training session
    5. Unique Discrepancy Number
    6. Any log on action taken and work done with identification and date
    7. Device Configuration status (SCOC/PFC)
    8. Engineering personnel signoff & comment (if required) with their identification.

2) Engineering documents

  1. Engineering documents related to FSTD
  2. Any information from engineering documents and records used for any sort of documentation will be used to gather information like personnel ID, date & remarks. Some of the main documents are listed below but not limited to,
    1. PMT & QTG work book
    2. QTG runs validation
    3. Software configuration book
    4. FSTD modification log
    5. FSB & logistics
  3. FSTD Discrepancy specific
    1. Unique Discrepancy Number
    2. Description of the Discrepancy and date raised
    3. Reported and raised person identification
    4. Categorization and priority
    5. Systems and action taken / work done including date and personnel identification
    6. Logistics details and SPR number (If applicable)
    7. Discrepancy closure details along with personnel identification
    8. MMI, Impact on Training & DL logs of all FSTD devices
    9. Personnel Documents
    10. Qualification & Experience details
    11. Certifications & Training details

3.3 DATA ANALYSIS

  1. Planned Training Day (PTD) = 24 hours of a day - Planned Support Time
  2. Planned support Time is summation below time categories,
    1. Planned Configuration Time
    2. Planned Engineering Time
    3. Planned Regulatory Time
    4. Planned Out of Service
  3. Actual Training Day (ATD) = 24 hours a day - Actual Support Time - STD Down Time
  4. Actual Support Time is summation of below time categories,
    1. Actual Configuration Time
    2. Actual Engineering Time
    3. Actual Regulatory Time
    4. Actual Out of Service
  5. Training Availability = (PTD -LTTPTD )× 100
    1. Expressed as a percentage of PTD as a function of Lost Training Time (LTT).
    2. Training Availability is from user perspective.
    3. Example:
      1. Planned Training Day (PTD) = 18 hours
      2. Training Time (LTT) = 4 hours
      3. Training Availability = (18  -418)× 100 => 77.77 %
  6. Device Availability = (PTD  - FSTD d owntimeP TD)× 100
    1. Expressed as a percentage of PTD as a function of FSTD Down Time.
    2. Device Availability is from Engineering perspective. This metric takes into account all events that could affect the availability.
    3. Example:
      1. Planned Training Day (PTD) = 18 hours
      2. FSTD down time = 2 hours
      3. Device Availability = (18  -218) ×100 => 88.88 %
  7. Device Reliablilty = (PTD -DFTPTD )× 100
    1. Expressed as a percentage of PTD as a function of Device Failure Time (DFT)
    2. Device Reliability is a metric that takes into account device specific events
    3. Example:
      1. Planned Training Day (PTD) = 18 hours
      2. Device Failure TIme (DFT) = 1 hours
      3. Device Reliablilty = (18 -1PTD )× 100 => 94.44 %
  8. FSTD Utilisation = (Actua l Training  Time + A ctual othe rtimePTD )×100
    1. Expressed as a percentage of PTD as a function of FSTD usage
    2. This could be greater than 100%, thus implying more use of the FSTD than what was planned.
  9. Average Discrepancy Turn Around Time = (Sum  of total d iscrepancy  open times Total numb er ofdiscr epancies)
    1. Discrepancy open time = DR closed date - DR raised date
    2. Expressed as a number of days, time, etc.
    3. Normally an average across a time period, device or fleet.
    4. This metric can be expressed several different ways depending on application.
    5. Priority level may be considered when utilizing this metric for evaluation.
    6. Example:
    7. DR Open Time
      (days)
      1 20
      2 64
      3 180

      ADTAT = (20 +  64 +  1803) => 88 days/discrepancy
  10. Number of interrupts = Count of suspension of training events in a given time period.
    1. Expressed as a number of interrupts per day, week, month.
    2. Can be evaluated per device or across a fleet.
  11. Number of Discrepancies = Count of discrepancies recorded against a FSTD
    1. Expressed as an average number of discrepancies per day, week, month.
    2. Can be evaluated per device or across a fleet.
    3. Can be categorized by priority.
  12. Issue Ageing = Length of time each simulator issue has been unresolved
    1. Expressed as the number of issues open by different periods of time (0- 30 days, 31-90 days, 91-180 days, 181- 365 days, over 1 year).
    2. May be displayed in a graphical showing trending over 12 months.
  13. Average hours between interrupt = (A ctual t raining  time +  Actual  other  timeNum ber of  interrup ts)
    1. Expressed as an average quality rating per day, week, month, etc.
    2. Can be evaluated per device or across a fleet.
  14. Average Quality Rating = (Su m of tr aining s ession q uality r atingsN o of tr aining s ession q uality r atings)
    1. Expressed as an average quality rating per day, week, month, etc.
    2. Can be evaluated per device or across a fleet.
    Note: points 4,5,6,8,11,12 holds critical measurement and this shall be calculated on monthly basis while other points are optional.

3.4 GOALS AND OBJECTIVE

The primary objective for each FSTD is to serve as a dependable and efficient training resource with minimal disruptions. The goals set forth are designed to uphold and enhance the operational quality of each FSTD, while providing feedback for continuous improvement.

  1. Our specific operational performance goals are:
    1. Achieving an Operational Efficiency Rate (reliability) at least 98%.
    2. Keeping the number of Open Training DR below 10 per device.
    3. Maintaining an Average Customer Satisfaction Score of at least 4.
  2. Response to underperformance:
    1. For 2 Consecutive Months Below Target:
    2. The Technical Manager (TM) or assigned delegate shall devise and implement a plan to address the shortfall in performance.
    3. For 3 Consecutive Months Below Target:
    4. The TM shall escalate the issue, reporting to the Accountable Manager deliberate on the situation and determine necessary corrective actions and shall document the decision taken in DTS(refer section 4.2)