Performance Assessment Guide


MODULE 2

Guide For Developing Performance Measures


TABLE OF CONTENTS
Introduction
Objectives
Accessing GDPM
Organizing Performance Measures and Results
Developing Performance Measures
Entering Data
Viewing Results
GDPM Utilities

Appendix A: Performance Measures A-1
Appendix B: Definitions of Statistical Terms B-1


INTRODUCTION

The Guide for Developing Performance Measures (GDPM) offers an easy way to track organizational performance over time. The software provides sample performance measures that others have found useful. You can use the sample measures as an idea source for measures you create for your organization.

You can use GDPM to:



OBJECTIVES

The GDPM User's Guide will show you how to:


It is recommended that you walk through the manual, following the numbered steps as practice, prior to collecting data. The practice allows you to create a set of performance measures which illustrates various aspects of the software. The guided practice should take about 1 hour to complete.


ACCESSING GDPM

Access GDPM

1. From the DoD Performance Assessment Guide's initial screen, choose module 2, Guide For Developing Performance Measures.

2. The GDPM Main Menu appears.

Note: If you have entered this module after using Module 1 (Quality and Productivity Self-Assessment Guide) the organization and work unit (at bottom left of screen) will reflect the names you used for Module 1. These names can be easily changed using the Change Org/Work Unit/Bus Area option in the Utilities menu. If you enter this module first, the organization and work unit area will be blank as shown in the picture.

The Main Menu

The Main Menu consists of three elements:


ORGANIZING PERFORMANCE MEASURES AND RESULTS

GDPM provides a convenient way for you to store your sets of performance measures and results. This is done by setting up directories within the Survey Directory on your computer. For practice purposes, you will be setting up 2 new directories and creating 1 set of performance measures.

Setting Up Directories

1. On the Main Menu (from Utilities), select Change Directory.
2. The Disk Drive Specification window appears:

Enter the hard disk name:
Enter the directory name:

Press <Enter> to proceed, or <Esc> to exit...


3. Type in a directory name, then press <Enter> and complete the entries as specified below.
4. A prompt appears reminding you that the directory does not exist and asks if you would like to create it. Choose Y.
5. The Organization Name and Work Unit Name prompt appears.
6. Type the organization and work unit for which performance measures are to be developed, then press <Enter>.
7. The business area list appears. For practice purposes, choose Base Support.
8. Practice by creating another directory. Choose None from the business area list.


DEVELOPING PERFORMANCE MEASURES

You can develop performance measures using the sample measures for your business area as a guide. To begin developing performance measures, you must first access the directory in which the measures are to be stored.


NOTE: For complete definitions and examples of performance measures, see Appendix A. Try to create a balanced set of performance measures (at least 1 measure representing each of the 6 categories) for your organization.

Selecting Directories

1. On the GDPM Main Menu from (Utilities), select Change Directory.
2. Type in the hard drive and name of directory for the first directory you created earlier, then press <Enter>.
3. The organization name, work unit name, and business area (Base Support) are displayed in the lower left corner of the GDPM Main Menu screen.

Selecting and Creating Performance Measures

Each business area contains a set of standard performance measures that others have found useful. Categories, other than those labeled "None" and "Other", contain additional measures directly applicable to that type of business area. For practice purposes, Base Support was chosen as the business area for the first directory.

View the Sample Performance Measures

1. On the GDPM Main Menu (from Performance Measures) select Data Entry. The list of performance measures appears.
2. Use your arrow keys or PgUp/PgDn keys to scroll through the list to review the measures.
Editing the Performance Measures

Now that you've reviewed the performance measures, the GDPM program makes it easy for you to choose the measures you'd like to retain and add others.

Create a set of performance measures using the following input.

1. The display column allows you to select the measures you want to keep. By default, none of the items are selected for display. To select items for display, use the <F2> key or press <Enter> in the display field for the following items:

Performance Category
Numerator Name
Denominator Name
Goal
Display
Direct OutcomesMilestones Completed Milestones Scheduled 1.00
*
Effectiveness Measures * Customer Satisfaction Rating 4.00
*
Effectiveness Measures # Maintenance & Repair Projects Requiring Rework # Total Maintenance & Repair Projects .02
*
Efficiency Measures Production Cost Overrun $ Total Production Costs .01
*
Input Measures* <$> Cost of Maintenance Repair Projects
*
Output Measures* # Maintenance & Repair Projects/Jobs
*


* Not all measures require a numerator and denominator. In these cases, use the numerator field torecord the measure.


2. The goal column allows you to establish a goal against which your performance measurement data will be judged. For some measures, goals make sense. For example, if you have been collecting data for some period of time, you may have established goals you'd like to reach. In this case, you may record the goal in this column. For other measures, there may not be a goal. Record the goals listed in the table above for each measure.<

Adding New Performance Measures

To add performance measures to the list, press the <F3> key. You will be prompted to select the performance measure category, and then for the numerator name, denominator name, and goal. It does not matter where in the list you press <F3>. The measure will be placed with others of its category after you finish making the entry. The performance measure will appear in yellow allowing you to edit or delete the measure at a later date.

1. Add the following performance measure.

Impact Measures# Mission Objectives Attained # Mission Objectives 1.00
*

2. Press <F10> to save your changes and return to the Main Menu.


ENTERING DATA

1. Enter data by selecting Data Entry.
2. Use the arrow keys to select the second performance measure.
3. Press <F7> to bring up the data entry box.
4. Press <F3> and enter the data shown below. You need to press <F3> after you enter each line of data.
5. Press <F10> to save your entries.
6. Practice by entering data for the customer satisfaction performance measure shown on next page.

Milestones Data

Date
Numerator
Denominator
01/01/92
25.00
30.00
04/01/92
24.00
30.00
07/01/92
28.00
34.00
10/01/92
26.00
31.00
01/01/93
28.00
31.00
04/01/93
31.00
35.00
07/01/93
28.00
30.00
10/01/93
27.00
30.00
01/01/94
29.00
31.00
04/01/94
31.00
31.00
07/01/94
29.00
30.00
10/01/94
32.00
33.00

Customer Satisfaction Data

These data represent results obtained from a survey where 5.0 represents the highest score.

Date
Numerator
Denominator
01/01/94
3.01
01/08/94
3.08
01/15/94
3.05
01/22/94
3.17
01/29/94
3.11
02/03/94
3.25
02/10/94
3.31
02/17/94
3.38
02/24/94
3.41
03/01/94
3.82
03/08/94
3.79
03/15/94
3.92
03/22/94
4.02
03/29/94
4.02


VIEWING RESULTS

Graphs

Once you enter data, you can view results in a variety of ways. The mean of the results across an interval of data is plotted on the graph.

Mean ScoreThe average score across all results for each data entry within the selected interval.
Standard DeviationAn expression of variability about the mean.
Control LimitsControl limits are calculated based on the overall mean across all results and the associated standard deviation. They are located at three standard deviations above and below the mean. Any data points above or below these limits indicate a statistically significant improvement (above) or decline (below) in results.


NOTE: See Appendix B for definitions of statistical terms.


1. From Main menu, select Analysis.
2. From Analysis, select Graphs.
3. Select Milestones Completed (under Numerator Name) and press <Enter>.
4. You may view results weekly, monthly, quarterly, or yearly. Choose Quarterly and press <Enter>.
5. You may select any date associated with a data entry as a starting point. Choose the first date and press <Enter>.
6. Select View on Screen and press <Enter>.
7. Press <Esc> to return to the Main Menu.
8. Practice by viewing the customer satisfaction graph (choose Monthly).
Reports

You can use the reports option to produce a printout of your data.

1. From Main menu, select Analysis.
2. From Analysis, select Reports.
3. Select Screen to view the report on screen. Select Printer to send the report to the printer.


NOTE: Choose Options from the Analysis menu to turn the display of goals and control limits off.

GDPM UTILITIES

Re-Index Files

Use this option after program execution is interrupted (e.g., a power failure).

Change Directory

Use this option to create or change directories. Additional directories are helpful for storing data separately for organization/work unit combinations.

Change Org/Work Unit/Bus Area

Select to change the organization name, work unit name, or business area.

Change System Date

Use this option to set a new system date.

Change Printer

Use this option to specify the printer on which reports and graphs are to be printed.

Change Printer Port

Use this option to specify the connection port on your computer to which the selected printer (above) is connected.


APPENDIX A: PERFORMANCE MEASURES

In order to successfully improve organizational performance, senior executives must clearly define:

Strategic Plan

Strategic planning is a disciplined effort to produce fundamental decisions and actions that shape and guide what an organization is, what it does, and why it does it. It requires broad-scale information gathering, an exploration of alternatives, and an emphasis on the future implications of present decisions. The Government Performance and Results Act (1993) requires Federal agencies to develop strategic plans prior to FY 1998. Each strategic plan is to include a mission statement, general performance goals and objectives, a description of how the goals will be achieved, and an indication of how program evaluations were used in establishing or revising the goals.

The mission of an organization describes its reason for existence. Mission statements are broad and expected to remain in effect for an extended period of time. The statement should be clear and concise, summarizing what the organization does by law, and presenting the main purposes for all its major functions and operations. They are often accompanied by an overarching statement of philosophy or strategic purpose intended to convey a vision for the future and an awareness of challenges from a top-level perspective.

Performance goals are sometimes referred to as objectives by other organizations. Note that the terms may be used interchangeably. Performance goals or objectives elaborate on the mission statement and constitute a specific set of policy, programmatic, or management objectives for the programs and operations covered in the strategic plan. They must be expressed in a manner that allows a future assessment of whether a goal has been achieved.

A description of how the goals will be achieved must also be included in a strategic plan. The description should include a schedule for significant actions, a description of resources required to achieve the goals, and the identification of key external factors that might affect achievement of the goals.

Program evaluation is an analytic process used to measure program outcomes. The results of program evaluations are used to help establish and/or revise goals.

Annual Performance Plan

Annual performance plans are derived from the strategic plan and set specific performance goals for a fiscal year. The Government Performance and Results Act (1993) requires Federal agencies to prepare annual performance plans for each program activity beginning with FY 1999. A performance plan must include performance goals for the fiscal year and the performance indicators which will be used to assess whether performance goals have been attained. Beginning in FY 2000, agencies will be required to submit annual reports to Congress on the actual performance achieved compared to the goals expressed in the performance plan.

Performance goals are to be expressed in objective, quantifiable, and measurable form. OMB may authorize agencies to use an alternative form if the goals cannot be expressed in a quantifiable manner. However, the alternative form must contain statements which are sufficiently precise to allow an accurate, independent determination of whether the program's actual performance meets the criteria in the statement.

Performance Measures

Performance measures are used to measure goal attainment. They provide a basis for comparing actual program results with established performance goals. A range of measures should be developed for each program.

There are three major categories of performance measures defined by the Comptroller of the Department of Defense, Directorate for Business Management: Factor of production measures, outcome measures, and work process measures. It is usually desirable for all three categories to be represented among an organization's set of measures to achieve balanced measurement across a mission.

1. Factor of Production Measures

These measures typically describe the resource to output relationship. They often focus on different aspects of the resources to output relationship. There are four distinct types of factor of production measures. The four types are:

Input Measures

These measures describe the resources, time, and staff utilized for a program. Financial resources can be identified as current dollars, or discounted, based on economic or accounting practices. Non-financial measures can be described in proxy measures. These measures are not described in terms of ratios. They are often used as one element of other measures such as efficiency and effectiveness measures which are described later.

Examples:

Output Measures

These measures describe goods or services produced. Outputs can be characterized by a discrete definition of the service or by a proxy measure that represents the product. Highly dissimilar products can be rolled up into a metric. As with Input Measures, these measures are not described in terms of ratios. They are often used as one element of other measures such as efficiency and effectiveness measures which are described later.

Examples:

Efficiency Measures

Efficiency is the measure of the relationship of outputs to inputs and is usually expressed as a ratio. These measures can be expressed in terms of actual expenditure of resources as compared to expected expenditure of resources. They can also be expressed as the expenditure of resources for a given output.

Examples:

Effectiveness Measures

These are measures of output conformance to specified characteristics.

Examples:

Outcome Measures

Outcome measures describe the results achieved by the product being produced with given characteristics. There are two types of outcome measures. These include:

Direct Outcome Measures

These measures assess the effect of output against given objective standards.

Examples:

Impact Measures

Impact measures describe how the outcome of a program affects strategic organization or mission objectives.

Example: (1) Impact of materiel readiness on execution of Operation Desert Storm

Work Process Measures

Work process measures are indicators of the way work gets done in producing the output at a given level of resources, efficiency, and effectiveness. These measures are a direct by-product of the technique, but do not measure the attributes of the final product per se. These measures are typically processes, or tools, to evaluate and help improve work processes. Some of the common measures include:

Cost-effectiveness

This is an evaluation process to assess changes in the relationship of resources to: (1) an outcome; (2) an efficiency rate; or (3) an effectiveness rate.

Examples:

Efficiency Reviews or Management Analysis

These basic industrial engineering approaches:

Flow Charting

This is a work process evaluation tool that graphically maps the activities that make up a process. It illustrates how different elements and tasks fit together. They can be used to describe a business process and the physical flow over space and time. They thus provide insight about efficiency and effectiveness opportunities.

Cost-based Activity Modeling System (IDEF Modeling)

These are widely used techniques to capture the processes and structure of information in an organization. The analysis charts work processes, identifies and eliminates non-value added tasks, identifies costs of remaining tasks, and focuses on process changes to accomplish needed tasks at reduced costs.

Theory of Constraints

This is a work engineering process which specifically focuses on maximizing throughput, inventory reduction, and turnaround time as key work process indicators.

Macro Management Analysis Reviews

These reviews typically use economic analysis techniques rather than industrial engineering approaches to assess alternative organizations or work processes. An example of this type of review includes a consolidation study of alternative work methods that consolidates organizations or work processes to achieve economies of scale.

Benchmarking

Benchmarking systematically compares performance measures such as efficiency, effectiveness, or outcomes of an organization against similar measures from other internal or external organizations. This analysis helps uncover best practices that can be adopted for improvement.

Statistical Process Control

This is a measurement method used for assessing the performance of processes. Statistical evaluation techniques identify whether a process is in control (e.g., produces results in a predictable manner) and assess the impact of method changes on process results.

Status of Conditions Indicators

These are measures such as accident rates, absenteeism, and turnover rates. They are indirect measures of the quality of work life that impact efficiency and effectiveness.

Organizational Assessment Tools

These are measurement tools designed to identify organization culture and management style, work force and management knowledge, application of quality and process improvement tools, and organizational outcomes. This form of measurement is increasingly used in leading private sector corporations to assess potential for innovation, employee empowerment, internal and external customer relations and satisfaction.

Innovation

These measures are typically qualitative indicators of the rate of introduction of managerial or technological innovations into the work process. Innovation can be used as a barometer of organizational health and openness to new methods and processes.

Quality

These indicators for work processes are various methods of identifying the costs of waste due to work processes or methods that produce less than standard output. These include such indicators as defect rates, rework, and "cost of quality" such as the total resources of time, personnel, and materials engaged in inspection, rework, scrap, etc.


For more information, see:


Cohen, S. & Brand, R. (1993). Total quality management in government: A practical guide for the real world. San Francisco, CA: Jossey-Bass.

Fitz-enz, J. (1993). Benchmarking staff performance. San Francisco, CA: Jossey-Bass.

Rosen, E. D. (1993). Improving public sector productivity. San Francisco, CA: Jossey-Bass.

Nanus, B. (1992). Visionary Leadership. San Francisco, CA: Jossey-Bass.

Sims, H. P. & Lorenzi, P. (1992). The new leadership paradigm. Newbury Park, CA: Sage Publications.

Srivastva, S. & Fry, R. E. (1992). Executive and organizational continuity: Managing the paradoxes of stability and change. San Francisco, CA: Jossey-Bass.

Boardman, T. J. & Boardman, E. (1990). Don't touch that funnel! Quality Progress, Dec., 65-69.


For more information about performance measurement, see:

Camp, R. C. (1993). Benchmarking: The search for industry best practices that lead to superior performance. Norcross, GA: Industrial Engineering and Management Press.

Shell, R. L. (1993). Work measurement: Principles and practices. Norcross, GA: Industrial Engineering and Management Press.

Sink, D. S. & Tuttle, T. C. (1993). Planning and measurement in your organization of the future. Norcross, GA: Industrial Engineering and Management Press.

Lynch, R. & Cross, K. (1990). Measure up! Yardsticks for continuous improvement. Colchester, VT: Basil Blackwell, Inc.

Rummler, G. A. & Brache, A. P. (1990). Improving performance: How to manage the white space on the organizational chart. San Francisco, CA: Jossey-Bass. ch. 4.

Brinkerhoff, R. O. & Dressler, D. E. (1989). Productivity measurement: A guide for managers and evaluators. Newbury Park, CA: Sage Publications.

Whiting, E. (1986). A guide to business performance measurements. New York, NY: Macmillan Press.

Nash, M. (1983). Managing organizational performance. San Francisco, CA: Jossey-Bass.


For more information about goal setting, see:

Odiorne, G. S. (1990). The human side of management: Management by integration and self-control. New York, NY: Lexington Books.

Locke, E. A. & Latham, G. P. (1984). Goal setting: A motivational technique that works!. Englewood Cliffs, NJ: Prentice-Hall.


APPENDIX B: DEFINITIONS OF STATISTICAL TERMS

Mean

The mean (arithmetic average) is the sum of the values of all results within an interval divided by the number of results.

Standard Deviation

A standard deviation is the square root of the variance of a distribution. The variance is computed by summing the squared differences from the mean for all results and then dividing by 1 less than the number of results. If all results are identical, there is no variance and the standard deviation would be 0. The larger the standard deviation, the more spread out, or inconsistent, the results.

Control Limits

Control limits are calculated using the standard deviation and mean.

By looking at the means of the intervals of data and examining whether they lie above or below the control limits, you can determine whether you are significantly improving (or falling behind) on the performance measure.