1. Strategic Direction | The agency has some goals, objectives and performance measures, but measures are developed in isolation from goals. | The agency is developing a collaborative process to set goals and objectives, with linkages between agency functions and broader societal concerns still being clarified. | The agency has established a collaborative goal setting process and there is common understanding of how measures will be added, modified, and used to track progress. | The agency has a well-established, collaborative goal and objective setting process that is ongoing, with goals and objectives integrated into planning, programming, and employee evaluations. | The agency periodically revisits and refines goals and objectives regarding internal and external stakeholder needs. |
---|
2. Target Setting | The agency has little information and /or understanding of baseline performance or historical trends. | The agency is collaboratively developing a methodology to understand baselines and set targets within agreed-upon performance areas. | The agency has established a well-understood, evidence-based, and data-driven methodology for observing baseline performance, establishing trend lines and calculating targets. | The agency has had established targets and accompanying methodology and business process for more than one cycle. | The agency has had targets, an established business process, and documented technical methodology in place for multiple cycles. |
---|
3. Performance-Based Planning | Strategy identification is not driven by goals, performance measures, and current performance. | The agency is defining a data-driven process for understanding current and future performance to identify and develop strategies. | The agency has documented a process for strategy development. | The agency has an established process for collaborative strategy identification and prioritization that is based on goals and analysis of current and projected performance trends. | The agency has a collaborative, data-driven process to identify strategies and evaluate tradeoffs across scenarios. |
---|
4. Performance-Based Programming | Programming decisions are not linked to goals or planning documents, and lack transparency. | The agency is developing a performance-based programming methodology and process that will enable project selection to reflect agency goals, priorities determined in planning documents, funding constraints, risk factors, and relative needs across performance areas. | The agency has established and documented a performance-based methodology and process to develop the S/TIP and agency budget that considers risk factors and tradeoffs between performance areas. | The agency has established and documented a performance-based methodology and process to program projects within and across performance areas to maximize achievement of multiple goals. | The agency has applied performance-based programming across multiple performance areas for multiple cycles and a feedback loop exists between performance monitoring and programming. |
---|
5. Monitoring and Adjustment | The agency does not have a well-defined output or outcome performance monitoring process. | The agency is developing a plan for system and program/project monitoring tied to the strategic direction, including definition of output and outcome measures, frequency, data sources, external influencing factors and users. | The agency has defined outcome and program/project output measures linked to the achievement of strategic goals and objectives. | The agency is monitoring outcomes and project outputs and using this information to adjust planning and programming decisions. | The use of performance information to assess program/project effectiveness in driving outcomes is common practice. |
---|
6. Reporting and Communication | The agency does not have a system for standard performance reporting, but rather uses ad-hoc reports generated in response to internal requests as they arise. | The agency is defining requirements for internal reports to ensure consistency, alignment with strategic direction, and provision of actionable information. | The agency has internal performance reports that align with the strategic direction and provide actionable information, but the reports have not yet been through a cycle of testing and refinement. | The agency has refined internal performance reports to provide performance information tailored to different audiences and these are actively used at multiple levels of the agency to evaluate progress toward achieving strategic goals. | The agency regularly refines performance reports based on feedback. |
---|
A. Organization and Culture | The agency’s performance management is the result of heroic activities by champions, with limited support from leadership. | The agency’s TPM champion(s) have initiated discussions with leadership about the value of performance management. | The agency’s leadership and senior management recognize the value of performance management and are beginning to drive activities related to performance management. | Agency leadership is committed to performance management as a core process and this commitment is demonstrated by words and actions. | The agency sustains performance management across changes in leadership and staff. |
---|
B. External Collaboration and Coordination | The agency coordinates with partner agencies as needed to meet state and federal requirements, but there is little collaboration with agency partners to set performance targets, define goals and objectives, program projects, or implement joint monitoring. | The agency is meeting with its partners to discuss goals, objectives, and performance measures; identify opportunities for collaboration on strategy development and implementation; and identify opportunities for joint performance monitoring and reporting. | The agency has worked with its partners to identify common goals and objectives, and has developed a plan for collaboration on setting performance targets, developing strategies, and project programming. | The agency has established productive working relationships with its partners on performance based planning and programming and has collaboratively monitored and reported performance for at least one cycle. | The agency has collaborated on cross-jurisdictional and/or multi-modal projects to achieve desired outcomes, building on potential synergies and avoiding conflicts. |
---|
C. Data Management | The agency has not established metrics for performance data quality; issues are identified and addressed on an ad-hoc basis rather than through a systematic process. | The agency is developing data quality metrics, quality assurance, and validation methods. | The agency has data quality metrics and standards for performance data sets, and has assembled a plan for making needed improvements. | The agency routinely follows standard data quality practices and data is integrated, accessible, convenient, and can be analyzed in a variety of ways with little additional development effort. | The agency regularly reviews opportunities to improve data integration and consistency. |
---|
D. Data Usability and Analysis Capabilities | Limited tabular performance reports may exist, but the agency does not have ad-hoc query or drill down/roll up capabilities. | The agency is developing exploration and visualization capabilities. | Tools and technologies for providing data views needed by various users are in place, and requirements have been documented. | Reports and tools meet the needs of different users, enabling employees to easily visualize and determine explanatory factors. | The agency routinely improves exploration and visualization and refines supplemental data based on user feedback. |