Welcome to the TPM Standard Maturity Assessment.

This tool allows transportation agencies to assess their TPM capabilities and identify areas where they can take steps to improve these capabilities. The assessment is based on 10 components:

The establishment of an agency’s focus through well-defined goals and objectives, enabling assessment of the agency’s progress toward meeting goals and objectives by specifying a set of aligned performance measures. The Strategic Direction is the foundation upon which all transportation performance management rests.
The use of baseline data, information on possible strategies, resource constraints, and forecasting tools to collaboratively establish a quantifiable level of performance the agency wants to achieve within a specific time frame. Targets make the link between investment decisions and performance expectations transparent across all stakeholders.
The use of agency goals and objectives and performance trends to drive the development of strategies and priorities in the long-range transportation plan and other performance-based plans and processes. The resulting planning documents become the blueprint for how an agency intends to achieve its desired performance outcomes.
The use of strategies and priorities to guide the allocation of resources to projects that are selected to achieve goals, objectives, and targets. Performance-based programming establishes clear linkages between investments made and expected performance outputs and outcomes.
A set of processes used to track and evaluate actions taken and outcomes achieved, thereby establishing a feedback loop to refine planning, programming, and target setting decisions. It involves using performance data to obtain key insights into the effectiveness of decisions and identifying where adjustments need to be made in order to improve performance.
The products, techniques, and processes used to communicate performance information to different audiences for maximum impact. Reporting is an important element for increasing accountability and transparency to external stakeholders and for explaining internally how transportation performance management is driving a data-driven approach to decision making.
Institutionalization of a transportation performance management culture within the organization, as evidenced by leadership support, employee buy-in, and embedded organizational structures and processes that support transportation performance management.
Established processes to collaborate and coordinate with agency partners and stakeholders on planning/visioning, target setting, programming, data sharing, and reporting. External collaboration allows agencies to leverage partner resources and capabilities, as well as increase understanding of how activities impact and are impacted by external factors.
A set of coordinated activities for maximizing the value of data to an organization. It includes data collection, creation, processing, storage, backup, organization, documentation, protection, integration, dissemination, archiving, and disposal. Well-managed data are essential for a robust TPM practice.
Existence of useful and valuable data sets and analysis capabilities available in accessible, convenient forms to support transportation performance management. While many agencies have a wealth of data, such data are often disorganized, or cannot be analyzed effectively to produce useful information to support target setting, decision making, monitoring, or other TPM practices.

Each of these components is further divided into sub-components. For example, Strategic Direction is divided into the following sub-components:

  • Sub-Component 1.1. Goals and Objectives
  • Sub-Component 1.2. Performance Measures

For each sub-component, there are five possible maturity levels with the following definitions:

  1. Initial
  2. Developing
  3. Defined
  4. Functioning
  5. Sustained

The specific criteria for a particular maturity level depend on the sub-component.

Once you have finished providing input, an overall maturity level is assessed for each component and for the TPM program as a whole. Based on the assessed maturity level, assessment results include links to relevant sections of the TPM Guidebook for more information on how to advance TPM practice.

 



1: Strategic Direction

For each row, click the description that best matches your capabilities.

Level 1Level 2Level 3Level 4Level 5
1.1 Goals and Objectives
Agency goals/objectives developed in isolation and without an understanding of agency and regional priorities. Goal/objectives do not provide a clear strategic direction for the agency and are not used in decision-making.
A collaborative process to establish goals/objectives under development. Baseline performance information being used to create context about key issues. Linkages between agency core functions and broader societal concerns being clarified.
The agency has agreed on a process for goal/objective development including roles of internal staff, external stakeholder involvement, and steps to formally adopt goals/objectives. A strategy outlined to support tracking of goal/objective progress.
Collaborative process to define goals/objectives is well established. There is substantive discussion about the relative priority of different goals. There is ongoing coordination of goals/objectives across planning documents. Goals/objectives are integrated into planning, programming and employee performance evaluations.
Goals/objectives periodically refined to better reflect agency's priorities, communicate transportation's role in broader societal concerns, and reflect new challenges and risk factors. Goals/objectives are part of the agency culture. Employees understand how their actions support the achievement of goals.
1.2 Performance Measures
Performance measures are developed in isolation and without consideration for agency goals/objectives. Supporting data may not exist for all measures and limited documentation of measure calculations exists.
The agency is defining a process to identify measures that enable an agency to track progress towards strategic goals/objectives. Initial documents outlining measure calculations and data sources being developed.
A high level set of performance measures have been defined and formally approved. Agency has established governance process for modifying or adding measures. Agency has documented methodology for measure calculation and identification of data sources. There is a common understanding of how measures will be used in business processes.
Measures are relied on to track progress towards agency goals/objectives and provide key information that can be used in decision-making. The collection of measures has been refined to provide valuable information to a range of internal users (e.g., system-wide measures for executives and corridor specific for managers). Measures provide the foundation for external communication with stakeholders.
The agency is using a hierarchy of performance measures to support decision-making and to explain results. Measures are periodically refined as new data sources become available, agency priorities changes and stakeholder feedback prompts adjustments.
2: Target Setting

For each row, click the description that best matches your capabilities.

Level 1Level 2Level 3Level 4Level 5
2.1 Technical Methodology
Targets do not exist, or defined without an understanding of baseline performance, trends, the connection between strategies and results or analysis of what is feasible to achieve.
Evidence-based and data-driven methodology for calculating targets under development. Baseline data being assembled and reviewed. Analysis of historical trends initiated.
Evidence-based and data-driven methodology for calculating targets has been developed and documented. Target parameters defined (format, geography/ scope and time horizon). External and internal influencing factors have been identified and documented (e.g., resource constraints, capital project commitments, demographic trends.) and are being considered in future performance forecasts. Influencing factors also used to assess risk. Analytical tools support target calculations.
Agency has used an evidence-based and data-driven methodology for calculating targets for more than one cycle. Agency has the capability to analyze actual performance against target, diagnose reasons for variances, and make adjustments accordingly. Target calculations take into account cross performance area tradeoffs and changes in agency goals and priorities.
Agency has applied the evidence-based and data-driven methodology for calculating targets for multiple cycles. Approach is being continually refined based on experience to account for a range of situations; data and tools are periodically enhanced to better support the target setting business process.
2.2. Business Process
Target setting process is ad-hoc and not coordinated across performance areas.
Coordinated and collaborative target setting process under development. Staff responsibilities and roles being clarified. Purpose of the target both internally and externally being established. Benchmarking information being gathered.
Target setting process including roles and responsibilities and steps to formally approve targets has been established and documented. The information to be considered in target setting is documented. A regular schedule has been set allowing for as-needed flexibility for adjustment. There is a common understanding of how different targets will be used.
Agency has undergone target setting process for more than one cycle. Collaboration and coordination across performance areas is well established. Process is an integral component of planning, budgeting, staffing, and employee performance evaluations. A key trigger for the re-assessment of targets is performance results.
Agency has applied target setting process for multiple cycles. Support of target setting and understanding of its value spread across the agency. Approach is being continually refined as needed to address organizational structure changes.
3: Performance-Based Planning

For each row, click the description that best matches your capabilities.

Level 1Level 2Level 3Level 4Level 5
3.1 Strategy Identification
Strategy identification is not driven by established goals and performance measures or an understanding of current performance and risk factors. There is limited dialogue among stakeholders in developing a full range of strategies.
The agency is defining a data-driven process for understanding current and future performance and identifying and evaluating strategies to achieve performance goals. The agency is working with a range of internal and external stakeholders to define this process.
The agency has identified and documented a process for strategy development including scope, data sources, analysis requirements, stakeholder involvement, roles and responsibilities and buy-in. The agency has identified exogenous factors that may impact strategy effectiveness (e.g. VMT, population, fuel prices).
Strategy identification is driven by goals and based on analysis and review of current and projected performance trends. Strategies are evaluated on contribution across multiple goals and agency priorities. Future projections incorporate consideration of risks. Strategies are formulated with an understanding of the broad agency-wide or regional context. The agency conducts scenario analysis to evaluate impacts of exogenous factors (e.g. VMT, population, fuel prices) on strategy effectiveness.
A collaborative, data-driven process to identify strategies is well-established. Strategy identification is informed by analysis of the effectiveness of alternative strategies (before/after analysis) with respect to established goals. Risk assessments are regularly conducted, resulting in mitigation strategies that reduce the likelihood of negative events occurring that will impact overall performance.
3.2 Investment Prioritization
The agency lacks information necessary to prioritize strategies based on need, risk, resource constraints and effectiveness towards achieving agency goals and policies.
The agency is defining methods and processes for analyzing tradeoffs based on established agency goals and priorities, relative need across performance areas and alternate investment scenarios. The agency is defining methods and processes for prioritizing strategies based on relative effectiveness to achieve desired outcomes or mitigate risk. Staff responsibilities are being clarified.
The agency has defined methodologies and processes for analyzing tradeoffs and prioritizing strategies based on established goals and priorities. Staff roles and responsibilities have been established. The agency has the necessary data and analysis capabilities in place to analyze tradeoffs across alternate investment scenarios, understand likelihood and consequences of different risks, and evaluate effectiveness of specific strategies. Staff understand how the results of tradeoff analysis and strategy p
Agency has applied tradeoff analysis and strategy prioritization process for more than one cycle. Prioritization takes into account synergistic effects across strategies, and the effect of a strategy on multiple goals. Long-range transportation plan and other performance-based plans have been developed based on analysis results, and have sufficient clarity to guide programming. Relevant stakeholders actively participate in the process of analyzing alternate investment scenarios and prioritizing strategi
Agency has applied tradeoff analysis and strategy prioritization for multiple cycles. Process and methodology is periodically refined to provide a better understanding of relative needs and strategy effectiveness on mitigating risk and achieving the desired balance across goals. Coordination across planning documents and processes regularly assessed. Linkages between planning documents and programs are well-established.
4: Performance-Based Programming

For each row, click the description that best matches your capabilities.

Level 1Level 2Level 3Level 4Level 5
4.1 Programming Within Performance Areas
Programming decisions are not linked to agency's goals or supporting planning documents. Resource allocation is based on formulas or historical allocations without analysis of performance impacts. Programming process lacks transparency.
A performance-based programming methodology and process under development. Project selection methodology being established that reflects agency goals, priorities determined in planning documents, funding constraints and risk factors. Staff responsibilities and collaboration opportunities with external stakeholders are being clarified.
The agency has established and documented a performance-based methodology and process to develop the S/TIP and agency budget. Output targets are set to track program delivery and anticipated results. The agency considers risk factors in programming and budgeting decisions. Staff responsibilities and external collaboration processes clarified.
Programming decisions are driven by a clear linkage between investments made and expected performance outputs and outcomes. External stakeholders understand programming decisions being made by the agency.
Agency has applied performance-based programming for multiple cycles. A strong feedback loop exists between performance monitoring and programming. Process and methodology is periodically refined to provide a better understanding of program effectiveness on mitigating risk and achieving the desired outcomes across goals.
4.2 Programming Across Performance Areas
Programming decisions are not linked to agency's goals or supporting planning documents. Resource allocation decisions do not take tradeoffs across program areas into account. Resource allocation is based on formulas or historical allocations without analysis of performance impacts. Programming process lacks transparency.
A performance-based programming methodology and process under development that considers tradeoffs across performance areas. Project selection methodology being established that reflect agency goals, relative needs across program areas, priorities determined in planning documents, funding constraints and risk factors. Staff responsibilities and opportunities to collaborate with external stakeholders are being clarified.
The agency has established and documented a performance-based methodology and process to program projects across performance areas. Output targets are set to evaluate program delivery and anticipated results. The agency identified risk factors in programming and budgeting decisions. Staff responsibilities and external collaboration processes clarified.
Programming decisions are driven by a clear linkage between investments made an expected performance outputs and outcomes. External stakeholders understand programming decisions being made by the agency. Investments reflect tradeoffs across performance areas and seek to maximize achievement of multiple goals. Programming decisions reflect established priorities across multiple planning documents (e.g., SHSP, CMAQ, State Freight Plan).
Agency has applied performance-based programming across performance areas for multiple cycles. A strong feedback loop exists between performance monitoring and programming. Process and methodology is periodically refined to provide a better understanding of program effectiveness on mitigating risk and achieving the desired outcomes across goals. Collaboration internally and with external stakeholders has resulted in coordinated multi-modal and/or cross-jurisdictional projects to achieve desired outcomes.
5: Monitoring and Adjustment

For each row, click the description that best matches your capabilities.

Level 1Level 2Level 3Level 4Level 5
5.1 System Level Monitoring
There is not yet a well-defined process for monitoring system-level performance outcomes. Agency staff are not able to obtain information concerning the impact of external factors on performance outcomes. Limited linkage exists between resource allocation decisions, performance results, and strategic goals.
A plan for outcome monitoring is under development including definition of users, outcome measures, frequency, data sources, and tracking influence of external factors. Plan development includes discussion of how resource allocation, performance results, and strategic goals can be linked.
Outcome measures have been defined and tied to the achievement of strategic goals and objectives. The agency has identified a process for making program adjustments as needed based on performance outcomes.
Managers are monitoring outcomes and assessing progress towards expected results and strategic goals. They are employing the identified adjustment process to identify issues, diagnose problems, and make appropriate adjustments to improve performance outcomes. Staff regularly review information about external factors to gain an understanding of how these factors impact desired outcomes.
Processes for monitoring system outcomes and external factors are periodically refined to improve the agency’s ability to anticipate and adjust to slower than anticipated progress towards strategic goals. Information on resource allocation and performance results are used to improve assumptions used for setting the strategic direction, target setting, planning, and programming.
5.2 Program/Project Level Monitoring
A well-defined process for monitoring program/project outputs and resulting impacts on outcomes is not yet in place. Agency staff are not able to see trends in outputs over time. Limited information exists for assessing the impacts of the program/projects on performance.
An agency-wide approach for monitoring program/project outputs and their impacts on outcomes is being defined. The data foundation for output monitoring and analysis of program/project impacts on performance results is being built.
An approach to monitoring program/project outputs and their resulting impacts on performance outcomes has been developed and documented. Agency is able to review performance trends and access before and after project-level information to determine program/project impacts on outcomes.
Managers are monitoring current performance and conducting analysis of outputs to inform mid-stream adjustments to ensure progress towards desired outcomes. Additional
The use of performance information in assessing program/project effectiveness in driving outcomes is common practice. Output performance monitoring serves as a key feedback loop back to planning and programming and the recalibration of goals, objectives, targets and measures.
6: Reporting and Communication

For each row, click the description that best matches your capabilities.

Level 1Level 2Level 3Level 4Level 5
6.1 Internal Reporting and Communication
Ad-hoc reports are generated in response to internal requests that arise. Performance reporting needs have not been systematically identified. Internal reporting is ad-hoc; standard data sources and review process have not been established.
Requirements for internal reports are being defined to ensure alignment with strategic direction and provision of actionable information; standards are being developed to ensure consistency; early prototypes or basic reporting capabilities may exist.
Internal performance reports are in place that are in alignment with the strategic direction and provide actionable information. Reports have not yet been through a cycle of testing and refinement. Pockets within the agency have begun using reports.
Internal performance reports have been refined to provide performance information tailored to different audiences (e.g. executive views with drill-downs). Reports are actively used at multiple levels of the agency to evaluate progress toward achieving strategic goals.
The performance reports are regularly refined based on feedback, and are considered to be an essential driver for management decision-making. Reporting is automated and staff support for performance reporting is focused on value-added improvement.
6.2 External Reporting & Communication
External reporting is ad-hoc; standard data sources and review process have not been established.
An effort is underway to develop an external reporting strategy. Requirements for external reports are being defined to ensure effective communication of agency goals, resource allocation decisions, actions and results achieved. Standards are being developed to ensure consistency. Early prototypes or basic reports may exist.
External performance reports are in place that are in alignment with the strategic direction and communicate agency goals, resource allocation decisions, actions and results. Managers understand how reports will be used to communicate with external stakeholders. Reports have not yet been through a cycle of testing and refinement.
External performance reports have been refined to effectively communicate the agency's
Stakeholders rely on the agency's reports to stay informed about actions and progress. Performance reports are widely recognized as critical to continued agency accountability and transparency. Reporting is automated and utilizes maps and graphs to maximize effectiveness of communication. Agency staff periodically refine reports based on stakeholder feedback.
A: Organization and Culture

For each row, click the description that best matches your capabilities.

Level 1Level 2Level 3Level 4Level 5
A.1 Leadership Team Support
Performance management is a result of the heroic activities on the part of champions. Limited to no support from senior management and executives.
Champion(s) have initiated discussions with senior management and executives about the value of performance management and their role in providing necessary leadership for success.
Agency leadership and senior management recognize the value of performance management. They are beginning to drive activities related to performance management - e.g. prioritizing goals, setting targets, using data to monitor and respond to performance results.
Agency leadership is committed to performance management as a core process and this commitment is demonstrated by what they say and do both internally and externally. Agency strategic goals figure prominently in internal and external communications.
Performance management sustained across changes in leadership.
A.2 Roles and Responsibilities
Implementation of performance management practices is sporadic across the agency. The agency lacks clarity about who is responsible for the various performance management roles.
An effort to identify and define roles and responsibilities necessary to establish a performance management framework is underway. Agency has begun to review its organizational structure to identify potential adjustments.
Roles and responsibilities for performance management have been defined, but not yet fully implemented. Recommended organizational structure changes have been outlined.
Staff at multiple levels of the organization understand their roles with respect to performance management practices. A clear organizational structure for performance management is in place - with sufficient budget and staffing.
Performance management practices have been sustained through changes in staff. Roles and responsibilities are periodically refined to reflect the adoption of new performance management practices.
A.3 Training and Workforce Capacity
Limited to no TPM training exists. Agency lacks an understanding of what core competencies are necessary to carry out performance management. Existing employee skill levels and gaps are not well understood.
Agency has begun to identify core competencies required for performance management. A skill assessment and training strategy are being developed to enable employees to strengthen the necessary capabilities.
Agency has identified core competencies for performance management. A suite of training resources have been developed.
Employees have the appropriate skills and training needed for the roles and responsibilities assigned to them.
Agency encourages a learning climate by periodically organizing seminars on performance management and by participating in TPM conferences, peer exchanges, webinars, and other forms of technology transfer. Training periodically refined to reflect developments and innovation in TPM.
A.4 Management Process Integration
There is no process to incorporate performance information into management practices. Performance data viewed as punitive rather than constructive.
Linkage being established between work group and employee management practices and agency strategic goals. Approach to performance reviews refined to create a clearer connection between individual actions and the agency's goals and targets.
Clear understanding by work group managers and their employees about the linkage between their activities and achieving strategic goals. Annual employee performance plans and evaluations include sufficient specificity to reinforce this linkage.
Performance information regularly included in management discussions at multiple levels. Expectations for employees are regularly set through measures and targets.
Integration of performance data into the agency's management functions has been applied for multiple cycles. Managers and staff have internalized the role of performance management to promote accountability and drive results.
B: External Collaboration

For each row, click the description that best matches your capabilities.

Level 1Level 2Level 3Level 4Level 5
B.1 Planning and Programming
The agency coordinates with partner agencies as needed to meet state and federal requirements. However, there is little or no substantive collaboration with agency partners to set performance targets, define goals and objectives and program projects to meet the established targets.
The agency meets with its partners to discuss goals, objectives and performance measures and identify opportunities for collaboration on strategy development and implementation.
The agency has worked with its partners to identify common goals and objectives, and develop a plan for collaboration on setting performance targets, developing strategies, and project programming.
The agency has established productive working relationships with its partners on performance based planning and programming. A collaborative process for establishing and updating performance targets is in place. Development of the long-range transportation plan and other performance-based plans (TAMP, SHSP, Freight) incorporate opportunities for substantive discussion among partners of strategies that address multiple perspectives and needs.
There is proactive communication across partner agencies to capitalize on potential synergies and avoid conflicts. Collaboration has resulted coordinated cross-jurisdictional and/or multi-modal projects to achieve desired outcomes.
B.2 Monitoring and Adjustment
There is little or no collaboration with partner agencies on performance monitoring and reporting. Each agency is implementing its own monitoring and reporting systems independently.
The agency is working with its partners to identify opportunities for collaboration on performance monitoring and reporting.
The agency has initiated one or more efforts to join forces and collaborate on data collection, data management, and/or reporting. These efforts may include collection of consistent infrastructure condition data across jurisdictions, integration of data collected by multiple agencies, development of multi-modal views, or development of network views including information for both state and locally managed facilities.
Collaborative performance monitoring and reporting systems are in place and have been used for at least one reporting cycle.
Collaborative performance monitoring and reporting systems are well established and have been used for multiple reporting cycles. Initial systems are periodically refined and expanded in recognition of their value-added.
C: Data Management

For each row, click the description that best matches your capabilities.

Level 1Level 2Level 3Level 4Level 5
C.1 Data Quality
Performance data quality issues that are identified are addressed on an ad-hoc basis rather than through a systematic process. Metrics for data quality have not been established and quality expectations have not been discussed.
Data quality metrics and minimum acceptable standards are being defined for performance data sets - considering accuracy, completeness, consistency, and timeliness. Data quality assurance and validation methods are being developed.
Data quality metrics and standards have been defined and documented for performance data sets. Baseline data quality has been measured and a plan for data quality improvement is in place. Business rules for assessing data validity have been defined. Standard protocols for data quality assurance and certification or acceptance have been established.
Users of performance data have an understanding of their level of accuracy, completeness, consistency and timeliness. Standard data quality assurance processes are routinely followed. New data collected are reviewed against historical data to identify unexpected changes warranting investigation. Data collection personnel are trained and certified based on demonstrated understanding of standard practices.
Data quality assurance processes are regularly improved based on experience and user feedback. Data validation and cleansing tools are used to identify and address missing or invalid values. Business rules for data validity are built in to data entry and collection applications.
C.2 Data Accessibility
Limited standard performance reports may exist, but variations on these reports are only available by special request. There are no ad-hoc query or drill down/roll up capabilities. Reports are developed within organizational silos and are not integrated.
The agency is exploring needs and opportunities for improving access to integrated agency data in usable forms. Pilot initiatives may be underway.
Requirements have been documented for performance data reports and views needed by different classes of users. Tools and technologies for providing these data views are in place. Reporting and query tools are available for general use within the agency and do not require specialized training.
Reports, dashboards, map interfaces and query tools are available and have been configured to provide convenient access to data by different users. Agency employees can view and analyze a variety of information such as pavement condition, bridge condition, crashes, traffic, programmed projects, and completed projects. Performance data can be viewed in a variety of ways: summary statistics, bar and pie charts, trend lines and map views. Agency employees have the ability to easily visualize trend information on performance together with explanatory factors such as VMT.
Data are shared outside of the agency via a statewide or national GIS portal or clearinghouse, or via a service or API. The agency routinely improves data access and usability based on feedback from users and monitoring of the latest technology developments.
C.3 Data Standardization and Integration
Agency data sets are not consistent with national standards. Agency data sets cannot be integrated due to lack of standardization in location referencing or other link or coded fields. The agency has not defined a strategy for combining different data sets in order to provide a consistent
Efforts are underway to identify key integration points across data sets and define standards that will enable integration. There is some understanding of user needs for trend analysis and creating snapshot views of data for analysis and reporting, but these needs have not been explored systematically or comprehensively. There is experience with integrating data to create a snapshot in time view, but no repeatable procedures for this have been defined.
Data standards have been defined for location referencing, temporal referencing and common link fields. The agency has defined units for aggregation for different types of data. The agency has identified single authoritative source systems for key performance data elements to provide
The agency is able to integrate performance data sets based on location and time. The agency has procedures in place to ensure that externally procured data sets and applications adhere to established data standards and can be linked to existing data. The agency has one or more skilled individuals with responsibility for data architecture and integration across systems. Data user requirements for trend analysis, snapshots and other uses of temporal information can be met without major changes to data structures or substantial new development effort.
Opportunities to improve data integration and consistency with other agency data sets are reviewed on an annual basis.
C.4 Data Collection Efficiency
Data to support performance management are collected within organizational silos to meet specific needs. There have not been any efforts to coordinate performance data collection across business units or identify where data sources can be repurposed to meet multiple needs. Available data from sources outside of the agency may be used but there are no data sharing arrangements or agreements in place.
Opportunities for coordinating data collection and for sharing data across agency business units have been discussed, but no action has been taken on this yet. Partnerships with other public and private sector organizations are being explored to share data on an ongoing basis.
Opportunities for maximizing use of existing data across the agency have been identified. Necessary system changes being implemented to support transitions to new data sources. Data sharing agreements in place with external entities.
Data collection being coordinated across business units. Data collected once and used for multiple purposes within the agency. Data sharing agreements with external entities sustained over 2+ years and through multiple data update cycles.
New internal and external agency partnerships on data collection and management are actively sought in order to achieve economies of scale and make best use of limited staff and budget.
C.5 Data Governance
Ownership and accountability for different performance data sets is unclear. Roles for ensuring data quality, value, and appropriate use have not been defined or established. Data improvement needs are not systematically or regularly identified, and the process for making decisions about data improvements is ad-hoc and opportunistic.
A business lead or point person has been designated for each major performance data set but the responsibilities of their role haven't been spelled out. Data improvement needs are identified and communicated to management in an informal manner.
Role(s) have been designated to identify points of accountability for important agency performance data sets. Decision making authority has been defined for collection/ acquisition of new data, discontinuation of current data collection, and significant changes to the content of existing data. Data improvement needs to support performance management have been systematically reviewed, assessed and documented. A standard approach has been defined for establishing business rules for data updates and producing data definitions and metadata.
Business rules for data maintenance are being followed. Metadata is being populated as data are added or changed. Staff with responsibility for data stewardship and management play an active role in defining data improvements and periodically produce reports of progress to their managers. A regular process of data needs assessment is in place, and is used to drive budgeting decisions. Staff with responsibility for data stewardship and management have sufficient time, training and authority to carry out these responsibilities. There is a standard process in place to ensure continuity in data management practices through staff transitions.
Data governance and planning activities are viewed as valuable and necessary in the organization and would have a high probability of continuing through changes in executive leadership. Stewardship roles are periodically reviewed and refined to reflect new or changing data requirements and implementation of new data systems. A centralized approach to management of metadata and business rules has been implemented.
D: Data Usability and Analysis

For each row, click the description that best matches your capabilities.

Level 1Level 2Level 3Level 4Level 5
D.1 Performance Data Exploration and Visualization
Limited tabular performance reports may exist. There are no ad-hoc query or drill down/roll up capabilities.
The agency is exploring needs and opportunities for improving capabilities for data exploration and visualization. Pilot initiatives may be underway.
Requirements have been documented for performance data reports and views needed by different classes of users. Tools and technologies for providing these data views are in place.
Reports, dashboards, map interfaces and query tools are available and have been configured to meet needs of different users. Performance data can be viewed in a variety of ways: summary statistics, bar and pie charts, trend lines and map views. Agency employees have the ability to easily visualize trend information on performance together with explanatory factors such as VMT.
The agency routinely improves data exploration and visualization based on feedback from users.
D.2 Performance Diagnostics
Information is not readily available for identifying root causes of performance results.
The agency is identifying supplemental data needed to improve performance diagnostic capabilities. Potential data sources are being investigated, including those that help explain results achieved by a particular project or action; and those that help explain system-level performance results.
The agency has identified available supplemental data needed to provide insight into root causes for project and system-level performance results. Performance reports have been modified to include these data or supplemental reports have been developed.
Agency staff regularly review supplemental data along with performance results and use these data to understand root causes at the project and system level.
Supplemental data are regularly refined and augmented based on feedback from users. The value of diagnostic information is continually being improved.
D.3 Predictive Capabilities
A methodology for predicting future performance has not been developed.
A methodology for predicting future performance is under development. Models and analytical tools are being developed or implemented.
Capabilities for predicting future performance under different scenarios are in place, but have not been fully tested.
Predictive capabilities are in place and have been utilized as part of performance-based planning and programming for at least one cycle. Predictive capabilities incorporate consideration of risk factors.
Scenario analysis has been applied through multiple planning and programming cycles. Agency managers and external stakeholders rely on predictions of future performance to set priorities and allocate resources.