Welcome to the TPM CMM Component C: Data Management – Maturity Assessment.

This tool allows transportation agencies to assess their TPM capabilities and identify areas where they can take steps to improve these capabilities. The capability maturity model contains 10 components. This assessment covers the ninth component.

Component C. Data Management

Data Management encompasses a set of coordinated activities for maximizing the value of data to an organization. It includes data collection, creation, processing, storage, backup, organization, documentation, protection, integration, dissemination, archiving and disposal.

Data Management is divided into the following sub-components:

  • Sub-Component 1.1. Data Quality
  • Sub-Component 1.2. Data Accessibility
  • Sub-Component 1.3. Data Standardization and Integration
  • Sub-Component 1.4. Data Collection Efficiency
  • Sub-Component 1.5. Data Governance

For each sub-component, there are five possible maturity levels with the following definitions:

  1. Initial
  2. Developing
  3. Defined
  4. Functioning
  5. Sustained

The specific criteria for a particular maturity level depend on the sub-component.

Based on the provided criteria, the user is asked to rate how well the agency meets the criteria as follows:

  • Totally Disagree
  • Somewhat Disagree
  • Somewhat Agree
  • Totally Agree

Once you have finished providing input, an overall maturity level is assessed for the component. Based on the assessed maturity level, assessment results include links to relevant sections of the TPM Guidebook for more information on how to advance TPM practice.



C: Data Management

For each row, click the description that best matches your capabilities.

Level 1Level 2Level 3Level 4Level 5
C.1 Data Quality
Performance data quality issues that are identified are addressed on an ad-hoc basis rather than through a systematic process. Metrics for data quality have not been established and quality expectations have not been discussed.
Data quality metrics and minimum acceptable standards are being defined for performance data sets - considering accuracy, completeness, consistency, and timeliness. Data quality assurance and validation methods are being developed.
Data quality metrics and standards have been defined and documented for performance data sets. Baseline data quality has been measured and a plan for data quality improvement is in place. Business rules for assessing data validity have been defined. Standard protocols for data quality assurance and certification or acceptance have been established.
Users of performance data have an understanding of their level of accuracy, completeness, consistency and timeliness. Standard data quality assurance processes are routinely followed. New data collected are reviewed against historical data to identify unexpected changes warranting investigation. Data collection personnel are trained and certified based on demonstrated understanding of standard practices.
Data quality assurance processes are regularly improved based on experience and user feedback. Data validation and cleansing tools are used to identify and address missing or invalid values. Business rules for data validity are built in to data entry and collection applications.
C.2 Data Accessibility
Limited standard performance reports may exist, but variations on these reports are only available by special request. There are no ad-hoc query or drill down/roll up capabilities. Reports are developed within organizational silos and are not integrated.
The agency is exploring needs and opportunities for improving access to integrated agency data in usable forms. Pilot initiatives may be underway.
Requirements have been documented for performance data reports and views needed by different classes of users. Tools and technologies for providing these data views are in place. Reporting and query tools are available for general use within the agency and do not require specialized training.
Reports, dashboards, map interfaces and query tools are available and have been configured to provide convenient access to data by different users. Agency employees can view and analyze a variety of information such as pavement condition, bridge condition, crashes, traffic, programmed projects, and completed projects. Performance data can be viewed in a variety of ways: summary statistics, bar and pie charts, trend lines and map views. Agency employees have the ability to easily visualize trend information on performance together with explanatory factors such as VMT.
Data are shared outside of the agency via a statewide or national GIS portal or clearinghouse, or via a service or API. The agency routinely improves data access and usability based on feedback from users and monitoring of the latest technology developments.
C.3 Data Standardization and Integration
Agency data sets are not consistent with national standards. Agency data sets cannot be integrated due to lack of standardization in location referencing or other link or coded fields. The agency has not defined a strategy for combining different data sets in order to provide a consistent
Efforts are underway to identify key integration points across data sets and define standards that will enable integration. There is some understanding of user needs for trend analysis and creating snapshot views of data for analysis and reporting, but these needs have not been explored systematically or comprehensively. There is experience with integrating data to create a snapshot in time view, but no repeatable procedures for this have been defined.
Data standards have been defined for location referencing, temporal referencing and common link fields. The agency has defined units for aggregation for different types of data. The agency has identified single authoritative source systems for key performance data elements to provide
The agency is able to integrate performance data sets based on location and time. The agency has procedures in place to ensure that externally procured data sets and applications adhere to established data standards and can be linked to existing data. The agency has one or more skilled individuals with responsibility for data architecture and integration across systems. Data user requirements for trend analysis, snapshots and other uses of temporal information can be met without major changes to data structures or substantial new development effort.
Opportunities to improve data integration and consistency with other agency data sets are reviewed on an annual basis.
C.4 Data Collection Efficiency
Data to support performance management are collected within organizational silos to meet specific needs. There have not been any efforts to coordinate performance data collection across business units or identify where data sources can be repurposed to meet multiple needs. Available data from sources outside of the agency may be used but there are no data sharing arrangements or agreements in place.
Opportunities for coordinating data collection and for sharing data across agency business units have been discussed, but no action has been taken on this yet. Partnerships with other public and private sector organizations are being explored to share data on an ongoing basis.
Opportunities for maximizing use of existing data across the agency have been identified. Necessary system changes being implemented to support transitions to new data sources. Data sharing agreements in place with external entities.
Data collection being coordinated across business units. Data collected once and used for multiple purposes within the agency. Data sharing agreements with external entities sustained over 2+ years and through multiple data update cycles.
New internal and external agency partnerships on data collection and management are actively sought in order to achieve economies of scale and make best use of limited staff and budget.
C.5 Data Governance
Ownership and accountability for different performance data sets is unclear. Roles for ensuring data quality, value, and appropriate use have not been defined or established. Data improvement needs are not systematically or regularly identified, and the process for making decisions about data improvements is ad-hoc and opportunistic.
A business lead or point person has been designated for each major performance data set but the responsibilities of their role haven't been spelled out. Data improvement needs are identified and communicated to management in an informal manner.
Role(s) have been designated to identify points of accountability for important agency performance data sets. Decision making authority has been defined for collection/ acquisition of new data, discontinuation of current data collection, and significant changes to the content of existing data. Data improvement needs to support performance management have been systematically reviewed, assessed and documented. A standard approach has been defined for establishing business rules for data updates and producing data definitions and metadata.
Business rules for data maintenance are being followed. Metadata is being populated as data are added or changed. Staff with responsibility for data stewardship and management play an active role in defining data improvements and periodically produce reports of progress to their managers. A regular process of data needs assessment is in place, and is used to drive budgeting decisions. Staff with responsibility for data stewardship and management have sufficient time, training and authority to carry out these responsibilities. There is a standard process in place to ensure continuity in data management practices through staff transitions.
Data governance and planning activities are viewed as valuable and necessary in the organization and would have a high probability of continuing through changes in executive leadership. Stewardship roles are periodically reviewed and refined to reflect new or changing data requirements and implementation of new data systems. A centralized approach to management of metadata and business rules has been implemented.


Overall Score: 0