Main Content

Comprehensive Data Management System


This page describes the Comprehensive Data Management System (CDMS) tool for use with Hazus. The page details the four CDMS modules: aggregate, site-specific, backward compatibility and import modules. It is intended to inform Hazus users of the capabilities, requirements for use and limitations of CDMS.

Graphic screenshot of the CDMS home screen

The Comprehensive Data Management System (CDMS) is a complimentary tool to Hazus that provides users with the capability to update and manage statewide datasets, which are currently used to support analysis in Hazus. CDMS will function for a single user or shared desktop application. 

Currently, Hazus users are required to undertake a large amount of manual effort to incorporate new data into the statewide datasets according to their pre-defined formats. To reduce this effort, CDMS will streamline and automate raw data processing, the conversion of external data sources into Hazus-compliant data and the transfer of data into and out of the statewide datasets. Processing site-specific level and aggregate information at the census block and tract levels will be supported. All new data brought into the system will be validated.

Once data are imported into the statewide datasets, CDMS will allow users to query, sort, export and print information. A backwards compatibility utility will be in place for upgrading previous versions of Hazus databases. CDMS is automatically downloaded when users download the Hazus software on the FEMA Flood Map Service Center (MSC).

CDMS Modules

CDMS is comprised of four modules: the aggregate, site-specific, backward compatibility, and import modules. The aggregate, site-specific, and backward compatibility modules work together. Raw data can be used to generate aggregated data and, if the raw data has location information, then site specific data can also be generated at the same time. Similarly, aggregated data can be generated when generating site-specific data. The backward compatibility component will apply the data conversion and use the aggregate data and site-specific data components to generate the data in Hazus format. The updated data in a study region can also be applied to the state data geo-databases.

Aggregate Module

Screen shot of the CDMS categorization screenThe aggregate module allows the user to:

  • Capture demographic data,

  • Update the aggregated data (square footage, building count, building and content exposure and demographics) and Occupancy to Building Type Mapping Schemes to state data geo-databases.

Site-Specific Module

The site-specific module has the following capabilities:

  • Allows the user to update for earthquake, hurricane and flood parameters for essential facilities

  • Capture bridge and tunnel data

  • Capture various transportation and utility facilities data

  • Update Hazus state data geo-databases with the data captured, i.e., ability to update EF.mdb, HPLF.mdb, TRN.mdb, UTIL.mdb, FLSg.mdb, and FlVeh.mdb

Site-Specific Features and Census Tracts

For site-specific features, CDMS requires that users provide a latitude/longitude coordinate in one of the following systems: Geographic Projection, Decimal Degree Coordinate System or North American Datum 1983 (NAD83). A Census Tract ID is not required but recommended. Site-specific features can consist of the following: user-defined, high potential loss, essential, transportation and utility facilities.

If a Census Tract ID is not provided, CDMS will utilize the latitude/longitude coordinate to identify the census tract in which a feature falls and will associate the feature with that Census Tract ID. If a site-specific feature, such as a highway bridge, does not have a Census Tract ID and does not fall within a census tract boundary, CDMS will not accept that feature and will warn the user that the feature requires a Census Tract ID before it can be accepted.

Null Values

When importing data into Hazus via CDMS, required fields for features in Hazus should be populated before the data are uploaded and mapped in CDMS. Fields not populated prior to loading into Hazus via CDMS will result in null values being transferred to Hazus and Hazus will not be able to count these features in its analysis. Consult the Hazus Data Dictionary for required fields and expected values/formats in the data preparation phase.

System Requirements for CDMS


HardwarePentium® IV with 800 MHz system bus and 2.6 GHz (or better) core speed

Note: Allows for fast import, export, and query for large areas.
Computer Storage:
Free Hard Disk Space
80 MB Moderate
Allows for base Statewide data and temporary storage for file imports.
Hardware AccessoriesCD-ROM reader with 12x minimum read speed for software installation
Graphics Card with 1024 x 768 minimum resolution
Mouse, Keyboard, and 19" Monitor
Software RequirementsThe CDMS application requires that the following programs be loaded:
  • Microsoft Visual FoxPro Driver 9.0*
  • Microsoft .NET 4.5 Framework*
  • MS SQL Server 2014 Express Edition*
  • Microsoft Data Access Components 2.8 Service Pack 1*
  • Microsoft J# Distributable File*
  • ESRI ArcGIS 9.3 Service Pack 1 (Engine, ArcView, ArcEditor, or ArcMap)
  • ESRI ArcGIS, ArcView 9.3 Service Pack 1, NET Runtime
Those programs highlighted by an asterisk (*) in the above list will be installed by CDMS. Please consult the CDMS instructions document for full details concerning CDMS installation.


CDMS Help and Technical Support

CDMS help is accessible from the pull-down Help Menu. If additional help is required, Technical Support can be contacted via the Hazus Help Desk at Please be sure to provide as much relevant information about your issue as possible including your operating environment, data formats, error messages, screenshots of errors, and data samples.

SQL Server 2014 Express Limitations

CDMS operates on a free version of Microsoft® SQL Server 2014 Express relational database software that enables Hazus users to run the CDMS without requiring a software purchase. However, SQL Server 2014 Express has the following limitations:

• Supports only 1 CPU but can be installed on any server
• 1 GB addressable RAM
• 10 GB maximum database size

Last Updated: 
05/12/2017 - 14:03