Hazus Help Desk Resource and Solutions Page

Main Content

This Resource and Solutions page has answers to Frequently Asked Questions (FAQs) and lists solutions to previously identified Hazus issues. You can use this page as a reference before submitting a request through the Hazus Help Desk. If you still have questions or need further help, please email helpdesk@support.hazus.us or log in to the Help Desk website by visiting https://support.hazus.us. If you don't have log in credentials, email helpdesk@support.hazus.us.

Frequently Asked Questions

  1. Does Hazus Work on ArcGIS 10.1?
  2. When will Hazus-MH be Updated with the 2010 US Census Data?
  3. What are the requirements for Hazus-MH 2.1?
  4. How do I get the updates from the Service Pack (SP) releases on my Hazus-MH application?

Solutions

  1. Configure Hazus-MH to Run with SQL Server 2005 Standard Edition
  2. Hazus 2.0 causes ArcMap to show a blue background by default
  3. Flood aggregation crashes
  4. Flood Options menu missing in Hazus-MH 2.0
  5. User Data - DEM process fails and crashes ArcMap in the Flood Model
  6. Flood: Modeling a Dam Failure in Hazus
  7. "User Data" import processes fail for DEM or User Defined Depth Grids in Flood model
  8. Flood model - User Data DEM import fails in Coastal Surge model / HEC-RAS fails. Due to previous install of ArcGIS 9.3.1
  9. Clicking on the HurrEvac FTP download button in the Hurricane Scenario Wizard produces an error
  10. Flood Model - There is no building or content loss in the results even though there are buildings in the floodplain

Frequently Asked Questions

 

  1. Does Hazus work on ArcGIS 10.1?

    Unfortunately, no.  ArcGIS 10.1 was released after the most recent Hazus release and the two are not compatible yet. Currently Hazus 2.1 is only compatible with ArcGIS 10.0, Service Pack 2 (SP2). FEMA will notify users once the next Hazus release is scheduled.
     
  2. When will Hazus-MH be updated with the 2010 US census data?

    The 2010 census results are being released to the public in separate census products starting in early 2011 through the Fall of 2013.  FEMA is currently working to update Hazus in the 2010 census results.
     
  3. What are the requirements for Hazus-MH 2.1?
Hardware Speed 2.2 GHZ dual core or higher, 2 GB or higher of memory/RAM
Computer Storage Disk Space 10 GB of disk space is needed to store one multi-hazard large urban study region/ Inventory data size varies by state; for the entire US – 30 GB is needed
Video / Graphics Adapter 24-bit capable video card with a minimum of 128 MB memory; a resolution of 1078 x 768 or higher is recommended
Supporting
Software
Microsoft Windows XP SP3/Windows 7 Professional/Enterprise* – only US English versions are supported**
ESRI ArcGIS 10.0 SP2 (Can be purchased by contacting ESRI at 1-800-447-9778 or online at http://www.esri.com)
Spatial Analyst extension required with flood model.

* Hazus-MH 2.1 will run on Windows 7 32-bit and 64-bit. 

**Hazus–MH installation will allow users to install on other operating systems/service packs but the application is not certified to work flawlessly with those operating systems/service packs.

  1. How do I get the updates from the Service Pack (SP) releases on my Hazus-MH application?

The latest Service Pack (SP) release is Hazus-MH SP03. This update will automatically update to the Hazus-MH application once installed on your computer; however, in order for the SP 03 updates to automatically take place, SP01 and SP02 must be manually installed on your computer.

The following are the links to manually install the Service Packs to your machine:

Solutions

  1. Problem:  Configure Hazus-MH to Run with SQL Server 2005 Standard Edition

    Hazus-MH uses Microsoft SQL Server 2005 Express edition as its database engine.  SQL Server 2005 Express is a free product but has a 4 GB database size limit.

    For certain uses of Hazus-MH (e.g., very large study regions), it may be beneficial to utilize the Standard edition of SQL Server. This document will guide you through the process of configuring Hazus-MH to use the Standard edition.  Additional information and detailed directions can also be found in Appendix L: Running Hazus-MH with SQL Server 2005 from your Hazus-MH Earthquake User’s Manual.

    Solution:

    Below are the steps to configure Hazus-MH to run with SQL Server 2005 Standard edition.
    1. Install Hazus-MH, launch at least one time and then close it.
       
    2. Open the windows registry. Click the START and select RUN to open the run window. Type regedit in the run window edit box and click the OK button to open the registry editor.
       
    3. Navigate through the folders listed in the Registry Editor to the location: [HKEY_LOCAL_MACHINE\SOFTWARE\FEMA\HAZUS-MH\General] in Registry Editor window.
       
    4. Double click on “ServerName” and enter the name of the new SQL Server 2005 instance. The name is in the format <computername>/<instancename>. For example, if the machine name is ATLHW32P91 and the instance name is SQL2005, then the registry entry should show ATLHW32P91/SQL2005. Open the SQL Server Management Studio from Start|Programs|Microsoft SQL Server 2005|SQL Server Management Studio on windows menu.
       
    5. Under the SQL Server folder double click Security Folder and select Logins, then right click. From the Popup menu select New Login.
       
    6. In the "SQL Server Properties - New Login" dialog enter hazuspuser in the name field
       
    7. Click SQL Server Authentication option. Enter the password gohazusplus_01
       
    8. Uncheck the “User must change password at next login” option if it is checked.
       
    9. Click OK.

      You can get the names in 7 and 8 above by copying them from registry [HKEY_LOCAL_MACHINE\SOFTWARE\FEMA\HAZUS-MH\General].
      1. For the Name field copy it from user identifier in the registry and paste it in the appropriate field.
         
      2. For the Password copy it from pwd in the registry and paste it in the appropriate field.
         
      3. NOTE:  It's better to copy these values from the registry to avoid mistakes.
         
    10. Click “Server Roles Tab” and check sysadmin. Click OK.
       
    11. Confirm password gohazusplus_01 and Click OK.
       
    12. Now connect to the HAZUSPLUSSRVR installed by Hazus-MH via the Management Studio. Click the Connect button at the top left corner of the SQL Server Management Studio and select the Database Engine.
       
    13. For the Server name select or type in <YourComputerName>\HAZUSPLUSSRVR.
       
    14. Select Windows Authentication for the Authentication.
       
    15. Click on the Connect button.
       
    16. Now, the new database server will be visible in the Management Studio.
       
    17. Next navigate to the Databases folder under HAZUSPLUSSRVR Server and expand it. Select the syHazus database, Right click on it and select Task|Detach and click OK.
       
    18. Navigate to the folder that represents the new server. Select Database folder and Right click, Select Attach… database... option.
       
    19. This will launch the Attach Database dialog.
       
    20. Click Add button and browse to the folder where Hazus-MH is installed. Within the Hazus-MH folder open the Data folder.
       
    21. Select syHazus_Data.MDF and click OK twice.
       
    22. You should get a message stating the syHazus is attached successfully.
       
    23. Right click on New Server (ATLHW32P91) in the Management Studio. Select Properties from the short cut menu. This will launch the Server Properties dialog. Click on the Security option, and make sure that the Server Authentication is set to SQL Server and Windows Authentication Mode.

      Hazus-MH is ready to be used with the full version SQL Server. Proceed with creating new study regions.
       
  2. Problem:  Hazus 2.0 causes ArcMap to show a blue background by default

    Solution:

    This was caused by a hurricane file "huExtension.dll" in the <Hazus installation directory>/BIN/HU/.   This was causing any ArcMap instance (including all 3 models) to display a blue background.  In the upcoming Hazus 2.1, this issue has been fixed (only the Hurricane / Wind model will display a blue background by default).  A temporary fix that you can try on your computer is to unregister the "huextension.dll".  This will affect the hurricane model, so if you need to run the hurricane model, please re-register it.  Note that this command assumes that you have installed Hazus in the default location: C:\Program Files\Hazus-MH\   If your installation is different, change the path in bold below to match your directory.
    1. In Windows Start menu, Click "Run" and paste this command into the box:
      "C:\Program Files\Common Files\ArcGIS\bin\ESRIRegAsm" /u /p:Desktop "C:\Program Files\HAZUS-MH\Bin\HU\huextension.dll"
       
    2. To re-register this file, follow the same steps as above with this command:
      "C:\Program Files\Common Files\ArcGIS\bin\ESRIRegAsm" /p:Desktop "C:\Program Files\HAZUS-MH\Bin\HU\huextension.dll"
       
  3. Problem:  Flood aggregation crashes

    If you try to aggregate a Flood model study region, the process completes about 3/4 of the way, then a shell error pops up.

    It has been proven that the base extension as well as Service Pack 1 conflict with Hazus, causing aggregation to fail when creating a new Flood study region.

    Solution:
    1. Check to see if the ESRI Data Interoperability extension has been installed.  To do so, go to "Add/Remove Programs."  If you see ‘Data Interoperability’ in the list, then it is installed.  If so, what service pack (if any) is installed of this extension?
    2. Install Service Pack 2 for the ESRI Data Interoperability extension.  This should solve the issue and allow you to aggregate a new Flood study region.
    3. The service pack 2 can be found here: http://resources.arcgis.com/content/patches-and-service-packs?fa=viewPatch&PID=83&MetaID=1787
    4. You need SP1 installed before you install SP2.  Here is the link to find Service Pack 1: http://resources.arcgis.com/content/patches-and-service-packs?fa=viewPatch&PID=83&MetaID=1721
       
  4. Problem:  Flood Options menu missing in Hazus-MH 2.0

    Solution:

    In Hazus-MH 2.0 we did not move the Tools menu to the Customize menu (as stated in the User Manual).  To add the Tools > Flood Options menu, please do the following:
    1. Right-click near your toolbar and select "Customize".
       
    2. In the Commands tab, scroll down in the Categories section, and select "[Menus], in the Commands section, select the "Tools"
       
    3. Drag the "Tools" to your toolbar (not at the end, next to Customize).
       
    4. Close the "Customize" dialog

      Now, you should have Tools menu on your toolbar.  If you go to Tools > Flood Options, you should see the menu option.
       
  5. Problem:  User Data - DEM process fails and crashes ArcMap in the Flood Model

    The path name where you are saving the NED file(s) is either too long, contains a space, or both. 

    (Other ArcGIS user's experienced this issue here: http://forumsstg.arcgis.com/threads/27012-rasterIO.dll-causing-exception-error-(all-PCs-on-the-network)

    Running "Hazard -> User Data (DEM tab)" in the Flood model
    1. Process fails and ArcMap crashes
       
    2. The "FlHazardLog.txt" located in "C:\Program Files\HAZUS-MH\BIN\FL" shows "GeometryToFeatureClass: Feature class: RivReqPoly" as the last line.  There is no error message in the last few lines of this log file.
       
    3. The Windows "Event Viewer" (Under "Administrative Tools") shows a red "X" under "Application".  The error code message contains "rasterIO.dll"
       
    4. Or click on the error message once ArcMap crashes to "Read More" regarding the error.  Look for "rasterIO.dll"  Refer to the attached screenshot for help.

      Solution:
      1. Copy the NED file(s) to a directory without spaces.  Try "C:\Temp" as a test.
         
      2. Open Hazus and browse to "C:\Temp" to add the NED file(s)
         
      3. Run "User Data"
         
      4. If the process completes successfully, then you must save future NED files to a shorter path that does not contain spaces.
         
  6. Problem:  Flood: Modeling a Dam Failure in Hazus

    Solution

    In order to model a dam failure scenario using data provided by the USGS, USCOE, and NCDNR, please refer to the following instructions:
    1. Obtain the water discharge (cubic feet per second) at the dam
       
    2. Verify that the DEM depicts the dam in elevation change (you should see a sudden drop in elevation where the dam is located in the DEM)
       
    3. In ArcMap, merge the DEM into one file
       
    4. Create a copy of this merged DEM to work with
       
    5. Breach the dam in the DEM by finding the lowest elevation on both sides of the dam and hydrologically connect the stream
       
    6. Open Hazus and import the dam-breached DEM in User Data
       
    7. Run hydraulics by discharge
       
  7. Problem:  "User Data" import processes fail for DEM or User Defined Depth Grids in Flood model

    ArcGIS sometimes has problems copying files that are stored in directories with long path names or paths with spaces in it.  Also, the study region name may contain too many spaces.
    1. Navigate to "Hazard - User Data"
    2. Try importing a DEM or User Depth Grid, etc.
    3. After clicking "OK" the process will complete very quickly but not load the DEM or User Depth Grid
    4. Open the "Hazard Log" file in your study region folder.  Verify that there are error messages.  The message may look like this ("CopyUDG: Error: Failed to copy raster dataset at 6") :

      2012/05/09 16:46:37.660  : ***********************************************************************

      2012/05/09 16:46:37.660  :                             USER DATA

      2012/05/09 16:46:37.660  : ***********************************************************************

      2012/05/09 16:46:37.660  frmUserData - OnOK: StudyRegion Name = Houston_County

      2012/05/09 16:46:37.660  frmUserData - OnOK: HAZUS Version = 12.0.0

      2012/05/09 16:46:37.675  frmUserData - OnOK: ArcGIS Version = 10.0.2414

      2012/05/09 16:46:37.722  frmUserData - OnOK: ArcGIS ServicePack Number =

      2012/05/09 16:46:37.738  frmUserData - OnOK: hzflvbdialogs.dll date = 12/19/2011 8:20:10 PM

      2012/05/09 16:46:38.330  frmUserData - ProcessFIT: Entering

      2012/05/09 16:46:38.377  frmUserData - ProcessFIT: Leaving

      2012/05/09 16:46:38.377  frmUserData - ProcessUDG: Entering

      2012/05/09 16:46:38.549  frmUserData - CopyUDG: Error: Failed to copy raster dataset at 6

      2012/05/09 16:46:38.549  frmUserData - ProcessUDG: Error:  at 6

      2012/05/09 16:46:38.549  frmUserData - OnOK: Error:  at 12

      Solution:

      Move your DEM files or User Depth Grid files to a location with a very short path name.  As a test, put the files in "C:\Temp".  Open "User Data" and try to import again. If this does not work and your study region name contains many spaces, create a new study region and use underscores “_” rather than spaces.  Try the import process again.
       
  8. Problem:  Flood model - User Data DEM import fails in Coastal Surge model / HEC-RAS fails. Due to previous install of ArcGIS 9.3.1

    This is caused by remnants of a previous ArcGIS 9.3.1 install.
    1. "Hazard - User Data -HEC-RAS" fails.

      or
       
    2. In a Coastal Surge study region, the "Hazard - User Data" DEM import fails.  Either DEM import process quits without finishing and no error or the following message is seen: "hzFlEntryPoints - hzFlAnWorkFlow   Object variable or With block variable not set"
       
    3. Open the "FlHazardLog.txt" file in your study region folder.  Verify that the following error message is in the log:  "ConvertFLTtoRaster: Error: Automation error  Unspecified error  at 2" .
       
    4. You will notice that you cannot use simple ArcMap toolbox tools such as Math: Plus etc. 

      Solution
       
    5. Close Hazus and ArcGIS programs
       
    6. Please navigate to: C:\Program Files\Common Files
       
    7. Notice that you have both "ESRI" and "ArcGIS" folders
       
    8. Rename the "ESRI" folder to "ESRI_old"
       
    9. Restart Hazus and try running the User Data / HEC-RAS process again.
       
  9. Problem:  Clicking on the HurrEvac FTP download button in the Hurricane Scenario Wizard produces an error
    1. Steps: Open the Hurricane Scenario Wizard: <Create New Scenario >, Import Hurrvac storm advisory, Storm Files {local machine}. Clicking on the ftp download button produces an error message:
       
    2. Mabry.ftpXObj.1

      "Invalid socket or not connected to remote"

      Solution

      The HURREVAC FTP site is no longer available. As a result, the FTP Download Tool in Hazus no longer works. Pending implementation of a permanent fix, please follow the workaround procedure documented in the PDF to download the Work Around for Downloading HURREVAC Storm Files into Hazus.
       
  10. Problem:  Flood Model - There is no building or content loss in the results even though there are buildings in the floodplain

    Part of the depth grid has "No Data" instead of an actual depth greater than zero.  This depth grid is corrupt and causes the Area Weighted process of the General Building Stock analysis to fail.  The corrupt portion of the depth grid must be removed or modified in order to allow the analysis to return accurate results.

    In the Flood Model, Hydraulics and GBS finishes without errors.  The user can view the results but there is no building or content damage. The only damage comes from Relocation Cost, Income Loss, Wage Loss, etc. Open the FlAnalysisLog file in your scenario directory.  In the Area Weighted process, search for “SC:”  without quotes.  This line should be followed by about 40 rows of long text such as below:

    2012/10/18 12:03:55.240  absp_GBSInsertAW: SC: 2yr Mask flAnAreaWeighted rows=2351

    2012/10/18 12:03:55.397  absp_GBSInsertAW: update flAnAreaWeighted set ft600 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 600

    2012/10/18 12:03:56.433  absp_GBSInsertAW: update flAnAreaWeighted set ft1050 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1050

    2012/10/18 12:03:57.397  absp_GBSInsertAW: update flAnAreaWeighted set ft1400 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1400

    2012/10/18 12:03:58.350  absp_GBSInsertAW: update flAnAreaWeighted set ft900 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 900

    2012/10/18 12:03:59.310  absp_GBSInsertAW: update flAnAreaWeighted set ft1100 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1100

    2012/10/18 12:04:00.270  absp_GBSInsertAW: update flAnAreaWeighted set ft1500 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1500

    2012/10/18 12:04:01.233  absp_GBSInsertAW: update flAnAreaWeighted set ft2400 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 2400

    2012/10/18 12:04:02.193  absp_GBSInsertAW: update flAnAreaWeighted set ft700 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 700

    2012/10/18 12:04:03.177  absp_GBSInsertAW: update flAnAreaWeighted set ft350 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 350

    2012/10/18 12:04:04.287  absp_GBSInsertAW: update flAnAreaWeighted set ft1350 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1350

    2012/10/18 12:04:05.260  absp_GBSInsertAW: update flAnAreaWeighted set ft550 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 550

    2012/10/18 12:04:06.290  absp_GBSInsertAW: update flAnAreaWeighted set ft1550 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1550

    2012/10/18 12:04:07.260  absp_GBSInsertAW: update flAnAreaWeighted set ft950 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 950

    2012/10/18 12:04:08.263  absp_GBSInsertAW: update flAnAreaWeighted set ft300 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 300

    2012/10/18 12:04:09.493  absp_GBSInsertAW: update flAnAreaWeighted set ft850 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 850

    2012/10/18 12:04:10.460  absp_GBSInsertAW: update flAnAreaWeighted set ft500 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 500

    2012/10/18 12:04:11.463  absp_GBSInsertAW: update flAnAreaWeighted set ft1200 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1200

    2012/10/18 12:04:12.423  absp_GBSInsertAW: update flAnAreaWeighted set ft2250 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 2250

    2012/10/18 12:04:13.390  absp_GBSInsertAW: update flAnAreaWeighted set ft1000 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1000

    2012/10/18 12:04:14.353  absp_GBSInsertAW: update flAnAreaWeighted set ft250 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 250

    2012/10/18 12:04:15.777  absp_GBSInsertAW: update flAnAreaWeighted set ft200 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 200

    2012/10/18 12:04:17.490  absp_GBSInsertAW: update flAnAreaWeighted set ft100 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 100

    2012/10/18 12:04:19.543  absp_GBSInsertAW: update flAnAreaWeighted set ft1900 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1900

    2012/10/18 12:04:20.500  absp_GBSInsertAW: update flAnAreaWeighted set ft400 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 400

    2012/10/18 12:04:21.557  absp_GBSInsertAW: update flAnAreaWeighted set ft150 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 150

    2012/10/18 12:04:23.470  absp_GBSInsertAW: update flAnAreaWeighted set ft1650 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1650

    2012/10/18 12:04:24.423  absp_GBSInsertAW: update flAnAreaWeighted set ft800 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 800

    2012/10/18 12:04:25.390  absp_GBSInsertAW: update flAnAreaWeighted set ft1150 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1150

    2012/10/18 12:04:26.347  absp_GBSInsertAW: update flAnAreaWeighted set ft50 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 50

    2012/10/18 12:04:28.490  absp_GBSInsertAW: update flAnAreaWeighted set ft1300 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1300

    2012/10/18 12:04:29.463  absp_GBSInsertAW: update flAnAreaWeighted set ft650 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 650

    2012/10/18 12:04:30.450  absp_GBSInsertAW: update flAnAreaWeighted set ft750 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 750

    2012/10/18 12:04:31.417  absp_GBSInsertAW: update flAnAreaWeighted set ft1250 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 1250

    2012/10/18 12:04:32.370  absp_GBSInsertAW: update flAnAreaWeighted set ft2100 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 2100

    2012/10/18 12:04:33.330  absp_GBSInsertAW: update flAnAreaWeighted set ft450 = t.PctAffected from flAnGBSCombineAWTemp t where flAnAreaWeighted.CensusBlock=t.CensusBlock and flAnAreaWeighted.CoastalZone=t.CoastalZone and flAnAreaWeighted.ControllingHazard=t.ControlHazard and flAnAreaWeighted.StudyCaseID = 6 and flAnAreaWeighted.ReturnPeriodID = '2' and flAnAreaWeighted.AnalysisOptID = 0     and t.Reclass = 450

    2012/10/18 12:04:34.377  absp_GBSInsertAW: DEBUG: AFTER update: flAnAreaWeighted rows=20250

    However, if the log shows that these lines have been omitted, you are experiencing the error in this solution.  Your log file may incorrectly show the following:

    2012/10/17 16:11:07.377  absp_GBSInsertAW: SC: 2yr_meters flAnAreaWeighted rows=2495

    2012/10/17 16:11:07.647  absp_GBSInsertAW: DEBUG: AFTER update: flAnAreaWeighted rows=17899

    In addition, open SQL Server Management Studio Express, connect to your study region database, and run the following query:

    select * from flanareaweighted where studycaseid = '<enter your study region id number>'

    (You can finder your study region ID number by running this query:  select * from flstudycase )

    The results will show NULL for all columns except “ft00”.  Also, run this query to prove that all the buildings in the results are undamaged although they are in the floodplain and should be damaged (modify the “studycaseid” to match your scenario):

    select sum(TotalBuildings) TotalBldgs, sum(UndamagedBldgs) Undamaged_Bldgs from absv_FRGBSDmgCountGOccupAll where studycaseid = '4'

    Solution:
    1. Use  the info button to find an area with “No Data” in the floodplain for the depth grid.  You will need to remove this boundary polygon and change the depth grid.
       
    2. Once you’ve identified the problem area. Close Hazus, open ArcMap and the ToolBox.
       
    3. Click “Add Data” and navigate to “<study region directory> - <scenario directory> - Riverine – CaseOutput.mdb – Hydrology”
       
    4. Add the appropriate return period boundary polygon i.e “BoundaryPolyRP100”
       
    5. Click “Add Data” and navigate to “<study region directory> - <scenario directory> - Riverine – Depth”
       
    6. Add the appropriate return period depth grid i.e “rpd100”
       
    7. In Arc Toolbox, navigate to “Data Management Tools – Features – Multipartpart to Singlepart” . Choose the boundary polygon from the “Input Features” combo box.
       
    8. Input the desired folder to save the new boundary polygon in the “Output Feature Class” field and give the file a unique name
       
    9. Click “OK”
       
    10. Click “Add Data” and navigate to wherever you saved the new boundary polygon
       
    11. Click “Editor – Start Editing” and select the new boundary polygon
       
    12. Select the polygon which had “No Data” in the depth grid in Hazus, causing the Area Weighted error
       
    13. Delete this polygon
       
    14. Click “Editor – Stop Editing” and “Yes” to save your edit
       
    15. In Arc Toolbox, navigate to “Spatial Analyst Tools – Extraction – Extract by Mask” 
       
    16. Choose the desired return period depth grid from the “Input Raster” combo box.
       
    17. Choose the edited boundary polygon from the “Input Raster or Feature Mask data” combo box.
       
    18. Input the desired folder to save the new depth grid in the “Output Raster” field and give the file a unique name
       
    19. Click “OK”
       
    20. Close ArcMap and open Hazus
       
    21. Navigate to “Hazard – User Data –Depth Grid”
       
    22. Browse to and add the new depth grid and click “OK”
       
    23. Create a new “User Defined Depth Grid” scenario and select the new depth grid added in User Data
       
    24. Run delineate floodplain
       
    25. Run General Building Stock analysis
       
    26. Once it finishes, run this query in SQL Server Management Studio Express to prove that all the undamaged buildings in the results no longer equal the damaged buildings (modify the “studycaseid” to match your scenario):

      select sum(TotalBuildings) TotalBldgs, sum(UndamagedBldgs) Undamaged_Bldgs from absv_FRGBSDmgCountGOccupAll where studycaseid = '4'

       
Last Updated: 
07/24/2014 - 16:00
Back to Top