PBE Documentation V2.0

Download as pdf or txt
Download as pdf or txt
You are on page 1of 93

PBE

Performance-Based Engineering
Application
Frank McKenna, Adam Zsarnóczay,
Wael Elhaddad, Chaofeng Wang, and Michael Gardner
NHERI SimCenter, UC Berkeley

Version 2.0

October 16, 2019


Licenses and Copyright Notices

The source code of PBE is licensed under a BSD 2-Clause License: ”Copyright (c) 2017-2019,
The Regents of the University of California (Regents).” All rights reserved.Redistribution
and use in source and binary forms, with or without modification, are permitted provided
that the following conditions are met:
1. Redistributions of source code must retain the above copyright notice, this list of
conditions and the following disclaimer
2. Redistributions in binary form must reproduce the above copyright notice, this list of
conditions and the following disclaimer in the documentation and/or other materials
provided with the distribution

THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CON-


TRIBUTORS “AS IS” AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUD-
ING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABIL-
ITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO
EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR
ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUEN-
TIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUB-
STITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSI-
NESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABIL-
ITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEG-
LIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
The views and conclusions contained in the software and documentation are those of
the authors and should not be interpreted as representing official policies, either expressed
or implied, of the FreeBSD Project. Authors take no responsibility, whatsoever, on accu-
racy of PBE. REGENTS SPECIFICALLY DISCLAIMS ANY WARRANTIES, INCLUD-
ING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY
AND FITNESS FOR A PARTICULAR PURPOSE. THE SOFTWARE AND ACCOMPA-
NYING DOCUMENTATION, IF ANY, PROVIDED HERE UNDER IS PROVIDED “AS
IS”. THE REGENTS HAVE NO OBLIGATION TO PROVIDE MAINTENANCE, SUP-
PORT, UPDATES, ENHANCEMENTS, OR MODIFICATIONS.
The compiled binary form of this application is licensed under a GPL Version 3 license.
The licenses are as published by the Free Software Foundation and appearing in the LICENSE
file included in the packaging of this application.
This software makes use of the QT packages (unmodified): core, gui, widgets and network.
QT is copyright “The Qt Company Ltd” and licensed under the GNU Lesser General Public
License (version 3) which references the GNU General Public License (version 3).
The licenses are as published by the Free Software Foundation and appearing in the
LICENSE file included in the packaging of this application.
Acknowledgments

This material is based upon work supported by the National Science Foundation under Grant
No. 1612843. Any opinions, findings, and conclusions or recommendations expressed in this
material are those of the author(s) and do not necessarily reflect the views of the National
Science Foundation.
Contents

Contents i

List of Figures iii

List of Tables vi

1 About 1

2 Installation Instructions 3
2.1 Download the Application . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2 Set up Python . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.3 Set up for Running Workflows Locally . . . . . . . . . . . . . . . . . . . . . 6
2.4 Test the PBE application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

3 Usage 17
3.1 GI: General Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.2 SIM: Structural Information Model . . . . . . . . . . . . . . . . . . . . . . . 20
3.3 EVT: Event . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.4 FEM: Finite Element Method . . . . . . . . . . . . . . . . . . . . . . . . . . 33
3.5 UQ: Uncertainty Quantification . . . . . . . . . . . . . . . . . . . . . . . . . 34
3.6 DL: Damage and Loss Model . . . . . . . . . . . . . . . . . . . . . . . . . . 40
3.7 RES: Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
3.8 Push Buttons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

4 Theory and Implementation 54


4.1 The basic workflow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
4.2 The Workflow with Uncertainty Quantification . . . . . . . . . . . . . . . . . 56

5 Source Code 59

6 User Training 60

7 Examples 61
7.1 One-story two-dimensional portal frame . . . . . . . . . . . . . . . . . . . . . 61

8 Verification and Validation 69


8.1 Estimation of central tendencies . . . . . . . . . . . . . . . . . . . . . . . . . 69

i
9 Requirements 76

10 Troubleshooting 82
10.1 Problems Starting the Application . . . . . . . . . . . . . . . . . . . . . . . . 82
10.2 Problems Running Simulations . . . . . . . . . . . . . . . . . . . . . . . . . 82

ii
List of Figures

2.1 Download Application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3


2.2 PBE Application on Startup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.3 Testing the Python environment. . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.4 Downloading OpenSees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.5 Adding OpenSees to the PATH environment variable on Windows . . . . . . . . 8
2.6 Adding OpenSees to the PATH environment variable on Mac. . . . . . . . . . . 9
2.7 Downloading Dakota Software . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
2.8 Testing OpenSees. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
2.9 Testing Dakota. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.10 Testing Perl. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.11 Testing the dakota Python package. . . . . . . . . . . . . . . . . . . . . . . . . . 13
2.12 Selecting event type and inputting synthetic motion parameters . . . . . . . . . 14
2.13 Specifying distribution type and parameters for random variables in analysis—only
Vs30 in this case . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
2.14 Specifying the damage and loss model for the analysis. . . . . . . . . . . . . . . 15
2.15 Results for test analysis. This tab will open automatically when the analysis
completes, indicating a successful installation . . . . . . . . . . . . . . . . . . . 16

3.1 The User Interface (UI) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17


3.2 General Information Input Panel . . . . . . . . . . . . . . . . . . . . . . . . . . 19
3.3 MDOF or Shear Building Model . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
3.4 OpenSees Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.5 Multiple Existing (SimCenter) Events . . . . . . . . . . . . . . . . . . . . . . . 23
3.6 Multiple PEER Events . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
3.7 Hazard Based Event . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.8 Stochastic Ground Motion Event . . . . . . . . . . . . . . . . . . . . . . . . . . 26
3.9 Site Response Analysis Event . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
3.10 Slope definition for 2D Column . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.11 Slope definition for 3D Column . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
3.12 Nodal and elemental responses . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
3.13 Soil Layer Modification in Site Response . . . . . . . . . . . . . . . . . . . . . . 30
3.14 Simulated Motion at the Surface of the Ground . . . . . . . . . . . . . . . . . . 30
3.15 User defined event . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
3.16 PEER NGA Records Event . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
3.17 Options for OpenSees transient analysis . . . . . . . . . . . . . . . . . . . . . . 33

iii
3.18 Uncertainty Quantification input panel . . . . . . . . . . . . . . . . . . . . . . . 35
3.19 Monte Carlo Sampling input panel . . . . . . . . . . . . . . . . . . . . . . . . . 35
3.20 Latin Hypercube Sampling input panel . . . . . . . . . . . . . . . . . . . . . . . 36
3.21 Importance Sampling input panel . . . . . . . . . . . . . . . . . . . . . . . . . . 36
3.22 GPR forward propagation input panel . . . . . . . . . . . . . . . . . . . . . . . 37
3.23 PCE forward propagation input panel . . . . . . . . . . . . . . . . . . . . . . . 38
3.24 Reliability input panel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
3.25 Random Variable specification . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
3.26 The General Damage and Loss Settings panel. (The settings shown in the Figure
serve demonstration purposes and are not the suggested inputs.) . . . . . . . . . 41
3.27 The Building Components panel. (The settings shown in the Figure serve demon-
stration purposes and are not the suggested inputs.) . . . . . . . . . . . . . . . . 44
3.28 The Collapse Modes panel. (The settings shown in the Figure serve demonstration
purposes and are not the suggested inputs.) . . . . . . . . . . . . . . . . . . . . 45
3.29 The General Damage and Loss Settings panel. (The settings shown in the Figure
serve demonstration purposes and are not the suggested inputs.) . . . . . . . . . 46
3.30 Results Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
3.31 Results presented graphically and in tabular form . . . . . . . . . . . . . . . . . 50
3.32 Pop-up window shown after clicking the RUN button . . . . . . . . . . . . . . . . 50
3.33 Pop-up window shown after clicking the RUN at DesignSafe button . . . . . . 51

4.1 Schematic of the basic Scientific Workflow for Damage and Loss Assessment . . 55
4.2 Schematic of the Scientific Workflow Application for Damage and Loss Assess-
ment with UQ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56

7.1 Two-dimensional portal frame model subjected to gravity and earthquake loading 62
7.2 General Information about the building. . . . . . . . . . . . . . . . . . . . . . . 62
7.3 Information about the simulation model. . . . . . . . . . . . . . . . . . . . . . . 63
7.4 Information about the ground motion event. . . . . . . . . . . . . . . . . . . . . 63
7.5 Uncertainty quantification settings. . . . . . . . . . . . . . . . . . . . . . . . . . 63
7.6 General settings for the Damage and Loss model. . . . . . . . . . . . . . . . . . 64
7.7 Component information. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
7.8 Collapse modes and their consequences. . . . . . . . . . . . . . . . . . . . . . . 65
7.9 Requested locations before running the analysis. . . . . . . . . . . . . . . . . . . 66
7.10 The RES panel after running the analysis. . . . . . . . . . . . . . . . . . . . . . 67
7.11 The joint distribution of reconstruction time (assuming parallel work) and recon-
struction cost. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
7.12 The cumulative distribution function of reconstruction time. . . . . . . . . . . . 68
7.13 A histogram showing the marginal probability density function of reconstruction
time. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

8.1 Cumulative distribution function of event month. . . . . . . . . . . . . . . . . . 70

iv
8.2 Distribution of weekday/weekend realizations. . . . . . . . . . . . . . . . . . . . 70
8.3 Cumulative distribution function of event hour. . . . . . . . . . . . . . . . . . . 70
8.4 Cumulative distribution function of inhabitants. . . . . . . . . . . . . . . . . . . 71
8.5 Distribution of collapse/non-collapse realizations. . . . . . . . . . . . . . . . . . 71
8.6 Distribution of collapse modes. . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
8.7 Cumulative distribution function of realizations that resulted in a red tag. . . . 72
8.8 Joint distribution of reconstruction times with parallel and sequential repair as-
sumptions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
8.9 Joint distribution of reconstruction cost and time. . . . . . . . . . . . . . . . . . 73
8.10 Cumulative distribution function of reconstruction time. . . . . . . . . . . . . . 74
8.11 Cumulative distribution function of injuries. . . . . . . . . . . . . . . . . . . . . 74
8.12 Cumulative distribution function of fatalities. . . . . . . . . . . . . . . . . . . . 75

10.1 Error message for missing Visual C/C++ runtime library . . . . . . . . . . . . . 82

v
List of Tables

9.1 Requirements for PBE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76

vi
1 About
The intended audience for the PBE Application (PBE App) is researchers and practitioners
interested in predicting the seismic performance of buildings.

This is an open-source research application. The source code at the PBE Github page pro-
vides an application that can be used to assess the performance of a building in an earthquake
scenario. The application focuses on quantifying building performance through decision vari-
ables. Given that the properties of the buildings and the earthquake events are not known
exactly, and that the simulation software and the user make simplifying assumptions in the
numerical modeling of the structure, the estimate response of the structure already exhibits
significant variability. Such response can be estimated using our EE-UQ Application. The
PBE App builds on the EE-UQ App and uses its response estimates to assess the damage to
building components and the consequences of such damage.

The user can characterize the structural model, the damage and loss model, and the
seismic hazard model in this application. All models are interconnected by an uncertainty
quantification framework that allows the user to define a flexible stochastic model for the
problem. Given the stochastic model, the application first performs nonlinear response his-
tory simulations to get the Engineering Demand Parameters (EDPs) that describe structural
response. Then, those EDPs are used to assess the Damage Measures (DMs) and Decision
Variables (DVs) that characterize structural performance.

Depending on the type of structural system, the fidelity of the numerical model and
the number of EDP samples requested, the response history simulations can be computa-
tionally prohibitively expensive. To overcome this impediment, the user has the option to
perform the response simulations on the Stampede2 supercomputer. Stampede2 is located
at the Texas Advanced Computing Center (TACC) and made available to the user through
NHERI DesignSafe-CI, the cyberinfrastructure provider for the distributed NSF funded Nat-
ural Hazards Engineering Research Infrastructure (NHERI) facility.

The computations are performed, as will be discussed in Chapter 4, in a workflow ap-


plication. That is, the numerical simulations are performed by a sequence of applications.
The PBE backend software runs these applications for the user, taking the outputs from some
programs and providing them as inputs to others. The design of the PBE App is such that
researchers are able to modify the backend application to utilize their own application in the
workflow computations. This ensures that researchers are not limited to using the default ap-
plications we provide and will be enthused to provide their own applications for others to use.

1
CHAPTER 1. ABOUT

This is Version 2.0 of the tool. Users are encouraged to comment on what additional
features and capabilities they would like to see in this application. These requests and
feedback can be submitted through an anonymous user survey; we greatly appreciate any
input you have. If there are features you want, chances are many of your colleagues also
would benefit from them. Users are encouraged to review Chapter 9 to see what features
are planned for this application.

2
2 Installation Instructions
All SimCenter applications are available at the SimCenter website under Research Tools.
The following sections outline the steps necessary to download and install the PBE applica-
tion. The SimCenter applications do require that you install a number of other applications
that are needed to run the workflows on your local machine as well as at DesignSafe.

2.1 Download the Application


To download the PBE application navigate to the PBE page and click on the Download App
& User Manual link on the right side of the page. As shown in Figure 2.1, this will bring
you to another page which contains a list of downloadable files and directories.

Figure 2.1: Download Application

There are at least four files available for download from this page:
1. The PDF file is the User Manual that you are reading now.
2. The MOV file is an video that provides an introduction to the usage of the application.
3. The ZIP file is an archive that contains the application files for a Windows operating
system.
4. The DMG file is an archive that contains the application files for a Mac OS X operating
system.

3
CHAPTER 2. INSTALLATION INSTRUCTIONS

To download the PBE application click on the link for the appropriate file for your oper-
ating system and then click on the Download button at bottom right corner of the ensuing
pop-up window. Unpackage the application from the downloaded file and place it in a loca-
tion on your filesystem. On Windows, we recommend that you create a C:/SimCenter/PBE
directory and extract the contents of the ZIP archive there. It is also recommended to run
the included installer for Visual C/C++ runtime library(vc redist.x64.exe). If you use a
Mac we recommend you copy the application to either your home folder or your Desktop
folder. You are free to place the applications anywhere you wish, you will need to make the
appropriate adjustments with the following instructions if you do so.

Now test that the application starts. To do this navigate to the location where you placed
the application and open it. You should see the user interface (UI) shown in Figure 2.2 after
starting the application. Now Quit the application. Additional steps are required before
computations can be performed.

Figure 2.2: PBE Application on Startup

4
CHAPTER 2. INSTALLATION INSTRUCTIONS

1. The SimCenter is not recognized as either a Windows or an Apple vendor. Our appli-
cations are not recognized by the operating system as being signed. Consequently, you
may receive a warning message when you start the PBE application for the first time.

2. On a Mac you will need to right click on the .dmg file to open it. The UI will not start
correctly while in the DMG file, you need to open the .dmg file and then copy the PBE
application to your Documents or Desktop folder. You can then move the .dmg file to
the trash or eject it after this has been done.

3. The PBE application requires additional software outlined in next subsections to work
properly. Even of the software starts correctly, it will not run correctly until this
software, outlined in the next section, is installed correctly.

2.2 Set up Python


The SimCenter workflow applications are managed by Python scripts. These are required to
prepare the input data for running analyses either remotely on DesignSafe or locally. As a
consequence the user must have Python installed on their machine and have the appropriate
environment variables set so that the UI can run these applications.

2.2.1 Install Python


SimCenter products require Python version 3.7 or above be installed on your machine as
January 2020 marks the end of life for Python 2.7.

1. Windows:
If you have not yet installed Python 3.7, we recommend installing from Python.org.
We recommend installing using the Windows x86-64 executable installer.
Allow the installer to change your system environment variables so that the install
directory is added to your PATH. Once installed you need to You need to install the
following python packages: numpy, scipy, and pandas are installed. To install these
packages open a terminal window as an Admin user and in that window type the
following instructions:
To install these packages, start a terminal window and type:

pip install numpy


pip install scipy
pip install pandas

2. Mac

5
CHAPTER 2. INSTALLATION INSTRUCTIONS

The Mac comes with Python pre-installed, which is currently the somewhat dated
version 2.7. To install Python 3.7 we recommend installing from Python.org. We rec-
ommend installing using the macOS 64-bit installer given for latest stable release.
The installer will place a python3 executable in your /usr/local/bin directory, whose
location should be on your system PATH.
You need to install the following python packages: numpy, scipy, and pandas are
installed. To install these packages, start a terminal window and type:

pip3 install numpy


pip3 install scipy
pip3 install pandas

Notes:
a) To start a terminal window you can use the spotlight app (magnifying top right of
desktop). Start the spotlight app and type in terminal. The terminal application
should appear as the top hit. Click on it to start it.
b) In tool preferences make sure that python3 appears as the python executable. If
you used older versions of SImCEnter tools this was the default.

2.2.2 Test Python


Test if the python environment is set up properly by executing python in a terminal window.
After Python starts, test if the packages are installed by executing import numpy, import
scipy, and import pandas. You will receive an error message if a pacakage is missing. If
no error appears, the terminal should look similar to Figure 2.3. Exit Python by executing
the exit() command.

Figure 2.3: Testing the Python environment.

2.3 Set up for Running Workflows Locally


To run the workflows locally, the backend python application needs publicly available soft-
ware to also be installed on your machine. These software applications need to be installed
and configured on your operating system. If you do not plan to run the workflows locally,
you will not need these applications.

6
CHAPTER 2. INSTALLATION INSTRUCTIONS

2.3.1 Install OpenSees


OpenSees is an open-source finite element application publicly available for download from
its download page. OpenSees installation requires the user install both OpenSees and Tcl.
If you have never downloaded OpenSees before, you will need to register your e-mail to gain
access. After registration, you can proceed to the download page by entering your email
address and clicking the Submit button. The Windows and Mac downloads are in different
locations on the download page, with the appropriate Tcl installer beside the OpenSees link;
see Figure 2.4

Figure 2.4: Downloading OpenSees

Follow the instructions on the download page to install Tcl (Figure 2.4). On Windows,
you must select the Custom option for installton and you must specify the installtion direc-
tory as C:\Program Files\Tcl, which is not the default.

After Tcl is installed, we recommend you put OpenSees in the C:/SimCenter/OpenSees


folder on Windows and in a /usr/local/OpenSees directory on the Mac (If you use finder
on Mac to do navigation, use command-shift-G in Finder and specify /usr/local as the folder
to go to. Create a new folder OpenSees and copy the OpenSees application to this folder).

Now you need to add the OpenSees folder to the system PATH environment variable to al-
low the SimCenter workflow applications to find the OpenSees executable on your computer.

7
CHAPTER 2. INSTALLATION INSTRUCTIONS

The steps to do this depend on your operating system:

1. Windows: To add a folder to the PATH on Windows (Figure 2.5):

Figure 2.5: Adding OpenSees to the PATH environment variable on Windows

a) open Start, type env, and choose Edit the system environment variables;
b) click on the Environment variables... button in the dialog window;
c) find the Path under System Variables in the Variable column;
d) click New and type in the path to your OpenSees.exe (this will be C:\SimCenter\OpenSees
if you put the executable at the recommended location - pay attention to using
backslashes here!);
e) click OK in every dialog to close them and save your changes.

8
CHAPTER 2. INSTALLATION INSTRUCTIONS

2. MacOS: To add the /usr/local/OpenSees folder to the PATH variable:

a) open a Terminal;
b) execute (type the following in the terminal window and hit the return key) the
following:
nano ${HOME}/.bash_profile
c) if the file contains nothing, add the first 3 lines shown in Figure 2.6 to the file.
This is done in case an existing .bashrc file exists for your system. Adding these
3 lines will test for the existance of this file, and source in any existing commands
if the file does exist.
d) on a new line add the OpenSees executable to the PATH variable, by typing the
following:
export PATH=/usr/local/OpenSees:${PATH}
e) quit by hitting Ctrl+X and then Y when asked if you want to save modifications.
f) test it is entered correctl, the following command now entered in the terminal
window should result in no errors:
source ${HOME}/.bash_profile

Figure 2.6: Adding OpenSees to the PATH environment variable on Mac.

2.3.2 Install Dakota


Dakota, an open-source optimization and UQ application from Sandia National Labs, is
publicly available for download at its download page. Select your operating system from
the list and set the other options as shown in Figure 2.7. Download the release in a ZIP
file for Windows and TAR.GZ file for Mac. We recommend you to extract the archive to a
C:/SimCenter/Dakota folder on Windows, and to a /usr/local/Dakota folder on a Mac.

9
CHAPTER 2. INSTALLATION INSTRUCTIONS

Figure 2.7: Downloading Dakota Software

Following the instructions provided for installing OpenSees, you need to add two Dakota
folders to the system PATH environment variable to allow the SimCenter workflow applications
to find the Dakota tools on your computer. the procedure described above for OpenSees to
add the following folders to your PATH:

1. Windows
Add the following 2 folders to your windows PATH variable:
• C:\SimCenter\Dakota\bin
• C:\SimCenter\Dakota\share\dakota\Python
Now you need to create a new variable, PYTHONPATH, and point it to the following
folder.

• C:\SimCenter\Dakota\share\dakota\Python

2. MacOS On the Mac you also need to add 2 lines, previously shown in Figure 2.6, to
the .bash profile file. One line adds the Dakota executable to the PATH variablem
and the other creates a new variable PYTHONPATH and points it to a folder in the
Dakota installation directory.

10
CHAPTER 2. INSTALLATION INSTRUCTIONS

• export PATH=/usr/local/Dakota/bin:$PATH
• export PYTHONPATH=/usr/local/Dakota/share/dakota/Python

NOTE: Apple, in the latest release of their operating system, MacOS 10.16 Catalina, has
changed the default working of Gatekeeper. Gatekeeper, first introduced in OS X Mountain
Lion, is a Mac security feature that helps protect your Mac from Malware and other malicious
software. Gatekeeper checks to make sure the application is safe to run by checking it
against the list of apps that Apple has vetted and approved for the Apple Mac Store and/or
approved by Apple even if not offered through the app store. In previous versions of MacOS,
Gatekeeper had three security level options: App Store, App Store and Identified Developers,
and Anywhere. Anywhere has been removed and this will cause problems with Dakota. As
a consequence, it is necessary to follow the following when you update the MacOS or install
Dakota for the first time on machine with an updated MacOS. From the terminal app, with
the above .bash profile settings set, you need to type the following in the terminal window:

sudo spctl --master-disable


dakota
sudo spctl --master-enable

This will temporarily disable gatekeeper (basically setting Gatekeeper options to Any-
where), allow the Dakota application and it’s .dylib files to be registered as safe, and then
turn Gatekeeper options back to default.

2.3.3 Install Perl


Mac OS X has Perl pre-installed, but Windows users will have to install it to be able to
use Dakota. We recommend you use Strawberry Perl; you can install it by downloading the
executable from its Strawberry Perl website and running it.

2.3.4 Test the Install of the Local Applications


Before running the PBE application, perform the following tests to make sure that the local
SimCenter working environment is set up appropriately:

• Start a Terminal on Mac or a Command Prompt on Windows.

• On Mac, execute cd /usr/Documents to change the active directory to /usr/Documents.


On Windows, execute cd C:/ to change the active directory to C:/.

• Test if OpenSees works correctly by executing the OpenSees command. The command
should start OpenSees (Figure 2.8). Close OpenSees with the exit command.

11
CHAPTER 2. INSTALLATION INSTRUCTIONS

• Test if Dakota works correctly by executing the dakota command. The command
should start Dakota and you should see a message about a missing argument (Fig-
ure 2.9).

• Test if Perl works correctly by executing the perl -v command. The command should
start Perl and return its version number (Figure 2.10).

• Test if the python package in Dakota works correctly by starting Python with the
python command and then executing the import dakota command. This should
import the dakota package. If you do not see errors, then the package is successfully
imported (Figure 2.11). Exit Python with the exit() command.

• If all the above tests ran without errors, your environment is set up appropriately.

Figure 2.8: Testing OpenSees.

12
CHAPTER 2. INSTALLATION INSTRUCTIONS

Figure 2.9: Testing Dakota.

Figure 2.10: Testing Perl.

Figure 2.11: Testing the dakota Python package.

13
CHAPTER 2. INSTALLATION INSTRUCTIONS

2.4 Test the PBE application


Once the local SimCenter working environment has been tested and is functioning correctly,
the PBE Application can be tested. The simplest way to do this is by running an analysis
using the default structural model with synthetic ground motions. By doing this, it is
not necessary to enter any information on the structural model and only inputs for the
synthetic motions, uncertainty quantification, and loss assessment are required. With this
quick setup, the functionality of the PBE UI and the associated backend workflow can be
tested. The necessary steps to perform this testing are provided below.
A full description of how to use this software is provided in Chapter 3. In this quick test,
users will only interface with the event tab (EVT), the uncertainty quantification tab (UQ),
the damage and loss assessment tab (DL), and the results tab (RES).
The first step is to start the PBE application. Once the application is started, the second
step is to input the parameters for the synthetic motions under the EVT (Event) tab. This is
shown in Figure 2.12. Click on the EVT tab which will allow the loading type to be selected.
From the dropdown menu select Stochastic Ground Motion Model. Upon selecting this
loading type, the loading model will be set as Vlachos et al. (2018).

Figure 2.12: Selecting event type and inputting synthetic motion parameters

Only three inputs are required for this test, of which one will be set to a random variable.
As shown in Figure 2.12, set the Moment Magnitude (MW ) to 6.5, the Closest-to-Site Rupture
Distance (Rrupt ) to 20 km, and the Average shear-wave velocity for the top 30 m (VS30 ) to
vs30. The Provide seed value radio button should be left unselected. By specifying these
inputs, both MW and Rrupt will have constant values in all realizations while VS30 will have
different values based on the model parameters specified in the uncertainty quantification
(UQ) tab. With these inputs specified, navigate to the UQ tab. Here the distributions and their
relevant parameters will be specified for the random variables defined in the analysis—only
VS30 in this case. Since VS30 was identified as a random variable by inputting the parameter
value as text, it is automatically added as a random variable, as shown in Figure 2.13. Set

14
CHAPTER 2. INSTALLATION INSTRUCTIONS

the distribution type to normal with a Mean and Standard Dev of 350 m/s and 25 m/s,
respectively.

Figure 2.13: Specifying distribution type and parameters for random variables in analy-
sis—only Vs30 in this case

The last step before running the analysis is to set up a damage and loss model. Navigate
to the DL tab and select HAZUS MH as the loss assessment method to use. Set up the model
according to Figure 2.14.

Figure 2.14: Specifying the damage and loss model for the analysis.

15
CHAPTER 2. INSTALLATION INSTRUCTIONS

Now, click on the RUN button, which will bring up a pop-up menu that provides informa-
tion on the application directory and the working directory. The application directory should
already be automatically set to where PBE is installed. If desired, the working directory can
be changed. In order to start the analysis, click on the Submit button.

Figure 2.15: Results for test analysis. This tab will open automatically when the analysis
completes, indicating a successful installation

If successful, the application will pause briefly while it runs the analysis before automat-
ically displaying the simulations results in the RES tab, as shown in Figure 2.15. Remember,
the results shown in Figure 2.15 most likely will not be the same as those from this local
test since VS30 is a random variable and the values realized in the simulations will be dif-
ferent while still following the same distribution. In any case, if the simulations completed
and the RES tab is showing simulation results, then the PBE App is properly installed and
configured.

16
3 Usage
The user interface (UI), as shown in Figure 3.1, is where the analysis is configured and
managed. Here, the user is able to provide the necessary parameters to create the simulation,
start the simulation both locally and remotely, and view the simulation results. The interface
contains several separate areas:

Figure 3.1: The User Interface (UI)

1. Input Panel Selection: This area on the left side provides the user with a selection of
items to choose from:

a) GI: General Information (Section 3.1), for specification of building description,


location and units.

17
CHAPTER 3. USAGE

b) SIM: Structure Information Model (Section 3.2), for description of the building
model.
c) EVT: Event (Section 3.3), for selecting the input earthquake motions for the
building.
d) FEM: Finite Element Method (Section 3.4), for specifying the options for struc-
tural response simulation.
e) UQ: Uncertainty Quantification (Section 3.5), for defining the distribution of the
random paramaters and UQ method analysis options.
f) DL: Damage and Loss Model (Section 3.6), for specification of the damage and
loss model parameters.
g) RES: Results output (Section 3.7), for looking at the results.

Selecting any of these will change the input panel presented.

2. Input Panel: This is the large central area of the UI where the user provides input for
the application chosen and where thay can view the results. For example, if the user
had selected UQ in the input panel selection, it is in this panel that the user would
provide details on the distributions associated with each random variable or select the
sampling method to use and provide the options necessary to run that method.

3. Push Buttons: This is the area near the bottom of the UI in which 4 buttons are
presented to the user:

a) RUN – Run the simulation locally on the user’s desktop machine.


b) RUN at DesignSafe – Process the information, and send to DesignSafe. The
simulation will be run there on a supercomputer, and results will be stored in the
user’s DesignSafe jobs folder.
c) GET from DesignSafe – Obtain the list of jobs for the user from DesignSafe and
select a job to download from that list.
d) Exit: Exit the application.

The first 3 of the above buttons and their use are discussed in more detail in Section 3.8.

4. Login Button: The Login Button is at the top right of the UI. Before the user can launch
any jobs on DesignSafe, they must first login to DesignSafe using their DesignSafe login
and password. Pressing the login button will open up the login window for users to
enter this information. Users can register for an account on the DesignSafe webpage.

5. Message Area: While the application is running, error and status messages will be
displayed here, in the top center of the user interface.

18
CHAPTER 3. USAGE

3.1 GI: General Information


The user here provides information about the building and the units the user will work with.
The widget contains 4 separate frames, as shown in Figure 3.2:

1. Building Information: Collects general information about the building: name, type,
and year of construction.

2. Properties: Collects information about number of stories, width, depth, plan area and
height of the building.

3. Location: Collects information about the location of the building. This information is
used in some event widgets to obtain events specific to the building location.

4. Units: Collects information about the units for the inputs and outputs. Some widgets
will require inputs in different units. Those widgets will display units beside those
special entry fields.

Figure 3.2: General Information Input Panel

19
CHAPTER 3. USAGE

3.2 SIM: Structural Information Model


The user here defines the structural system of the building. The structural system is the
part of the building provided to resist the gravity loads and those loads arising from the
natural hazard. There are a number of backend applications provided for this part of the
workflow, each responsible for defining the structural analysis model. The user can select
the application to use from the drop-down menu at the top of this panel. As the user
switches between applications, the input panel changes to reflect the inputs each particular
application requires. At present, there are two backend applications available through the
drop down menu:

1. Multiple Degrees of Freedom (MDOF) (Section 3.2.1)


2. OpenSees (Section 3.2.2)

3.2.1 Multiple Degrees of Freedom (MDOF)


This panel is provided for users to quickly create simple shear models of a building. The
panel, as shown in Figure 3.3 is divided into 3 frames:
1. The top left frame allows the user to specify the number of stories and properties that
are constant for every floor and story in the building. The following properties are
available: floor weight, story height, torsional stiffness, initial stiffness, yield strength,
and hardening ratio for each direction in each story. The user has the option of spec-
ifying values for eccentricty of mass in x and y directions, and eccentricities for the
location of the response quantaties. Here, the one and two directions are orthogonal x
and y axes in plan view.
2. The lower left frame allows the user to override the structural parameters above for
individual floors and stories.
3. The frame on the right is a graphical widget showing the current building. When
entering data into the lower left frame, the stories corresponding to the data being
modified are highlighted in red.
Random Variables: Random Variables can be created by the user if they enter a valid
string instead of a number in the entry fields for any entry except for the Number of floors.
The variable name entered will appear as a Random Variable in the UQ panel; it is ther the
user must specify the distribution associated with the Random Variable.

3.2.2 OpenSees
This panel is for users who have an existing OpenSees model of a building that performs a
gravity analysis and now they wish to subject that building model to one of the EVT options

20
CHAPTER 3. USAGE

Figure 3.3: MDOF or Shear Building Model

provided. The input panel for this option is shown in Figure 3.4. Users need to provide
three pieces of information:

1. Main OpenSees Script: The main script that contains the building model. This script
should build a model and perform any gravity analysis of the building that is required
before the event is applied.

2. Response Nodes: A list of node numbers that define a column line of interest for which
the responses will be determined. The column nodes should be in order from ground
floor to roof. The EDP workflow application uses this information to determine nodes
at which displacement, acceleration, and story drifts are calculated.

3. An entry for the dimension of the model (i.e., 2D or 3D). This information is used
when ground motions are applied.

4. entry for the number of degrees of freedom at each node in the model.

Random Variables: In OpenSees there is an option to set variables to have certain values
using the pset command, e.g pset a 5.0 will set the variable a to have a value 5 in the
OpenSees script. In PBE, any variable found in the main script to be set using the pset
command will be assumed to be a Random Variable. As such, when a new main script is
loaded all variables set with pset will appear as Random Variables in the UQ panel.

21
CHAPTER 3. USAGE

Figure 3.4: OpenSees Model

3.3 EVT: Event


The event panel presents the user with a drop-down menu with a list of available event
applications. Event applications are applications that, given the building and user supplied
data inputs, will generate a list of events (i.e., typically time-dependent loads that represent
natural disasters) for the building. The following options are available in the drop-down
menu:

1. Multiple Existing SimCenter Events (Section 3.3.1)


2. Multiple PEER Events (Section 3.3.2)
3. Hazard Based Event (Section 3.3.3)
4. Stochastic Ground Motion (Section 3.3.4)
5. Site Response (Section 3.3.5)
6. User Application (Section 3.3.6)

3.3.1 Multiple Existing


This panel is provided for the user to specify multiple existing SimCenter event files. If more
than one event is specified it is done to provide the UQ engine with a discrete set of events to
choose from—it is not done with the intention of specifying that one event follows another.
The panel presented to the user is shown in Figure 3.5.
Use the Add button to add a new event. This adds an empty event to the panel. Pressing
the button multiple times will keep adding events to the panel. Figure 3.5 shows the state
after the button has been pressed twice, and data entered to load the El Centro and Rinaldi
Events.
The path to the event file can be entered manually, or using the Choose button for
convenience. Pushing the button brings up a typical file search screen. By default, a scale
factor of 1.0 is assigned to the event. The user can change this to another floating point
value (DO NOT USE INTEGER), and they can define the scale factor as a random variable
by entering a variable name, such as factorRinaldi for the second event in Figure 3.5.

22
CHAPTER 3. USAGE

Figure 3.5: Multiple Existing (SimCenter) Events

Note: the name of the random variable must not start with a number, or contain any
spaces or special characters, such as -, +, %, etc.
The Remove button is used to remove events. To remove an event, the user must first
select events they wish to remove, which is done by clicking in the small circle at the left
side of the event frame. All of the selected events are removed when the Remove button is
pressed.
The Load Directory button provides a convenient method to load multiple events. All
event files shall first be placed into the same folder. We recommend to put the files in a
folder of their own, with no other files besides the earthquake events in it. After pressing
the Load Directory button, the user will be able to choose the directory that contains the
files, and the application will load all event files (i.e., every file with a .json extension) into
the widget automatically.
Initially, every event will be given a load factor of 1.0. Load factors can be assigned
automatically by preparing a Records.txt file in the directory with the events. Each line in
the Records.txt shall represent one event file, and contain two comma separated values: the
event file name and the desired scale factor. The application will open that file automatically
and assign the prescribed load factors to the events. Using a Records.txt file also allows
users to load only a subset of the events from a folder by listing only those in the file. An
example Records.txt is shown below:

ElCentro.json,1.5
Rinaldi.json,2.0

Random Variables: Scale factors can be defined as being random variables by entering a
string in the factor field. The variable name entered will appear as a Random Variable in the
UQ panel and the user must specify its distribution there. If multiple events are specified,
the event itself will be also be treated as a random variable, with each event being part of
the discrete set of possible events. For this discrete set the user does not define a distribution
as this is done automatically.

23
CHAPTER 3. USAGE

3.3.2 Multiple PEER Event


This event option is provided for the user to specify multiple existing PEER ground motion
files. PEER files contain time-histories in a single degree of freedom; hence, if multi-degree-
of-freedom excitation is desired, the user is required to specify each individual component
for every EVENT. The Add/Remove buttons at the top are to create and remove an event,
as per Section 3.3.1. The + and - buttons add and remove components (see Figure 3.6).
Remove removes all selected components. Each component in a PEER event can have their
own scale factor, which can be assigned a random variable.

Figure 3.6: Multiple PEER Events

The Load Directory button and the Records.txt file works for PEER events as de-
scribed in Section 3.3.1. The only difference is that PEER files are expected to have an .AT2
extension. Only files with that extension shall be specified in the Records.txt. An example
Records.txt file for multiple Peer events is shown below:

elCentro.AT2,1.5
Rinaldi228.AT2,2.0
Rinaldi318.AT2,2.0

Random Variables: Random scale factors can be defined by entering a string in the factor
field. The variable name entered will appear as a Random Variable in the UQ panel and the
user must specify its properties there. If multiple events are specified, the event itself will be
treated as a random variable, with each event being part of the discrete set of possible events.
For this discrete set the user does not define a distribution as this is done automatically bu
the UI.

3.3.3 Hazard Based Event


The panel for this event application is as shown in Figure 3.7. This application implements
a scenario-based (deterministic) seismic event. In this panel the user specifies an earthquake

24
CHAPTER 3. USAGE

rupture (location, geometry and magnitude), a ground motion prediction equation (GMPE),
a record selection database and the intensity measure used for record selection. In the back-
end, this application relies on three other applications to perform seismic hazard analysis,
intensity measures simulation (to create a simulated target spectrum), and ground motion
record selection/scaling. Users interested in learning about those applications are referred
to the documentation of the SimCenter Ground Motion Utilities.

Figure 3.7: Hazard Based Event

3.3.4 Stochastic Ground Motion Model


This option allows users to generate synthetic ground motions for a target seismic event.
In order to do so, the stochastic ground motion model is selected from the drop-down
menu, as shown in Figure 3.8. Depending on the model selected, the user will be asked
to enter values for a number of parameters that are used to generate a seismic event. In
the current release, users can select between the model derived by Vlachos et al. (2018)
[vlachos2018predictive] and the model developed by Dabaghi & Der Kiureghian (2014,
2017, 2018) [[dabaghi2014stochastic], [dabaghi2017stochastic], [dabaghi2018simulation]].
The geometric directivity parameters, as shown in Figure 3.8b, required by the Dabaghi &
Der Kiureghian model are described completely in Somerville et al. (1997) [somerville1997modification].
Additionally, users can provide a seed for the stochastic motion generation if they desire the
same suite of synthetic motions to be generated on multiple occasions. If the seed is not
specified, a different realization of the time history will be generated for each run. The back-
end application that generates the stochastic ground motions relies on smelt, a modular and
extensible C++ library for generating stochastic time histories. Users interested in learning
more about the implementation and design of smelt are referred to its GitHub repository.

25
CHAPTER 3. USAGE

All input parameters can be specified as random variables by entering a string in the
parameter field. Please note that information for the inputs that are identified as random
variables needs to be provided in the UQ tab.

(a) Vlachos et al. (2018) model inputs

(b) Dabaghi & Der Kiureghian (2018) model inputs

Figure 3.8: Stochastic Ground Motion Event

26
CHAPTER 3. USAGE

3.3.5 Site Response


This option allows users to determine the event at the base of the building by performing an
effective free-field site response analysis of a soil column. In this panel the user specifies a
ground motion at the bottom of the column. After the soil layers have been properly defined,
the motion at the ground surface are given at the end of the analysis, and is is this motion
will be used in the simulation of the building response.

Figure 3.9: Site Response Analysis Event

The UI of the Site Response is shown in Figure 3.9. It is split into the following areas:
1. Soil Column Graphic: The first graphic on the left of the panel shows a visualization
of the soil column. To select a layer, the user must move their cursor over the area and
then select it. The selected layer will be outlined with a red box.
2. FE Mesh Graphic: The second graphic on the left shows the finite element mesh and
profile plots. Selecting any of the tabs on the right inside this graphic (i.e, PGA, γmax ,
maxDisp, maxRu, maxRuPWP) will show various results from the simulation at the
mesh points.
3. Operations Area: The right side of this area shows the height, total number of soil
layers, graound water table (GWT), and includes plus and minus buttons. If the user
presses the plus button, a layer is added below the selected layer. If the minus button
is pressed the selected layer is removed. The GWT input field allows the user to specify
the level of the ground water table.
4. Analyze Button: A button the user presses when inputs for this widget are all entered.
The site response tool currently requires the site response analysis is performed before
the RUN button can be pressed.

27
CHAPTER 3. USAGE

5. Soil Layer Table: This table is where the user provides the characteristics of the soil
layer, such as layer thickness, density, Vs30 , material type, and element size in the finite
element mesh.
6. Tabbed Area: This area contains the three tabbed widgets described below.

x2

α x1

Figure 3.10: Slope definition for 2D Column

x3

β
x1

α
x2

Figure 3.11: Slope definition for 3D Column

a) Configure Tab: This tab allows the user to specify the path to the OpenSees
executable and to a ground motion file that represent the ground shaking at the

28
CHAPTER 3. USAGE

Figure 3.12: Nodal and elemental responses

bedrock. The rock motion file must follow the SimCenter event format. Examples
of SimCenter event files are available with the source code. s3 hark will determine
to use 2D column or 3D column based on the ground motion file provided. When
a ground motion file is selected from the local computer, or the path of the ground
motion file is typed in, s3 hark will figure out if it’s a 1D or 2D shaking file. If it’s
1D shaking, all elements will be 2D. If it’s 2D shaking, all elements will be 3D.
The definition of 2D and 3D slope are different. See Figure 3.10 and Figure 3.11.
b) Layer Properties Tab: This tab allows the user to enter additional material prop-
erties for the selected soil layer (Figure 3.13).
c) Response Tab: Once the site response analysis has been performed, this tab
provides information about element and nodal time varying response quantities.
See Figure 3.12.
d) Run Tab: Opens up a window in which by using the up and down arrows on the
keyboard the dino will jump up and down. Something to do if the site response
analysis is taking too long, which it may if many soil layers are used.

7. Analyze Button: This button shall be used to run the simulation locally. A progress
bar will show the status of the analysis. This allows the user to review the ground
motion predicted at the surface.

Upon the finish of the finite element analysis, the ground motion at the soil surface
(Figure 3.14) will be stored in EE-UQ’s input file. This computed motion will be applied
during the simulation.

29
CHAPTER 3. USAGE

Select a layer, the Layer properties tab will show up

Figure 3.13: Soil Layer Modification in Site Response

Figure 3.14: Simulated Motion at the Surface of the Ground

30
CHAPTER 3. USAGE

Random Variables: The current version of the Site Response event type does not support
random variables.

NOTES:
1. Variables are assumed to have m, kPa, and kN units in the Site Response panel.
2. If the Analyze button is not pressed, no simulation will be performed. If no simulation
is performed there will be no ground motions provided to the building.

3.3.6 User Application


The final option for event definition is a user application. The user specifies the application
name and the input file containing the specific input information needed by the application
when it is running in the backend. As will be discussed later, is the user selects to utilize
an application that is not provided, they are also required to edit the tools registry file.
Here they must include a new event application with the same name as that entered and
they must provide the location where that application can be found relative to the tools
application directory. If running on DesignSafe, that application must be built and must be
available on the Stampede2 supercomputer.
Note: Given how DesignSafe runs the applications through Agave, the file permissions
of this application must be world readable and executable.

Figure 3.15: User defined event

3.3.7 PEER NGA Records


This event allows the user to perform ground motion records selection and scaling using
PEER NGA West 2 ground motions database. The suite of records can be selected from the
database to represent the uncertainty in the ground motion. The following steps are needed
to use this event:

1. A target response spectrum must be specified by the user to be used for records selection
and scaling.
2. The user specifies a selection criteria, such as the number of records and optional ranges
of earthquake magnitude, distance to rupture (Rrup ) and shear wave velocity in the
top 30 meter of soil (Vs 30).

31
CHAPTER 3. USAGE

3. The user run the record selection and inspect the selected suite of ground motions.
It is important to note that this event requires a PEER NGA West 2 account, users
will be asked to provide their credentials (user name and password) to log in to the
database. Users who do not have an account will be forwarded to the account sign up
web page. Record selection is always done to minimize the mean square error between
the target spectrum and the selected scaled spectrum. It is also important to note that
the current version only allows user to specify the ASCE 7-10 design spectrum as a
target. Future versions will allow the users to specify a user-provided target spectrum
or a target spectrum obtained from seismic hazard analysis, such as the uniform hazard
spectrum (UHS) or the conditional mean spectrum (CMS).

Figure 3.16: PEER NGA Records Event

After a suite of records is selected from the database, the list of records is shown
in tabular form for the user to inspect their information, as shown in Figure 3.16.
Additionally a plot is generated showing the target spectrum, the average and standard
deviation of the selected suite of records and the selected scaled ground motions spectra.
Users can also highlight particular spectra on the plot by selecting the one or more
records in the table provided. This enables the user to inspect the suite of records used
to characterize the ground motions before running the building simulation.

32
CHAPTER 3. USAGE

3.4 FEM: Finite Element Method


The FEM panel will present users with a selection of FEM applications that will take
a building model generated by the BIM application and the EVENT from the event
application and perform a deterministic simulation of structural response. Currently,
there is one application available, OpenSees, and there is no application selection box.
That will be modified in future versions to allow users to provide their own simulation
application. The current OpenSees implementation extends the standard OpenSees
executable with a pre- and post-processor to take the BIM and EVENT files and use
OpenSees to simulate the response, and return it in an EDP file.

Figure 3.17: Options for OpenSees transient analysis

For the OpenSees application the user is required to specify the options to be used in
the transient analysis. As shown in Figure 3.17, this includes the choice of
a) Solution algorithm, the default is Newton Raphson.
b) Integration Scheme, the default is Newmark’s linear acceleration method.
c) Convergence Test, the default is a norm on the unbalance force.
d) Convergence tolerance, with a default of 0.01.
e) Damping Ratio. the default is 2% equivalent viscous damping entered as 0.02. If a
damping ratio of 0 is specified, no damping is added by the simulation application.
This allows users to add their own damping in the OpenSees tcl script they load
under SIM.
f) Analysis Script. This shall be left blank by default. Advanced users of OpenSees
who have their preferred analysis script and wish to provide their own damping
model can provide it here.
The options available for each setting can be found in the OpenSees online user manual.

A default transient analysis script is run with these inputs. It is built for Version
3.0.0+ of OpenSees and uses a divide and conquer algorithm to overcome convergence

33
CHAPTER 3. USAGE

issues. This new algorithm does not work for every nonlinear problem. The actual
analysis command that is created based on the defaults is the following:

numberer RCM
system Umfpack
integrator Newmark 0.5 0.25
test NormUnbalance 0.01 20
algorithm Newton
analysis Transient -numSubLevels 2 -numSubSteps 10
analyze $numStep $dt

If the user specifies their own analysis script to run instead of the default, they can
take advantage of the numStep and dt variables that are obtained from the EVENT
and are automatically set by the program.

3.5 UQ: Uncertainty Quantification


Throughout the input specification the user is defining variables. As described in the
above sections many of these variables can be specified by the user to be random
variables. The UQ panel is where the user specifies the distribution of these random
variables. Besides the properties of random variables, the sampling method and the
number of requested samples shall also be defined by the user. The panel is split, as
shown in Figure 3.18, into two frames:

a) Sampling Methods
b) Random Variables

3.5.1 Sampling Methods


In the forward propagation problem, the user selects the sampling method to use
from the dropdown menu sampling methods. Currently there are five options avail-
able: Monte Carlo Sampling (MCS), Latin Hypercube Sampling (LHS), Importance
Sampling (IS), and sampling based on surrogate models, including Gaussian Process
Regression (GPR) and Polynomial Chaos Expansion (PCE). Depending on the op-
tion selected, the user must specifies the appropriate input parameters for each. For
instance, for MCS, the number of samples specifies the number of simulations to be
performed, and providing a random seed allows the user to reproduce the same set of
samples from the random variables multiple times.

34
CHAPTER 3. USAGE

Figure 3.18: Uncertainty Quantification input panel

Figure 3.19: Monte Carlo Sampling input panel

35
CHAPTER 3. USAGE

Figure Figure 3.19 shows the input panel corresponding to the Monte Carlo Sampling
(MCS) setting. Two input parameters need to be specified, the number of samples to
be executed, as well as the seed used in generating the random samples.

Figure 3.20: Latin Hypercube Sampling input panel

Figure Figure 3.20 shows the input panel corresponding to the Latin Hypercube Sam-
pling (LHS) scheme. Two input parameters also need to be specified, the number of
samples to be executed, as well as the seed used in generating the LHS samples.

Figure 3.21: Importance Sampling input panel

For rare event analysis, Figure Figure 3.21 shows the input panel for Importance Sam-
pling (IS) scheme. Similar to MCS and LHS, the IS requires both the number of samples
to be executed and the corresponding seed for generating such random samples. In
addition, the Importance Sampling algorithm can performed via three different ap-
proaches, as specified by the third input method. The latter includes Basic Sampling,
Adaptive Sampling, and Multimodal Adaptive Sampling.

36
CHAPTER 3. USAGE

For uncertainty propagation with surrogates, two popular surrogates are available,
namely Gaussian Process Regression (GPR) and Polynomial Chaos Expansion (PCE).
Figure Figure 3.22 shows the input panel for the GPR model, with input panels for
training and sampling.

Figure 3.22: GPR forward propagation input panel

For uncertainty propagation with Gaussian Process Regression (GPR), Figure Fig-
ure 3.22 shows the input panel for the PCE model, with input panels for training
and sampling as well. The first set of input parameters in the surrogate training data
specify the dataset used for training the surrogate model, while the second set of in-
put parameters in the surrogate sampling data relate to the dataset used for sampling
the surrogate. Care must be taken in specifying the training dataset to results in an
accurate response surface approximation.
For uncertainty propagation with Polynomial Chaos Expansion (PCE), Figure Fig-
ure 3.23 shows the input panel for the PCE model, with input panels for training and
sampling as well, similar to the input GPR panel. The first set of input parameters in
the surrogate training data specify the dataset used for training the surrogate model,
while the second set of input parameters in the surrogate sampling data relate to the
dataset used for sampling the surrogate. Extreme care must be taken in specifying the
parameters of the training dataset to results in an accurate response surface approxi-
mation.
If you are not sure about the training parameters of the surrogates, please refrain from
using the surrogates for forward propagation and use instead conventional sampling

37
CHAPTER 3. USAGE

Figure 3.23: PCE forward propagation input panel

such as MCS and LHS as discussed above.

3.5.2 Reliability Analysis


For reliability analysis, figure Figure 3.24 shows the input panel for the reliability
capabilities. Currently, both First-Order Reliability Methods (FORM) and Second-
Order Reliability Methods (SORM) are supported. The user can specify the local or
global solution in the Reliability Scheme input parameter. In addition, the user needs
to specify the method for searching for the Most Probable Point (MPP); if not sure,
do not use any MPP approximation.
For both first and second-order reliability analysis, the user needs to specify the either
the response levels or the probability levels at which the CDF of the QoI needs to be
queried. The user can specify multiple query points, separated by a space.

3.5.3 Random Variables


The RV panel allows the user to specify the probabilistic distribution for the random
problem at hand. The following probabilistic distributions for the random variables
are currently supported:

a) Gaussian
b) Lognormal

38
CHAPTER 3. USAGE

Figure 3.24: Reliability input panel

c) Beta
d) Uniform
e) Weibull
f) Gumbel

Each distribution has different parameters, and the user needs to select accordingly
the parameters for the distribution selected for each random variable. Once the user
selects the distribution of the random variable, the corresponding input boxes for the
parameters will show.
Figure 3.25 shows the panel for a problem with four Random Variables with all random
input following Gaussian distributions.

Figure 3.25: Random Variable specification

39
CHAPTER 3. USAGE

3.6 DL: Damage and Loss Model


The Damage and Loss panel provides users a convenient way to define the damage and
loss model for the building. The dropdown list at the top of the panel allows users to
choose between two loss-assessment methods: FEMA P58 [applied˙technology˙council˙atc˙fema˙2
and HAZUS MH [federal˙emergency˙management˙agency˙fema˙hazus˙2018-2].
The method chosen determines the information displayed in the rest of the panel. The
two methods are discussed in the following sections.

3.6.1 FEMA P58


This option implements the loss assessment methodology described in the FEMA P58
documents. The main panel is divided into three parts that can be accessed by clicking
at the tabs at the top of the input panel.

3.6.1.1 General Settings

Figure 3.26 shows the first panel, which corresponds to general damage and loss set-
tings. The panel allows the user to set the following parameters of the loss assessment:

• Uncertainty Quantification
– Realizations: The number of realizations to generate using the stochastic
loss model. Depending on the complexity of the model, a few thousand
realizations might be sufficient to capture central tendencies. A much larger
number is required to get appropriate estimates of the dispersion of results.
– Additional uncertainty: Ground motion and modeling uncertainty per FEMA
P58. The prescribed logarithmic standard deviation values are added to the
dispersion of EDPs to arrive at the description of uncertain building response.
• Decision Variables
These checkboxes allow the user to pick the decision variables of interest and save
computation time and storage space by only focusing on those. Expect this list
to grow as more decision variables are added in the future.
• Inhabitants
– Occupancy type: The type of occupancy is used to describe the temporal
distribution of the inhabitants. Note: the default FEMA P58 distribution
can be overridden by a custom file provided in the Custom Data Sources box.
– Peak Population: The maximum number of people present at each floor of
the building. The example in Figure 3.26 shows a two-story wooden house
with a cripple wall, hence the 0 population in the first floor.

40
CHAPTER 3. USAGE

Figure 3.26: The General Damage and Loss Settings panel. (The settings shown in the
Figure serve demonstration purposes and are not the suggested inputs.)

• Building Response
– Yield Drift Ratio: This prescribed value is used to estimate residual drifts
from peak interstory drifts per Section 5.4 in FEMA P58. These are only
needed if no reliable residual drifts are available from the simulation. Con-
sidering the large uncertainty in estimated residual drift values, it is recom-
mended to consider using the peak interstory drift as a proxy even if it would
be numerically possible to obtain residual drift values.
– Detection Limits: These limits correspond to the maximum possible values
that the response history analysis can provide. While peak interstory drifts
will certainly have an upper limit, peak floor acceleration will not necessarily
require such a setting. Leaving any of the fields empty corresponds to unlim-
ited confidence in the simulation results.
Note: these limits will be used to consider EDP data as a set of censored
samples when finding the multivariate distribution that fits the simulation
results.

41
CHAPTER 3. USAGE

• Building Damage and Loss


– Replacement Cost and Time: The cost (in the currency used to describe
repair costs, typically US dollars) and the time (in days) it takes to replace
the building.
– Irrepairable Residual Drift: Describes the limiting residual drift as a random
variable with a Lognormal distribution. See Figure 2-5 and Section 7.6 in
FEMA P58 for details.
– Collapse Limits: Collapse of the building in each realization is inferred from
the magnitude of EDPs. The collapse limits describe the EDP value beyond
which the building is considered collapsed. Note that collapse limits might
be beyond the detection limits (although that is generally not a good idea)
and certain EDPs might not have collapse limits associated with them (e.g.
PFA).
• Loss Model Dependencies
The PBE App allows you to specify dependencies between various parts of the
loss model. The default FEMA P58 setting would assume all variables are inde-
pendent, except for the fragility data, where the fragility of certain Component
Subgroups (i.e. groups of components with identical behavior within Performance
Groups) is perfectly correlated. This behavior is achieved by setting every other
dependency to Independent and setting the Component Fragilities to per ATC
recommendation.
Every type of prescribed dependency assumes perfect correlation between a cer-
tain subset of the loss model’s variables and no correlation between the others.
Future versions will expand on this approach by introducing more complex cor-
relation structures.
The user can assign perfect correlation between the following logical components
of the model:
– Fragility Groups: Assumes that the selected parameters are correlated be-
tween Fragility Groups (i.e. the highest organizational level) and at every
level below. That is, with this setting, the users assigns perfect correlation
between every single parameter of the selected type in the model. Use this
with caution.
– Performance Groups: Assumes that the selected parameters are correlated be-
tween all Performance Groups and at every logical level below. For instance,
this setting for Component Quantities will lead to identical deviations from
mean quantities among the floors and directions in the building.
– Floors: Assumes that the selected parameters are correlated between Per-
formance Groups at various floors, but not between Performance Groups in
different directions in the building. Also assumes perfect correlation between
the Damage States within each Performance Group. This is useful when the

42
CHAPTER 3. USAGE

parameter is direction-dependent and similar deviations are expected among


all floors in the same direction.
– Directions: Assumes that the selected parameters are correlated between
Performance Groups in various (typically two) directions, but not between
different floors of the building. This can be useful when you want to prescribe
similar deviations from mean values within each floor, but want to allow
independent behavior over the height of the building.
– Damage States: Correlation at the lowest organizational level. Assumes that
the selected parameters are correlated between Damage States only. This
type of correlation, for instance, would assume that deviation from the median
reconstruction cost is due to factors that affect all types of damage within a
performance group in identical fashion.
The following model parameters can handle the assigned dependencies:
– Component Quantities: The amount of components in the building (see the
description of the Components tab below for more details).
– Component Fragilities: Each Damage State has a corresponding random EDP
limit. The component fragilities is a collection of such EDP limit variables.
Note: most methodologies assume that such EDP limits are perfectly corre-
lated at least among the Damage States within a Component Subgroup.
– Reconstruction Costs and Times: The cost and time it takes to repair a par-
ticular type of damage to a component. The btw. Rec. Cost and Time
checkbox allows you to define correlation between reconstruction cost and
time on top of the correlations already set above for each of these individu-
ally.
Note: if you do define such a correlation structure, the more general corre-
lation among the settings in the Reconstruction Costs and Reconstruction
Times lines will need to be applied to both cases to respect conditional corre-
lations in the system. (e.g., if you set costs to be correlated between Perfor-
mance Groups and times to correlate between Floors and check the cost and
time correlation as well, times will be forced to become correlated between
Performance Groups.)
– Injuries: The probability of being injured at a given severity when being
in the affected area of a damaged component. Note that the Injuries lines
prescribe correlations between the same level of injury at different places in
the building. Correlation between different levels of injury at the same place
can be prescribed by the btw. Injuries and Fatalities checkbox.
– Red Tag Probabilities: The amount of damage in a given Damage State that
triggers an unsafe placard or red tag.
• Custom Data Sources
The loss assessment is performed using population and fragility data from the

43
CHAPTER 3. USAGE

first edition of FEMA P58. Each data source can be overridden by custom user-
defined data.
Note: the loss calculations are performed at the local computer. Consequently,
the locally available fragility and population data files can be used to perform the
calculations even if the response simulations are done at DesignSafe.

3.6.1.2 Building Components

Figure 3.27 shows the input panel where you can define the components of the building.
The following pieces of information are required for each component:

Figure 3.27: The Building Components panel. (The settings shown in the Figure serve
demonstration purposes and are not the suggested inputs.)

• name: A name that helps you identify the component. It is arbitrary and not
used by the loss assessment engine.
• ID: The ID of the component is the ID used in FEMA P58 or in PACT and this
shall be the same as the name of the json file that contains the component data.
The first custom component in Figure 3.27, for example, has a unique ID and the
corresponding file has been renamed accordingly.
• quantity: A list of component quantities on each floor of the building. The
quantities shall be specified in the units assigned to the component in the fragility
data file. These are the units assigned in FEMA P58 by default.
• cov: Coefficient of variation for the random distribution used to consider the
uncertainty in component quantities.

44
CHAPTER 3. USAGE

• distribution: The type of random distribution used to consider the uncertainty in


component quantities.
• unit: The unit assigned to the component in the fragility data file.
• directions: Components within a Fragility Group are separated into Performance
Groups by floor and direction. Components within a Performance Group are
further separated into Component Subgroups that might experience independent
damage and losses depending on the settings in the General tab. The list of
directions provided here specifies the number of Component Subgroups in each
direction. The specified pattern is applied to all floors of the building.
Note: the number of floors is defined by the number of elements in the list of
quantities.
• weights: These weights prescribe the proportion of component quantity in each
floor that shall be assigned to each Component Subgroup. Consequently, these
weights for each component shall sum to one and the number of weights shall be
equal to the number of directions provided.
• structural: This checkbox specifies if the component is a structural or a non-
structural one.

3.6.1.3 Collapse Modes

Figure 3.28 shows the input panel where you can specify the collapse modes of the build-
ing. Collapse modes provide information for the estimation of injuries from building
collapse. As such, they are only used if injuries are among the requested Decision
Variables. The following pieces of information are required for each collapse mode:

Figure 3.28: The Collapse Modes panel. (The settings shown in the Figure serve demon-
stration purposes and are not the suggested inputs.)

• name: A name that helps you identify the collapse mode. It is arbitrary and not
used by the loss assessment engine.
• probability: Conditioned on collapse, the likelihood of this collapse mode.

45
CHAPTER 3. USAGE

• affected area: The affected area (as a fraction of the total plan area) of the building
at each floor. We assume that the floor area is uniform along the height of the
building.
• injuries: The probability of each level of injury when people are in the affected
area and this collapse mode occurs. (FEMA P58 assumes two levels of severity:
injuries and fatalities).

3.6.2 HAZUS MH
This option implements the loss assessment methodology described in the HAZUS MH
Technical Manual document. Figure 3.29 shows the input panel that allows the user
to set the following parameters of the loss assessment:

Figure 3.29: The General Damage and Loss Settings panel. (The settings shown in the
Figure serve demonstration purposes and are not the suggested inputs.)

• Uncertainty Quantification
– Realizations: The number of realizations to generate using the stochastic
loss model. Depending on the complexity of the model, a few thousand

46
CHAPTER 3. USAGE

realizations might be sufficient to capture central tendencies. A much larger


number is required to get appropriate estimates of the dispersion of results.
– Additional uncertainty: Ground motion and modeling uncertainty per FEMA
P58 that is referred to as uncertainty in response due to variability of ground
motion demand and variability in the capacity properties of the model build-
ing in HAZUS MH. The prescribed logarithmic standard deviation values are
added to the dispersion of EDPs to arrive at the description of uncertain
building response.
• Decision Variables
These checkboxes allow the user to pick the decision variables of interest and save
computation time and storage space by only focusing on those. Expect this list
to grow as more decision variables are added in the future.
• Inhabitants
– Occupancy type: The type of occupancy is used to describe the temporal
distribution of the inhabitants. Note: the default HAZUS MH distribution
can be overridden by a custom file provided in the Custom Data Sources box.
– Peak Population: The maximum number of people present at each floor of
the building. The example in Figure 3.26 shows a two-story wooden house
with a cripple wall, hence the 0 population in the first floor.
• Building Response
– Detection Limits: These limits correspond to the maximum possible values
that the response history analysis can provide. While peak interstory drifts
will certainly have an upper limit, peak floor acceleration will not necessarily
require such a setting. Leaving any of the fields empty corresponds to unlim-
ited confidence in the simulation results.
Note: these limits will be used to consider EDP data as a set of censored
samples when finding the multivariate distribution that fits the simulation
results.
• Building Damage and Loss
– Replacement Cost and Time: The cost (in the currency used to describe
repair costs, typically US dollars) and the time (in days) it takes to replace
the building.
– Design Information: The Structure Type and the Design Level per HAZUS
MH. These two pieces of information are used to select the appropriate
fragility and consequence functions from those provided in the HAZUS MH
Tehcnical Manual.
Note: Any fragility or consequences function can be edited by the user and
loaded by specifying a directory that contains those custom functions in the
Custom Data Sources box.

47
CHAPTER 3. USAGE

• Custom Data Sources


The loss assessment is performed using population and fragility data from the
HAZUS MH Technical Manual. Each data source can be overridden by custom
user-defined data.
Note: the loss calculations are performed at the local computer. Consequently,
the locally available fragility and population data files can be used to perform the
calculations even if the response simulations are done at DesignSafe.

3.7 RES: Results


After the user hits the Run button, the simulation of building response and the loss
assessment is performed automatically in the background. Assuming that the calcu-
lations were successful, the PBE application switches to the RES panel to present the
results. A successful run or download of a job that ran successfully will result in two
tabbed widgets being displayed in this panel.

The first panel (Figure 3.30) shows summary statistics: mean, standard deviation, and
several important percentiles of the Decision Variables.

Figure 3.30: Results Summary

48
CHAPTER 3. USAGE

The second panel presents results for each realization in a plot and in tabular format.
By selecting various columns with the left and right mouse buttons in the table below
the graphic, the information in the plot is updated (Figure 3.31). The plot is controlled
as follows:

• Selection with the left mouse button identifies the variable on the Y axis.
• Selection with the right mouse button identifies the variable on the X axis.
• If the same column is selected with both mouse buttons, then the distribution
of the selected Decision Variable is plotted. A left click on the column triggers
a cumulative distribution function plot, while a right click triggers a probability
density function plot.

The columns in the table typically identify the Decision Variable category (e.g. re-
construction, injuries, etc.) and a variable within that category preceded by a for-
ward slash (e.g., injuries/fatalities). Boolean variables (e.g,. collapsed?, red tagged?)
are used to describe the occurrence of events. The reconstruction/time impractical?
and cost impractical? variables identify realizations where reconstruction time or cost
would exceed the replacement time or cost, respectively. In such cases, replacement is
assumed instead of reconstruction. The minimum (i.e., based on parallel work) recon-
struction time is used when making this decision.

Besides the results displayed in the application, advanced users can find detailed in-
formation about EDPs, damage, and decision variables in csv files in the Working
directory in the tmp.SimCenter/templatedir folder.

3.8 Push Buttons


There are a number of buttons in the Push Button area of Figure 3.1. This section
describes the usage of these buttons.

3.8.1 RUN
This button allows the user to run the simulation on the local machine. When the
button is pressed a window, as shown in Figure 3.32, will pop up informing the user
that the UQ engine has started running. When complted this window will dissapear
and the RES panel will be selected.
Note: There are two input fields specified in the applications preferences that effects
the running of the UQ engine:

49
CHAPTER 3. USAGE

Figure 3.31: Results presented graphically and in tabular form

Figure 3.32: Pop-up window shown after clicking the RUN button

• Local Jobs Dir: specifies where the PBE application shall create a tmp.SimCenter
directory for temporary files that are used to perform the simulation. This di-
rectory is created after the Submit button is pressed. As discussed in Chap-
ter 10, when the application creates this directory it copies the files needed to
it (e.g., if you are using OpenSees input script, it will copy that script to the
tmp.SimCenter directory. ALL FILES IN THE SCRIPT DIRECTORY AND
ALL FILES IN SUBDIRECTORIES OF THAT DIRECTORY GET COPIED SO
DON’T PLACE THE OPENSEES SCRIPT IN HOME, DOWNLOADS, DOCU-
MENTS, etc. . . .
• Local Application Dir: The PBE application searches for the workflow applica-
tions in this directory. Only edit its location if you are introducing your own

50
CHAPTER 3. USAGE

applications or you want to build and modify the applications provided with the
tool.

3.8.2 RUN at DesignSafe


The purpose of this option is to allow the user to run the job on a high performance
computer (HPC) at DesignSafe. After clicking on the button, the window shown in
Figure 3.33 pops up. There are several input fields and a Submit button in the window.

Figure 3.33: Pop-up window shown after clicking the RUN at DesignSafe button

• Job Name: The name the user can use to identify the job in Get from DesignSafe.
• Num Nodes: The number of compute nodes to use on Stampede2. Using the
default App Name the job will run on Stampede2’s KNL Landing (KNL) compute
nodes. Each node has 68 cores. The actual number of cores the application will
use on each of these nodes depends on the total number of processes specified.
As per the TACC webpage, for MPI tasks it’s best not to specify more than 64-
68 processes to run. Depending on the numerical computations and amount of
memory each uses, for large simulations you may wish to use more nodes and less
processes to avoid page faulting.
• Total Number of Processes: Total number of MPI parallel processes the UQ engine
is going to use.
• Max Wall Time: Use HOURS:MIN:SEC format and be conservative. Your job is
killed after the time limit is reached. On Stampede2 you have a max wall time of
24 hours.

Finally, when inputs are finished, the user presses the Submit button will cause the UI
to package the input information and send it to DesignSafe. At DesignSafe the inputs
will be stored, and a job request will be submitted. If the job request gets submiitted
successfully to DesignSafe the pop-up window will dissappear with a job successfully
started message will appear in the message area. Do not press the Submit button

51
CHAPTER 3. USAGE

multiple times, monitor the message area for progress. If the process appear stalled it
may be due to a large number of requests being processed by DesignSafe. If this is the
case it is sometimes best to close the popup, save the file, and try again later.
NOTE: Similar to running locally, there are additional options available in the prefer-
ences section that advanced users can modify that effect the running of the tool.

• Remote Jobs Directory: specifies where the PBE application shall create a tmp.SimCenter
directory for temporary files that are used to perform the simulation. This di-
rectory is created after the Submit button is pressed. As discussed in Chap-
ter 10, when the application creates this directory it copies the files needed to
it (e.g., if you are using OpenSees input script, it will copy that script to the
tmp.SimCenter directory. ALL FILES IN THE SCRIPT DIRECTORY AND
ALL FILES IN SUBDIRECTORIES OF THAT DIRECTORY GET COPIED SO
DON’T PLACE THE OPENSEES SCRIPT IN HOME, DOWNLOADS, DOC-
UMENTS, etc. . . . The Working Directory is removed after the job has been
submitted successfully.
• Local Applications Directory: The PBE application searches for the workflow ap-
plications in this directory. Only edit its location if you are introducing your own
applications or you want to build and modify the applications provided with the
tool.
• Remote Applications Directory: Remote directory on Stampede2 where applica-
tions needed by the workflow reside. Only modify if you have built the applications
on the supercomputer, currently Stampede2.
• App Name: Name of Agave app to run. Only modify you have created your own
Agave app.

3.8.3 GET from DesignSafe


Allows you to obtain your list of jobs from DesignSafe and select from that list a job to
update status of, download or delete. To select a job, place the cursor over the jobline
and press the left mouse button. A number of options will appear:

a) Update Job Status. A job submitted to run at DesignSAfe goes through a number
of states. Only when the state is ’FINISHED’ is it completed and ready for you to
download. By slecting this option, the job status will be retrieved from DesignSafe
and the table row updated.
b) Retrieve Data. Select this option to download a previously run job. It will down-
load the form data and the results. The application will not download input or job
created files. These can be downloaded through your browser from DesignSafe.
This option is only valid when the state of the job is ’FINISHED’.

52
CHAPTER 3. USAGE

c) Delete Job. This will delete the job from the list. It does not delete any data files
that were created as part of the job submission and running.
d) Delete Job and Data. This will delete both the job and files at DesignSafe.

3.8.4 Exit
Click this button to exit the application.

53
4 Theory and Implementation
This chapter describes how the backend of the PBE application works. If you intend to
use the application a lot or extend it, it is important that you read this section.

Some Definitions before we start:

• Workflow: A sequence of steps involved in moving from a beginning state to an


ending state.
• Scientific Workflow Application: An application that automates a workflow pro-
cess through software, with each step in the workflow being performed by a sep-
arate “scientific” software application.
• Scientific Workflow System: Software providing an infrastructure for the set-up,
scheduling, running, and monitoring of a user-defined scientific workflow applica-
tion.

4.1 The basic workflow


The PBE App is a limited scientific workflow system that allows users to create scientific
workflow applications needed for the performance assessment of a building subjected
to earthquake ground motions. It allows the users to run the workflow application
using custom, user-defined data.

The PBE App is composed of two parts:

• Frontend User Interface (UI): This is the application the user interacts with to
define the workflow and its inputs (i.e., which software to use and what data to
use for the various software). The UI is introduced in Chapter 3. Its purpose, as
shown in Figure 4.1 below, is to create the BIM and start the workflow. Currently,
the inputs for the workflow are stored in the BIM file to reduce file overhead.
• Backend Application: This is the application that actually creates and runs the
workflow. It consists of a script that processes the output file from the UI to
determine which applications to run and with which input data. It invokes
these applications using the outputs from one application as the input to an-
other. The workflow is controlled by a Python script, PBE.py, that is stored

54
CHAPTER 4. THEORY AND IMPLEMENTATION

in the applications/Workflow/ directory. The input and output from each


application is in the form of JavaScript Object Notation (JSON) files. JSON
is a human-readable file format used widely for passing data between front-end
browser applications (Safari, Firefox, Internet Explorer) and backend servers.

Figure 4.1: Schematic of the basic Scientific Workflow for Damage and Loss Assessment

The software that are invoked by the Backend Application are categorized into certain
types according to their purpose:

a) createEVENT: Given the structure and the user input for hazard application,
define the loads for the building (i.e., the ground motion records that represent
an earthquake scenario). The output file is an EVENT file.
b) createSAM: Given the building description and event, create a response simulation
model (currently, a finite element model) of the building. The output file is a SAM
(Structural Analysis Model) file.
c) createEDP: Given the building, determine what response quantities are required.
The output file is the EDP (Engineering Demand Parameters) file. Currently, the
user has no influence on the EDPs as the StandardEarthquakeEDP application is
the built-in default application.
d) performSIMULATION: Given the response simulation model and the event, per-
form a nonlinear response history simulation. The responsibility of performSIM-
ULATION is to fill in the values in the EDP files.
e) performDL: Given the building response in the form of EDPs, perform a damage
and loss assessment and provide the DVs (Decision Variables) that describe the
performance of the building.

55
CHAPTER 4. THEORY AND IMPLEMENTATION

4.2 The Workflow with Uncertainty Quantification


The need to characterize the uncertainties in the computed response complicates this
workflow. All of the software above can have uncertainties in their inputs and models
(e.g., building geometry or material properties in the building input file; hazard descrip-
tion (magnitude, distance) and corresponding ground motion records in createEVENT;
finite element model parameters in createSAM; integration scheme, damping ratio or
convergence tolerance in performSIMULATION, and component damage and conse-
quence characteristics in performDL) and we want to allow users of the PBE App to
take that into consideration. Figure 4.2 provides an overview of the resulting workflow
with uncertainty quantification.

Figure 4.2: Schematic of the Scientific Workflow Application for Damage and Loss Assess-
ment with UQ

Each application is called with two different sets of input arguments. The first time
every application is called with a {getRV input argument. This tells them to return
information about the random variables inside a randomVariables entry in pFiles
(i.e., output files with a p prefix) generated by the application along with other needed

56
CHAPTER 4. THEORY AND IMPLEMENTATION

data (e.g., the event type). The randomVariables is a JSON array of random variables,
each with a field for a name, a type, a value, and other type-dependent info. A sample
array is shown below with three variables:

{
” randomVariables ” : [
{
” d i s t r i b u t i o n ” : ”Normal ” ,
”mean ” : 6 ,
”name ” : ” f c ” ,
” stdDev ” : 0 . 6 ,
” v a l u e ” : ”RV. f c ” ,
” v a r i a b l e C l a s s ”: ” Uncertain ”
},
{
” d i s t r i b u t i o n ” : ”Normal ” ,
”mean ” : 6 0 ,
”name ” : ” f y ” ,
” stdDev ” : 6 ,
” v a l u e ” : ”RV. f y ” ,
” v a r i a b l e C l a s s ”: ” Uncertain ”
},
{
” d i s t r i b u t i o n ” : ”Normal ” ,
”mean ” : 30 0 00 ,
”name ” : ”E” ,
” stdDev ” : 3 0 0 0 ,
” v a l u e ” : ”RV. E” ,
” v a r i a b l e C l a s s ”: ” Uncertain ”
}

After preparing information about all random variables, the control of the workflow is
passed to the UQ engine. It generates the prescribed number of samples of the ran-
dom variables, then runs the workflow until the response simulation and collects the
EDPs for each input sample. The UQ engine searches for value fields in the input files
with RV.variableName values and replaces them with the realizations of the random
variables.

The performUQ application is actually a script that calls three applications, as shown

57
CHAPTER 4. THEORY AND IMPLEMENTATION

below:

a) PreProcessUQ: Parses all pFiles to build the list of random variables.


b) PerformUQ: Invokes the UQ engine, which fills in the random variable values, and
runs the applications in the workflow with the updated input files.
c) PostProcessUQ: Combines the output results, collecting the EDPs.

The computationally expensive part of the workflow is the response simulation done by
performSIMULATION within performUQ. As discussed earlier, the user has the option
of running the performUQ part locally or remotely at DesignSafe. When the user selects
to run the job remotely, it is only the performUQ part that runs at DesignSafe. The
pFiles are still prepared locally. These files are placed in a directory (along with
all other needed files) and transferred to DesignSafe before an Agave application is
invoked to run the performUQ part of the workflow on Stampede2.
The performDL part of the workflow runs separately because its stochastic loss model
can be decoupled from the other parts of the workflow when analyzing single build-
ings. Decoupling performDL allows us to take advantage of the smaller computational
burden of this part of the workflow and use a considerably larger number of samples
in damage and loss calculation (i.e., 10,000 or more samples versus the typical 100
samples in response estimation).

58
5 Source Code
This source code for the tool is released under the 2-clause BSD License, commonly
called the FreeBSD license. It is available for download from the tool’s GitHub repos-
itory

59
6 User Training
User Training consists of an online video available from the tool webpage that demon-
strates tool use. The tool will be presented in user workshops hosted by the SimCenter.

60
7 Examples
This chapter provides examples that demonstrate the functionality of the PBE applica-
tion.

7.1 One-story two-dimensional portal frame


In this example, the performance of a simple 2D portal frame model is evaluated. This
is an extension of the example in the EE-UQ App.

The model is a linear elastic single-bay single-story model of a reinforced concrete por-
tal frame (Figure 7.1). The analysis of this model considers both gravity loading and
lateral earthquake loading due to El Centro earthquake (Borrego Mountain 04/09/68
0230, El Centro ARRAY #9, 270). The original model and ground motion used in this
example were obtained from example 1b in the OpenSees website, and were modified
to scale the ground motion record from gravity units (g) to the model units (in/sec2).
Files for this example are included with the release of the software and are available in
the Examples/01 PortalFrame2D folder. You can load the settings shown below using
Portal2D input file.json file. This populates all input data automatically, except
for the path to external files. These will need to be updated as shown below.

The GI and SIM setup is shown in Figure 7.2 and Figure 7.3. Most of the GI data are
arbitrary for this example, except for the plan area that is used to estimate injuries
later. Note: the plan area is defined in square inches following the length units speci-
fied in the GI.

The SIM input panel requires additional input because you need to specify the location
of the Portal2D-UQ.tcl script that describes the simulation model. The file is located
in the 01 PortalFrame2D folder. Use the Choose button to select the file.

The EVT input panel in Figure 7.4 shows that we are going to use a pre-defined JSON
file to specify a ground motion record. You need to use the Choose button here to
update the file location – the BM68elc.json file is available in the 01 PortalFrame2D
folder.

61
CHAPTER 7. EXAMPLES

Figure 7.1: Two-dimensional portal frame model subjected to gravity and earthquake loading

Figure 7.2: General Information about the building.

62
CHAPTER 7. EXAMPLES

Figure 7.3: Information about the simulation model.

Figure 7.4: Information about the ground motion event.

The FEM panel contains the default settings and there is no need to change those.
The UQ panel allows you to specify the number of samples (we use 10 by default) and
the distribution of random variables. To introduce uncertainty in the model, the mass
and the Young’s modulus are assumed to be normally distributed random variables
with means and standard deviation values shown in Figure 7.5.

Figure 7.5: Uncertainty quantification settings.

A simple damage and loss model is prepared for this example (Figure 7.6). We request
10,000 realizations with moderate levels of additional uncertainty. The building is as-

63
CHAPTER 7. EXAMPLES

sumed to be a retail area with customers inside. Hence, we specify 40 people as the
peak population. The replacement cost is set to $300,000 and the replacement time
is almost a year. No dependencies are set for this example and the components are
chosen from the default FEMA P58 database.

Figure 7.6: General settings for the Damage and Loss model.

The list of building components are shown in Figure 7.7. NoteL the quantity fields
contain only one number because the building has only one story. Multi-story buildings
would need to have their component quantities specified on every floor in this panel.
The units of each quantity are available in the JSON files in the applications/performDL/resources/
P58 first edition/DL json folder. Future versions of the PBE App will load the
information about units automatically. The data under directions not only defines the
direction of each group of components, but also specifies the number of component
groups in a performance group. Note that the storefront, for example, is assumed to
be only on one side, hence the identical directions, but it is separated into three groups
of components. The quantity corresponding to each group of components is identified
by the weights. In the storefront example the weights show an uneven distribution of
quantity among the three component groups.

The last tab in the DL panel identifies the collapse modes of the structure (Figure 7.8).
We defined two collapse modes for this example: a complete one that results mostly in
fatalities and a partial one that leads to injuries in a smaller affected area and leaves

64
CHAPTER 7. EXAMPLES

Figure 7.7: Component information.

most people unharmed.

Figure 7.8: Collapse modes and their consequences.

At this point the PBE application has all the information required to create the work-
flow and run the performance assessment for the building. You can start running the
calculation by clicking on the RUN button. This shows a dialog window requesting two
pieces of information (Figure 7.9). The location of the working directory is arbitrary,
but the application directory has to be the location of the application folder. This
should be the place where you have the PBE executable file. After providing the data,
click Submit to run the analysis. The runtime should not be more than 20 seconds.

If something goes wrong, go to the Working Dir Location and find the workflow-log
file in the tmp.SimCenter/templatedir folder. Contact us, and send us the file.

65
CHAPTER 7. EXAMPLES

Figure 7.9: Requested locations before running the analysis.

If all goes well, the RES panel is automatically loaded and shows the distribution of
reconstruction time among the 10,000 realizations (Figure 7.10).

By clicking at different columns with the left and right mouse buttons, you can visualize
the relationship between decision variables, or the cumulative distribution function, or
the probability density function of individual decision variables. The following figures
provide a few examples.

66
CHAPTER 7. EXAMPLES

Figure 7.10: The RES panel after running the analysis.

Figure 7.11: The joint distribution of reconstruction time (assuming parallel work) and
reconstruction cost.

67
CHAPTER 7. EXAMPLES

Figure 7.12: The cumulative distribution function of reconstruction time.

Figure 7.13: A histogram showing the marginal probability density function of reconstruction
time.

68
8 Verification and Validation
This chapter collects carefully designed verification exercises that we use to test the
functionality of the PBE application. Verification of the structural response simulation
part of the workflow are presented in the EE-UQ user manual.

8.1 Estimation of central tendencies


This example was designed to verify that the central tendencies estimated by the PBE
App are appropriate. The dispersion of every variable is reduced to a sufficiently low
value (i.e., a 10−4 coefficient of variation is used) that makes it simple to obtain the
decision variables analytically. Verification is performed by comparing analytical solu-
tions to the estimated values. Note: this verification example is identical to the first
system test of the pelicun Python library that the PBE App uses for loss estimation.

The files required to run the verification are available in the Verification/01 Central
tendency folder. Most of the input information can be loaded automatically from the
test central tendency.json file. After loading the file, the following locations need
to be set manually:

• Input Script under SIM: location of SDOF.tcl


• File under EVT/Multiple Existing: location of Rinaldi.json
• Analysis Script under FEM: location of test analysis script.tcl
• Component damage and loss under DL/General/Custom Data Source: location
of the folder that contains T0001.001.xml
• Population distribution under DL/General/Custom Data Source: location of
population test.json

All of the required files shall be available in the Verification/01 Central tendency
folder. After specifying the above locations you should be able to run the performance
assessment. The following comparisons are performed to verify the calculation:

a) The cumulative distribution function (CDF) of event time/month shall show 12


steps that correspond to a uniform distribution of 12 discrete month values (Fig-
ure 8.1).

69
CHAPTER 8. VERIFICATION AND VALIDATION

Figure 8.1: Cumulative distribution function of event month.

b) The probability density function (PDF) of event time/weekday? shall show two
bars with 0.286 / 0.714 likelihoods that correspond to the probability of a day
being a weekday or a weekend, respectively (Figure 8.2).

Figure 8.2: Distribution of weekday/weekend realizations.

c) The CDF of event time/hour shall show 24 steps that correspond to a uniform
distribution of 24 discrete hour values starting at 0 with a maximum of 23 (Fig-
ure 8.3).

Figure 8.3: Cumulative distribution function of event hour.

d) The CDF of inhabitants shall follow the custom temporal distribution prescribed
in population test.json. That distribution assigns 0, 25%, 50% or 100% of the

70
CHAPTER 8. VERIFICATION AND VALIDATION

peak population to the building. The peak population is 10 people. Hence, the
CDF of inhabitants shall show steps at 0, 2.5, 5.0 and 10.0. The step heights can
be determined from the month, weekday and hour distributions and the temporal
change in the number of inhabitants. The following step heights are expected:
14/28 ; 2/28; 7/28; 5/28 (Figure 8.4)

Figure 8.4: Cumulative distribution function of inhabitants.

e) The ground motion record is scaled so that it results in an EDP that is identical
to the collapse limit. The collapse limit is 0.5 g, which is 4.903325 m/s2. Due
to the stochastic nature (and the simplicity) of the model, half of the realizations
will have a slightly lower EDP value than the collapse limit, while the other half
shall have a higher one. This shall lead to a 50% probability of collapse in the
results as shown in Figure 8.5.

Figure 8.5: Distribution of collapse/non-collapse realizations.

f) Only one collapse mode is used in this test. The PDF of the collapse mode results
shall show that the collapse mode is evaluated in only 50% of the realizations
because the others did not lead to collapse (Figure 8.6).
g) There is only one type of component in the building and the corresponding Per-
formance Group is not divided further into multiple Component Subgroups (note
that there is only one number provided under directions in the DL/Components
tab). Consequently, the components within the Performance Group are assumed

71
CHAPTER 8. VERIFICATION AND VALIDATION

Figure 8.6: Distribution of collapse modes.

to have identical and perfectly correlated behavior. When the building is dam-
aged, all the components will be damaged which will trigger a red tag regardless
of the actual limit assigned to the red tags. Only the second and third damage
states trigger red tags. The probability of exceeding damage state (DS) 2 given
the quasi-deterministic EDP value is 50%. Hence, the probability of a red tag
triggered shall be 25% (the probability of non-collapse and exceeding DS2). Note
that the CDF in Figure 8.7 show that red tag information is only available in 50%
of the cases because the other realizations led to collapse.

Figure 8.7: Cumulative distribution function of realizations that resulted in a red tag.

h) Irrepairability and impractical repairs due to excessive cost or time are not ex-
amined by this test in detail. Similarly to the collapse modes, it can be verified
that only 50% of the realizations provide a value for these decision variables and
the provided value is 0 in all cases.
i) The fragility data is specified so that reconstruction costs are one hundred times
the reconstruction times. Because there is only one component, there should
be no difference between reconstruction times with parallel and sequential work
assumptions. This can be verified by plotting the joint distribution of those two
variables (Figure 8.8) and the joint distribution of reconstruction cost and time
(Figure 8.9).

72
CHAPTER 8. VERIFICATION AND VALIDATION

Figure 8.8: Joint distribution of reconstruction times with parallel and sequential repair
assumptions.

Figure 8.9: Joint distribution of reconstruction cost and time.

j) The distribution of reconstruction/time-parallel (and the other two, perfectly cor-


related decision variables) is more difficult to determine analytically than the pre-
vious results. The three damage states of the single component in the building
have 2.5, 25 and 250 days of repair consequence. The replacement time of the
building is 300 days. Therefore, considering the non-zero likelihood of no dam-
age, the steps in the CDF of repair time shall be at the following values: 0, 2.5,
25, 250, and 300 days. The last step corresponds to collapse and has a 50%
probability of occurrence. The other steps correspond to the probability of oc-
currence of each damage state from DS0-4 where 0 is no damage. The fragility
of the component is designed to have the following DS exceedance probabilities
at the quasi-deterministic EDP value: 0.8413, 0.5000, 0.1586 for DS1-3, respec-
tively. These correspond to the non-collapsed realizations. Consequently, the
total probability of being in each damage state will be half of those values given
50% probability of collapse (Figure 8.10).
k) Injuries and fatalities either stem from collapse of the building or from exceeding
DS2 in the component. Component damage leads to a significantly smaller num-
ber of injuries than building collapse. Considering the temporal distribution of
the population, the exceedance probabilities of DS2 and DS3, the probability of

73
CHAPTER 8. VERIFICATION AND VALIDATION

Figure 8.10: Cumulative distribution function of reconstruction time.

building collapse and the affected area by collapse and component damage, the
following steps are expected on the CDF of injuries (step size shown in parenthe-
sis) (Figure 8.11):
0 (35/56), 0.075 (1/56), 0.15 (3.5/36), 0.25 (2/56), 0.3 (2.5/56), 0.5 (7/56), 1.0
(5/56)

Figure 8.11: Cumulative distribution function of injuries.

l) A similar calculation provides the following steps for the CDF of fatalities (Fig-
ure 8.12):
0 (35/56), 0.025 (1/56), 0.05 (3.5/36), 0.1 (2.5/56), 2.25 (2/56), 4.5 (7/56), 9.0
(5/56)

74
CHAPTER 8. VERIFICATION AND VALIDATION

Figure 8.12: Cumulative distribution function of fatalities.

75
9 Requirements
This chapter outlines the general features of the PBE application. We show when the
features were introduced and what features and when you can expect to see in the
future. This provides a roadmap of where this application has come from and where
it is headed. The future features are highly dependent on user feedback. You are
highly encouraged to contact us to discuss any new features you would like to see in
the application.

Table 9.1: Requirements for PBE

# Description Source Priority Version


P1 Ability to determine damage and loss calculations GC M 1.0
for a building subjected to a natural hazard includ-
ing formal treatment of randomness and uncertainty
uncertainty
P1.1 Ability to determine damage and loss for multiple different GC M
hazards
P1.1.1 Damage and Loss for ground shaking due to Earthquake GC M 1.0
P1.1.2 Damage and Loss due to Wind Loading GC M
P1.1.3 Damage and Loss due to water damage due to Tsunami or GC M
Coastal Inundation
P1.1 Ability of Practicing Engineers to use multiple coupled re- GC 1.0
sources (applications, databases, viz tools) in engineering
practice
P1.2 Ability to utilize resources beyond the desktop including GC M 1.0
HPC
P1.3 Tool should incorporate data from WWW GC M 1.0
P1.4 Tool available for download from web GC M 1.0
P1.5 Ability to use new viz tools for viewing large datasets gen- GC M 1.0
erated by PBE
P2 Various Motion Selection Options for Hazard Event SP M 1.0
P2.1 Various Earthquake Events SP M 1.0
Ability to select from Multiple input motions and view UQ GC M 1.0
P2.1.1 due to all the discrete events
P2.1.2 Ability to select from list of SimCenter motions SP M 1.0

76
CHAPTER 9. REQUIREMENTS

P2.1.3 Ability to select from list of PEER motions SP D 1.0


P2.1.4 Ability to use OpenSHA and selection methods to generate UF D 1.0
motions
P2.1.5 Ability to Utilize Own Application in Workflow SP M 1.0
P2.1.6 Ability to use Broadband SP D
P2.1.7 Ability to include Soil Structure Interaction Effects GC M 1.1
P2.1.7.1 1D nonlinear site response with effective stress analysis SP M 1.1
P2.1.7.2 Nonlinear site response with bidirectional loading SP M 1.2
P2.1.7.3 Nonlinear site response with full stochastic characterization SP M
of soil layers
P2.1.7.4 Nonlinear site response, bidirectional different input motions SP M
P2.1.7.5 Building in nonlinear soil domain utilizing large scale rup- GC M
ture simulation
P2.1.7.5.1Interface using DRM method SP M
P2.1.8 Utilize PEER NGA www ground motion selection tool UF D 2.0
P2.1.9 Ability to select from synthetic ground motions SP M 1.0
P2.1.9.1 per Vlachos, Papakonstantinou, Deodatis (2017) SP D 1.1
P2.1.9.2 per Dabaghi, Der Kiureghian (2017) UF D 2.0
P2.2 Various Wind Loading Options SP M
Various Water Loading Options SP M
P2.3
P3 Building Model Generation GC M
P3.1 Ability to quickly create a simple nonlinear building model GC D 1.1
P3.2 Ability to use existing OpenSees model scripts SP M 1.0
P3.3 Ability to define building and use Expert System to generate SP
FE mesh
P3.3.1 Expert system for Concrete Shear Walls SP M
P3.3.2 Expert system for Moment Frames SP M
P3.3.3 Expert system for Braced Frames SP M
P3.4 Ability to define building and use Machine Learning appli- GC
cations to generate FE
P3.4.1 Machine Learning for Concrete Shear Walls SP M
P3.4.2 Machine Learning for Moment Frames SP M
P3.4.3 Machine Learning for Braced Frames SP M
P3.5 Ability to specify connection details for member ends UF M
P3.6 Ability to define a user-defined moment-rotation response UF D 2.2
representing the connection details
P4 Perform Nonlinear Analysis GC M 1.0

77
CHAPTER 9. REQUIREMENTS

P4.1 Ability to specify OpenSees as FEM engine and to specify SP M 1.0


different analysis options
P4.2 Ability to provide own OpenSees Analysis script to SP D 1.0
OpenSees engine.
P4.3 Ability to provide own Python script and use OpenSeesPy UF O
engine.
P4.4 Ability to use alternative FEM engine. SP M 2.0
P5 Uncertainty Quantification Methods GC M 1.0
P5.1 Various Forward Propogation Methods SP M 1.0
P5.1.1 Ability to use basic Monte Carlo and LHS methods SP M 1.0
P5.1.2 Ability to use Importance Sampling SP M 2.0
P5.1.3 Ability to use Gaussian Process Regression SP M 2.0
P5.1.4 Ability to use Own External UQ Engine SP M
P5.2 Various Reliability Methods UF M
P5.2.1 Ability to use First Order Reliability method UF M
P5.2.2 Ability to use Second Order Reliability method UF M
P5.2.2 Ability to use Surrogate Based Reliability UF M
P5.2.3 Ability to use Own External Application to generate Results UF M
P5.3 Various Sensitivity Methods UF M
P5.3.1 Ability to obtain Global Sensitivity Sobol’s indices UF M
P6 Random Variables for Uncertainty Quantification GC M 1.0
P6.1 Ability to Define Variables of certain types: SP M 1.0
P6.1.1 Normal SP M 1.0
P6.1.2 Lognormal SP M 1.0
P6.1.3 Uniform SP M 1.0
P6.1.4 Beta SP M 1.0
P6.1.5 Weibull SP M 1.0
P6.1.6 Gumbel SP M 1.0
P6.2 User defined Distribution SP M
P6.3 Correlated Random Variables SP M
P6.4 Random Fields SP M
P8 Engineering Demand Parameters
P8.1 Ability to Process own Output Parameters UF M
P8.2 Add to Standard Earthquake a variable indicating analysis UF D
failure
P8.3 Allow users to provide their own set of EDPs for the analysis. UF D 2.0
P9 Damage and Loss Assessment GC M 1.0
P9.1 Different Assessment Methods GC M 2.0

78
CHAPTER 9. REQUIREMENTS

P9.1.1 Ability to perform component-based (FEMA P58) loss as- SP M 1.0


sessment for an earthquake hazard.
P9.1.2 Ability to perform component-assembly-based (HAZUS SP D 1.1
MH) loss assessment for an earthquake hazard.
P9.1.3 Ability to perform downtime estimation using the REDi UF D
methodology.
P9.1.4 Ability to describe building performance with additional de- SP D
cision variables from HAZUS (e.g., business interruption,
debris)
P9.1.5 Ability to perform time-based assessment GC M
P9.1.6 Ability to perform damage and loss assessment for hurricane GC M
wind
P9.1.7 Ability to perform damage and loss assessment for storm GC M
surge
P9.2 Control SP M 1.0
P9.2.1 Allow users to set the number of realizations SP M 1.0
P9.2.2 Allow users to specify the added uncertainty to EDPs SP M 1.0
P9.2.3 Allow users to decide which decision variables to calculate SP D 1.0
P9.2.4 Allow users to set the number of inhabitants on each floor SP D 1.0
and customize their temporal distribution.
P9.2.5 Allow users to specify the boundary conditions of repairabil- SP D 1.0
ity.
P9.2.6 Allow users to control collapse through EDP limits. SP D 1.0
P9.2.7 Allow users to specify the replacement cost and time for the SP M 1.0
building.
P9.2.8 Allow users to specify EDP boundaries that correspond to SP D 1.0
reliable simulation results.
P9.2.9 Allow users to specify collapse modes and characterize the SP D 1.0
corresponding likelihood of injuries.
P9.2.10 Allow users to specify the collapse probability of the struc- UF M 1.2
ture.
P9.2.11 Allow users to use empirical EDP data to estimate the col- UF M 1.2
lapse probability of the structure.
P9.2.12 Allow users to choose the type of distribution they want to UF D 1.2
estimate the EDPs with.
P9.2.13 Allow users to perform the EDP fitting only for non- UF M 1.2
collapsed cases.
P9.2.14 Allow users to couple response estimation with loss assess- UF M
ment.
P9.3 Component damage and loss information SP M 1.0

79
CHAPTER 9. REQUIREMENTS

P9.3.1 Make the component damage and loss data from FEMA P58 SP M 1.0
available.
P9.3.2 Ability to use custom components for loss assessment. SP D 1.0
P9.3.3 Allow users to set different component quantities for each SP D 1.0
floor in each direction.
P9.3.4 Allow users to set the number of identical component groups UF D 1.0
and their quantities within each performance group.
P9.3.5 Use a generic JSON data format for building components SP D 1.1
that can be shared by component-based and component-
assembly-based assessments.
P9.3.6 Convert FEMA P58 and HAZUS component damage and SP D 1.1
loss data to the new JSON format and make it available
with the tool.
P9.3.7 Make component definition easier by providing a list of avail- UF D 1.2
able components in the given framework (e.g. FEMA P58
or HAZUS) and not requesting inputs that are already avail-
able in the data files.
P9.3.8 Make the component damage and loss data from FEMA P58 UF M 2.0
2nd edition available.
P9.3.9 Improve component definition by providing complete control UF D 2.0
over every characteristic on every floor and in every direction
P9.3.10 Allow users to view fragility and consequence functions in UF D
the application
P9.3.11 Allow users to edit fragility and consequence functions in UF D
the application
P9.4 Stochastic loss model SP M 1.0
P9.4.1 Allow the user to specify basic dependencies (i.e. indepen- SP D 1.0
dence or perfect correlation) between logically similar parts
of the stochastic model (i.e. within component quantities
or one type of decision variable, but not between quantities
and fragilities)
P9.4.2 Allow the user to specify basic dependencies between recon- SP D 1.0
struction cost and reconstruction time.
P9.4.3 Allow the user to specify basic dependencies between differ- SP D 1.0
ent levels of injuries.
P9.4.4 Allow the user to specify intermediate levels of correlation SP D
(i.e. not limited to 0 or 1) and provide a convenient interface
that makes sure the specified correlation structure is valid.
P9.4.5 Allow the user to specify the correlation for EDPs. SP D
PM Misc. UF M 1.2

80
CHAPTER 9. REQUIREMENTS

PM.1 Tool to allow user to load and save user inputs SP M 1.0
PM.2 Simplify run local and run remote by removing workdir lo- UF D 1.2
cations. Move to preferences
P M.3 Add to EDP a variable indicating analysis failure UF D
PM.4 Installer which installs application and all needed software UF M
PE Ability to gain educational materials that will help GC M 1.0
and encourage PBE
PE.1 Documentation exists on tool usage SP M 1.1
PE.2 Video Exists demonstrating usage SP M 1.1
PE.3 Verification Examples Exist SP M 1.1
PE.4 Validation Examples Exist, validated against tests or other GC D
software

KEY:
Source: GC=Needed for Grand Challenges, SP=Senior Personnel, UF=User Feedback
Need: M=Mandatory, D=Desirable, P=Possible Future
Version: Version number the basic requirement was met

81
10 Troubleshooting

10.1 Problems Starting the Application


On Windows operating systems, if you receive an error when starting the application
with the message that MSVCP140.dll is missing as shown in Figure 10.1, it is caused
by a missing Visual C/C++ runtime library. You can fix this error by running the
installer for the Visual C/C++ redistributable package (vc redist.x64.exe) which is
included with the application.

Figure 10.1: Error message for missing Visual C/C++ runtime library

10.2 Problems Running Simulations


The PBE can be a complicated tool and it will not always run. Causes of failure include
incorrect set up, non-functioning or poorly functioning websites, and user error. To
discover the errors it is useful to understand how the UI and the backend work when
the user submits to run a job. A number of things occur when the Submit button is
clicked:

a) The UI creates a folder in the working dir location specified called tmp.SimCenter
and in that folder creates another folder called templatedir.
b) The UI then iterates through all the widgets chosen and these widgets place all
needed files for the computation into the templatedir directory.

82
CHAPTER 10. TROUBLESHOOTING

c) A python script is run in this templatedir directory that creates the input file for
the UQ Engine. For example, using Dakota the input file dakota.in is created and
placed in tmp.SimCenter folder.
d) The UQ engine is then started and runs using the dakota.in input file.
e) As the UQ engine runs, it creates folders in tmp.SimCenter, one folder for each
deterministic run.
f) When completed the UQ engine leaves the results files in the tmp.SimCenter
folder.
g) The results files are then processed by the UI and presented to the user in the
RES tab.

The following is a list of things that we have observed go wrong when the UI informs
the user of a failure and steps the user can take to fix the problem:

a) Could not create working dir: The user does not have permission to create the
tmp.SimCenter folder in working dir location. Change the Local Jobs Directory
and the Remote Jobs Directory in the applications Preferences menu option.
b) No Script File: The user has changed the Local Applications dir location in
Preferences, or the applications folder that accompanies the installation has been
modified. Either set the correct dir location or re-install the tool.
c) ERROR: Dakota failed to finish: This can occur for a number of reasons.
Go to the tmp.SimCenter folder and have a look for the dakota.err file. If no file
exists then dakota did not start, if the file exists look at it’s contents to see if
there are any errors.
i. No dakota.err file and no dakota.in file: the Python script in templatdir
failed to create the necessary files. Have a look at the workflow log file in
templatedir folder to see what the error is as it could indicate an error in your
input. If no workflow log file exists, it means Python failed to start. Check
the installation of Python.
ii. No dakota.err and dakota.in exists: Dakota failed to run. Check instal-
lation of Dakota and Python. NOTE: Sometimes if python starts, it is not
using the version of Python you specified in the environment variables when
setting up Python (this is due to fact that many applications install their
own version of Python). If Dakota is installed correctly, set the location of
the Python executable in Preferences.
iii. dakota.err file exists: Open the file and see what the error is. For example
if it says Error: at least one variable must be specified. This means no
random variables have been specified. You have only one deterministic event
or you have not specified any random variables in the EDP.

83
iv. dakota.err file exists but is empty: This means that Dakota ran but
there was a problem with the simulation. Go to one of the workdir locations.
There is a file workflow driver that can be run. Run it and see what the errors
are.
v. You ran at DesignSafe and no dakota.out files come back: Go to your
data depot folder at DesignSafe using your browser. Go to archive/jobs and
use the job number shown in table that pops up when you ask to get the job
from DesignSafe. Study both the .err and .out files in that directory for a
clues to as what went wrong.
vi. No results and you used the Site Response to create the event.
You must run a simulated event in the Site Response Widget before you can
submit a job to run.

If still having trouble, you can always join the PBE slack channel and look for similar
issues or post a new one.

84

You might also like