openquake.commonlib package

openquake.commonlib.concurrent_futures_process_mpatch module

openquake.commonlib.concurrent_futures_process_mpatch.concurrent_futures_process_monkeypatch()[source]

openquake.commonlib.datastore module

class openquake.commonlib.datastore.ByteCounter(nbytes=0)[source]

Bases: object

A visitor used to measure the dimensions of a HDF5 dataset or group. Use it as ByteCounter.get_nbytes(dset_or_group).

classmethod get_nbytes(dset)[source]
class openquake.commonlib.datastore.DataStore(calc_id=None, datadir='/home/daniele/oqdata', export_dir='.', params=(), mode=None)[source]

Bases: _abcoll.MutableMapping

DataStore class to store the inputs/outputs of a calculation on the filesystem.

Here is a minimal example of usage:

>>> ds = DataStore()
>>> ds['example'] = 'hello world'
>>> ds.items()
[(u'example', 'hello world')]
>>> ds.clear()

When reading the items, the DataStore will return a generator. The items will be ordered lexicographically according to their name.

There is a serialization protocol to store objects in the datastore. An object is serializable if it has a method __toh5__ returning an array and a dictionary, and a method __fromh5__ taking an array and a dictionary and populating the object. For an example of use see openquake.hazardlib.site.SiteCollection.

build_fname(prefix, postfix, fmt, export_dir=None)[source]

Build a file name from a realization, by using prefix and extension.

Parameters:
  • prefix – the prefix to use
  • postfix – the postfix to use (can be a realization object)
  • fmt – the extension (‘csv’, ‘xml’, etc)
  • export_dir – export directory (if None use .export_dir)
Returns:

relative pathname including the extension

clear()[source]

Remove the datastore from the file system

close()[source]

Close the underlying hdf5 file

create_dset(key, dtype, size=None, compression=None)[source]

Create a one-dimensional HDF5 dataset.

Parameters:
  • key – name of the dataset
  • dtype – dtype of the dataset (usually composite)
  • size – size of the dataset (if None, the dataset is extendable)
export_csv(key)[source]

Generic csv exporter

export_path(relname, export_dir=None)[source]

Return the path of the exported file by adding the export_dir in front, the calculation ID at the end.

Parameters:
  • relname – relative file name
  • export_dir – export directory (if None use .export_dir)
flush()[source]

Flush the underlying hdf5 file

get(key, default)[source]
Returns:the value associated to the datastore key, or the default
get_attr(key, name, default=None)[source]
Parameters:
  • key – dataset path
  • name – name of the attribute
  • default – value to return if the attribute is missing
getsize(key=None)[source]

Return the size in byte of the output associated to the given key. If no key is given, returns the total size of all files.

save(key, kw)[source]

Update the object associated to key with the kw dictionary; works for LiteralAttrs objects and automatically flushes.

set_attrs(key, **kw)[source]

Set the HDF5 attributes of the given key

set_nbytes(key, nbytes=None)[source]

Set the nbytes attribute on the HDF5 object identified by key.

set_parent(parent)[source]

Give a parent to a datastore and update its .attrs with the parent attributes, which are assumed to be literal strings.

class openquake.commonlib.datastore.Fake(attrs=None, **kwargs)[source]

Bases: dict

A fake datastore as a dict subclass, useful in tests and such

openquake.commonlib.datastore.get_calc_ids(datadir='/home/daniele/oqdata')[source]

Extract the available calculation IDs from the datadir, in order.

openquake.commonlib.datastore.get_last_calc_id(datadir)[source]

Extract the latest calculation ID from the given directory. If none is found, return 0.

openquake.commonlib.datastore.get_nbytes(dset)[source]

If the dataset has an attribute ‘nbytes’, return it. Otherwise get the size of the underlying array. Returns None if the dataset is actually a group.

openquake.commonlib.datastore.persistent_attribute(key)[source]

Persistent attributes are persisted to the datastore and cached. Modifications to mutable objects are not automagically persisted. If you have a huge object that does not fit in memory use the datastore directory (for instance, open a HDF5 file to create an empty array, then populate it). Notice that you can use any dict-like data structure in place of the datastore, provided you can set attributes on it. Here is an example:

>>> class Datastore(dict):
...     "A fake datastore"
>>> class Store(object):
...     a = persistent_attribute('a')
...     def __init__(self, a):
...         self.datastore = Datastore()
...         self.a = a  # this assegnation will store the attribute
>>> store = Store([1])
>>> store.a  # this retrieves the attribute
[1]
>>> store.a.append(2)
>>> store.a = store.a  # remember to store the modified attribute!
Parameters:key – the name of the attribute to be made persistent
Returns:a property to be added to a class with a .datastore attribute
openquake.commonlib.datastore.read(calc_id, mode='r', datadir='/home/daniele/oqdata')[source]
Parameters:
  • calc_id – calculation ID
  • mode – ‘r’ or ‘w’
  • datadir – the directory where to look
Returns:

the corresponding DataStore instance

Read the datastore, if it exists and it is accessible.

openquake.commonlib.hazard_writers module

Classes for serializing various NRML XML artifacts.

class openquake.commonlib.hazard_writers.BaseCurveWriter(dest, **metadata)[source]

Bases: object

Base class for curve writers.

Parameters:
  • dest – File path (including filename) or file-like object for results to be saved to.
  • metadata

    The following keyword args are required:

    • investigation_time: Investigation time (in years) defined in the calculation which produced these results.

    The following are more or less optional (combinational rules noted below where applicable):

    • statistics: ‘mean’ or ‘quantile’
    • quantile_value: Only required if statistics = ‘quantile’.
    • smlt_path: String representing the logic tree path which produced these curves. Only required for non-statistical curves.
    • gsimlt_path: String represeting the GSIM logic tree path which produced these curves. Only required for non-statisical curves.
serialize(_data)[source]

Implement in subclasses.

class openquake.commonlib.hazard_writers.DisaggXMLWriter(dest, **metadata)[source]

Bases: object

Parameters:
  • dest – File path (including filename) or file-like object for XML results to be saved to.
  • metadata

    The following keyword args are required:

    • investigation_time: Investigation time (in years) defined in the calculation which produced these results.
    • imt: Intensity measure type used to compute these matrices.
    • lon, lat: Longitude and latitude associated with these results.

    The following attributes define dimension context for the result matrices:

    • mag_bin_edges: List of magnitude bin edges (floats)
    • dist_bin_edges: List of distance bin edges (floats)
    • lon_bin_edges: List of longitude bin edges (floats)
    • lat_bin_edges: List of latitude bin edges (floats)
    • eps_bin_edges: List of epsilon bin edges (floats)
    • tectonic_region_types: List of tectonic region types (strings)
    • smlt_path: String representing the logic tree path which produced these results. Only required for non-statistical results.
    • gsimlt_path: String represeting the GSIM logic tree path which produced these results. Only required for non-statistical results.

    The following are optional, depending on the imt:

    • sa_period: Only used with imt = ‘SA’.
    • sa_damping: Only used with imt = ‘SA’.
BIN_EDGE_ATTR_MAP = OrderedDict([('mag_bin_edges', 'magBinEdges'), ('dist_bin_edges', 'distBinEdges'), ('lon_bin_edges', 'lonBinEdges'), ('lat_bin_edges', 'latBinEdges'), ('eps_bin_edges', 'epsBinEdges'), ('tectonic_region_types', 'tectonicRegionTypes')])

Maps metadata keywords to XML attribute names for bin edge information passed to the constructor. The dict here is an OrderedDict so as to give consistent ordering of result attributes.

DIM_LABEL_TO_BIN_EDGE_MAP = {'Dist': 'dist_bin_edges', 'Lon': 'lon_bin_edges', 'Eps': 'eps_bin_edges', 'Mag': 'mag_bin_edges', 'Lat': 'lat_bin_edges', 'TRT': 'tectonic_region_types'}
serialize(data)[source]
Parameters:data

A sequence of data where each datum has the following attributes:

  • matrix: N-dimensional numpy array containing the disaggregation histogram.
  • dim_labels: A list of strings which label the dimensions of a given histogram. For example, for a Magnitude-Distance-Epsilon histogram, we would expect dim_labels to be ['Mag', 'Dist', 'Eps'].
  • poe: The disaggregation Probability of Exceedance level for which these results were produced.
  • iml: Intensity measure level, interpolated from the source hazard curve at the given poe.
class openquake.commonlib.hazard_writers.EventBasedGMFXMLWriter(dest, sm_lt_path, gsim_lt_path)[source]

Bases: object

Parameters:
  • dest – File path (including filename) or a file-like object for XML results to be saved to.
  • sm_lt_path (str) – Source model logic tree branch identifier of the logic tree realization which produced this collection of ground motion fields.
  • gsim_lt_path – GSIM logic tree branch identifier of the logic tree realization which produced this collection of ground motion fields.
serialize(data, fmt='%10.7E')[source]

Serialize a collection of ground motion fields to XML.

Parameters:data

An iterable of “GMF set” objects. Each “GMF set” object should:

  • have an investigation_time attribute
  • have an stochastic_event_set_id attribute
  • be iterable, yielding a sequence of “GMF” objects

Each “GMF” object should:

  • have an imt attribute
  • have an sa_period attribute (only if imt is ‘SA’)
  • have an sa_damping attribute (only if imt is ‘SA’)
  • have a rupture_id attribute (to indicate which rupture contributed to this gmf)
  • be iterable, yielding a sequence of “GMF node” objects

Each “GMF node” object should have:

  • a gmv attribute (to indicate the ground motion value
  • lon and lat attributes (to indicate the geographical location of the ground motion field)
class openquake.commonlib.hazard_writers.HazardCurveGeoJSONWriter(dest, **metadata)[source]

Bases: openquake.commonlib.hazard_writers.BaseCurveWriter

Writes hazard curves to GeoJSON. Has the same constructor and interface as HazardCurveXMLWriter.

serialize(data)[source]

Write the hazard curves to the given as GeoJSON. The GeoJSON format is customized to contain various bits of metadata.

See HazardCurveXMLWriter.serialize() for expected input.

class openquake.commonlib.hazard_writers.HazardCurveXMLWriter(dest, **metadata)[source]

Bases: openquake.commonlib.hazard_writers.BaseCurveWriter

Hazard Curve XML writer. See BaseCurveWriter for a list of general constructor inputs.

The following additional metadata params are required:
  • imt: Intensity measure type used to compute these hazard curves.
  • imls: Intensity measure levels, which represent the x-axis values of each curve.
The following parameters are optional:
  • sa_period: Only used with imt = ‘SA’.
  • sa_damping: Only used with imt = ‘SA’.
add_hazard_curves(root, metadata, data)[source]

Add hazard curves stored into data as child of the root element with metadata. See the documentation of the method serialize and the constructor for a description of data and metadata, respectively.

serialize(data)[source]

Write a sequence of hazard curves to the specified file.

Parameters:data

Iterable of hazard curve data. Each datum must be an object with the following attributes:

  • poes: A list of probability of exceedence values (floats).
  • location: An object representing the location of the curve; must have x and y to represent lon and lat, respectively.
class openquake.commonlib.hazard_writers.HazardMapGeoJSONWriter(dest, **metadata)[source]

Bases: openquake.commonlib.hazard_writers.HazardMapWriter

GeoJSON implementation of a HazardMapWriter. Serializes hazard maps as FeatureCollection artifacts with additional hazard map metadata.

See HazardMapWriter for information about constructor parameters.

serialize(data)[source]

Serialize hazard map data to GeoJSON.

See HazardMapWriter.serialize() for details about the expected input.

class openquake.commonlib.hazard_writers.HazardMapWriter(dest, **metadata)[source]

Bases: object

Parameters:
  • dest – File path (including filename) or a file-like object for results to be saved to.
  • metadata

    The following keyword args are required:

    • investigation_time: Investigation time (in years) defined in the calculation which produced these results.
    • imt: Intensity measure type used to compute these hazard curves.
    • poe: The Probability of Exceedance level for which this hazard map was produced.

    The following are more or less optional (combinational rules noted below where applicable):

    • statistics: ‘mean’ or ‘quantile’
    • quantile_value: Only required if statistics = ‘quantile’.
    • smlt_path: String representing the logic tree path which produced these curves. Only required for non-statistical curves.
    • gsimlt_path: String represeting the GSIM logic tree path which produced these curves. Only required for non-statisical curves.
    • sa_period: Only used with imt = ‘SA’.
    • sa_damping: Only used with imt = ‘SA’.
serialize(data)[source]

Write a sequence of hazard map data to the specified file.

Parameters:data – Iterable of hazard map data. Each datum should be a triple of (lon, lat, iml) values.
class openquake.commonlib.hazard_writers.HazardMapXMLWriter(dest, **metadata)[source]

Bases: openquake.commonlib.hazard_writers.HazardMapWriter

NRML/XML implementation of a HazardMapWriter.

See HazardMapWriter for information about constructor parameters.

serialize(data)[source]

Serialize hazard map data to XML.

See HazardMapWriter.serialize() for details about the expected input.

class openquake.commonlib.hazard_writers.MultiHazardCurveXMLWriter(dest, metadata_set)[source]

Bases: object

A serializer of multiple hazard curve set having multiple metadata. It uses openquake.commonlib.hazard_writers.HazardCurveXMLWriter to actually serialize the single set of curves.

Attr str dest:The path of the filename to be written, or a file-like object
Attr metadata_set:
 Iterable over metadata suitable to create instances of openquake.commonlib.hazard_writers.HazardCurveXMLWriter
serialize(curve_set)[source]

Write a set of sequence of hazard curves to the specified file. :param curve_set:

Iterable over sequence of curves. Each element returned by the iterable is an iterable suitable to be used by the serialize() of the class openquake.commonlib.hazard_writers.HazardCurveXMLWriter
class openquake.commonlib.hazard_writers.SESXMLWriter(dest)[source]

Bases: object

Parameters:
  • dest – File path (including filename) or a file-like object for XML results to be saved to.
  • sm_lt_path (str) – Source model logic tree branch identifier of the logic tree realization which produced this collection of stochastic event sets.
  • gsim_lt_path – GSIM logic tree branch identifier of the logic tree realization which produced this collection of stochastic event sets.
serialize(data)[source]

Serialize a collection of stochastic event sets to XML.

Parameters:data

An iterable of “SES” (“Stochastic Event Set”) objects. Each “SES” object should:

  • have an investigation_time attribute
  • have an ordinal attribute
  • be iterable, yielding a sequence of “rupture” objects

Each rupture” should have the following attributes: * etag * magnitude * strike * dip * rake * tectonic_region_type * is_from_fault_source (a bool) * is_multi_surface (a bool) * lons * lats * depths

If is_from_fault_source is True, the rupture originated from a simple or complex fault sources. In this case, lons, lats, and depths should all be 2D arrays (of uniform shape). These coordinate triples represent nodes of the rupture mesh.

If is_from_fault_source is False, the rupture originated from a point or area source. In this case, the rupture is represented by a quadrilateral planar surface. This planar surface is defined by 3D vertices. In this case, the rupture should have the following attributes:

  • top_left_corner
  • top_right_corner
  • bottom_right_corner
  • bottom_left_corner

Each of these should be a triple of lon, lat, depth.

If is_multi_surface is True, the rupture originated from a multi-surface source. In this case, lons, lats, and depths should have uniform length. The length should be a multiple of 4, where each segment of 4 represents the corner points of a planar surface in the following order:

  • top left
  • top right
  • bottom left
  • bottom right

Each of these should be a triple of lon, lat, depth.

class openquake.commonlib.hazard_writers.UHSXMLWriter(dest, **metadata)[source]

Bases: openquake.commonlib.hazard_writers.BaseCurveWriter

UHS curve XML writer. See BaseCurveWriter for a list of general constructor inputs.

The following additional metadata params are required:
  • poe: Probability of exceedance for which a given set of UHS have been

    computed

  • periods: A list of SA (Spectral Acceleration) period values, sorted

    ascending order

serialize(data)[source]

Write a sequence of uniform hazard spectra to the specified file.

Parameters:data

Iterable of UHS data. Each datum must be an object with the following attributes:

  • imls: A sequence of Itensity Measure Levels
  • location: An object representing the location of the curve; must have x and y to represent lon and lat, respectively.
openquake.commonlib.hazard_writers.gen_gmfs(gmf_set)[source]

Generate GMF nodes from a gmf_set :param gmf_set: a sequence of GMF objects with attributes imt, sa_period, sa_damping, rupture_id and containing a list of GMF nodes with attributes gmv and location. The nodes are sorted by lon/lat.

openquake.commonlib.hazard_writers.rupture_to_element(rupture, parent=None)[source]

Convert a rupture object into an Element object.

Parameters:
  • rupture – must have attributes .rupture, .etag and .seed
  • parent – if None a new element is created, otherwise a sub element is attached to the parent.

openquake.commonlib.logictree module

Logic tree parser, verifier and processor. See specs at https://blueprints.launchpad.net/openquake-old/+spec/openquake-logic-tree-module

A logic tree object must be iterable and yielding realizations, i.e. objects with attributes value, weight, lt_path and ordinal.

class openquake.commonlib.logictree.BaseLogicTree(filename, validate=True, seed=0, num_samples=0)[source]

Bases: object

Common code for logic tree readers, parsers and verifiers – GMPELogicTree and SourceModelLogicTree.

Parameters:
  • filename – Full pathname of logic tree file
  • validate – Boolean indicating whether or not the tree should be validated while parsed. This should be set to True on initial load of the logic tree (before importing it to the database) and to False on workers side (when loaded from the database).
Raises:
  • ParsingError – If logic tree file or any of the referenced files is unable to read or parse.
  • ValidationError – If logic tree file has a logic error, which can not be prevented by xml schema rules (like referencing sources with missing id).
FILTERS = ('applyToTectonicRegionType', 'applyToSources', 'applyToSourceType')
apply_branchset(branchset_node, branchset)[source]

Apply branchset to all “open end” branches. See parse_branchinglevel().

Parameters:
  • branchset_node – Same as for parse_branchset().
  • branchset – An instance of BranchSet to make it child for “open-end” branches.

Can be overridden by subclasses if they want to apply branchests to branches selectively.

parse_branches(branchset_node, branchset, validate)[source]

Create and attach branches at branchset_node to branchset.

Parameters:
  • branchset_node – Same as for parse_branchset().
  • branchset – An instance of BranchSet.
  • validate – Whether or not branches’ uncertainty values should be validated.

Checks that each branch has valid value, unique id and that all branches have total weight of 1.0.

Returns:None, all branches are attached to provided branchset.
parse_branchinglevel(branchinglevel_node, depth, validate)[source]

Parse one branching level.

Parameters:
  • branchinglevel_nodeetree.Element object with tag “logicTreeBranchingLevel”.
  • depth – The sequential number of this branching level, based on 0.
  • validate – Whether or not the branching level, its branchsets and their branches should be validated.

Enumerates children branchsets and call parse_branchset(), validate_branchset(), parse_branches() and finally apply_branchset() for each.

Keeps track of “open ends” – the set of branches that don’t have any child branchset on this step of execution. After processing of every branching level only those branches that are listed in it can have child branchsets (if there is one on the next level).

parse_branchset(branchset_node, depth, number, validate)[source]

Create BranchSet object using data in branchset_node.

Parameters:
  • branchset_nodeetree.Element object with tag “logicTreeBranchSet”.
  • depth – The sequential number of branchset’s branching level, based on 0.
  • number – Index number of this branchset inside branching level, based on 0.
  • validate – Whether or not filters defined in branchset and the branchset itself should be validated.
Returns:

An instance of BranchSet with filters applied but with no branches (they’re attached in parse_branches()).

parse_filters(branchset_node, uncertainty_type, filters)[source]

Do any kind of type conversion or adaptation on the filters.

Abstract method, must be overriden by subclasses.

Parameters are the same as for validate_filters().

Returns:The filters dictionary to replace the original.
parse_tree(tree_node, validate)[source]

Parse the whole tree and point root_branchset attribute to the tree’s root. If validate is set to True, calls validate_tree() when done. Also passes that value to parse_branchinglevel().

parse_uncertainty_value(node, branchset)[source]

Do any kind of type conversion or adaptation on the uncertainty value.

Abstract method, must be overridden by subclasses.

Parameters are the same as for validate_uncertainty_value().

Returns:Something to replace value as the uncertainty value.
sample_path(rnd)[source]

Return the model name and a list of branch ids.

Parameters:random_seed (int) – the seed used for the sampling
skip_branchset_condition(attrs)[source]

Override in subclasses to skip a branchset depending on a condition on its attributes.

Parameters:attrs – a dictionary with the attributes of the branchset
validate_branchset(branchset_node, depth, number, branchset)[source]

Check that branchset is valid.

Abstract method, must be overriden by subclasses.

Parameters:
  • branchset_nodeetree.Element object with tag “logicTreeBranchSet”.
  • depth – The number of branching level that contains the branchset, based on 0.
  • number – The number of branchset inside the branching level, based on 0.
  • branchset – An instance of BranchSet.
validate_filters(node, uncertainty_type, filters)[source]

Check that filters filters are valid for given uncertainty type.

Abstract method, must be overriden by subclasses.

Parameters:
  • nodeetree.Element object with tag “logicTreeBranchSet”.
  • uncertainty_type – String specifying the uncertainty type. See the list in BranchSet.
  • filters – Filters dictionary.
validate_tree(tree_node, root_branchset)[source]

Check the whole parsed tree for consistency and sanity.

Can be overriden by subclasses. Base class implementation does nothing.

Parameters:
  • tree_nodeetree.Element object with etag “logicTree”.
  • root_branchset – An instance of BranchSet which is about to become the root branchset for this tree.
validate_uncertainty_value(node, branchset)[source]

Check the value value for correctness to be set for one of branchset’s branches.

Abstract method, must be overridden by subclasses.

Parameters:
  • nodeetree.Element object with tag “uncertaintyModel” (the one that contains the subject value).
  • branchset – An instance of BranchSet which will have the branch with provided value attached once it’s validated.
  • value – The actual value to be checked. Type depends on branchset’s uncertainty type.
class openquake.commonlib.logictree.Branch(branch_id, weight, value)[source]

Bases: object

Branch object, represents a <logicTreeBranch /> element.

Parameters:
  • branch_id – Value of @branchID attribute.
  • weight – Decimal value of weight assigned to the branch. A text node contents of <uncertaintyWeight /> child node.
  • value – The actual uncertainty parameter value. A text node contents of <uncertaintyModel /> child node. Type depends on the branchset’s uncertainty type.
class openquake.commonlib.logictree.BranchSet(uncertainty_type, filters)[source]

Bases: object

Branchset object, represents a <logicTreeBranchSet /> element.

Parameters:
  • uncertainty_type

    String value. According to the spec one of:

    gmpeModel
    Branches contain references to different GMPEs. Values are parsed as strings and are supposed to be one of supported GMPEs. See list at GMPELogicTree.
    sourceModel
    Branches contain references to different PSHA source models. Values are treated as file names, relatively to base path.
    maxMagGRRelative
    Different values to add to Gutenberg-Richter (“GR”) maximum magnitude. Value should be interpretable as float.
    bGRRelative
    Values to add to GR “b” value. Parsed as float.
    maxMagGRAbsolute
    Values to replace GR maximum magnitude. Values expected to be lists of floats separated by space, one float for each GR MFD in a target source in order of appearance.
    abGRAbsolute
    Values to replace “a” and “b” values of GR MFD. Lists of pairs of floats, one pair for one GR MFD in a target source.
    incrementalMFDAbsolute
    Replaces an evenly discretized MFD with the values provided
    simpleFaultDipRelative
    Increases or decreases the angle of fault dip from that given in the original source model
    simpleFaultDipAbsolute
    Replaces the fault dip in the specified source(s)
    simpleFaultGeometryAbsolute
    Replaces the simple fault geometry (trace, upper seismogenic depth lower seismogenic depth and dip) of a given source with the values provided
    complexFaultGeometryAbsolute
    Replaces the complex fault geometry edges of a given source with the values provided
    characteristicFaultGeometryAbsolute
    Replaces the complex fault geometry surface of a given source with the values provided
  • filters

    Dictionary, a set of filters to specify which sources should the uncertainty be applied to. Represented as branchset element’s attributes in xml:

    applyToSources
    The uncertainty should be applied only to specific sources. This filter is required for absolute uncertainties (also only one source can be used for those). Value should be the list of source ids. Can be used only in source model logic tree.
    applyToSourceType
    Can be used in the source model logic tree definition. Allows to specify to which source type (area, point, simple fault, complex fault) the uncertainty applies to.
    applyToTectonicRegionType
    Can be used in both the source model and GMPE logic trees. Allows to specify to which tectonic region type (Active Shallow Crust, Stable Shallow Crust, etc.) the uncertainty applies to. This filter is required for all branchsets in GMPE logic tree.
apply_uncertainty(value, source)[source]

Apply this branchset’s uncertainty with value value to source source, if it passes filters.

This method is not called for uncertainties of types “gmpeModel” and “sourceModel”.

Parameters:
  • value – The actual uncertainty value of sampled branch. Type depends on uncertainty type.
  • source – The opensha source data object.
Returns:

None, all changes are applied to MFD in place. Therefore all sources have to be reinstantiated after processing is done in order to sample the tree once again.

enumerate_paths()[source]

Generate all possible paths starting from this branch set.

Returns:Generator of two-item tuples. Each tuple contains weight of the path (calculated as a product of the weights of all path’s branches) and list of path’s Branch objects. Total sum of all paths’ weights is 1.0
filter_source(source)[source]

Apply filters to source and return True if uncertainty should be applied to it.

get_branch_by_id(branch_id)[source]

Return Branch object belonging to this branch set with id equal to branch_id.

class openquake.commonlib.logictree.BranchTuple(bset, id, uncertainty, weight, effective)

Bases: tuple

bset

Alias for field number 0

effective

Alias for field number 4

id

Alias for field number 1

uncertainty

Alias for field number 2

weight

Alias for field number 3

class openquake.commonlib.logictree.GsimLogicTree(fname, tectonic_region_types=['*'], ltnode=None)[source]

Bases: object

A GsimLogicTree instance is an iterable yielding Realization tuples with attributes value, weight and lt_path, where value is a dictionary {trt: gsim}, weight is a number in the interval 0..1 and lt_path is a tuple with the branch ids of the given realization.

Parameters:
  • fname (str) – full path of the gsim_logic_tree file
  • tectonic_region_types – a sequence of distinct tectonic region types
  • ltnode – usually None, but it can also be a openquake.commonlib.nrml.Node object describing the GSIM logic tree XML file, to avoid reparsing it
check_imts(imts)[source]

Make sure the IMTs are recognized by all GSIMs in the logic tree

classmethod from_(gsim)[source]

Generate a trivial GsimLogicTree from a single GSIM instance.

get_gsim_by_trt(rlz, trt)[source]
Parameters:rlz – a logictree Realization
Param:a tectonic region type string
Returns:the GSIM string associated to the given realization
get_num_branches()[source]

Return the number of effective branches for branchset id, as a dictionary.

get_num_paths()[source]

Return the effective number of paths in the tree.

reduce(trts)[source]

Reduce the GsimLogicTree.

Parameters:trts – a subset of tectonic region types
Returns:a reduced GsimLogicTree instance
exception openquake.commonlib.logictree.InvalidLogicTree[source]

Bases: exceptions.Exception

exception openquake.commonlib.logictree.LogicTreeError(filename, message)[source]

Bases: exceptions.Exception

Base class for errors of loading, parsing and validation of logic trees.

Parameters:
  • filename – The name of the file which contains an error.
  • message – The error message.
openquake.commonlib.logictree.MAX_SINT_32 = 2147483647

Maximum value for a seed number

openquake.commonlib.logictree.MIN_SINT_32 = -2147483648

Minimum value for a seed number

exception openquake.commonlib.logictree.ParsingError(filename, message)[source]

Bases: openquake.commonlib.logictree.LogicTreeError

XML file failed to load: it is not readable or contains invalid xml.

class openquake.commonlib.logictree.Realization(value, weight, lt_path, ordinal, lt_uid)

Bases: tuple

lt_path

Alias for field number 2

lt_uid

Alias for field number 4

ordinal

Alias for field number 3

uid
value

Alias for field number 0

weight

Alias for field number 1

class openquake.commonlib.logictree.SourceModelLogicTree(*args, **kwargs)[source]

Bases: openquake.commonlib.logictree.BaseLogicTree

Source model logic tree parser.

SOURCE_TYPES = ('point', 'area', 'complexFault', 'simpleFault', 'characteristicFault')
apply_branchset(branchset_node, branchset)[source]

See superclass’ method for description and signature specification.

Parses branchset node’s attribute @applyToBranches to apply following branchests to preceding branches selectively. Branching level can have more than one branchset exactly for this: different branchsets can apply to different open ends.

Checks that branchset tries to be applied only to branches on previous branching level which do not have a child branchset yet.

collect_source_model_data(source_model)[source]

Parse source model file and collect information about source ids, source types and tectonic region types available in it. That information is used then for validate_filters() and validate_uncertainty_value().

make_apply_uncertainties(branch_ids)[source]

Parse the path through the source model logic tree and return “apply uncertainties” function.

Parameters:branch_ids – List of string identifiers of branches, representing the path through source model logic tree.
Returns:Function to be applied to all the sources as they get read from the database and converted to hazardlib representation. Function takes one argument, that is the hazardlib source object, and applies uncertainties to it in-place.
parse_filters(branchset_node, uncertainty_type, filters)[source]

See superclass’ method for description and signature specification.

Converts “applyToSources” filter value by just splitting it to a list.

parse_uncertainty_value(node, branchset)[source]

See superclass’ method for description and signature specification.

Doesn’t change source model file name, converts other values to either pair of floats or a single float depending on uncertainty type.

samples_by_lt_path()[source]

Returns a dictionary lt_path -> how many times that path was sampled

validate_branchset(branchset_node, depth, number, branchset)[source]

See superclass’ method for description and signature specification.

Checks that the following conditions are met:

  • First branching level must contain exactly one branchset, which must be of type “sourceModel”.
  • All other branchsets must not be of type “sourceModel” or “gmpeModel”.
validate_filters(branchset_node, uncertainty_type, filters)[source]

See superclass’ method for description and signature specification.

Checks that the following conditions are met:

  • “sourceModel” uncertainties can not have filters.
  • Absolute uncertainties must have only one filter – “applyToSources”, with only one source id.
  • All other uncertainty types can have either no or one filter.
  • Filter “applyToSources” must mention only source ids that exist in source models.
  • Filter “applyToTectonicRegionType” must mention only tectonic region types that exist in source models.
  • Filter “applyToSourceType” must mention only source types that exist in source models.
validate_uncertainty_value(node, branchset)[source]

See superclass’ method for description and signature specification.

Checks that the following conditions are met:

  • For uncertainty of type “sourceModel”: referenced file must exist and be readable. This is checked in collect_source_model_data() along with saving the source model information.
  • For uncertainty of type “abGRAbsolute”: value should be two float values.
  • For both absolute uncertainties: the source (only one) must be referenced in branchset’s filter “applyToSources”.
  • For all other cases: value should be a single float value.
exception openquake.commonlib.logictree.ValidationError(node, *args, **kwargs)[source]

Bases: openquake.commonlib.logictree.LogicTreeError

Logic tree file contains a logic error.

Parameters:node – XML node object that causes fail. Used to determine the affected line number.

All other constructor parameters are passed to superclass' constructor.

openquake.commonlib.logictree.get_effective_rlzs(rlzs)[source]

Group together realizations with the same unique identifier (uid) and yield the first representative of each group.

openquake.commonlib.logictree.sample(weighted_objects, num_samples, rnd)[source]

Take random samples of a sequence of weighted objects

Parameters:
  • weighted_objects – A finite sequence of objects with a .weight attribute. The weights must sum up to 1.
  • num_samples – The number of samples to return
  • rnd – Random object. Should have method random() – return uniformly distributed random float number >= 0 and < 1.
Returns:

A subsequence of the original sequence with num_samples elements

openquake.commonlib.logictree.sample_one(branches, rnd)[source]

openquake.commonlib.node module

This module defines a Node class, together with a few conversion functions which are able to convert NRML files into hierarchical objects (DOM). That makes it easier to read and write XML from Python and viceversa. Such features are used in the command-line conversion tools. The Node class is kept intentionally similar to an Element class, however it overcomes the limitation of ElementTree: in particular a node can manage a lazy iterable of subnodes, whereas ElementTree wants to keep everything in memory. Moreover the Node class provides a convenient dot notation to access subnodes.

The Node class is instantiated with four arguments:

  1. the node tag (a mandatory string)
  2. the node attributes (a dictionary)
  3. the node value (a string or None)
  4. the subnodes (an iterable over nodes)

If a node has subnodes, its value should be None.

For instance, here is an example of instantiating a root node with two subnodes a and b:

>>> from openquake.commonlib.node import Node
>>> a = Node('a', {}, 'A1')
>>> b = Node('b', {'attrb': 'B'}, 'B1')
>>> root = Node('root', nodes=[a, b])
>>> root
<root {} None ...>

Node objects can be converted into nicely indented strings:

>>> print(root.to_str())
root
  a 'A1'
  b{attrb='B'} 'B1'

The subnodes can be retrieved with the dot notation:

>>> root.a
<a {} A1 >

The value of a node can be extracted with the ~ operator:

>>> ~root.a
'A1'

If there are multiple subnodes with the same name

>>> root.append(Node('a', {}, 'A2'))  # add another 'a' node

the dot notation will retrieve the first node.

It is possible to retrieve the other nodes from the ordinal index:

>>> root[0], root[1], root[2]
(<a {} A1 >, <b {'attrb': 'B'} B1 >, <a {} A2 >)

The list of all subnodes with a given name can be retrieved as follows:

>>> list(root.getnodes('a'))
[<a {} A1 >, <a {} A2 >]

It is also possible to delete a node given its index:

>>> del root[2]

A node is an iterable object yielding its subnodes:

>>> list(root)
[<a {} A1 >, <b {'attrb': 'B'} B1 >]

The attributes of a node can be retrieved with the square bracket notation:

>>> root.b['attrb']
'B'

It is possible to add and remove attributes freely:

>>> root.b['attr'] = 'new attr'
>>> del root.b['attr']

Node objects can be easily converted into ElementTree objects:

>>> node_to_elem(root)  
<Element 'root' at ...>

Then is trivial to generate the XML representation of a node:

>>> from xml.etree import ElementTree
>>> print(ElementTree.tostring(node_to_elem(root)))
<root><a>A1</a><b attrb="B">B1</b></root>

Generating XML files larger than the available memory requires some care. The trick is to use a node generator, such that it is not necessary to keep the entire tree in memory. Here is an example:

>>> def gen_many_nodes(N):
...     for i in xrange(N):
...         yield Node('a', {}, 'Text for node %d' % i)
>>> lazytree = Node('lazytree', {}, nodes=gen_many_nodes(10))

The lazytree object defined here consumes no memory, because the nodes are not created a instantiation time. They are created as soon as you start iterating on the lazytree. In particular list(lazytree) will generated all of them. If your goal is to store the tree on the filesystem in XML format you should use a writing routine converting a subnode at the time, without requiring the full list of them. The routines provided by ElementTree are no good, however commonlib.writers provide an StreamingXMLWriter just for that purpose.

Lazy trees should not be used unless it is absolutely necessary in order to save memory; the problem is that if you use a lazy tree the slice notation will not work (the underlying generator will not accept it); moreover it will not be possible to iterate twice on the subnodes, since the generator will be exhausted. Notice that even accessing a subnode with the dot notation will avance the generator. Finally, nodes containing lazy nodes will not be pickleable.

class openquake.commonlib.node.LiteralNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.Node

Subclasses should define a non-empty dictionary of validators. Known validators:

validators = {}
class openquake.commonlib.node.MetaLiteralNode[source]

Bases: type

Metaclass adding __slots__ and extending the docstring with a note about the known validators. Moreover it checks for the attribute .validators.

class openquake.commonlib.node.Node(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: object

A class to make it easy to edit hierarchical structures with attributes, such as XML files. Node objects must be pickleable and must consume as little memory as possible. Moreover they must be easily converted from and to ElementTree objects. The advantage over ElementTree objects is that subnodes can be lazily generated and that they can be accessed with the dot notation.

append(node)[source]

Append a new subnode

attrib
getnodes(name)[source]

Return the direct subnodes with name ‘name’

lineno
nodes
tag
text
to_str(expandattrs=True, expandvals=True)[source]

Convert the node into a string, intended for testing/debugging purposes

Parameters:
  • expandattrs – print the values of the attributes if True, else print only the names
  • expandvals – print the values if True, else print only the tag names
class openquake.commonlib.node.SourceLineParser(html=0, target=None, encoding=None)[source]

Bases: xml.etree.ElementTree.XMLParser

A custom parser managing line numbers

openquake.commonlib.node.context(*args, **kwds)[source]

Context manager managing exceptions and adding line number of the current node and name of the current file to the error message.

Parameters:
  • fname – the current file being processed
  • node – the current node being processed
openquake.commonlib.node.fromstring(text)[source]

Parse an XML string and return a tree

openquake.commonlib.node.iterparse(source, events=('end', ), remove_comments=True, **kw)[source]

Thin wrapper around ElementTree.iterparse

openquake.commonlib.node.node_copy(node, nodefactory=<class 'openquake.commonlib.node.Node'>)[source]

Make a deep copy of the node

openquake.commonlib.node.node_display(root, expandattrs=False, expandvals=False, output=<open file '<stdout>', mode 'w'>)[source]

Write an indented representation of the Node object on the output; this is intended for testing/debugging purposes.

Parameters:
  • root – a Node object
  • expandattrs (bool) – if True, the values of the attributes are also printed, not only the names
  • expandvals (bool) – if True, the values of the tags are also printed, not only the names.
  • output – stream where to write the string representation of the node
openquake.commonlib.node.node_from_dict(dic, nodefactory=<class 'openquake.commonlib.node.Node'>)[source]

Convert a (nested) dictionary with attributes tag, attrib, text, nodes into a Node object.

openquake.commonlib.node.node_from_elem(elem, nodefactory=<class 'openquake.commonlib.node.Node'>, lazy=())[source]

Convert (recursively) an ElementTree object into a Node object.

openquake.commonlib.node.node_from_ini(ini_file, nodefactory=<class 'openquake.commonlib.node.Node'>, root_name='ini')[source]

Convert a .ini file into a Node object.

Parameters:ini_file – a filename or a file like object in read mode
openquake.commonlib.node.node_from_xml(xmlfile, nodefactory=<class 'openquake.commonlib.node.Node'>)[source]

Convert a .xml file into a Node object.

Parameters:xmlfile – a file name or file object open for reading
openquake.commonlib.node.node_to_dict(node)[source]

Convert a Node object into a (nested) dictionary with attributes tag, attrib, text, nodes.

Parameters:node – a Node-compatible object
openquake.commonlib.node.node_to_elem(root)[source]

Convert (recursively) a Node object into an ElementTree object.

openquake.commonlib.node.node_to_ini(node, output=<open file '<stdout>', mode 'w'>)[source]

Convert a Node object with the right structure into a .ini file.

Params node:a Node object
Params output:a file-like object opened in write mode
openquake.commonlib.node.node_to_xml(node, output=<open file '<stdout>', mode 'w'>, nsmap=None)[source]

Convert a Node object into a pretty .xml file without keeping everything in memory. If you just want the string representation use commonlib.writers.tostring(node).

Parameters:
  • node – a Node-compatible object (ElementTree nodes are fine)
  • nsmap – if given, shorten the tags with aliases
openquake.commonlib.node.parse(source, remove_comments=True, **kw)[source]

Thin wrapper around ElementTree.parse

openquake.commonlib.node.pprint(self, stream=None, indent=1, width=80, depth=None)[source]

Pretty print the underlying literal Python object

openquake.commonlib.node.read_nodes(fname, filter_elem, nodefactory=<class 'openquake.commonlib.node.Node'>, remove_comments=True)[source]

Convert an XML file into a lazy iterator over Node objects satifying the given specification, i.e. a function element -> boolean.

Parameters:
  • fname – file name of file object
  • filter_elem – element specification

In case of errors, add the file name to the error message.

openquake.commonlib.node.striptag(tag)[source]

Get the short representation of a fully qualified tag

Parameters:tag (str) – a (fully qualified or not) XML tag
openquake.commonlib.node.to_literal(self)[source]

Convert the node into a literal Python object

openquake.commonlib.nrml module

From Node objects to NRML files and viceversa

It is possible to save a Node object into a NRML file by using the function write(nodes, output) where output is a file object. If you want to make sure that the generated file is valid according to the NRML schema just open it in ‘w+’ mode: immediately after writing it will be read and validated. It is also possible to convert a NRML file into a Node object with the routine read(node, input) where input is the path name of the NRML file or a file object opened for reading. The file will be validated as soon as opened.

For instance an exposure file like the following:

<?xml version='1.0' encoding='utf-8'?>
<nrml xmlns="http://openquake.org/xmlns/nrml/0.4"
      xmlns:gml="http://www.opengis.net/gml">
  <exposureModel
      id="my_exposure_model_for_population"
      category="population"
      taxonomySource="fake population datasource">

    <description>
      Sample population
    </description>

    <assets>
      <asset id="asset_01" number="7" taxonomy="IT-PV">
          <location lon="9.15000" lat="45.16667" />
      </asset>

      <asset id="asset_02" number="7" taxonomy="IT-CE">
          <location lon="9.15333" lat="45.12200" />
      </asset>
    </assets>
  </exposureModel>
</nrml>

can be converted as follows:

>> nrml = read(<path_to_the_exposure_file.xml>)

Then subnodes and attributes can be conveniently accessed:

>> nrml.exposureModel.assets[0][‘taxonomy’] ‘IT-PV’ >> nrml.exposureModel.assets[0][‘id’] ‘asset_01’ >> nrml.exposureModel.assets[0].location[‘lon’] ‘9.15000’ >> nrml.exposureModel.assets[0].location[‘lat’] ‘45.16667’

The Node class provides no facility to cast strings into Python types; this is a job for the LiteralNode class which can be subclassed and supplemented by a dictionary of validators.

class openquake.commonlib.nrml.BcrNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.LiteralNode

Known validators: statistics: Choice(‘quantile’,) ratio: positivefloat pos: lon_lat lossCategory: str aalRetr: positivefloat unit: str interestRate: positivefloat lossType: Choice(‘structural’, ‘nonstructural’, ‘contents’, ‘business_interruption’, ‘occupants’) quantileValue: positivefloat aalOrig: positivefloat assetLifeExpectancy: positivefloat

validators = {'statistics': <openquake.risklib.valid.Choice object at 0x7f48359c2650>, 'ratio': <function positivefloat at 0x7f4835ac38c0>, 'pos': <function lon_lat at 0x7f4835ac3668>, 'lossCategory': <type 'str'>, 'aalRetr': <function positivefloat at 0x7f4835ac38c0>, 'unit': <type 'str'>, 'interestRate': <function positivefloat at 0x7f4835ac38c0>, 'lossType': <openquake.risklib.valid.Choice object at 0x7f48359c25d0>, 'quantileValue': <function positivefloat at 0x7f4835ac38c0>, 'aalOrig': <function positivefloat at 0x7f4835ac38c0>, 'assetLifeExpectancy': <function positivefloat at 0x7f4835ac38c0>}
class openquake.commonlib.nrml.CollapseNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.LiteralNode

Known validators: cf: asset_mean_stddev pos: lon_lat

validators = {'cf': <function asset_mean_stddev at 0x7f48359b6050>, 'pos': <function lon_lat at 0x7f4835ac3668>}
class openquake.commonlib.nrml.CurveNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.LiteralNode

Known validators: node: lon_lat_iml loss_type: Choice(‘structural’, ‘nonstructural’, ‘contents’, ‘business_interruption’, ‘occupants’) saDamping: positivefloat pos: lon_lat IMT: str losses: positivefloats gsimTreePath: <lambda> unit: str poEs: probabilities poE: probability saPeriod: positivefloat sourceModelTreePath: <lambda> IMLs: positivefloats averageLoss: positivefloat investigationTime: positivefloat stdDevLoss: positivefloat quantileValue: positivefloat

validators = {'node': <function lon_lat_iml at 0x7f4835ac36e0>, 'loss_type': <openquake.risklib.valid.Choice object at 0x7f48359c25d0>, 'saDamping': <function positivefloat at 0x7f4835ac38c0>, 'pos': <function lon_lat at 0x7f4835ac3668>, 'IMT': <type 'str'>, 'losses': <function positivefloats at 0x7f4835ac3938>, 'gsimTreePath': <function <lambda> at 0x7f48359bf6e0>, 'unit': <type 'str'>, 'poEs': <function probabilities at 0x7f4835ac3a28>, 'poE': <openquake.risklib.valid.FloatRange object at 0x7f4835f6f110>, 'saPeriod': <function positivefloat at 0x7f4835ac38c0>, 'sourceModelTreePath': <function <lambda> at 0x7f48359bf8c0>, 'IMLs': <function positivefloats at 0x7f4835ac3938>, 'averageLoss': <function positivefloat at 0x7f4835ac38c0>, 'investigationTime': <function positivefloat at 0x7f4835ac38c0>, 'stdDevLoss': <function positivefloat at 0x7f4835ac38c0>, 'quantileValue': <function positivefloat at 0x7f4835ac38c0>}
class openquake.commonlib.nrml.DamageNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.LiteralNode

Known validators: pos: lon_lat damage: damage_triple damageStates: namelist

validators = {'pos': <function lon_lat at 0x7f4835ac3668>, 'damage': <function damage_triple at 0x7f48359bf5f0>, 'damageStates': <function namelist at 0x7f4835ac32a8>}
class openquake.commonlib.nrml.DisaggNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.LiteralNode

Known validators: saDamping: positivefloat dims: positiveints IMT: str lonBinEdges: longitudes lat: latitude epsBinEdges: integers index: positiveints poE: probability magBinEdges: integers saPeriod: positivefloat lon: longitude iml: positivefloat value: positivefloat distBinEdges: integers investigationTime: positivefloat latBinEdges: latitudes type: namelist

validators = {'saDamping': <function positivefloat at 0x7f4835ac38c0>, 'dims': <function positiveints at 0x7f4835ac4488>, 'IMT': <type 'str'>, 'lonBinEdges': <function longitudes at 0x7f4835ac3500>, 'lat': <function latitude at 0x7f4835ac3488>, 'epsBinEdges': <function integers at 0x7f4835ac4410>, 'index': <function positiveints at 0x7f4835ac4488>, 'poE': <openquake.risklib.valid.FloatRange object at 0x7f4835f6f110>, 'magBinEdges': <function integers at 0x7f4835ac4410>, 'saPeriod': <function positivefloat at 0x7f4835ac38c0>, 'lon': <function longitude at 0x7f4835ac3410>, 'iml': <function positivefloat at 0x7f4835ac38c0>, 'value': <function positivefloat at 0x7f4835ac38c0>, 'distBinEdges': <function integers at 0x7f4835ac4410>, 'investigationTime': <function positivefloat at 0x7f4835ac38c0>, 'latBinEdges': <function latitudes at 0x7f4835ac3578>, 'type': <function namelist at 0x7f4835ac32a8>}
class openquake.commonlib.nrml.ExposureDataNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.LiteralNode

Known validators: retrofitted: positivefloat name: Choice(‘structural’, ‘nonstructural’, ‘contents’, ‘business_interruption’) insuranceLimit: float_or_flag lon: longitude number: compose(positivefloat,nonzero) value: positivefloat occupants: positivefloat lat: latitude deductible: float_or_flag type: Regex[^[a-zA-Z_]w*$] id: SimpleId(100, ^[w_-]+$) description: utf8_not_empty

validators = {'retrofitted': <function positivefloat at 0x7f4835ac38c0>, 'name': <openquake.risklib.valid.Choice object at 0x7f4835f6f710>, 'insuranceLimit': <function float_or_flag at 0x7f48359bf500>, 'lon': <function longitude at 0x7f4835ac3410>, 'number': <function compose(positivefloat,nonzero) at 0x7f48359bf758>, 'value': <function positivefloat at 0x7f4835ac38c0>, 'occupants': <function positivefloat at 0x7f4835ac38c0>, 'lat': <function latitude at 0x7f4835ac3488>, 'deductible': <function float_or_flag at 0x7f48359bf500>, 'type': <openquake.risklib.valid.Regex object at 0x7f4835f62d50>, 'id': <openquake.risklib.valid.SimpleId object at 0x7f4835f62fd0>, 'description': <function utf8_not_empty at 0x7f4835ac3230>}
class openquake.commonlib.nrml.FragilityNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.LiteralNode

Literal Node class used to validate fragility functions and consequence functions. Known validators: noDamageLimit: positivefloat dist: Choice(‘LN’,) description: utf8_not_empty format: ChoiceCI(‘discrete’, ‘continuous’) IML: IML lossCategory: Regex[^[a-zA-Z_]w*$] maxIML: positivefloat assetCategory: utf8 id: utf8 poEs: probabilities limitStates: namelist minIML: positivefloat imt: intensity_measure_type stddev: positivefloat poes: <lambda> type: ChoiceCI(‘lognormal’,) mean: positivefloat

validators = {'noDamageLimit': <openquake.risklib.valid.NoneOr object at 0x7f48359c2550>, 'dist': <openquake.risklib.valid.Choice object at 0x7f48359c2450>, 'description': <function utf8_not_empty at 0x7f4835ac3230>, 'format': <openquake.risklib.valid.ChoiceCI object at 0x7f48359c2410>, 'IML': <function IML at 0x7f4835ac3b18>, 'lossCategory': <openquake.risklib.valid.Regex object at 0x7f4835f62d50>, 'maxIML': <function positivefloat at 0x7f4835ac38c0>, 'assetCategory': <function utf8 at 0x7f4835ac31b8>, 'id': <function utf8 at 0x7f4835ac31b8>, 'poEs': <function probabilities at 0x7f4835ac3a28>, 'limitStates': <function namelist at 0x7f4835ac32a8>, 'minIML': <function positivefloat at 0x7f4835ac38c0>, 'imt': <function intensity_measure_type at 0x7f4835ac3b90>, 'stddev': <function positivefloat at 0x7f4835ac38c0>, 'poes': <function <lambda> at 0x7f48359bf848>, 'type': <openquake.risklib.valid.ChoiceCI object at 0x7f48359c2490>, 'mean': <function positivefloat at 0x7f4835ac38c0>}
class openquake.commonlib.nrml.GmfNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.LiteralNode

Class used to convert nodes such as:

<gmf IMT="PGA" ruptureId="scenario-0000000001" >
   <node gmv="0.365662734506" lat="0.0" lon="0.0"/>
   <node gmv="0.256181251586" lat="0.1" lon="0.0"/>
   <node gmv="0.110685275111" lat="0.2" lon="0.0"/>
</gmf>

into LiteralNode objects. Known validators: lat: latitude lon: longitude gmv: positivefloat

validators = {'lat': <function latitude at 0x7f4835ac3488>, 'lon': <function longitude at 0x7f4835ac3410>, 'gmv': <function positivefloat at 0x7f4835ac38c0>}
class openquake.commonlib.nrml.NRMLFile(dest, mode='r')[source]

Bases: object

Context-managed output object which accepts either a path or a file-like object.

Behaves like a file.

class openquake.commonlib.nrml.UHSNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.LiteralNode

Known validators: investigationTime: positivefloat poE: probability pos: lon_lat periods: positivefloats IMLs: positivefloats

validators = {'investigationTime': <function positivefloat at 0x7f4835ac38c0>, 'poE': <openquake.risklib.valid.FloatRange object at 0x7f4835f6f110>, 'pos': <function lon_lat at 0x7f4835ac3668>, 'periods': <function positivefloats at 0x7f4835ac3938>, 'IMLs': <function positivefloats at 0x7f4835ac3938>}
class openquake.commonlib.nrml.ValidNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.LiteralNode

A subclass of LiteralNode to be used when parsing sources and ruptures from NRML files. Known validators: discretization: compose(positivefloat,nonzero) weight: probability probs_occur: pmf lowerSeismoDepth: positivefloat posList: posList magnitudes: positivefloats characteristicRate: positivefloat id: SimpleId(100, ^[w_-]+$) probability: probability minMag: positivefloat tectonicRegion: str totalMomentRate: positivefloat lon: longitude downDip: probability strike: FloatRange[0:360] magScaleRel: mag_scale_rel aValue: float rake: FloatRange[-180:180] pos: lon_lat occurRates: positivefloats lat: latitude hypoDepth: probability_depth alongStrike: probability binWidth: positivefloat characteristicMag: positivefloat ruptAspectRatio: positivefloat depth: positivefloat magnitude: positivefloat maxMag: positivefloat bValue: positivefloat dip: FloatRange[0:90] upperSeismoDepth: positivefloat

validators = {'discretization': <function compose(positivefloat,nonzero) at 0x7f48359bf668>, 'weight': <openquake.risklib.valid.FloatRange object at 0x7f4835f6f110>, 'probs_occur': <function pmf at 0x7f4835ac3f50>, 'lowerSeismoDepth': <function positivefloat at 0x7f4835ac38c0>, 'posList': <function posList at 0x7f4835ac41b8>, 'magnitudes': <function positivefloats at 0x7f4835ac3938>, 'characteristicRate': <function positivefloat at 0x7f4835ac38c0>, 'id': <openquake.risklib.valid.SimpleId object at 0x7f4835f62f50>, 'probability': <openquake.risklib.valid.FloatRange object at 0x7f4835f6f110>, 'minMag': <function positivefloat at 0x7f4835ac38c0>, 'tectonicRegion': <type 'str'>, 'totalMomentRate': <function positivefloat at 0x7f4835ac38c0>, 'lon': <function longitude at 0x7f4835ac3410>, 'downDip': <openquake.risklib.valid.FloatRange object at 0x7f4835f6f110>, 'strike': <openquake.risklib.valid.FloatRange object at 0x7f4835f6f150>, 'magScaleRel': <function mag_scale_rel at 0x7f4835ac3ed8>, 'aValue': <type 'float'>, 'rake': <openquake.risklib.valid.FloatRange object at 0x7f4835f6f1d0>, 'pos': <function lon_lat at 0x7f4835ac3668>, 'occurRates': <function positivefloats at 0x7f4835ac3938>, 'lat': <function latitude at 0x7f4835ac3488>, 'hypoDepth': <function probability_depth at 0x7f4835ac4320>, 'alongStrike': <openquake.risklib.valid.FloatRange object at 0x7f4835f6f110>, 'binWidth': <function positivefloat at 0x7f4835ac38c0>, 'characteristicMag': <function positivefloat at 0x7f4835ac38c0>, 'ruptAspectRatio': <function positivefloat at 0x7f4835ac38c0>, 'depth': <function positivefloat at 0x7f4835ac38c0>, 'magnitude': <function positivefloat at 0x7f4835ac38c0>, 'maxMag': <function positivefloat at 0x7f4835ac38c0>, 'bValue': <function positivefloat at 0x7f4835ac38c0>, 'dip': <openquake.risklib.valid.FloatRange object at 0x7f4835f6f190>, 'upperSeismoDepth': <function positivefloat at 0x7f4835ac38c0>}
class openquake.commonlib.nrml.VulnerabilityNode(fulltag, attrib=None, text=None, nodes=None, lineno=None)[source]

Bases: openquake.commonlib.node.LiteralNode

Literal Node class used to validate discrete vulnerability functions Known validators: coefficientsVariation: positivefloats covLRs: positivefloats dist: Choice(‘LN’, ‘BT’, ‘PM’) lossRatio: positivefloats vulnerabilitySetID: str IML: IML vulnerabilityFunctionID: str lossCategory: utf8 lr: probability imt: intensity_measure_type probabilisticDistribution: Choice(‘LN’, ‘BT’) meanLRs: positivefloats imls: <lambda> assetCategory: str

validators = {'coefficientsVariation': <function positivefloats at 0x7f4835ac3938>, 'covLRs': <function positivefloats at 0x7f4835ac3938>, 'dist': <openquake.risklib.valid.Choice object at 0x7f48359c2350>, 'lossRatio': <function positivefloats at 0x7f4835ac3938>, 'vulnerabilitySetID': <type 'str'>, 'IML': <function IML at 0x7f4835ac3b18>, 'vulnerabilityFunctionID': <type 'str'>, 'lossCategory': <function utf8 at 0x7f4835ac31b8>, 'lr': <openquake.risklib.valid.FloatRange object at 0x7f4835f6f110>, 'imt': <function intensity_measure_type at 0x7f4835ac3b90>, 'probabilisticDistribution': <openquake.risklib.valid.Choice object at 0x7f48359c2390>, 'meanLRs': <function positivefloats at 0x7f4835ac3938>, 'imls': <function <lambda> at 0x7f48359bf7d0>, 'assetCategory': <type 'str'>}
openquake.commonlib.nrml.asset_mean_stddev(value, assetRef, mean, stdDev)[source]
openquake.commonlib.nrml.build_exposure(node, fname)[source]
openquake.commonlib.nrml.convert_fragility_model_04(node, fname, fmcounter=count(1))[source]
Parameters:
  • node – an openquake.commonib.node.LiteralNode in NRML 0.4
  • fname – path of the fragility file
Returns:

an openquake.commonib.node.LiteralNode in NRML 0.5

openquake.commonlib.nrml.damage_triple(value, ds, mean, stddev)[source]
openquake.commonlib.nrml.ffconvert(fname, limit_states, ff, min_iml=1e-10)[source]

Convert a fragility function into a numpy array plus a bunch of attributes.

Parameters:
  • fname – path to the fragility model file
  • limit_states – expected limit states
  • ff – fragility function node
Returns:

a pair (array, dictionary)

openquake.commonlib.nrml.float_or_flag(value, isAbsolute=None)[source]

Validate the attributes/tags insuranceLimit and deductible

openquake.commonlib.nrml.get_consequence_model(node, fname)[source]
openquake.commonlib.nrml.get_fragility_model(node, fname)[source]
Parameters:
  • node – a vulnerabilityModel node
  • fname – path to the vulnerability file
Returns:

a dictionary imt, taxonomy -> fragility function list

openquake.commonlib.nrml.get_fragility_model_04(fmodel, fname)[source]
Parameters:
  • fmodel – a fragilityModel node
  • fname – path of the fragility file
Returns:

an openquake.risklib.scientific.FragilityModel instance

openquake.commonlib.nrml.get_tag_version(nrml_node)[source]

Extract from a node of kind NRML the tag and the version. For instance from ‘{http://openquake.org/xmlns/nrml/0.4}fragilityModel’ one gets the pair (‘fragilityModel’, ‘nrml/0.4’).

openquake.commonlib.nrml.get_vulnerability_functions_04(node, fname)[source]
Parameters:
  • node – a vulnerabilityModel node
  • fname – path to the vulnerability file
Returns:

a dictionary imt, taxonomy -> vulnerability function

openquake.commonlib.nrml.get_vulnerability_functions_05(node, fname)[source]
Parameters:
  • node – a vulnerabilityModel node
  • fname – path of the vulnerability filter
Returns:

a dictionary imt, taxonomy -> vulnerability function

openquake.commonlib.nrml.parse(fname, *args)[source]

Parse a NRML file and return an associated Python object. It works by calling nrml.read() and build() in sequence.

openquake.commonlib.nrml.read(source, chatty=True)[source]

Convert a NRML file into a validated LiteralNode object. Keeps the entire tree in memory.

Parameters:source – a file name or file object open for reading
openquake.commonlib.nrml.read_lazy(source, lazytags)[source]

Convert a NRML file into a validated LiteralNode object. The tree is lazy, i.e. you access nodes by iterating on them.

Parameters:
  • source – a file name or file object open for reading
  • lazytags – the name of nodes which subnodes must be read lazily
Returns:

a list of nodes; some of them will contain lazy subnodes

openquake.commonlib.nrml.write(nodes, output=<open file '<stdout>', mode 'w'>, fmt='%10.7E', gml=True)[source]

Convert nodes into a NRML file. output must be a file object open in write mode. If you want to perform a consistency check, open it in read-write mode, then it will be read after creation and validated.

Params nodes:an iterable over Node objects
Params output:a file-like object in write or read-write mode

openquake.commonlib.oqvalidation module

class openquake.commonlib.oqvalidation.OqParam(**names_vals)[source]

Bases: openquake.risklib.valid.ParamSet

all_cost_types

Return the cost types of the computation (including occupants if it is there) in order.

area_source_discretization

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
asset_correlation

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
asset_hazard_distance

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
asset_life_expectancy

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
asset_loss_table

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
avg_losses

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
base_path

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
calculation_mode

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
check_gsims(gsims)[source]
Parameters:gsims – a sequence of GSIM instances
check_uniform_hazard_spectra()[source]
compare_with_classical

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
complex_fault_mesh_spacing

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
concurrent_tasks

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
conditional_loss_poes

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
continuous_fragility_discretization

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
coordinate_bin_width

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
description

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
distance_bin_width

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
export_dir

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
export_multi_curves

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
exports

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
file_type
filter_sources

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
ground_motion_correlation_model

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
ground_motion_correlation_params

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
ground_motion_fields

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
gsim

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
hazard_calculation_id

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
hazard_curves_from_gmfs

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
hazard_maps

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
hazard_output_id

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
hypocenter

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
ignore_missing_costs

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
imtls

Returns an OrderedDict with the risk intensity measure types and levels, if given, or the hazard ones.

individual_curves

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
inputs

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
insured_losses

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
intensity_measure_types

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
intensity_measure_types_and_levels

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
interest_rate

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
investigation_time

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
is_valid_complex_fault_mesh_spacing()[source]

The complex_fault_mesh_spacing parameter can be None only if rupture_mesh_spacing is set. In that case it is identified with it.

is_valid_export_dir()[source]

The export_dir parameter must refer to a directory, and the user must have the permission to write on it.

is_valid_geometry()[source]

It is possible to infer the geometry only if exactly one of sites, sites_csv, hazard_curves_csv, gmfs_csv, region and exposure_file is set. You did set more than one, or nothing.

is_valid_hazard_curves()[source]

You must set hazard_curves_from_gmfs if mean_hazard_curves or quantile_hazard_curves are set.

is_valid_inputs()[source]

Invalid calculation_mode=”{calculation_mode}” or missing fragility_file/vulnerability_file in the .ini file.

is_valid_intensity_measure_levels()[source]

In order to compute hazard curves, intensity_measure_types_and_levels must be set or extracted from the risk models.

is_valid_intensity_measure_types()[source]

If the IMTs and levels are extracted from the risk models, they must not be set directly. Moreover, if intensity_measure_types_and_levels is set directly, intensity_measure_types must not be set.

is_valid_maximum_distance()[source]

Invalid maximum_distance={maximum_distance}: {error}

is_valid_poes()[source]

When computing hazard maps and/or uniform hazard spectra, the poes list must be non-empty.

is_valid_region()[source]

If there is a region a region_grid_spacing must be given

is_valid_sites_disagg()[source]

The option sites_disagg (when given) requires specific_assets to be set.

is_valid_specific_assets()[source]

Read the special assets from the parameters specific_assets or specific_assets_csv, if present. You cannot have both. The concept is meaninful only for risk calculators.

is_valid_truncation_level_disaggregation()[source]

Truncation level must be set for disaggregation calculations

loss_curve_resolution

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
loss_dt(dtype=<type 'numpy.float32'>)[source]

Return a composite dtype based on the loss types, including occupants

loss_ratios

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
lrem_steps_per_interval

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
mag_bin_width

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
master_seed

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
maximum_distance

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
mean_hazard_curves

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
minimum_intensity

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
no_imls()[source]

Return True if there are no intensity measure levels

num_epsilon_bins

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
number_of_ground_motion_fields

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
number_of_logic_tree_samples

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
poes

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
poes_disagg

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
quantile_hazard_curves

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
quantile_loss_curves

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
random_seed

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
reference_backarc

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
reference_depth_to_1pt0km_per_sec

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
reference_depth_to_2pt5km_per_sec

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
reference_vs30_type

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
reference_vs30_value

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
region

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
region_constraint

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
region_grid_spacing

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
risk_files
risk_imtls

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
risk_investigation_time

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
rupture_mesh_spacing

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
ses_per_logic_tree_path

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
ses_ratio

The ratio

risk_investigation_time / investigation_time / ses_per_logic_tree_path

set_risk_imtls(risk_models)[source]
Parameters:risk_models – a dictionary taxonomy -> loss_type -> risk_function

Set the attribute risk_imtls.

siteparam = {'backarc': 'reference_backarc', 'z2pt5': 'reference_depth_to_2pt5km_per_sec', 'vs30measured': 'reference_vs30_type', 'vs30': 'reference_vs30_value', 'z1pt0': 'reference_depth_to_1pt0km_per_sec'}
sites

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
sites_disagg

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
sites_per_tile

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
specific_assets

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
steps_per_interval

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
taxonomies_from_model

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
time_event

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
truncation_level

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
tses

Return the total time as investigation_time * ses_per_logic_tree_path * (number_of_logic_tree_samples or 1)

uniform_hazard_spectra

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
width_of_mfd_bin

A descriptor for validated parameters with a default, to be used as attributes in ParamSet objects.

Parameters:
  • validator – the validator
  • default – the default value
openquake.commonlib.oqvalidation.fix_maximum_distance(max_dist, trts)[source]

Make sure the dictionary maximum_distance (provided by the user in the job.ini file) is filled for all tectonic region types and has no key named ‘default’.

openquake.commonlib.oqvalidation.getdefault(dic_with_default, key)[source]
Parameters:
  • dic_with_default – a dictionary with a ‘default’ key
  • key – a key that may be present in the dictionary or not
Returns:

the value associated to the key, or to ‘default’

openquake.commonlib.parallel module

TODO: write documentation.

class openquake.commonlib.parallel.NoFlush(monitor, taskname)[source]

Bases: object

class openquake.commonlib.parallel.Pickled(obj)[source]

Bases: object

An utility to manually pickling/unpickling objects. The reason is that celery does not use the HIGHEST_PROTOCOL, so relying on celery is slower. Moreover Pickled instances have a nice string representation and length giving the size of the pickled bytestring.

Parameters:obj – the object to pickle
unpickle()[source]

Unpickle the underlying object

class openquake.commonlib.parallel.TaskManager(oqtask, name=None)[source]

Bases: object

A manager to submit several tasks of the same type. The usage is:

tm = TaskManager(do_something, logging.info)
tm.send(arg1, arg2)
tm.send(arg3, arg4)
print tm.reduce()

Progress report is built-in.

aggregate_result_set(agg, acc)[source]

Loop on a set results and update the accumulator by using the aggregation function.

Parameters:
  • agg – the aggregation function, (acc, val) -> new acc
  • acc – the initial value of the accumulator
Returns:

the final value of the accumulator

classmethod apply_reduce(task, task_args, agg=<built-in function add>, acc=None, concurrent_tasks=4, weight=<function <lambda>>, key=<function <lambda>>, name=None)[source]

Apply a task to a tuple of the form (sequence, *other_args) by first splitting the sequence in chunks, according to the weight of the elements and possibly to a key (see :function: openquake.baselib.general.split_in_blocks). Then reduce the results with an aggregation function. The chunks which are generated internally can be seen directly ( useful for debugging purposes) by looking at the attribute ._chunks, right after the apply_reduce function has been called.

Parameters:
  • task – a task to run in parallel
  • task_args – the arguments to be passed to the task function
  • agg – the aggregation function
  • acc – initial value of the accumulator (default empty AccumDict)
  • concurrent_tasks – hint about how many tasks to generate
  • weight – function to extract the weight of an item in arg0
  • key – function to extract the kind of an item in arg0
executor = <concurrent.futures.process.ProcessPoolExecutor object>
static progress(msg, *args, **kwargs)

Log a message with severity ‘INFO’ on the root logger.

reduce(agg=<built-in function add>, acc=None)[source]

Loop on a set of results and update the accumulator by using the aggregation function.

Parameters:
  • agg – the aggregation function, (acc, val) -> new acc
  • acc – the initial value of the accumulator
Returns:

the final value of the accumulator

classmethod restart()[source]
classmethod starmap(task, task_args, name=None)[source]

Spawn a bunch of tasks with the given list of arguments

Returns:a TaskManager object with a .result method.
submit(*args)[source]

Submit a function with the given arguments to the process pool and add a Future to the list .results. If the variable OQ_DISTRIBUTE is set, the function is run in process and the result is returned.

task_ids = []
wait()[source]

Wait until all the task terminate. Discard the results.

Returns:the total number of tasks that were spawned
openquake.commonlib.parallel.check_mem_usage(monitor=<Monitor dummy>, soft_percent=90, hard_percent=100)[source]

Display a warning if we are running out of memory

Parameters:mem_percent (int) – the memory limit as a percentage
openquake.commonlib.parallel.do_not_aggregate(acc, value)[source]

Do nothing aggregation function, use it in openquake.commonlib.parallel.apply_reduce calls when no aggregation is required.

Parameters:
  • acc – the accumulator
  • value – the value to accumulate
Returns:

the accumulator unchanged

openquake.commonlib.parallel.get_pickled_sizes(obj)[source]

Return the pickled sizes of an object and its direct attributes, ordered by decreasing size. Here is an example:

>> total_size, partial_sizes = get_pickled_sizes(Monitor(‘’)) >> total_size 345 >> partial_sizes [(‘_procs’, 214), (‘exc’, 4), (‘mem’, 4), (‘start_time’, 4), (‘_start_time’, 4), (‘duration’, 4)]

Notice that the sizes depend on the operating system and the machine.

openquake.commonlib.parallel.litetask(func)

Add monitoring support to the decorated function. The last argument must be a monitor object.

openquake.commonlib.parallel.litetask_futures(func)[source]

Add monitoring support to the decorated function. The last argument must be a monitor object.

openquake.commonlib.parallel.log_percent_gen(taskname, todo, progress)[source]

Generator factory. Each time the generator object is called log a message if the percentage is bigger than the last one. Yield the number of calls done at the current iteration.

Parameters:
  • taskname (str) – the name of the task
  • todo (int) – the number of times the generator object will be called
  • progress – a logging function for the progress report
openquake.commonlib.parallel.no_distribute()[source]

True if the variable OQ_DISTRIBUTE is “no”

openquake.commonlib.parallel.oq_distribute()[source]

Return the current value of the variable OQ_DISTRIBUTE; if undefined, return ‘futures’.

openquake.commonlib.parallel.pickle_sequence(objects)[source]

Convert an iterable of objects into a list of pickled objects. If the iterable contains copies, the pickling will be done only once. If the iterable contains objects already pickled, they will not be pickled again.

Parameters:objects – a sequence of objects to pickle
openquake.commonlib.parallel.rec_delattr(mon, name)[source]

Delete attribute from a monitor recursively

openquake.commonlib.parallel.safely_call(func, args, pickle=False)[source]

Call the given function with the given arguments safely, i.e. by trapping the exceptions. Return a pair (result, exc_type) where exc_type is None if no exceptions occur, otherwise it is the exception class and the result is a string containing error message and traceback.

Parameters:
  • func – the function to call
  • args – the arguments
  • pickle – if set, the input arguments are unpickled and the return value is pickled; otherwise they are left unchanged

openquake.commonlib.readinput module

exception openquake.commonlib.readinput.DuplicatedID[source]

Bases: exceptions.Exception

Raised when two assets with the same ID are found in an exposure model

exception openquake.commonlib.readinput.DuplicatedPoint[source]

Bases: exceptions.Exception

Raised when reading a CSV file with duplicated (lon, lat) pairs

class openquake.commonlib.readinput.Exposure(id, category, description, cost_types, time_events, insurance_limit_is_absolute, deductible_is_absolute, area, assets, taxonomies, asset_refs)

Bases: tuple

area

Alias for field number 7

asset_refs

Alias for field number 10

assets

Alias for field number 8

category

Alias for field number 1

cost_types

Alias for field number 3

deductible_is_absolute

Alias for field number 6

description

Alias for field number 2

id

Alias for field number 0

insurance_limit_is_absolute

Alias for field number 5

taxonomies

Alias for field number 9

time_events

Alias for field number 4

openquake.commonlib.readinput.collect_files(dirpath, cond=<function <lambda>>)[source]

Recursively collect the files contained inside dirpath.

Parameters:
  • dirpath – path to a readable directory
  • cond – condition on the path to collect the file
openquake.commonlib.readinput.extract_from_zip(path, candidates)[source]

Given a zip archive and a function to detect the presence of a given filename, unzip the archive into a temporary directory and return the full path of the file. Raise an IOError if the file cannot be found within the archive.

Parameters:
  • path – pathname of the archive
  • candidates – list of names to search for
openquake.commonlib.readinput.get_composite_source_model(oqparam, in_memory=True)[source]

Parse the XML and build a complete composite source model in memory.

Parameters:
openquake.commonlib.readinput.get_correl_model(oqparam)[source]

Return a correlation object. See openquake.hazardlib.correlation for more info.

openquake.commonlib.readinput.get_exposure(oqparam)[source]

Read the full exposure in memory and build a list of openquake.risklib.riskmodels.Asset instances. If you don’t want to keep everything in memory, use get_exposure_lazy instead (for experts only).

Parameters:oqparam – an openquake.commonlib.oqvalidation.OqParam instance
Returns:an Exposure instance
openquake.commonlib.readinput.get_exposure_lazy(fname, ok_cost_types)[source]
Parameters:
  • fname – path of the XML file containing the exposure
  • ok_cost_types – a set of cost types (as strings)
Returns:

a pair (Exposure instance, list of asset nodes)

openquake.commonlib.readinput.get_gmfs(oqparam)[source]
Parameters:oqparam – an openquake.commonlib.oqvalidation.OqParam instance
Returns:sitecol, etags, gmf array
openquake.commonlib.readinput.get_gmfs_from_txt(oqparam, fname)[source]
Parameters:
Returns:

a composite array of shape (N, R) read from a CSV file with format etag indices [gmv1 ... gmvN] * num_imts

openquake.commonlib.readinput.get_gsim_lt(oqparam, trts=['*'])[source]
Parameters:
Returns:

a GsimLogicTree instance obtained by filtering on the provided tectonic region types.

openquake.commonlib.readinput.get_gsims(oqparam)[source]

Return an ordered list of GSIM instances from the gsim name in the configuration file or from the gsim logic tree file.

Parameters:oqparam – an openquake.commonlib.oqvalidation.OqParam instance
openquake.commonlib.readinput.get_hcurves(oqparam)[source]
Parameters:oqparam – an openquake.commonlib.oqvalidation.OqParam instance
Returns:sitecol, imtls, curve array
openquake.commonlib.readinput.get_hcurves_from_csv(oqparam, fname)[source]
Parameters:
Returns:

the site collection and the hazard curves read by the .txt file

openquake.commonlib.readinput.get_hcurves_from_nrml(oqparam, fname)[source]
Parameters:
Returns:

sitecol, curve array

openquake.commonlib.readinput.get_imts(oqparam)[source]

Return a sorted list of IMTs as hazardlib objects

openquake.commonlib.readinput.get_job_info(oqparam, source_models, sitecol)[source]
Parameters:
Returns:

a dictionary with same parameters of the computation, in particular the input and output weights

openquake.commonlib.readinput.get_mesh(oqparam)[source]

Extract the mesh of points to compute from the sites, the sites_csv, or the region.

Parameters:oqparam – an openquake.commonlib.oqvalidation.OqParam instance
openquake.commonlib.readinput.get_mesh_csvdata(csvfile, imts, num_values, validvalues)[source]

Read CSV data in the format IMT lon lat value1 ... valueN.

Parameters:
  • csvfile – a file or file-like object with the CSV data
  • imts – a list of intensity measure types
  • num_values – dictionary with the number of expected values per IMT
  • validvalues – validation function for the values
Returns:

the mesh of points and the data as a dictionary imt -> array of curves for each site

openquake.commonlib.readinput.get_mesh_hcurves(oqparam)[source]

Read CSV data in the format lon lat, v1-vN, w1-wN, ....

Parameters:oqparam – an openquake.commonlib.oqvalidation.OqParam instance
Returns:the mesh of points and the data as a dictionary imt -> array of curves for each site
openquake.commonlib.readinput.get_oqparam(job_ini, pkg=None, calculators=None, hc_id=None)[source]

Parse a dictionary of parameters from an INI-style config file.

Parameters:
  • job_ini – Path to configuration file/archive or dictionary of parameters
  • pkg – Python package where to find the configuration file (optional)
  • calculators – Sequence of calculator names (optional) used to restrict the valid choices for calculation_mode
  • hc_id – Not None only when called from a post calculation
Returns:

An openquake.commonlib.oqvalidation.OqParam instance containing the validate and casted parameters/values parsed from the job.ini file as well as a subdictionary ‘inputs’ containing absolute paths to all of the files referenced in the job.ini, keyed by the parameter name.

openquake.commonlib.readinput.get_params(job_inis)[source]

Parse one or more INI-style config files.

Parameters:job_inis – List of configuration files (or list containing a single zip archive)
Returns:A dictionary of parameters
openquake.commonlib.readinput.get_risk_model(oqparam, rmdict)[source]
Parameters:
openquake.commonlib.readinput.get_rupture(oqparam)[source]

Returns a hazardlib rupture by reading the rupture_model file.

Parameters:oqparam – an openquake.commonlib.oqvalidation.OqParam instance
openquake.commonlib.readinput.get_scenario_from_nrml(oqparam, fname)[source]
Parameters:
Returns:

a triple (sitecol, etags, gmf array)

openquake.commonlib.readinput.get_site_collection(oqparam, mesh=None, site_model_params=None)[source]

Returns a SiteCollection instance by looking at the points and the site model defined by the configuration parameters.

Parameters:
  • oqparam – an openquake.commonlib.oqvalidation.OqParam instance
  • mesh – a mesh of hazardlib points; if None the mesh is determined by invoking get_mesh
  • site_model_params – object with a method .get_closest returning the closest site model parameters
openquake.commonlib.readinput.get_site_model(oqparam)[source]

Convert the NRML file into an iterator over 6-tuple of the form (z1pt0, z2pt5, measured, vs30, lon, lat)

Parameters:oqparam – an openquake.commonlib.oqvalidation.OqParam instance
openquake.commonlib.readinput.get_sitecol_assets(oqparam, exposure)[source]
Parameters:oqparam – an openquake.commonlib.oqvalidation.OqParam instance
Returns:two sequences of the same length: the site collection and an array with the assets per each site, collected by taxonomy
openquake.commonlib.readinput.get_source_model_lt(oqparam)[source]
Parameters:oqparam – an openquake.commonlib.oqvalidation.OqParam instance
Returns:a openquake.commonlib.logictree.SourceModelLogicTree instance
openquake.commonlib.readinput.get_source_models(oqparam, gsim_lt, source_model_lt, in_memory=True)[source]

Build all the source models generated by the logic tree.

Parameters:
Returns:

an iterator over openquake.commonlib.source.SourceModel tuples

openquake.commonlib.readinput.possibly_gunzip(fname)[source]

A file can be .gzipped to save space (this happens in the debian package); in that case, let’s gunzip it.

Parameters:fname – a file name (not zipped)
openquake.commonlib.readinput.sitecol_from_coords(oqparam, coords)[source]

Return a SiteCollection instance from an ordered set of coordinates

openquake.commonlib.reportwriter module

Utilities to build a report writer generating a .rst report for a calculation

class openquake.commonlib.reportwriter.ReportWriter(dstore)[source]

Bases: object

A particularly smart view over the datastore

add(name, obj=None)[source]

Add the view named name to the report text

make_report()[source]

Build the report and return a restructed text string

save(fname)[source]

Save the report

title = {'short_source_info': 'Slowest sources', 'inputs': 'Input files', 'avglosses_data_transfer': 'Estimated data transfer for the avglosses', 'csm_info': 'Composite source model', 'exposure_info': 'Exposure model', 'times_by_source_class': 'Computation times by source typology', 'rlzs_assoc': 'Realizations per (TRT, GSIM)', 'job_info': 'Informational data', 'task_info': 'Information about the tasks', 'params': 'Parameters', 'ruptures_events': 'Specific information for event based', 'performance': 'Slowest operations', 'required_params_per_trt': 'Required parameters per tectonic region type', 'ruptures_per_trt': 'Number of ruptures per tectonic region type', 'biggest_ebr_gmf': 'Maximum memory allocated for the GMFs'}
openquake.commonlib.reportwriter.build_report(job_ini, output_dir=None)[source]

Write a report.csv file with information about the calculation without running it

Parameters:
  • job_ini – full pathname of the job.ini file
  • output_dir – the directory where the report is written (default the input directory)
openquake.commonlib.reportwriter.indent(text)[source]

openquake.commonlib.risk_parsers module

Module containing parsers for risk input artifacts.

class openquake.commonlib.risk_parsers.AssetData(exposure_metadata, site, asset_ref, taxonomy, area, number, costs, occupancy)

Bases: tuple

area

Alias for field number 4

asset_ref

Alias for field number 2

costs

Alias for field number 6

exposure_metadata

Alias for field number 0

number

Alias for field number 5

occupancy

Alias for field number 7

site

Alias for field number 1

taxonomy

Alias for field number 3

class openquake.commonlib.risk_parsers.Conversions(cost_types, area_type, area_unit, deductible_is_absolute, insurance_limit_is_absolute)[source]

Bases: object

class openquake.commonlib.risk_parsers.Cost(cost_type, value, retrofitted, deductible, limit)

Bases: tuple

cost_type

Alias for field number 0

deductible

Alias for field number 3

limit

Alias for field number 4

retrofitted

Alias for field number 2

value

Alias for field number 1

class openquake.commonlib.risk_parsers.CostType(name, conversion_type, unit, retrofitted_type, retrofitted_unit)

Bases: tuple

conversion_type

Alias for field number 1

name

Alias for field number 0

retrofitted_type

Alias for field number 3

retrofitted_unit

Alias for field number 4

unit

Alias for field number 2

class openquake.commonlib.risk_parsers.ExposureMetadata(exposure_id, taxonomy_source, asset_category, description, conversions)

Bases: tuple

asset_category

Alias for field number 2

conversions

Alias for field number 4

description

Alias for field number 3

exposure_id

Alias for field number 0

taxonomy_source

Alias for field number 1

class openquake.commonlib.risk_parsers.ExposureModelParser(source)[source]

Bases: object

Exposure model parser. This class is implemented as a generator.

For each asset element in the parsed document, it yields an AssetData instance

Parameters:source – Filename or file-like object containing the XML data.
class openquake.commonlib.risk_parsers.Occupancy(occupants, period)

Bases: tuple

occupants

Alias for field number 0

period

Alias for field number 1

class openquake.commonlib.risk_parsers.Site(longitude, latitude)

Bases: tuple

latitude

Alias for field number 1

longitude

Alias for field number 0

openquake.commonlib.risk_writers module

Module containing writers for risk output artifacts.

class openquake.commonlib.risk_writers.AggregateLossCurveXMLWriter(dest, investigation_time, loss_type, source_model_tree_path=None, gsim_tree_path=None, statistics=None, quantile_value=None, unit=None, poe=None, risk_investigation_time=None)[source]

Bases: object

Parameters:
  • dest – File path (including filename) or file-like objects for results to be saved to.
  • investigation_time (float) – Investigation time (also known as Time Span) defined in the calculation which produced these results (in years).
  • loss_type (str) – Loss type used in risk model input for the calculation producing this output (examples: structural, non-structural, business-interruption, occupants)
  • source_model_tree_path (str) – Id of the source model tree path (obtained by concatenating the IDs of the branches the path is made of) for which input hazard curves have been computed.
  • gsim_tree_path (str) – Id of the gsim (ground shaking intensity model) tree path (obtained by concatenating the IDs of the branches the path is made of) for which input hazard curves have been computed.
  • unit (str) – Attribute describing how the value of the assets has been measured.
  • statistics (str) – mean or quantile. When serializing loss curves produced from statistical hazard inputs, it describes the type of statistic used.
  • quantile_value (float) – When serializing loss curves produced from quantile hazard inputs, it describes the quantile value.
serialize(data)[source]

Serialize an aggregation loss curve.

Parameters:data

An object representing an aggregate loss curve. This object should:

  • define an attribute poes, which is a list of floats describing the probabilities of exceedance.
  • define an attribute losses, which is a list of floats describing the losses.
  • define an attribute average_loss, which is a float describing the average loss associated to the loss curve
  • define an attribute stddev_loss, which is a float describing the standard deviation of losses if the loss curve has been computed with an event based approach. Otherwise, it is None

Also, poes, losses values must be indexed coherently, i.e.: the loss at index zero is related to the probability of exceedance at the same index.

class openquake.commonlib.risk_writers.BCRMapXMLWriter(path, interest_rate, asset_life_expectancy, loss_type, source_model_tree_path=None, gsim_tree_path=None, statistics=None, quantile_value=None, unit=None, loss_category=None, poe=None)[source]

Bases: object

Serializer for bcr (benefit cost ratio) maps produced with the classical and probabilistic calculators.

Parameters:
  • dest – File path (including filename) or file-like object for results to be saved to.
  • interest_rate (float) – The inflation discount rate.
  • asset_life_expectancy (float) – The period of time in which the building is expected to be used.
  • loss_type (str) – Loss type used in risk model input for the calculation producing this output (examples: structural, non-structural, business-interruption, occupants)
  • source_model_tree_path (str) – Id of the source model tree path (obtained by concatenating the IDs of the branches the path is made of) for which input hazard curves have been computed.
  • gsim_tree_path (str) – Id of the gsim (ground shaking intensity model) tree path (obtained by concatenating the IDs of the branches the path is made of) for which input hazard curves have been computed.
  • unit (str) – Attribute describing how the value of the assets has been measured.
  • loss_category (str) – Attribute describing the category (economic, population, buildings, etc..) of the losses producing this bcr map.
  • statistics (str) – mean or quantile. When serializing bcr values produced from statistical hazard inputs, it describes the type of statistic used.
  • quantile_value (float) – When serializing bcr values produced from quantile hazard inputs, it describes the quantile value.
serialize(data)[source]

Serialize a collection of (benefit cost) ratios.

Parameters:data

An iterable of bcr objects. Each object should:

  • define an attribute location, which is itself an object defining two attributes, x containing the longitude value and y containing the latitude value. Also, it must define an attribute wkt, which is the Well-known text representation of the location.
  • define an attribute asset_ref, which contains the unique identifier of the asset related to the (benefit cost) ratio.
  • define an attribute average_annual_loss_original, which is the expected average annual economic loss using the original vulnerability of the asset.
  • define an attribute average_annual_loss_retrofitted, which is the expected average annual economic loss using the improved (better design or retrofitted) vulnerability of the asset.
  • define an attribute bcr, which is the value of the ( benefit cost) ratio.
class openquake.commonlib.risk_writers.DamageWriter(damage_states)[source]

Bases: object

A class to convert scenario_damage outputs into nodes and then XML.

Parameters:damage_states – a sequence of DamageState objects with attributes .dmg_state and .lsi
asset_node(asset_ref, means, stddevs)[source]
Parameters:
  • asset_ref – asset reference string
  • means – array of means, one per damage state
  • stddevs – array of stddevs, one per damage state
Returns:

an asset node

cm_node(loc, asset_refs, means, stddevs)[source]
Parameters:
  • loc – a location object with attributes x and y
  • asset_refs – asset reference strings
  • means – array of means, one per asset
  • stddevs – array of stddevs, one per asset
Returns:

a CMNode node

collapse_map_node(data)[source]
Parameters:data – a sequence of records with attributes .exposure_data, .mean and .stddev
Returns:a dmgDistPerAsset node
damage_nodes(means, stddevs)[source]
Parameters:
  • means – array of means, one per damage state
  • stddevs – array of stddevs, one per damage state
Returns:

a list of damage nodes

dd_node_taxo(taxonomy, means, stddevs)[source]
Parameters:
  • taxonomy – taxonomy string
  • means – array of means, one per damage state
  • stddevs – array of stddevs, one per damage state
Returns:

a DDNode node

dmg_dist_per_asset_node(data)[source]
Parameters:data – a sequence of records with attributes .exposure_data, .mean and .stddev
Returns:a dmgDistPerAsset node
dmg_dist_per_taxonomy_node(data)[source]
Parameters:data – a sequence of records with attributes .taxonomy, .mean and .stddev
Returns:a dmgDistPerTaxonomy node
dmg_dist_total_node(data)[source]
Parameters:data – a sequence of records with attributes .dmg_state, .mean and .stddev
Returns:a totalDmgDist node
point_node(loc)[source]
Parameters:loc – a location object with attributes x and y
Returns:a gml:Point node
to_nrml(key, data, fname=None, fmt='%.5E')[source]
Parameters:
  • keydmg_dist_per_asset|dmg_dist_per_taxonomy|dmg_dist_total|collapse_map
  • data – sequence of rows to serialize
Fname:

the path name of the output file; if None, build a name

Returns:

path name of the saved file

class openquake.commonlib.risk_writers.DmgDistPerAsset(exposure_data, dmg_state, mean, stddev)

Bases: tuple

dmg_state

Alias for field number 1

exposure_data

Alias for field number 0

mean

Alias for field number 2

stddev

Alias for field number 3

class openquake.commonlib.risk_writers.DmgDistPerTaxonomy(taxonomy, dmg_state, mean, stddev)

Bases: tuple

dmg_state

Alias for field number 1

mean

Alias for field number 2

stddev

Alias for field number 3

taxonomy

Alias for field number 0

class openquake.commonlib.risk_writers.DmgDistTotal(dmg_state, mean, stddev)

Bases: tuple

dmg_state

Alias for field number 0

mean

Alias for field number 1

stddev

Alias for field number 2

class openquake.commonlib.risk_writers.DmgState(dmg_state, lsi)

Bases: tuple

dmg_state

Alias for field number 0

lsi

Alias for field number 1

class openquake.commonlib.risk_writers.ExposureData(asset_ref, site)

Bases: tuple

asset_ref

Alias for field number 0

site

Alias for field number 1

class openquake.commonlib.risk_writers.LossCurveXMLWriter(dest, investigation_time, loss_type, source_model_tree_path=None, gsim_tree_path=None, statistics=None, quantile_value=None, unit=None, insured=False, poe=None, risk_investigation_time=None)[source]

Bases: object

Parameters:
  • dest – File path (including filename) or file-like object for results to be saved to.
  • investigation_time (float) – Investigation time (also known as Time Span) defined in the calculation which produced these results (in years).
  • loss_type (str) – Loss type used in risk model input for the calculation producing this output (examples: structural, non-structural, business-interruption, occupants)
  • source_model_tree_path (str) – Id of the source model tree path (obtained by concatenating the IDs of the branches the path is made of) for which input hazard curves have been computed.
  • gsim_tree_path (str) – Id of the gsim (ground shaking intensity model) tree path (obtained by concatenating the IDs of the branches the path is made of) for which input hazard curves have been computed.
  • statistics (str) – mean or quantile. When serializing loss curves produced from statistical hazard inputs, it describes the type of statistic used.
  • quantile_value (float) – When serializing loss curves produced from quantile hazard inputs, it describes the quantile value.
  • quantile_value – When serializing loss curves produced from quantile hazard inputs, it describes the quantile value.
  • unit (str) – Attribute describing how the value of the assets has been measured.
  • insured (bool) – True if it is an insured loss curve
serialize(data)[source]

Serialize a collection of loss curves.

Parameters:data

An iterable of loss curve objects. Each object should:

  • define an attribute location, which is itself an object defining two attributes, x containing the longitude value and y containing the latitude value.
  • define an attribute asset_ref, which contains the unique identifier of the asset related to the loss curve.
  • define an attribute poes, which is a list of floats describing the probabilities of exceedance.
  • define an attribute losses, which is a list of floats describing the losses.
  • define an attribute loss_ratios, which is a list of floats describing the loss ratios.
  • define an attribute average_loss, which is a float describing the average loss associated to the loss curve
  • define an attribute stddev_loss, which is a float describing the standard deviation of losses if the loss curve has been computed with an event based approach. Otherwise, it is None

All attributes must be defined, except for loss_ratios that can be None since it is optional in the schema.

Also, poes, losses and loss_ratios values must be indexed coherently, i.e.: the loss (and optionally loss ratio) at index zero is related to the probability of exceedance at the same index.

class openquake.commonlib.risk_writers.LossFractionsWriter(dest, variable, loss_unit, loss_type, loss_category, hazard_metadata, poe=None)[source]

Bases: object

Serializer for loss fractions produced with the classical and event based calculators.

Attr dest:

Full path including file name or file-like object where the results will be saved into.

Attr str variable:
 

The variable used for disaggregation

Attr str loss_unit:
 

Attribute describing how the value of the assets has been measured.

Parameters:

loss_type (str) – Loss type used in risk model input for the calculation producing this output (examples: structural, non-structural, business-interruption, occupants)

Attr str loss_category:
 

Attribute describing the category (economic, population, buildings, etc..) of the losses producing this loss map.

Attr object hazard_metadata:
 
metadata of hazard outputs used by risk calculation. It has the

attributes: investigation_time, source_model_tree_path, gsim_tree_path, statistics, quantile_value

Attr float poe:

Probability of exceedance used to interpolate the losses producing this fraction map.

serialize(total_fractions, locations_fractions)[source]

Actually serialize the fractions.

Parameters:
  • total_fractions (dict) – maps a value of variable with a tuple representing the absolute losses and the fraction
  • locations_fractions (dict) – a dictionary mapping a tuple (longitude, latitude) to bins. Each bin is a dictionary with the same structure of total_fractions.
class openquake.commonlib.risk_writers.LossMapGeoJSONWriter(dest, investigation_time, poe, loss_type, source_model_tree_path=None, gsim_tree_path=None, statistics=None, quantile_value=None, unit=None, loss_category=None, risk_investigation_time=None)[source]

Bases: openquake.commonlib.risk_writers.LossMapWriter

GeoJSON implementation of a LossMapWriter. Serializes loss maps as FeatureCollection artifacts with additional loss map metadata.

See LossMapWriter for information about constructor parameters.

serialize(data)[source]

Serialize loss map data to a file as a GeoJSON feature collection.

See LossMapWriter.serialize() for expected input.

class openquake.commonlib.risk_writers.LossMapWriter(dest, investigation_time, poe, loss_type, source_model_tree_path=None, gsim_tree_path=None, statistics=None, quantile_value=None, unit=None, loss_category=None, risk_investigation_time=None)[source]

Bases: object

Base class for serializing loss maps produced with the classical and probabilistic calculators.

Subclasses must implement the serialize() method, which defines the format of the output.

Parameters:
  • dest – File path (including filename) or file-like object for results to be saved to.
  • investigation_time (float) – Investigation time (also known as Time Span) defined in the calculation which produced these results (in years).
  • poe (float) – Probability of exceedance used to interpolate the losses producing this loss map.
  • loss_type (str) – Loss type used in risk model input for the calculation producing this output (examples: structural, non-structural, business-interruption, occupants)
  • source_model_tree_path (str) – Id of the source model tree path (obtained by concatenating the IDs of the branches the path is made of) for which input hazard curves have been computed.
  • gsim_tree_path (str) – Id of the gsim (ground shaking intensity model) tree path (obtained by concatenating the IDs of the branches the path is made of) for which input hazard curves have been computed.
  • unit (str) – Attribute describing how the value of the assets has been measured.
  • loss_category (str) – Attribute describing the category (economic, population, buildings, etc..) of the losses producing this loss map.
  • statistics (str) – mean or quantile. When serializing loss curves produced from statistical hazard inputs, it describes the type of statistic used.
  • quantile_value (float) – When serializing loss curves produced from quantile hazard inputs, it describes the quantile value.
serialize(data)[source]

Serialize a collection of losses.

Parameters:data

An iterable of loss objects. Each object should:

  • define an attribute location, which is itself an object defining two attributes, x containing the longitude value and y containing the latitude value. Also, it must define an attribute wkt, which is the Well-known text representation of the location.
  • define an attribute asset_ref, which contains the unique identifier of the asset related to the loss curve.
  • define an attribute value, which is the value of the loss.
class openquake.commonlib.risk_writers.LossMapXMLWriter(dest, investigation_time, poe, loss_type, source_model_tree_path=None, gsim_tree_path=None, statistics=None, quantile_value=None, unit=None, loss_category=None, risk_investigation_time=None)[source]

Bases: openquake.commonlib.risk_writers.LossMapWriter

NRML/XML implementation of a LossMapWriter.

See LossMapWriter for information about constructor parameters.

serialize(data)[source]

Serialize loss map data to XML.

See LossMapWriter.serialize() for expected input.

class openquake.commonlib.risk_writers.Site(x, y)[source]

Bases: object

A small wrapper over a lon-lat pair (x, y). It has a .wkt attribute and an ordering. It is used for consistency with the export routines.

openquake.commonlib.risk_writers.notnan(value)[source]

True if the value is not numpy.nan

openquake.commonlib.risk_writers.validate_hazard_metadata(gsim_tree_path=None, source_model_tree_path=None, statistics=None, quantile_value=None)[source]

Validate the hazard input metadata.

openquake.commonlib.riskmodels module

Reading risk models for risk calculators

openquake.commonlib.riskmodels.build_vf_node(vf)[source]

Convert a VulnerabilityFunction object into a LiteralNode suitable for XML conversion.

openquake.commonlib.riskmodels.filter_vset(elem)[source]
openquake.commonlib.riskmodels.get_risk_files(inputs)[source]
Parameters:inputs – a dictionary key -> path name
Returns:a pair (file_type, {cost_type: path})
openquake.commonlib.riskmodels.get_risk_models(oqparam, kind=None)[source]
Parameters:
  • oqparam – an OqParam instance
  • kind – vulnerability|vulnerability_retrofitted|fragility|consequence; if None it is extracted from the oqparam.file_type attribute
Returns:

a dictionary taxonomy -> loss_type -> function

openquake.commonlib.sap module

Here is a minimal example of usage:

>>> from openquake.commonlib import sap
>>> def fun(input, inplace, output=None, out='/tmp'):
...     'Example'
...     for argname, argvalue in sorted(locals().iteritems()):
...         print argname, '=', argvalue

>>> p = sap.Parser(fun)
>>> p.arg('input', 'input file or archive')
>>> p.flg('inplace', 'convert inplace')
>>> p.arg('output', 'output archive')
>>> p.opt('out', 'optional output file')

>>> p.callfunc(['a'])
inplace = False
input = a
out = /tmp
output = None

>>> p.callfunc(['a', 'b', '-i', '-o', 'OUT'])
inplace = True
input = a
out = OUT
output = b

Parsers can be composed too.

class openquake.commonlib.sap.Parser(func, name=None, parentparser=None, help=True)[source]

Bases: object

A simple way to define command processors based on argparse. Each parser is associated to a function and parsers can be composed together, by dispatching on a given name (if not given, the function name is used).

arg(name, help, type=None, choices=None, metavar=None, nargs=None)[source]

Describe a positional argument

callfunc(argv=None)[source]

Parse the argv list and extract a dictionary of arguments which is then passed to the function underlying the Parser.

check_arguments()[source]

Make sure all arguments have a specification

flg(name, help, abbrev=None)[source]

Describe a flag

group(descr)[source]

Added a new group of arguments with the given description

help()[source]

Return the help message as a string

opt(name, help, abbrev=None, type=None, choices=None, metavar=None, nargs=None)[source]

Describe an option

openquake.commonlib.sap.compose(parsers, name='main', description=None, prog=None, version=None)[source]

Collects together different arguments parsers and builds a single Parser dispatching to the subparsers depending on the first argument, i.e. the name of the subparser to invoke.

Parameters:
  • parsers – a list of Parser instances
  • name – the name of the composed parser
  • description – description of the composed parser
  • prog – name of the script printed in the usage message
  • version – version of the script printed with –version
openquake.commonlib.sap.get_parentparser(parser, description=None, help=True)[source]
Parameters:
  • parserargparse.ArgumentParser instance or None
  • description – string used to build a new parser if parser is None
  • help – flag used to build a new parser if parser is None
Returns:

if parser is None the new parser; otherwise the .parentparser attribute (if set) or the parser itself (if not set)

openquake.commonlib.sap.str_choices(choices)[source]

Returns {choice1, ..., choiceN} or the empty string

openquake.commonlib.source module

class openquake.commonlib.source.CompositeSourceModel(gsim_lt, source_model_lt, source_models, set_weight=True)[source]

Bases: _abcoll.Sequence

Parameters:
get_num_sources()[source]
Returns:the total number of sources in the model
get_sources(kind='all')[source]

Extract the sources contained in the source models by optionally filtering and splitting them, depending on the passed parameters.

set_weights()[source]

Update the attributes .weight and src.num_ruptures for each TRT model .weight of the CompositeSourceModel.

trt_models

Yields the TrtModels inside each source model.

class openquake.commonlib.source.CompositionInfo(gsim_lt, seed, num_samples, source_models)[source]

Bases: object

An object to collect information about the composition of a composite source model.

Parameters:
  • source_model_lt – a SourceModelLogicTree object
  • source_models – a list of SourceModel instances
classmethod fake(gsimlt=None)[source]
Returns:a fake CompositionInfo instance with the given gsim logic tree object; if None, builds automatically a fake gsim logic tree
get_num_rlzs(source_model=None)[source]
Parameters:source_model – a SourceModel instance (or None)
Returns:the number of realizations per source model (or all)
get_rlzs_assoc(count_ruptures=None)[source]

Return a RlzsAssoc with fields realizations, gsim_by_trt, rlz_idx and trt_gsims.

Parameters:count_ruptures – a function trt_model -> num_ruptures
get_source_model(trt_model_id)[source]

Return the source model for the given trt_model_id

get_trt(trt_model_id)[source]

Return the TRT string for the given trt_model_id

exception openquake.commonlib.source.DuplicatedID[source]

Bases: exceptions.Exception

Raised when two sources with the same ID are found in a source model

class openquake.commonlib.source.LtRealization(ordinal, sm_lt_path, gsim_rlz, weight, sampleid)[source]

Bases: object

Composite realization build on top of a source model realization and a GSIM realization.

gsim_lt_path
uid

An unique identifier for effective realizations

class openquake.commonlib.source.RlzsAssoc(csm_info)[source]

Bases: _abcoll.Mapping

Realization association class. It should not be instantiated directly, but only via the method :meth: openquake.commonlib.source.CompositeSourceModel.get_rlzs_assoc.

Attr realizations:
 list of LtRealization objects
Attr gsim_by_trt:
 list of dictionaries {trt: gsim}
Attr rlzs_assoc:
 dictionary {trt_model_id, gsim: rlzs}
Attr rlzs_by_smodel:
 list of lists of realizations

For instance, for the non-trivial logic tree in openquake.qa_tests_data.classical.case_15, which has 4 tectonic region types and 4 + 2 + 2 realizations, there are the following associations:

(0, ‘BooreAtkinson2008()’) [‘#0-SM1-BA2008_C2003’, ‘#1-SM1-BA2008_T2002’] (0, ‘CampbellBozorgnia2008()’) [‘#2-SM1-CB2008_C2003’, ‘#3-SM1-CB2008_T2002’] (1, ‘Campbell2003()’) [‘#0-SM1-BA2008_C2003’, ‘#2-SM1-CB2008_C2003’] (1, ‘ToroEtAl2002()’) [‘#1-SM1-BA2008_T2002’, ‘#3-SM1-CB2008_T2002’] (2, ‘BooreAtkinson2008()’) [‘#4-SM2_a3pt2b0pt8-BA2008’] (2, ‘CampbellBozorgnia2008()’) [‘#5-SM2_a3pt2b0pt8-CB2008’] (3, ‘BooreAtkinson2008()’) [‘#6-SM2_a3b1-BA2008’] (3, ‘CampbellBozorgnia2008()’) [‘#7-SM2_a3b1-CB2008’]

combine(results, agg=<function agg_prob>)[source]
Parameters:
  • results – a dictionary (trt_model_id, gsim) -> floats
  • agg – an aggregation function
Returns:

a dictionary rlz -> aggregated floats

Example: a case with tectonic region type T1 with GSIMS A, B, C and tectonic region type T2 with GSIMS D, E.

>> assoc = RlzsAssoc(CompositionInfo([], [])) >> assoc.rlzs_assoc = { ... (‘T1’, ‘A’): [‘r0’, ‘r1’], ... (‘T1’, ‘B’): [‘r2’, ‘r3’], ... (‘T1’, ‘C’): [‘r4’, ‘r5’], ... (‘T2’, ‘D’): [‘r0’, ‘r2’, ‘r4’], ... (‘T2’, ‘E’): [‘r1’, ‘r3’, ‘r5’]} ... >> results = { ... (‘T1’, ‘A’): 0.01, ... (‘T1’, ‘B’): 0.02, ... (‘T1’, ‘C’): 0.03, ... (‘T2’, ‘D’): 0.04, ... (‘T2’, ‘E’): 0.05,} ... >> combinations = assoc.combine(results, operator.add) >> for key, value in sorted(combinations.items()): print key, value r0 0.05 r1 0.06 r2 0.06 r3 0.07 r4 0.07 r5 0.08

You can check that all the possible sums are performed:

r0: 0.01 + 0.04 (T1A + T2D) r1: 0.01 + 0.05 (T1A + T2E) r2: 0.02 + 0.04 (T1B + T2D) r3: 0.02 + 0.05 (T1B + T2E) r4: 0.03 + 0.04 (T1C + T2D) r5: 0.03 + 0.05 (T1C + T2E)

In reality, the combine_curves method is used with hazard_curves and the aggregation function is the agg_curves function, a composition of probability, which however is close to the sum for small probabilities.

combine_curves(results)[source]
Parameters:results – dictionary (trt_model_id, gsim) -> curves
Returns:a dictionary rlz -> aggregate curves
extract(rlz_indices, csm_info)[source]

Extract a RlzsAssoc instance containing only the given realizations.

Parameters:rlz_indices – a list of realization indices from 0 to R - 1
get_rlzs_by_gsim(trt_id)[source]

Returns a dictionary gsim -> rlzs

get_rlzs_by_trt_id()[source]

Returns a dictionary trt_id > [sorted rlzs]

realizations

Flat list with all the realizations

class openquake.commonlib.source.SourceInfo(trt_model_id, source_id, source_class, weight, sources, filter_time, split_time, calc_time)

Bases: tuple

calc_time

Alias for field number 7

filter_time

Alias for field number 5

source_class

Alias for field number 2

source_id

Alias for field number 1

sources

Alias for field number 4

split_time

Alias for field number 6

trt_model_id

Alias for field number 0

weight

Alias for field number 3

class openquake.commonlib.source.SourceManager(csm, maximum_distance, dstore, monitor, random_seed=None, filter_sources=True, num_tiles=1)[source]

Bases: object

Manager associated to a CompositeSourceModel instance. Filter and split sources and send them to the worker tasks.

gen_args(tiles)[source]

Yield (sources, sitecol, siteidx, rlzs_assoc, monitor) by looping on the tiles and on the source blocks.

get_sources(kind, tile)[source]
Parameters:
  • kind – a string ‘light’, ‘heavy’ or ‘all’
  • tile – a openquake.hazardlib.site.Tile instance
Returns:

the sources of the given kind affecting the given tile

set_serial(src, split_sources=())[source]

Set a serial number per each rupture in a source, managing also the case of split sources, if any.

store_source_info(dstore)[source]

Save the source_info array and its attributes in the datastore.

Parameters:dstore – the datastore
class openquake.commonlib.source.SourceModel(name, weight, path, trt_models, num_gsim_paths, ordinal, samples)[source]

Bases: object

A container of TrtModel instances with some additional attributes describing the source model in the logic tree.

get_skeleton()[source]

Return an empty copy of the source model, i.e. without sources, but with the proper attributes for each TrtModel contained within.

num_sources
class openquake.commonlib.source.SourceModelParser(converter)[source]

Bases: object

A source model parser featuring a cache.

Parameters:converteropenquake.commonlib.source.SourceConverter instance
parse_sources(fname)[source]

Parse all the sources and return them ordered by tectonic region type. It does not count the ruptures, so it is relatively fast.

Parameters:fname – the full pathname of the source model file
parse_trt_models(fname, apply_uncertainties=None)[source]
Parameters:
  • fname – the full pathname of the source model file
  • apply_uncertainties – a function modifying the sources (or None)
class openquake.commonlib.source.TrtModel(trt, sources=None, min_mag=None, max_mag=None, id=0, eff_ruptures=-1)[source]

Bases: _abcoll.Sequence

A container for the following parameters:

Parameters:
  • trt (str) – the tectonic region type all the sources belong to
  • sources (list) – a list of hazardlib source objects
  • min_mag – the minimum magnitude among the given sources
  • max_mag – the maximum magnitude among the given sources
  • gsims – the GSIMs associated to tectonic region type
  • id – an optional numeric ID (default None) useful to associate the model to a database object
classmethod collect(sources)[source]
Parameters:sources – dictionaries with a key ‘tectonicRegion’
Returns:an ordered list of TrtModel instances
tot_ruptures()[source]
update(src)[source]

Update the attributes sources, min_mag, max_mag according to the given source.

Parameters:src – an instance of :class: openquake.hazardlib.source.base.BaseSeismicSource
openquake.commonlib.source.agg_prob(acc, prob)[source]

Aggregation function for probabilities

openquake.commonlib.source.capitalize(words)[source]

Capitalize words separated by spaces.

>>> capitalize('active shallow crust')
'Active Shallow Crust'
openquake.commonlib.source.collect_source_model_paths(smlt)[source]

Given a path to a source model logic tree or a file-like, collect all of the soft-linked path names to the source models it contains and return them as a uniquified list (no duplicates).

Parameters:smlt – source model logic tree file
openquake.commonlib.source.count_eff_ruptures(sources, sitecol, siteidx, rlzs_assoc, monitor)[source]

Count the number of ruptures contained in the given sources and return a dictionary trt_model_id -> num_ruptures. All sources belong to the same tectonic region type.

openquake.commonlib.source.source_info_iadd(self, other)[source]

openquake.commonlib.sourceconverter module

class openquake.commonlib.sourceconverter.MultiRuptureSource(ruptures, source_id, tectonic_region_type, trt_model_id)[source]

Bases: object

Fake source class used to encapsule a set of ruptures.

Parameters:
  • rupture – an instance of openquake.hazardlib.source.rupture. ParametricProbabilisticRupture
  • source_id – an ID for the MultiRuptureSource
  • tectonic_region_type – the tectonic region type
  • trt_model_id – ID of the tectonic region model the source belongs to
count_ruptures()[source]

Return the block size

filter_sites_by_distance_to_source(maxdist, sitecol)[source]

The source has been already filtered, return the sitecol

iter_ruptures()[source]

Yield the ruptures

classmethod split(src, block_size)[source]

Split the given fault source into MultiRuptureSources depending on the given block size.

class openquake.commonlib.sourceconverter.RuptureConverter(rupture_mesh_spacing, complex_fault_mesh_spacing=None)[source]

Bases: object

Convert ruptures from nodes into Hazardlib ruptures.

convert_complexFaultRupture(node, mag, rake, hypocenter)[source]

Convert a complexFaultRupture node.

Parameters:
  • node – the rupture node
  • mag – the rupture magnitude
  • rake – the rupture rake angle
  • hypocenter – the rupture hypocenter
convert_multiPlanesRupture(node, mag, rake, hypocenter)[source]

Convert a multiPlanesRupture node.

Parameters:
  • node – the rupture node
  • mag – the rupture magnitude
  • rake – the rupture rake angle
  • hypocenter – the rupture hypocenter
convert_node(node)[source]

Convert the given rupture node into a hazardlib rupture, depending on the node tag.

Parameters:node – a node representing a rupture
convert_simpleFaultRupture(node, mag, rake, hypocenter)[source]

Convert a simpleFaultRupture node.

Parameters:
  • node – the rupture node
  • mag – the rupture magnitude
  • rake – the rupture rake angle
  • hypocenter – the rupture hypocenter
convert_singlePlaneRupture(node, mag, rake, hypocenter)[source]

Convert a singlePlaneRupture node.

Parameters:
  • node – the rupture node
  • mag – the rupture magnitude
  • rake – the rupture rake angle
  • hypocenter – the rupture hypocenter
convert_surfaces(surface_nodes)[source]

Utility to convert a list of surface nodes into a single hazardlib surface. There are three possibilities:

  1. there is a single simpleFaultGeometry node; returns a openquake.hazardlib.geo.simpleFaultSurface instance
  2. there is a single complexFaultGeometry node; returns a openquake.hazardlib.geo.complexFaultSurface instance
  3. there is a list of PlanarSurface nodes; returns a openquake.hazardlib.geo.MultiSurface instance
Parameters:surface_nodes – surface nodes as just described
fname = None
geo_line(edge)[source]

Utility function to convert a node of kind edge into a openquake.hazardlib.geo.Line instance.

Parameters:edge – a node describing an edge
geo_lines(edges)[source]

Utility function to convert a list of edges into a list of openquake.hazardlib.geo.Line instances.

Parameters:edge – a node describing an edge
geo_planar(surface)[source]

Utility to convert a PlanarSurface node with subnodes topLeft, topRight, bottomLeft, bottomRight into a openquake.hazardlib.geo.PlanarSurface instance.

Parameters:surface – PlanarSurface node
class openquake.commonlib.sourceconverter.SourceConverter(investigation_time, rupture_mesh_spacing, complex_fault_mesh_spacing=None, width_of_mfd_bin=1.0, area_source_discretization=None)[source]

Bases: openquake.commonlib.sourceconverter.RuptureConverter

Convert sources from valid nodes into Hazardlib objects.

convert_areaSource(node)[source]

Convert the given node into an area source object.

Parameters:node – a node with tag areaGeometry
Returns:a openquake.hazardlib.source.AreaSource instance
convert_characteristicFaultSource(node)[source]

Convert the given node into a characteristic fault object.

Parameters:node – a node with tag areaGeometry
Returns:a openquake.hazardlib.source.CharacteristicFaultSource instance
convert_complexFaultSource(node)[source]

Convert the given node into a complex fault object.

Parameters:node – a node with tag areaGeometry
Returns:a openquake.hazardlib.source.ComplexFaultSource instance
convert_hpdist(node)[source]

Convert the given node into a probability mass function for the hypo depth distribution.

Parameters:node – a hypoDepthDist node
Returns:a openquake.hazardlib.pmf.PMF instance
convert_mfdist(node)[source]

Convert the given node into a Magnitude-Frequency Distribution object.

Parameters:node – a node of kind incrementalMFD or truncGutenbergRichterMFD
Returns:a openquake.hazardlib.mdf.EvenlyDiscretizedMFD. or openquake.hazardlib.mdf.TruncatedGRMFD instance
convert_node(node)[source]

Convert the given node into a hazardlib source, depending on the node tag.

Parameters:node – a node representing a source
convert_nonParametricSeismicSource(node)[source]

Convert the given node into a non parametric source object.

Parameters:node – a node with tag areaGeometry
Returns:a openquake.hazardlib.source.NonParametricSeismicSource instance
convert_npdist(node)[source]

Convert the given node into a Nodal Plane Distribution.

Parameters:node – a nodalPlaneDist node
Returns:a openquake.hazardlib.geo.NodalPlane instance
convert_pointSource(node)[source]

Convert the given node into a point source object.

Parameters:node – a node with tag pointGeometry
Returns:a openquake.hazardlib.source.PointSource instance
convert_simpleFaultSource(node)[source]

Convert the given node into a simple fault object.

Parameters:node – a node with tag areaGeometry
Returns:a openquake.hazardlib.source.SimpleFaultSource instance
openquake.commonlib.sourceconverter.area_to_point_sources(area_src)[source]

Split an area source into a generator of point sources.

MFDs will be rescaled appropriately for the number of points in the area mesh.

Parameters:area_srcopenquake.hazardlib.source.AreaSource
openquake.commonlib.sourceconverter.filter_sources(sources, sitecol, maxdist)[source]

Filter a list of hazardlib sources according to the maximum distance.

Parameters:
  • sources – the original sources
  • sitecol – a SiteCollection instance
  • maxdist – maximum distance
Returns:

the filtered sources ordered by source_id

openquake.commonlib.sourceconverter.get_set_num_ruptures(src)[source]

Extract the number of ruptures and set it

openquake.commonlib.sourceconverter.parse_ses_ruptures(fname)[source]

Convert a stochasticEventSetCollection file into a set of SES, each one containing ruptures with an etag and a seed.

openquake.commonlib.sourceconverter.split_coords_2d(seq)[source]
Parameters:seq – a flat list with lons and lats
Returns:a validated list of pairs (lon, lat)
>>> split_coords_2d([1.1, 2.1, 2.2, 2.3])
[(1.1, 2.1), (2.2, 2.3)]
openquake.commonlib.sourceconverter.split_coords_3d(seq)[source]
Parameters:seq – a flat list with lons, lats and depths
Returns:a validated list of (lon, lat, depths) triplets
>>> split_coords_3d([1.1, 2.1, 0.1, 2.3, 2.4, 0.1])
[(1.1, 2.1, 0.1), (2.3, 2.4, 0.1)]
openquake.commonlib.sourceconverter.split_fault_source(src, block_size)[source]

Generator splitting a fault source into several fault sources.

Parameters:src – an instance of openquake.hazardlib.source.base.SeismicSource
openquake.commonlib.sourceconverter.split_fault_source_by_magnitude(src)[source]

Utility splitting a fault source into fault sources with a single magnitude bin.

Parameters:src – an instance of openquake.hazardlib.source.base.SeismicSource
openquake.commonlib.sourceconverter.split_source(src, block_size=1)[source]

Split an area source into point sources and a fault sources into smaller fault sources.

Parameters:src – an instance of openquake.hazardlib.source.base.SeismicSource

openquake.commonlib.sourcewriter module

Source model XML Writer

openquake.commonlib.sourcewriter.build_arbitrary_mfd(mfd)[source]

Parses the arbitrary MFD as a Node param mfd:

MFD as instance of :class: openquake.hazardlib.mfd.arbitrary.ArbitraryMFD
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_area_source_geometry(area_source)[source]

Returns the area source geometry as a Node :param area_source:

Area source model as an instance of the :class: openquake.hazardlib.source.area.AreaSource
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_area_source_node(area_source)[source]

Parses an area source to a Node class :param area_source:

Area source as instance of :class: openquake.hazardlib.source.area.AreaSource
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_characteristic_fault_source_node(source)[source]
openquake.commonlib.sourcewriter.build_complex_fault_geometry(fault_source)[source]

Returns the complex fault source geometry as a Node :param fault_source:

Complex fault source model as an instance of the :class: openquake.hazardlib.source.complex_fault.ComplexFaultSource
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_complex_fault_source_node(fault_source)[source]

Parses a complex fault source to a Node class :param fault_source:

Simple fault source as instance of :class: openquake.hazardlib.source.complex_fault.ComplexFaultSource
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_evenly_discretised_mfd(mfd)[source]

Returns the evenly discretized MFD as a Node :param mfd:

MFD as instance of :class: openquake.hazardlib.mfd.evenly_discretized.EvenlyDiscretizedMFD
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_hypo_depth_dist(hdd)[source]

Returns the hypocentral depth distribution as a Node instance :param hdd:

Hypocentral depth distribution as an instance of :class: openuake.hzardlib.pmf.PMF
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_hypo_list_node(hypo_list)[source]
Parameters:hypo_list – an array of shape (N, 3) with columns (alongStrike, downDip, weight)
Returns:a hypoList node containing N hypo nodes
openquake.commonlib.sourcewriter.build_linestring_node(line, with_depth=False)[source]

Parses a line to a Node class :param line:

Line as instance of :class: openquake.hazardlib.geo.line.Line
Parameters:with_depth (bool) – Include the depth values (True) or not (False):
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_nodal_plane_dist(npd)[source]

Returns the nodal plane distribution as a Node instance :param npd:

Nodal plane distribution as instance of :class: openquake.hazardlib.pmf.PMF
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_nonparametric_source_node(source)[source]
openquake.commonlib.sourcewriter.build_point_source_geometry(point_source)[source]

Returns the poing source geometry as a Node :param point_source:

Point source model as an instance of the :class: openquake.hazardlib.source.point.PointSource
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_point_source_node(point_source)[source]

Parses a point source to a Node class :param point_source:

Point source as instance of :class: openquake.hazardlib.source.point.PointSource
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_rupture_node(rupt, probs_occur)[source]
Parameters:
  • rupt – a hazardlib rupture object
  • probs_occur – a list of floats with sum 1
openquake.commonlib.sourcewriter.build_simple_fault_geometry(fault_source)[source]

Returns the simple fault source geometry as a Node :param fault_source:

Simple fault source model as an instance of the :class: openquake.hazardlib.source.simple_fault.SimpleFaultSource
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_simple_fault_source_node(fault_source)[source]

Parses a simple fault source to a Node class :param fault_source:

Simple fault source as instance of :class: openquake.hazardlib.source.simple_fault.SimpleFaultSource
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_slip_list_node(slip_list)[source]
Parameters:slip_list – an array of shape (N, 2) with columns (slip, weight)
Returns:a hypoList node containing N slip nodes
openquake.commonlib.sourcewriter.build_truncated_gr_mfd(mfd)[source]

Parses the truncated Gutenberg Richter MFD as a Node :param mfd:

MFD as instance of :class: openquake.hazardlib.mfd.truncated_gr.TruncatedGRMFD
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.build_youngs_coppersmith_mfd(mfd)[source]

Parses the Youngs & Coppersmith MFD as a node. Note that the MFD does not hold the total moment rate, but only the characteristic rate. Therefore the node is written to the characteristic rate version regardless of whether or not it was originally created from total moment rate :param mfd:

MFD as instance of :class: openquake.hazardlib.mfd.youngs_coppersmith_1985. YoungsCoppersmith1985MFD
Returns:Instance of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.get_distributed_seismicity_source_nodes(source)[source]

Returns list of nodes of attributes common to all distributed seismicity source classes :param source:

Seismic source as instance of :class: openquake.hazardlib.source.area.AreaSource or :class: openquake.hazardlib.source.point.PointSource
Returns:List of instances of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.get_fault_source_nodes(source)[source]

Returns list of nodes of attributes common to all fault source classes :param source:

Fault source as instance of :class: openquake.hazardlib.source.simple_fault.SimpleFaultSource or :class: openquake.hazardlib.source.complex_fault.ComplexFaultSource
Returns:List of instances of :class: openquake.commonlib.node.Node
openquake.commonlib.sourcewriter.get_source_attributes(source)[source]

Retreives a dictionary of source attributes from the source class :param source:

Seismic source as instance of :class: openquake.hazardlib.source.base.BaseSeismicSource
Returns:Dictionary of source attributes
openquake.commonlib.sourcewriter.write_source_model(dest, sources, name=None)[source]

Writes a source model to XML.

Parameters:
  • dest (str) – Destination path
  • sources (list) – Source model as list of instance of the openquake.hazardlib.source.base.BaseSeismicSource
  • name (str) – Name of the source model (if missing, extracted from the filename)

openquake.commonlib.util module

class openquake.commonlib.util.Rupture(etag, indices=None)[source]

Bases: object

Simplified Rupture class with attributes etag, indices, ses_idx, used in export.

openquake.commonlib.util.compose_arrays(a1, a2, firstfield='etag')[source]

Compose composite arrays by generating an extended datatype containing all the fields. The two arrays must have the same length.

openquake.commonlib.util.get_assets(dstore)[source]
Parameters:dstore – a datastore with keys ‘assetcol’
Returns:an ordered array of records (asset_ref, taxonomy, lon, lat)
openquake.commonlib.util.get_serial(etag)[source]
>>> get_serial("trt=00~ses=0007~src=1-3~rup=018-01")
'018'
openquake.commonlib.util.get_ses_idx(etag)[source]
>>> get_ses_idx("trt=00~ses=0007~src=1-3~rup=018-01")
7
openquake.commonlib.util.max_rel_diff(curve_ref, curve, min_value=0.01)[source]

Compute the maximum relative difference between two curves. Only values greather or equal than the min_value are considered.

>>> curve_ref = [0.01, 0.02, 0.03, 0.05, 1.0]
>>> curve = [0.011, 0.021, 0.031, 0.051, 1.0]
>>> round(max_rel_diff(curve_ref, curve), 2)
0.1
openquake.commonlib.util.max_rel_diff_index(curve_ref, curve, min_value=0.01)[source]

Compute the maximum relative difference between two sets of curves. Only values greather or equal than the min_value are considered. Return both the maximum difference and its location (array index).

>>> curve_refs = [[0.01, 0.02, 0.03, 0.05], [0.01, 0.02, 0.04, 0.06]]
>>> curves = [[0.011, 0.021, 0.031, 0.051], [0.012, 0.022, 0.032, 0.051]]
>>> max_rel_diff_index(curve_refs, curves)
(0.2, 1)
openquake.commonlib.util.rmsep(array_ref, array, min_value=0.01)[source]

Root Mean Square Error Percentage for two arrays.

Parameters:
  • array_ref – reference array
  • array – another array
  • min_value – compare only the elements larger than min_value
Returns:

the relative distance between the arrays

>>> curve_ref = numpy.array([[0.01, 0.02, 0.03, 0.05],
... [0.01, 0.02, 0.04, 0.06]])
>>> curve = numpy.array([[0.011, 0.021, 0.031, 0.051],
... [0.012, 0.022, 0.032, 0.051]])
>>> round(rmsep(curve_ref, curve), 5)
0.11292

openquake.commonlib.writers module

class openquake.commonlib.writers.CsvWriter(sep=', ', fmt='%12.8E')[source]

Bases: object

Class used in the exporters to save a bunch of CSV files

getsaved()[source]

Returns the list of files saved by this CsvWriter

save(data, fname, header=None)[source]

Save data on fname.

Parameters:
  • data – numpy array or list of lists
  • fname – path name
  • header – header to use
class openquake.commonlib.writers.HeaderTranslator(**descr)[source]

Bases: object

An utility to convert the headers in CSV files. When reading, the column names are converted into column descriptions with the method .read, when writing column descriptions are converted into column names with the method .write. The usage is

>>> htranslator = HeaderTranslator(
...     asset_ref='asset_ref:|S20',
...     rup_id='rup_id:uint32',
...     taxonomy='taxonomy:|S100')
>>> htranslator.read('asset_ref value:5'.split())
['asset_ref:|S20', 'value:5']
>>> htranslator.write('asset_ref:|S20 value:5'.split())
['asset_ref', 'value:5']
read(names)[source]
write(descr)[source]
class openquake.commonlib.writers.StreamingXMLWriter(bytestream, indent=4, encoding='utf-8', nsmap=None)[source]

Bases: object

A bynary stream XML writer. The typical usage is something like this:

with StreamingXMLWriter(output_file) as writer:
    writer.start_tag('root')
    for node in nodegenerator():
        writer.serialize(node)
    writer.end_tag('root')
emptyElement(name, attrs)[source]

Add an empty element (may have attributes)

end_tag(name)[source]

Close an XML tag

serialize(node)[source]

Serialize a node object (typically an ElementTree object)

shorten(tag)[source]

Get the short representation of a fully qualified tag

Parameters:tag (str) – a (fully qualified or not) XML tag
start_tag(name, attrs=None)[source]

Open an XML tag

openquake.commonlib.writers.build_header(dtype)[source]

Convert a numpy nested dtype into a list of strings suitable as header of csv file.

>>> imt_dt = numpy.dtype([('PGA', float, 3), ('PGV', float, 4)])
>>> build_header(imt_dt)
['PGA:float64:3', 'PGV:float64:4']
>>> gmf_dt = numpy.dtype([('A', imt_dt), ('B', imt_dt),
...                       ('idx', numpy.uint32)])
>>> build_header(gmf_dt)
['A~PGA:float64:3', 'A~PGV:float64:4', 'B~PGA:float64:3', 'B~PGV:float64:4', 'idx:uint32']
openquake.commonlib.writers.castable_to_int(s)[source]

Return True if the string s can be interpreted as an integer

openquake.commonlib.writers.extract_from(data, fields)[source]

Extract data from numpy arrays with nested records.

>>> imt_dt = numpy.dtype([('PGA', float, 3), ('PGV', float, 4)])
>>> a = numpy.array([([1, 2, 3], [4, 5, 6, 7])], imt_dt)
>>> extract_from(a, ['PGA'])
array([[ 1.,  2.,  3.]])
>>> gmf_dt = numpy.dtype([('A', imt_dt), ('B', imt_dt),
...                       ('idx', numpy.uint32)])
>>> b = numpy.array([(([1, 2, 3], [4, 5, 6, 7]),
...                  ([1, 2, 4], [3, 5, 6, 7]), 8)], gmf_dt)
>>> extract_from(b, ['idx'])
array([8], dtype=uint32)
>>> extract_from(b, ['B', 'PGV'])
array([[ 3.,  5.,  6.,  7.]])
openquake.commonlib.writers.floatformat(*args, **kwds)[source]

Context manager to change the default format string for the function openquake.commonlib.writers.scientificformat().

Parameters:fmt_string – the format to use; for instance ‘%13.9E’
openquake.commonlib.writers.parse_header(header)[source]

Convert a list of the form [‘fieldname:fieldtype:fieldsize’,...] into a numpy composite dtype. The parser understands headers generated by openquake.commonlib.writers.build_header(). Here is an example:

>>> parse_header(['PGA', 'PGV:float64', 'avg:2'])
(['PGA', 'PGV', 'avg'], dtype([('PGA', '<f4'), ('PGV', '<f8'), ('avg', '<f4', (2,))]))
Params header:a list of type descriptions
Returns:column names and the corresponding composite dtype
openquake.commonlib.writers.read_array(fname, sep=', ')[source]

Convert a CSV file without header into a numpy array of floats.

>>> from openquake.baselib.general import writetmp
>>> print read_array(writetmp('.1 .2, .3 .4, .5 .6\n'))
[[[ 0.1  0.2]
  [ 0.3  0.4]
  [ 0.5  0.6]]]
openquake.commonlib.writers.read_composite_array(fname, sep=', ')[source]

Convert a CSV file with header into a numpy array of records.

>>> from openquake.baselib.general import writetmp
>>> fname = writetmp('PGA:float64:3,PGV:float64:2,avg:float64:1\n'
...                  '.1 .2 .3,.4 .5,.6\n')
>>> print read_composite_array(fname)  # array of shape (1,)
[([0.1, 0.2, 0.3], [0.4, 0.5], [0.6])]
openquake.commonlib.writers.scientificformat(value, fmt='%13.9E', sep=' ', sep2=':')[source]
Parameters:
  • value – the value to convert into a string
  • fmt – the formatting string to use for float values
  • sep – separator to use for vector-like values
  • sep2 – second separator to use for matrix-like values

Convert a float or an array into a string by using the scientific notation and a fixed precision (by default 10 decimal digits). For instance:

>>> scientificformat(-0E0)
'0.000000000E+00'
>>> scientificformat(-0.004)
'-4.000000000E-03'
>>> scientificformat([0.004])
'4.000000000E-03'
>>> scientificformat([0.01, 0.02], '%10.6E')
'1.000000E-02 2.000000E-02'
>>> scientificformat([[0.1, 0.2], [0.3, 0.4]], '%4.1E')
'1.0E-01:2.0E-01 3.0E-01:4.0E-01'
openquake.commonlib.writers.tostring(node, indent=4, nsmap=None)[source]

Convert a node into an XML string by using the StreamingXMLWriter. This is useful for testing purposes.

Parameters:
  • node – a node object (typically an ElementTree object)
  • indent – the indentation to use in the XML (default 4 spaces)
openquake.commonlib.writers.write_csv(dest, data, sep=', ', fmt='%.6E', header=None)[source]
Parameters:
  • dest – destination filename or io.StringIO instance
  • data – array to save
  • sep – separator to use (default comma)
  • fmt – formatting string (default ‘%12.8E’)
  • header – optional list with the names of the columns to display

Module contents

exception openquake.commonlib.InvalidFile[source]

Bases: exceptions.Exception