openquake.hazardlib.calc package#

Hazardlib calculators#

Disaggregation (disagg)#

openquake.hazardlib.calc.disagg contains Disaggregator, disaggregation() as well as several aggregation functions for extracting a specific PMF from the result of disaggregation().

class openquake.hazardlib.calc.disagg.BinData(dists, lons, lats, pnes)#

Bases: tuple

dists#

Alias for field number 0

lats#

Alias for field number 2

lons#

Alias for field number 1

pnes#

Alias for field number 3

class openquake.hazardlib.calc.disagg.Disaggregator(srcs_or_ctxs, site, cmaker, bin_edges, imts=None)[source]#

Bases: object

A class to perform single-site disaggregation with methods .disagg_by_magi (called in standard disaggregation) and .disagg_mag_dist_eps (called in disaggregation by relevant source). Internally the attributes .mea and .std are set, with shape (G, M, U), for each magnitude bin.

disagg_by_magi(imtls, rlzs, rwdic, src_mutex, mon0, mon1, mon2, mon3)[source]#
Parameters:
  • imtls – a dictionary imt->imls

  • rlzs – an array of realization indices

  • rwdic – a dictionary rlz_id->weight; if non-empty, used compute the mean

  • src_mutex – dictionary used to set the self.src_mutex slices

Yields:

a dictionary with keys trti, magi, sid, rlzi, mean for each magi

disagg_mag_dist_eps(imldic, rlz_weights, src_mutex={})[source]#
Parameters:
  • imldic – a dictionary imt->iml

  • src_mutex – a dictionary with keys src_id, weight or empty

  • rlz_weights – an array with the realization weights

Returns:

a 4D matrix of rates of shape (Ma, D, E, M)

init(magi, src_mutex, mon0=<Monitor disagg mean_stds[runner]>, mon1=<Monitor disagg by eps[runner]>, mon2=<Monitor composing pnes[runner]>, mon3=<Monitor disagg matrix[runner]>)[source]#
openquake.hazardlib.calc.disagg.assert_same_shape(arrays)[source]#

Raises an AssertionError if the shapes are not consistent

openquake.hazardlib.calc.disagg.collect_std(disaggs)[source]#
Returns:

an array of shape (Ma, D, M’, G)

openquake.hazardlib.calc.disagg.disagg_source(groups, site, reduced_lt, edges_shapedic, oq, imldic, monitor=<Monitor [runner]>)[source]#

Compute disaggregation for the given source.

Parameters:
  • groups – groups containing a single source ID

  • site – a Site object

  • reduced_lt – a FullLogicTree reduced to the source ID

  • edges_shapedic – pair (bin_edges, shapedic)

  • oq – OqParam instance

  • imldic – dictionary imt->iml

  • monitor – a Monitor instance

Returns:

sid, src_id, std(Ma, D, G, M), rates(Ma, D, E, M), rates(M, L1)

openquake.hazardlib.calc.disagg.disaggregation(sources, site, imt, iml, gsim_by_trt, truncation_level, n_epsilons=None, mag_bin_width=None, dist_bin_width=None, coord_bin_width=None, source_filter=<openquake.hazardlib.calc.filters.SourceFilter object>, epsstar=False, bin_edges={}, **kwargs)[source]#

Compute “Disaggregation” matrix representing conditional probability of an intensity measure type imt exceeding, at least once, an intensity measure level iml at a geographical location site, given rupture scenarios classified in terms of:

  • rupture magnitude

  • Joyner-Boore distance from rupture surface to site

  • longitude and latitude of the surface projection of a rupture’s point closest to site

  • epsilon: number of standard deviations by which an intensity measure level deviates from the median value predicted by a GSIM, given the rupture parameters

  • rupture tectonic region type

In other words, the disaggregation matrix allows to compute the probability of each scenario with the specified properties (e.g., magnitude, or the magnitude and distance) to cause one or more exceedences of a given hazard level.

For more detailed information about the disaggregation, see for instance “Disaggregation of Seismic Hazard”, Paolo Bazzurro, C. Allin Cornell, Bulletin of the Seismological Society of America, Vol. 89, pp. 501-520, April 1999.

Parameters:
  • sources – Seismic source model, as for PSHA calculator it should be an iterator of seismic sources.

  • siteSite of interest to calculate disaggregation matrix for.

  • imt – Instance of intensity measure type class.

  • iml – Intensity measure level. A float value in units of imt.

  • gsim_by_trt – Tectonic region type to GSIM objects mapping.

  • truncation_level – Float, number of standard deviations for truncation of the intensity distribution.

  • n_epsilons – Integer number of epsilon histogram bins in the result matrix.

  • mag_bin_width – Magnitude discretization step, width of one magnitude histogram bin.

  • dist_bin_width – Distance histogram discretization step, in km.

  • coord_bin_width – Longitude and latitude histograms discretization step, in decimal degrees.

  • source_filter – Optional source-site filter function. See openquake.hazardlib.calc.filters.

  • epsstar – A boolean. When true disaggregations results including epsilon are in terms of epsilon star rather then epsilon.

  • bin_edges – Bin edges provided by the users. These override the ones automatically computed by the OQ Engine.

Returns:

A tuple of two items. First is itself a tuple of bin edges information for (in specified order) magnitude, distance, longitude, latitude, epsilon and tectonic region types.

Second item is 6d-array representing the full disaggregation matrix. Dimensions are in the same order as bin edges in the first item of the result tuple. The matrix can be used directly by pmf-extractor functions.

openquake.hazardlib.calc.disagg.get_edges_shapedic(oq, sitecol, num_tot_rlzs=None)[source]#
Returns:

(mag dist lon lat eps trt) edges and shape dictionary

openquake.hazardlib.calc.disagg.get_eps4(eps_edges, truncation_level)[source]#
Returns:

eps_min, eps_max, eps_bands, eps_cum

openquake.hazardlib.calc.disagg.get_ints(src_ids)[source]#
Returns:

array of integers from source IDs following the colon convention

openquake.hazardlib.calc.disagg.lon_lat_bins(lon, lat, size_km, coord_bin_width)[source]#

Define lon, lat bin edges for disaggregation histograms.

Parameters:
  • lon – longitude of the site

  • lat – latitude of the site

  • size_km – total size of the bins in km

  • coord_bin_width – bin width in degrees

Returns:

two arrays lon bins, lat bins

openquake.hazardlib.calc.disagg.split_by_magbin(ctxt, mag_edges)[source]#
Parameters:
  • ctxt – a context array

  • mag_edges – magnitude bin edges

Returns:

a dictionary magbin -> ctxt

openquake.hazardlib.calc.disagg.uniform_bins(min_value, max_value, bin_width)[source]#

Returns an array of bins including all values:

>>> uniform_bins(1, 10, 1.)
array([ 1.,  2.,  3.,  4.,  5.,  6.,  7.,  8.,  9., 10.])
>>> uniform_bins(1, 10, 1.1)
array([ 0. ,  1.1,  2.2,  3.3,  4.4,  5.5,  6.6,  7.7,  8.8,  9.9, 11. ])

Filters (filters)#

exception openquake.hazardlib.calc.filters.FilteredAway[source]#

Bases: Exception

class openquake.hazardlib.calc.filters.IntegrationDistance[source]#

Bases: dict

A dictionary trt -> [(mag, dist), …]

cut(min_mag_by_trt)[source]#

Cut the lower magnitudes. For instance

>>> maxdist = IntegrationDistance.new('[(4., 50), (8., 200.)]')
>>> maxdist.cut({'default': 5.})
>>> maxdist
{'default': [(5.0, 87.5), (8.0, 200.0)]}
>>> maxdist = IntegrationDistance.new('200')
>>> maxdist.cut({"Active Shallow Crust": 5.2, "default": 4.})
>>> maxdist
{'default': [(4.0, 200.0), (10.2, 200)], 'Active Shallow Crust': [(5.2, 200.0), (10.2, 200)]}
get_bounding_box(lon, lat, trt=None)[source]#

Build a bounding box around the given lon, lat by computing the maximum_distance at the given tectonic region type and magnitude.

Parameters:
  • lon – longitude

  • lat – latitude

  • trt – tectonic region type, possibly None

Returns:

min_lon, min_lat, max_lon, max_lat

get_dist_bins(trt, nbins=51)[source]#
Returns:

an array of distance bins, from 10m to maxdist

classmethod new(value)[source]#
Parameters:

value – string to be converted

Returns:

IntegrationDistance dictionary

>>> md = IntegrationDistance.new('50')
>>> md
{'default': [(2.5, 50), (10.2, 50)]}
class openquake.hazardlib.calc.filters.SourceFilter(sitecol, integration_distance={'default': [(2.5, 1000), (10.2, 1000)]})[source]#

Bases: object

Filter objects have a .filter method yielding filtered sources and the IDs of the sites within the given maximum distance. Filter the sources by using self.sitecol.within_bbox which is based on numpy.

close_sids(src_or_rec, trt=None, maxdist=None)[source]#
Parameters:
  • src_or_rec – a source or a rupture record

  • trt – passed only if src_or_rec is a rupture record

Returns:

the site indices within the maximum_distance of the hypocenter, plus the maximum size of the bounding box

filter(sources)[source]#
Parameters:

sources – a sequence of sources

Yields:

pairs (sources, sites)

get_close(secparams)[source]#
Parameters:

secparams – a structured array with fields tl0, tl1, tr0, tr1

Returns:

an array with the number of close sites per secparams

get_close_sites(source)[source]#

Returns the sites within the integration distance from the source, or None.

get_enlarged_box(src, maxdist=None)[source]#

Get the enlarged bounding box of a source.

Parameters:
  • src – a source object

  • maxdist – a scalar maximum distance (or None)

Returns:

a bounding box (min_lon, min_lat, max_lon, max_lat)

get_nsites(rups, trt)[source]#
Returns:

array of float32 with the number of close sites per rupture

get_rectangle(src)[source]#
Parameters:

src – a source object

Returns:

((min_lon, min_lat), width, height), useful for plotting

reduce(multiplier=5)[source]#

Reduce the SourceFilter to a subset of sites

split(sources)[source]#
Yields:

pairs (split, sites)

openquake.hazardlib.calc.filters.context(src)[source]#

Used to add the source_id to the error message. To be used as

with context(src):

operation_with(src)

Typically the operation is filtering a source, that can fail for tricky geometries.

openquake.hazardlib.calc.filters.floatdict(value)[source]#
Parameters:

value – input string corresponding to a literal Python number or dictionary

Returns:

a Python dictionary key -> number

>>> floatdict("200")
{'default': 200}
>>> floatdict("{'active shallow crust': 250., 'default': 200}")
{'active shallow crust': 250.0, 'default': 200}
openquake.hazardlib.calc.filters.get_distances(rupture, sites, param)[source]#
Parameters:
  • rupture – a rupture

  • sites – a mesh of points or a site collection

  • param – the kind of distance to compute (default rjb)

  • dcache – distance cache dictionary or None if disabled

Returns:

an array of distances from the given sites

openquake.hazardlib.calc.filters.get_dparam(surface, sites, param)[source]#
openquake.hazardlib.calc.filters.getdefault(dic_with_default, key)[source]#
Parameters:
  • dic_with_default – a dictionary with a ‘default’ key

  • key – a key that may be present in the dictionary or not

Returns:

the value associated to the key, or to ‘default’

openquake.hazardlib.calc.filters.magdepdist(pairs)[source]#
Parameters:

pairs – a list of pairs [(mag, dist), …]

Returns:

a scipy.interpolate.interp1d function

openquake.hazardlib.calc.filters.magstr(mag)[source]#
Returns:

a string representation of the magnitude

openquake.hazardlib.calc.filters.split_source(src)[source]#
Parameters:

src – a splittable (or not splittable) source

Returns:

the underlying sources (or the source itself)

openquake.hazardlib.calc.filters.unique_sorted(items)[source]#

Check that the items are unique and sorted

openquake.hazardlib.calc.filters.upper_maxdist(idist)[source]#
Returns:

the maximum distance in a dictionary trt->dists

Ground Motion Fields (gmf)#

Module gmf exports ground_motion_fields().

exception openquake.hazardlib.calc.gmf.CorrelationButNoInterIntraStdDevs(corr, gsim)[source]#

Bases: Exception

class openquake.hazardlib.calc.gmf.GmfComputer(rupture, sitecol, cmaker, correlation_model=None, cross_correl=None, amplifier=None, sec_perils=())[source]#

Bases: object

Given an earthquake rupture, the GmfComputer computes ground shaking over a set of sites, by randomly sampling a ground shaking intensity model.

Parameters:

rupture – EBRupture to calculate ground motion fields radiated from.

:param openquake.hazardlib.site.SiteCollection sitecol:

a complete SiteCollection

Parameters:
  • cmaker – a openquake.hazardlib.gsim.base.ContextMaker instance

  • correlation_model – Instance of a spatial correlation model object. See openquake.hazardlib.correlation. Can be None, in which case non-correlated ground motion fields are calculated. Correlation model is not used if truncation_level is zero.

  • cross_correl – Instance of a cross correlation model object. See openquake.hazardlib.cross_correlation. Can be None, in which case non-cross-correlated ground motion fields are calculated.

  • amplifier – None or an instance of Amplifier

  • sec_perils – Tuple of secondary perils. See openquake.hazardlib.sep. Can be None, in which case no secondary perils need to be evaluated.

build_sig_eps(se_dt)[source]#
Returns:

a structured array of size E with fields (eid, rlz_id, sig_inter_IMT, eps_inter_IMT)

compute_all(mean_stds=None, max_iml=None, mmon=<Monitor [runner]>, cmon=<Monitor [runner]>, umon=<Monitor [runner]>)[source]#
Returns:

DataFrame with fields eid, rlz, sid, gmv_X, …

init_eid_rlz_sig_eps()[source]#

Initialize the attributes eid, rlz, sig, eps with shapes E, E, EM, EM

mtp_dt = dtype([('rup_id', '<i8'), ('site_id', '<u4'), ('gsim_id', '<u2'), ('imt_id', 'u1'), ('mea', '<f4'), ('tau', '<f4'), ('phi', '<f4')])#
strip_zeros(data)[source]#
Returns:

a DataFrame with the nonzero GMVs

update(data, array, rlzs, mean, max_iml=None)[source]#

Updates the data dictionary with the values coming from the array of GMVs. Also indirectly updates the arrays .sig and .eps.

openquake.hazardlib.calc.gmf.build_eid_sid_rlz(allrlzs, sids, eids, rlzs)[source]#
openquake.hazardlib.calc.gmf.calc_gmf_simplified(ebrupture, sitecol, cmaker)[source]#

A simplified version of the GmfComputer for event based calculations. Used only for pedagogical purposes. Here is an example of usage:

from unittest.mock import Mock import numpy from openquake.hazardlib import valid, contexts, site, geo from openquake.hazardlib.source.rupture import EBRupture, build_planar from openquake.hazardlib.calc.gmf import calc_gmf_simplified, GmfComputer

imts = [‘PGA’] rlzs = numpy.arange(3, dtype=numpy.uint32) rlzs_by_gsim = {valid.gsim(‘BooreAtkinson2008’): rlzs} lons = [0., 0.] lats = [0., 1.] siteparams = Mock(reference_vs30_value=760.) sitecol = site.SiteCollection.from_points(lons, lats, sitemodel=siteparams) hypo = geo.point.Point(0, .5, 20) rup = build_planar(hypo, mag=7., rake=0.) cmaker = contexts.simple_cmaker(rlzs_by_gsim, imts, truncation_level=3.) ebr = EBRupture(rup, 0, 0, n_occ=2, id=1) ebr.seed = 42 print(cmaker) print(sitecol.array) print(ebr)

gmfa = calc_gmf_simplified(ebr, sitecol, cmaker) print(gmfa) # numbers considering the full site collection sites = site.SiteCollection.from_points([0], [1], sitemodel=siteparams) gmfa = calc_gmf_simplified(ebr, sites, cmaker) print(gmfa) # different numbers considering half of the site collection

openquake.hazardlib.calc.gmf.exp(vals, notMMI)[source]#

Exponentiate the values unless the IMT is MMI

openquake.hazardlib.calc.gmf.ground_motion_fields(rupture, sites, imts, gsim, truncation_level, realizations, correlation_model=None, seed=0)[source]#

Given an earthquake rupture, the ground motion field calculator computes ground shaking over a set of sites, by randomly sampling a ground shaking intensity model. A ground motion field represents a possible ‘realization’ of the ground shaking due to an earthquake rupture.

Note

This calculator is using random numbers. In order to reproduce the same results numpy random numbers generator needs to be seeded, see http://docs.scipy.org/doc/numpy/reference/generated/numpy.random.seed.html

Parameters:
  • rupture (openquake.hazardlib.source.rupture.Rupture) – Rupture to calculate ground motion fields radiated from.

  • sites (openquake.hazardlib.site.SiteCollection) – Sites of interest to calculate GMFs.

  • imts – List of intensity measure type objects (see openquake.hazardlib.imt).

  • gsim – Ground-shaking intensity model, instance of subclass of either GMPE or IPE.

  • truncation_level – Float, number of standard deviations for truncation of the intensity distribution

  • realizations – Integer number of GMF simulations to compute.

  • correlation_model – Instance of correlation model object. See openquake.hazardlib.correlation. Can be None, in which case non-correlated ground motion fields are calculated. Correlation model is not used if truncation_level is zero.

  • seed (int) – The seed used in the numpy random number generator

Returns:

Dictionary mapping intensity measure type objects (same as in parameter imts) to 2d numpy arrays of floats, representing different simulations of ground shaking intensity for all sites in the collection. First dimension represents sites and second one is for simulations.

openquake.hazardlib.calc.gmf.set_max_min(array, mean, max_iml, min_iml, mmi_index)[source]#

Hazard Curves (hazard_curve)#

openquake.hazardlib.calc.hazard_curve implements calc_hazard_curves(). Here is an example of a classical PSHA parallel calculator computing the hazard curves per each realization in less than 20 lines of code:

import sys
from openquake.commonlib import logs
from openquake.calculators.base import calculators

def main(job_ini):
    with logs.init(job_ini) as log:
        calc = calculators(log.get_oqparam(), log.calc_id)
        calc.run(individual_rlzs='true', shutdown=True)
        print('The hazard curves are in %s::/hcurves-rlzs'
             % calc.datastore.filename)

if __name__ == '__main__':
    main(sys.argv[1])  # path to a job.ini file

NB: the implementation in the engine is smarter and more efficient. Here we start a parallel computation per each realization, the engine manages all the realizations at once.

openquake.hazardlib.calc.hazard_curve.calc_hazard_curve(site1, src, gsims, oqparam, monitor=<Monitor [runner]>)[source]#
Parameters:
  • site1 – site collection with a single site

  • src – a seismic source object

  • gsims – a list of GSIM objects

  • oqparam – an object with attributes .maximum_distance, .imtls

  • monitor – a Monitor instance (optional)

Returns:

an array of shape (L, G)

openquake.hazardlib.calc.hazard_curve.calc_hazard_curves(groups, srcfilter, imtls, gsim_by_trt, truncation_level=99.0, apply=<function sequential_apply>, reqv=None, **kwargs)[source]#

Compute hazard curves on a list of sites, given a set of seismic source groups and a dictionary of ground shaking intensity models (one per tectonic region type).

Probability of ground motion exceedance is computed in different ways depending if the sources are independent or mutually exclusive.

Parameters:
  • groups – A sequence of groups of seismic sources objects (instances of of BaseSeismicSource).

  • srcfilter – A source filter over the site collection or the site collection itself

  • imtls – Dictionary mapping intensity measure type strings to lists of intensity measure levels.

  • gsim_by_trt – Dictionary mapping tectonic region types (members of openquake.hazardlib.const.TRT) to GMPE or IPE objects.

  • truncation_level – Float, number of standard deviations for truncation of the intensity distribution.

  • apply – apply function to use (default sequential_apply)

  • reqv – If not None, an instance of RjbEquivalent

Returns:

An array of size N, where N is the number of sites, which elements are records with fields given by the intensity measure types; the size of each field is given by the number of levels in imtls.

openquake.hazardlib.calc.hazard_curve.classical(group, sitecol, cmaker)[source]#

Compute the hazard curves for a set of sources belonging to the same tectonic region type for all the GSIMs associated to that TRT. The arguments are the same as in calc_hazard_curves(), except for gsims, which is a list of GSIM instances.

Returns:

a dictionary with keys pmap, source_data, rup_data, extra

Stochastic Event Set (stochastic)#

openquake.hazardlib.calc.stochastic contains stochastic_event_set().

openquake.hazardlib.calc.stochastic.get_rup_array(ebruptures, srcfilter=<openquake.hazardlib.calc.filters.SourceFilter object>, model='???', model_geom=None)[source]#

Convert a list of EBRuptures into a numpy composite array, by filtering out the ruptures far away from every site. If a shapely polygon is passed in model_geom, ruptures outside the polygon are discarded.

openquake.hazardlib.calc.stochastic.sample_cluster(group, num_ses, ses_seed)[source]#

Yields ruptures generated by a cluster of sources

Parameters:
  • group – A sequence of sources of the same group

  • num_ses – Number of stochastic event sets

  • ses_seed – Global seed for rupture sampling

Yields:

dictionaries with keys rup_array, source_data, eff_ruptures

openquake.hazardlib.calc.stochastic.sample_ruptures(sources, cmaker, sitecol=None, monitor=<Monitor [runner]>)[source]#
Parameters:
  • sources – a sequence of sources of the same group

  • cmaker – a ContextMaker instance with ses_per_logic_tree_path, ses_seed

  • sitecol – SiteCollection instance used for filtering (None for no filtering)

  • monitor – monitor instance

Yields:

dictionaries with keys rup_array, source_data