openquake.calculators package

openquake.calculators.base module

exception openquake.calculators.base.AssetSiteAssociationError[source]

Bases: exceptions.Exception

Raised when there are no hazard sites close enough to any asset

class openquake.calculators.base.BaseCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: object

Abstract base class for all calculators.

Parameters:
  • oqparam – OqParam object
  • monitor – monitor object
  • calc_id – numeric calculation ID
assetcol
before_export()[source]

Collect the realizations and set the attributes nbytes

core_task(*args)[source]

Core routine running on the workers.

csm
execute()[source]

Execution phase. Usually will run in parallel the core function and return a dictionary with the results.

export(exports=None)[source]

Export all the outputs in the datastore in the given export formats.

Returns:dictionary output_key -> sorted list of exported paths
is_stochastic = False
performance
post_execute(result)[source]

Post-processing phase of the aggregated output. It must be overridden with the export code. It will return a dictionary of output files.

pre_calculator = None
pre_execute()[source]

Initialization phase.

run(pre_execute=True, concurrent_tasks=None, close=True, **kw)[source]

Run the calculation and return the exported outputs.

save_params(**kw)[source]

Update the current calculation parameters and save engine_version

set_log_format()[source]

Set the format of the root logger

sitecol
taxonomies[source]
class openquake.calculators.base.HazardCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.base.BaseCalculator

Base class for hazard calculators based on source models

assoc_assets_sites(sitecol)[source]
Parameters:sitecol – a sequence of sites
Returns:a pair (filtered_sites, assets_by_site)

The new site collection is different from the original one if some assets were discarded or if there were missing assets for some sites.

basic_pre_execute()[source]
compute_previous()[source]
count_assets()[source]

Count how many assets are taken into consideration by the calculator

init()[source]

To be overridden to initialize the datasets needed by the calculation

load_riskmodel()[source]

Read the risk model and set the attribute .riskmodel. The riskmodel can be empty for hazard calculations. Save the loss ratios (if any) in the datastore.

post_process()[source]

For compatibility with the engine

pre_execute()[source]

Check if there is a pre_calculator or a previous calculation ID. If yes, read the inputs by invoking the precalculator or by retrieving the previous calculation; if not, read the inputs directly.

read_exposure()[source]

Read the exposure, the riskmodel and update the attributes .exposure, .sitecol, .assets_by_site, .taxonomies.

read_previous(precalc_id)[source]
read_risk_data()[source]

Read the exposure (if any), the risk model (if any) and then the site collection, possibly extracted from the exposure.

save_data_transfer(iter_result)[source]

Save information about the data transfer in the risk calculation as attributes of agg_loss_table

exception openquake.calculators.base.InvalidCalculationID[source]

Bases: exceptions.Exception

Raised when running a post-calculation on top of an incompatible pre-calculation

class openquake.calculators.base.RiskCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.base.HazardCalculator

Base class for all risk calculators. A risk calculator must set the attributes .riskmodel, .sitecol, .assets_by_site, .exposure .riskinputs in the pre_execute phase.

build_riskinputs(hazards_by_key, eps=array(, []dtype=float64))[source]
Parameters:
  • hazards_by_key – a dictionary key -> IMT -> array of length num_sites
  • eps – a matrix of epsilons (possibly empty)
Returns:

a list of RiskInputs objects, sorted by IMT.

check_poes(curves_by_trt_gsim)[source]

Overridden in ClassicalDamage

execute()[source]

Parallelize on the riskinputs and returns a dictionary of results. Require a .core_task to be defined with signature (riskinputs, riskmodel, rlzs_assoc, monitor).

extra_args = ()
make_eps(num_ruptures)[source]
Parameters:num_ruptures – the size of the epsilon array for each asset
class openquake.calculators.base.Site

Bases: tuple

Site(sid, lon, lat)

lat

Alias for field number 2

lon

Alias for field number 1

sid

Alias for field number 0

openquake.calculators.base.check_precalc_consistency(calc_mode, precalc_mode)[source]

Defensive programming against users providing an incorrect pre-calculation ID (with --hazard-calculation-id)

Parameters:
  • calc_mode – calculation_mode of the current calculation
  • precalc_mode – calculation_mode of the previous calculation
openquake.calculators.base.check_time_event(oqparam, time_events)[source]

Check the time_event parameter in the datastore, by comparing with the periods found in the exposure.

openquake.calculators.base.gsim_names(rlz)[source]

Names of the underlying GSIMs separated by spaces

openquake.calculators.base.set_array(longarray, shortarray)[source]
Parameters:
  • longarray – a numpy array of floats of length L >= l
  • shortarray – a numpy array of floats of length l

Fill longarray with the values of shortarray, starting from the left. If shortarry is shorter than longarray, then the remaining elements on the right are filled with numpy.nan values.

openquake.calculators.calc module

openquake.calculators.classical module

class openquake.calculators.classical.BBdict[source]

Bases: openquake.baselib.general.AccumDict

A serializable dictionary containing bounding box information

dt = dtype([('lt_model_id', '<u2'), ('site_id', '<u2'), ('min_dist', '<f8'), ('max_dist', '<f8'), ('east', '<f8'), ('west', '<f8'), ('south', '<f8'), ('north', '<f8')])
class openquake.calculators.classical.BoundingBox(lt_model_id, site_id)[source]

Bases: object

A class to store the bounding box in distances, longitudes and magnitudes, given a source model and a site. This is used for disaggregation calculations. The goal is to determine the minimum and maximum distances of the ruptures generated from the model from the site; moreover the maximum and minimum longitudes and magnitudes are stored, by taking in account the international date line.

bins_edges(dist_bin_width, coord_bin_width)[source]

Define bin edges for disaggregation histograms, from the bin data collected from the ruptures.

Parameters:
  • dists – array of distances from the ruptures
  • lons – array of longitudes from the ruptures
  • lats – array of latitudes from the ruptures
  • dist_bin_width – distance_bin_width from job.ini
  • coord_bin_width – coordinate_bin_width from job.ini
update(dists, lons, lats)[source]

Compare the current bounding box with the value in the arrays dists, lons, lats and enlarge it if needed.

Parameters:
  • dists – a sequence of distances
  • lons – a sequence of longitudes
  • lats – a sequence of latitudes
update_bb(bb)[source]

Compare the current bounding box with the given bounding box and enlarge it if needed.

Parameters:bb – an instance of :class: openquake.engine.calculators.hazard.classical.core.BoundingBox
class openquake.calculators.classical.ClassicalCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.classical.PSHACalculator

Classical PSHA calculator

core_task(pmap_by_grp, sids, pstats, rlzs_assoc, monitor)
Parameters:
  • pmap_by_grp – dictionary of probability maps by source group ID
  • sids – array of site IDs
  • pstats – instance of PmapStats
  • rlzs_assoc – instance of RlzsAssoc
  • monitor – instance of Monitor
Returns:

a dictionary kind -> ProbabilityMap

The “kind” is a string of the form ‘rlz-XXX’ or ‘mean’ of ‘quantile-XXX’ used to specify the kind of output.

execute()[source]

Builds hcurves and stats from the stored PoEs

gen_args(pmap_by_grp)[source]
Parameters:pmap_by_grp – dictionary of ProbabilityMaps keyed by src_grp_id
Yields:arguments for the function build_hcurves_and_stats
post_execute(acc)[source]

Save the number of bytes per each dataset

pre_calculator = 'psha'
save_hcurves(acc, pmap_by_kind)[source]

Works by side effect by saving hcurves and statistics on the datastore; the accumulator stores the number of bytes saved.

Parameters:
  • acc – dictionary kind -> nbytes
  • pmap_by_kind – a dictionary of ProbabilityMaps
class openquake.calculators.classical.HazardCurve

Bases: tuple

HazardCurve(location, poes)

location

Alias for field number 0

poes

Alias for field number 1

class openquake.calculators.classical.PSHACalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.base.HazardCalculator

Classical PSHA calculator

agg_dicts(acc, val)[source]

Aggregate dictionaries of hazard curves by updating the accumulator.

Parameters:
  • acc – accumulator dictionary
  • val – a nested dictionary grp_id -> ProbabilityMap
core_task(sources, sitecol, gsims, monitor)
Parameters:
  • sources – a non-empty sequence of sources of homogeneous tectonic region type
  • sitecol – a SiteCollection instance
  • gsims – a list of GSIMs for the current tectonic region type
  • monitor – a monitor instance
Returns:

an AccumDict rlz -> curves

count_eff_ruptures(result_dict, src_group)[source]

Returns the number of ruptures in the src_group (after filtering) or 0 if the src_group has been filtered away.

Parameters:
  • result_dict – a dictionary with keys (grp_id, gsim)
  • src_group – a SourceGroup instance
execute()[source]

Run in parallel core_task(sources, sitecol, monitor), by parallelizing on the sources according to their weight and tectonic region type.

gen_args(src_groups, oq, monitor)[source]

Used in the case of large source model logic trees.

Parameters:
Yields:

(sources, sites, gsims, monitor) tuples

post_execute(pmap_by_grp_id)[source]

Collect the hazard curves by realization and export them.

Parameters:pmap_by_grp_id – a dictionary grp_id -> hazard curves
source_info
store_source_info(infos)[source]
zerodict()[source]

Initial accumulator, a dict grp_id -> ProbabilityMap(L, G)

openquake.calculators.classical.build_hcurves_and_stats(pmap_by_grp, sids, pstats, rlzs_assoc, monitor)[source]
Parameters:
  • pmap_by_grp – dictionary of probability maps by source group ID
  • sids – array of site IDs
  • pstats – instance of PmapStats
  • rlzs_assoc – instance of RlzsAssoc
  • monitor – instance of Monitor
Returns:

a dictionary kind -> ProbabilityMap

The “kind” is a string of the form ‘rlz-XXX’ or ‘mean’ of ‘quantile-XXX’ used to specify the kind of output.

openquake.calculators.classical.classical(sources, sitecol, gsims, monitor)[source]
Parameters:
  • sources – a non-empty sequence of sources of homogeneous tectonic region type
  • sitecol – a SiteCollection instance
  • gsims – a list of GSIMs for the current tectonic region type
  • monitor – a monitor instance
Returns:

an AccumDict rlz -> curves

openquake.calculators.classical.saving_sources_by_task(iterargs, dstore)[source]

Yield the iterargs again by populating ‘task_info/source_ids’

openquake.calculators.classical.split_filter_source(src, sites, ss_filter, random_seed)[source]
Parameters:
  • src – an heavy source
  • sites – sites affected by the source
  • ss_filter – a SourceSitesFilter instance
Random_seed:

used only for event based calculations

Returns:

a list of split sources

openquake.calculators.classical_bcr module

class openquake.calculators.classical_bcr.ClassicalBCRCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.classical_risk.ClassicalRiskCalculator

Classical BCR Risk calculator

core_task(riskinput, riskmodel, rlzs_assoc, bcr_dt, monitor)

Compute and return the average losses for each asset.

Parameters:
post_execute(result)[source]
pre_execute()[source]
openquake.calculators.classical_bcr.classical_bcr(riskinput, riskmodel, rlzs_assoc, bcr_dt, monitor)[source]

Compute and return the average losses for each asset.

Parameters:

openquake.calculators.classical_damage module

class openquake.calculators.classical_damage.ClassicalDamageCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.classical_risk.ClassicalRiskCalculator

Scenario damage calculator

check_poes(curves_by_trt_gsim)[source]

Raise an error if one PoE = 1, since it would produce a log(0) in openquake.risklib.scientific.annual_frequency_of_exceedence

core_task(riskinput, riskmodel, rlzs_assoc, monitor)

Core function for a classical damage computation.

Parameters:
Returns:

a nested dictionary rlz_idx -> asset -> <damage array>

damages
post_execute(result)[source]

Export the result in CSV format.

Parameters:result – a dictionary asset -> fractions per damage state
openquake.calculators.classical_damage.classical_damage(riskinput, riskmodel, rlzs_assoc, monitor)[source]

Core function for a classical damage computation.

Parameters:
Returns:

a nested dictionary rlz_idx -> asset -> <damage array>

openquake.calculators.classical_risk module

class openquake.calculators.classical_risk.ClassicalRiskCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.base.RiskCalculator

Classical Risk calculator

avg_losses
core_task(riskinput, riskmodel, rlzs_assoc, monitor)

Compute and return the average losses for each asset.

Parameters:
post_execute(result)[source]

Save the losses in a compact form.

pre_calculator = 'classical'
pre_execute()[source]

Associate the assets to the sites and build the riskinputs.

save_loss_curves(result)[source]

Saving loss curves in the datastore.

Parameters:result – aggregated result of the task classical_risk
save_loss_maps(result)[source]

Saving loss maps in the datastore.

Parameters:result – aggregated result of the task classical_risk
openquake.calculators.classical_risk.by_l_assets(output)[source]
openquake.calculators.classical_risk.classical_risk(riskinput, riskmodel, rlzs_assoc, monitor)[source]

Compute and return the average losses for each asset.

Parameters:

openquake.calculators.disaggregation module

Disaggregation calculator core functionality

class openquake.calculators.disaggregation.BinData

Bases: tuple

BinData(mags, dists, lons, lats, trts, pnes)

dists

Alias for field number 1

lats

Alias for field number 3

lons

Alias for field number 2

mags

Alias for field number 0

pnes

Alias for field number 5

trts

Alias for field number 4

class openquake.calculators.disaggregation.DisaggregationCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.classical.ClassicalCalculator

Classical PSHA disaggregation calculator

agg_result(acc, result)[source]

Collect the results coming from compute_disagg into self.results, a dictionary with key (sid, rlz.id, poe, imt, iml, trt_names) and values which are probability arrays.

Parameters:
  • acc – dictionary accumulating the results
  • result – dictionary with the result coming from a task
full_disaggregation()[source]

Run the disaggregation phase after hazard curve finalization.

get_curves(sid)[source]

Get all the relevant hazard curves for the given site ordinal. Returns a dictionary {(rlz_id, imt) -> curve}.

post_execute(nbytes_by_kind)[source]

Performs the disaggregation

save_disagg_result(site_id, bin_edges, trt_names, matrix, rlz_id, investigation_time, imt_str, iml, poe)[source]

Save a computed disaggregation matrix to hzrdr.disagg_result (see DisaggResult).

Parameters:
  • site_id – id of the current site
  • bin_edges – The 5-uple mag, dist, lon, lat, eps
  • trt_names – The list of Tectonic Region Types
  • matrix – A probability array
  • rlz_id – ordinal of the realization to which the results belong.
  • investigation_time (float) – Investigation time (years) for the calculation.
  • imt_str – Intensity measure type string (PGA, SA, etc.)
  • iml (float) – Intensity measure level interpolated (using poe) from the hazard curve at the site.
  • poe (float) – Disaggregation probability of exceedance value for this result.
save_disagg_results(results)[source]

Save all the results of the disaggregation. NB: the number of results to save is #sites * #rlzs * #disagg_poes * #IMTs.

Parameters:results – a dictionary of probability arrays
openquake.calculators.disaggregation.compute_disagg(sitecol, sources, src_group_id, rlzs_assoc, trt_names, curves_dict, bin_edges, oqparam, monitor)[source]
Parameters:
  • sitecol – a openquake.hazardlib.site.SiteCollection instance
  • sources – list of hazardlib source objects
  • src_group_id – numeric ID of a SourceGroup instance
  • rlzs_assoc – a openquake.commonlib.source.RlzsAssoc instance
  • trt_names (dict) – a tuple of names for the given tectonic region type
  • curves_dict – a dictionary with the hazard curves for sites, realizations and IMTs
  • bin_egdes – a dictionary site_id -> edges
  • oqparam – the parameters in the job.ini file
  • monitor – monitor of the currently running job
Returns:

a dictionary of probability arrays, with composite key (sid, rlz.id, poe, imt, iml, trt_names).

openquake.calculators.event_based module

class openquake.calculators.event_based.EBRupture(rupture, indices, events, source_id, grp_id, serial)[source]

Bases: object

An event based rupture. It is a wrapper over a hazardlib rupture object, containing an array of site indices affected by the rupture, as well as the tags of the corresponding seismic events.

eids[source]

An array with the underlying event IDs

etags[source]

An array of tags for the underlying seismic events

export(mesh)[source]

Yield openquake.commonlib.util.Rupture objects, with all the attributes set, suitable for export in XML format.

multiplicity[source]

How many times the underlying rupture occurs.

class openquake.calculators.event_based.EventBasedCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.classical.ClassicalCalculator

Event based PSHA calculator generating the ground motion fields and the hazard curves from the ruptures, depending on the configuration parameters.

combine_pmaps_and_save_gmfs(acc, res)[source]

Combine the hazard curves (if any) and save the gmfs (if any) sequentially; notice that the gmfs may come from different tasks in any order.

Parameters:
  • acc – an accumulator for the hazard curves
  • res – a dictionary rlzi, imt -> [gmf_array, curves_by_imt]
Returns:

a new accumulator

core_task(getter, rlzs, monitor)
Parameters:
  • eb_ruptures – a list of blocks of EBRuptures of the same SESCollection
  • sitecol – a openquake.hazardlib.site.SiteCollection instance
  • imts – a list of intensity measure type strings
  • rlzs_by_gsim – a dictionary gsim -> associated realizations
  • monitor – a Monitor instance
Returns:

a dictionary with keys gmfcoll and hcurves

execute()[source]

Run in parallel core_task(sources, sitecol, monitor), by parallelizing on the ruptures according to their weight and tectonic region type.

gen_args(ebruptures)[source]
Parameters:ebruptures – a list of EBRupture objects to be split
Yields:the arguments for compute_gmfs_and_curves
is_stochastic = True
post_execute(result)[source]
Parameters:result – a dictionary (src_group_id, gsim) -> haz_curves or an empty dictionary if hazard_curves_from_gmfs is false
pre_calculator = 'event_based_rupture'
class openquake.calculators.event_based.EventBasedRuptureCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.classical.PSHACalculator

Event based PSHA calculator generating the ruptures only

agg_dicts(acc, ruptures_by_grp_id)[source]

Accumulate dictionaries of ruptures and populate the events dataset in the datastore.

Parameters:
  • acc – accumulator dictionary
  • ruptures_by_grp_id – a nested dictionary grp_id -> ruptures
core_task(sources, sitecol, gsims, monitor)
Parameters:
  • sources – List of commonlib.source.Source tuples
  • sitecol – a openquake.hazardlib.site.SiteCollection instance
  • gsims – a list of GSIMs for the current tectonic region model
  • monitor – monitor instance
Returns:

a dictionary src_group_id -> [Rupture instances]

count_eff_ruptures(ruptures_by_grp_id, src_group)[source]

Returns the number of ruptures sampled in the given src_group.

Parameters:
  • ruptures_by_grp_id – a dictionary with key grp_id
  • src_group – a SourceGroup instance
init()[source]

Set the random seed passed to the SourceManager and the minimum_intensity dictionary.

is_stochastic = True
post_execute(result)[source]

Save the SES collection

save_ruptures(ruptures_by_grp_id)[source]

Extend the ‘events’ dataset with the given ruptures

zerodict()[source]

Initial accumulator, a dictionary (grp_id, gsim) -> curves

openquake.calculators.event_based.build_eb_ruptures(src, num_occ_by_rup, rupture_filter, random_seed, rup_mon)[source]

Filter the ruptures stored in the dictionary num_occ_by_rup and yield pairs (rupture, <list of associated EBRuptures>)

openquake.calculators.event_based.compute_gmfs_and_curves(getter, rlzs, monitor)[source]
Parameters:
  • eb_ruptures – a list of blocks of EBRuptures of the same SESCollection
  • sitecol – a openquake.hazardlib.site.SiteCollection instance
  • imts – a list of intensity measure type strings
  • rlzs_by_gsim – a dictionary gsim -> associated realizations
  • monitor – a Monitor instance
Returns:

a dictionary with keys gmfcoll and hcurves

openquake.calculators.event_based.compute_ruptures(sources, sitecol, gsims, monitor)[source]
Parameters:
  • sources – List of commonlib.source.Source tuples
  • sitecol – a openquake.hazardlib.site.SiteCollection instance
  • gsims – a list of GSIMs for the current tectonic region model
  • monitor – monitor instance
Returns:

a dictionary src_group_id -> [Rupture instances]

openquake.calculators.event_based.get_geom(surface, is_from_fault_source, is_multi_surface)[source]

The following fields can be interpreted different ways, depending on the value of is_from_fault_source. If is_from_fault_source is True, each of these fields should contain a 2D numpy array (all of the same shape). Each triple of (lon, lat, depth) for a given index represents the node of a rectangular mesh. If is_from_fault_source is False, each of these fields should contain a sequence (tuple, list, or numpy array, for example) of 4 values. In order, the triples of (lon, lat, depth) represent top left, top right, bottom left, and bottom right corners of the the rupture’s planar surface. Update: There is now a third case. If the rupture originated from a characteristic fault source with a multi-planar-surface geometry, lons, lats, and depths will contain one or more sets of 4 points, similar to how planar surface geometry is stored (see above).

Parameters:
  • rupture – an instance of openquake.hazardlib.source.rupture.BaseProbabilisticRupture
  • is_from_fault_source – a boolean
  • is_multi_surface – a boolean
openquake.calculators.event_based.get_mean_curves(dstore)[source]

Extract the mean hazard curves from the datastore, as a composite array of length nsites.

openquake.calculators.event_based.sample_ruptures(src, num_ses, num_samples, seed)[source]

Sample the ruptures contained in the given source.

Parameters:
  • src – a hazardlib source object
  • num_ses – the number of Stochastic Event Sets to generate
  • num_samples – how many samples for the given source
  • seed – master seed from the job.ini file
Returns:

a dictionary of dictionaries rupture -> {ses_id: num_occurrences}

openquake.calculators.event_based_risk module

class openquake.calculators.event_based_risk.EbriskCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.base.RiskCalculator

Event based PSHA calculator generating the total losses by taxonomy

build_starmap(ssm, sitecol, assetcol, riskmodel, imts, trunc_level, correl_model, min_iml, monitor)[source]
Parameters:
  • ssm – CompositeSourceModel containing a single source model
  • sitecol – a SiteCollection instance
  • assetcol – an AssetCollection instance
  • riskmodel – a RiskModel instance
  • imts – a list of Intensity Measure Types
  • trunc_level – truncation level
  • correl_model – correlation model
  • min_iml – vector of minimum intensities, one per IMT
  • monitor – a Monitor instance
Returns:

a pair (starmap, dictionary)

static compute_ruptures(sources, sitecol, gsims, monitor)
Parameters:
  • sources – List of commonlib.source.Source tuples
  • sitecol – a openquake.hazardlib.site.SiteCollection instance
  • gsims – a list of GSIMs for the current tectonic region model
  • monitor – monitor instance
Returns:

a dictionary src_group_id -> [Rupture instances]

execute()[source]

Run the calculator and aggregate the results

gen_args()[source]

Yield the arguments required by build_starmap, i.e. the source models, the asset collection, the riskmodel and others.

is_stochastic = True
post_execute(num_events)[source]

Save an array of losses by taxonomy of shape (T, L, R).

pre_calculator = None
save_agglosses(agglosses, offset)[source]

Save the event loss tables incrementally.

Parameters:
  • agglosses – a dictionary lr -> {eid: loss}
  • offset – realization offset
save_results(allres, num_rlzs)[source]
Parameters:
  • allres – an iterable of result iterators
  • num_rlzs – the total number of realizations
Returns:

the total number of events

class openquake.calculators.event_based_risk.EventBasedRiskCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.base.RiskCalculator

Event based PSHA calculator generating the event loss table and fixed ratios loss curves.

agg(acc, result)[source]

Aggregate losses and store them in the datastore.

Parameters:
  • acc – accumulator dictionary
  • result – dictionary coming from event_based_risk
build_agg_curve_and_stats(builder)[source]

Build a single loss curve per realization. It is NOT obtained by aggregating the loss curves; instead, it is obtained without generating the loss curves, directly from the the aggregate losses.

build_agg_curve_stats(builder, agg_curve, loss_curve_dt)[source]

Build and save agg_curve-stats in the HDF5 file.

Parameters:
compute_store_stats(rlzs, builder)[source]

Compute and store the statistical outputs. :param rlzs: list of realizations

core_task(riskinput, riskmodel, rlzs_assoc, assetcol, monitor)
Parameters:
Returns:

a dictionary of numpy arrays of shape (L, R)

execute()[source]

Run the event_based_risk calculator and aggregate the results

is_stochastic = True
post_execute(result)[source]

Save the event loss table in the datastore.

Parameters:result – the dictionary returned by the .execute method
pre_calculator = 'event_based'
pre_execute()[source]

Read the precomputed ruptures (or compute them on the fly)

openquake.calculators.event_based_risk.build_agg_curve(lr_data, insured_losses, ses_ratio, curve_resolution, L, monitor)[source]

Build the aggregate loss curve in parallel for each loss type and realization pair.

Parameters:
  • lr_data – a list of triples (l, r, data) where l is the loss type index, r is the realization index and data is an array of kind (rupture_id, loss) or (rupture_id, loss, loss_ins)
  • insured_losses (bool) – job.ini configuration parameter
  • ses_ratio – a ratio obtained from ses_per_logic_tree_path
  • curve_resolution – the number of discretization steps for the loss curve
  • L – the number of loss types
  • monitor – a Monitor instance
Returns:

a dictionary (r, l, i) -> (losses, poes, avg)

openquake.calculators.event_based_risk.build_el_dtypes(insured_losses)[source]
Parameters:insured_losses (bool) – job.ini configuration parameter
Returns:ela_dt and elt_dt i.e. the data types for event loss assets and event loss table respectively
openquake.calculators.event_based_risk.event_based_risk(riskinput, riskmodel, rlzs_assoc, assetcol, monitor)[source]
Parameters:
Returns:

a dictionary of numpy arrays of shape (L, R)

openquake.calculators.event_based_risk.losses_by_taxonomy(riskinput, riskmodel, rlzs_assoc, assetcol, monitor)[source]
Parameters:
Returns:

a numpy array of shape (T, L, R)

openquake.calculators.event_based_risk.square(L, R, factory)[source]
Parameters:
  • L – the number of loss types
  • R – the number of realizations
  • factory – thunk used to initialize the elements
Returns:

a numpy matrix of shape (L, R)

openquake.calculators.risk module

openquake.calculators.scenario module

class openquake.calculators.scenario.ScenarioCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.base.HazardCalculator

Scenario hazard calculator

execute()[source]

Compute the GMFs and return a dictionary rlzi -> array gmv_dt

init()[source]
is_stochastic = True
post_execute(gmfa_by_rlzi)[source]
Parameters:gmfa – a dictionary rlzi -> gmfa
pre_execute()[source]

Read the site collection and initialize GmfComputer and seeds

openquake.calculators.scenario_damage module

class openquake.calculators.scenario_damage.ScenarioDamageCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.base.RiskCalculator

Scenario damage calculator

core_task(riskinput, riskmodel, rlzs_assoc, monitor)

Core function for a damage computation.

Parameters:
Returns:

a dictionary {‘d_asset’: [(l, r, a, mean-stddev), ...],

‘d_taxonomy’: damage array of shape T, L, R, E, D, ‘c_asset’: [(l, r, a, mean-stddev), ...], ‘c_taxonomy’: damage array of shape T, L, R, E}

d_asset and d_taxonomy are related to the damage distributions whereas c_asset and c_taxonomy are the consequence distributions. If there is no consequence model c_asset is an empty list and c_taxonomy is a zero-value array.

is_stochastic = True
post_execute(result)[source]

Compute stats for the aggregated distributions and save the results on the datastore.

pre_calculator = 'scenario'
pre_execute()[source]
openquake.calculators.scenario_damage.dist_by_asset(data, multi_stat_dt)[source]
Parameters:
  • data – array of shape (N, L, R, 2, ...)
  • multi_stat_dt – numpy dtype for statistical outputs
Returns:

array of shape (N, R) with records of type multi_stat_dt

openquake.calculators.scenario_damage.dist_by_taxon(data, multi_stat_dt)[source]
Parameters:
  • data – array of shape (T, L, R, ...)
  • multi_stat_dt – numpy dtype for statistical outputs
Returns:

array of shape (T, R) with records of type multi_stat_dt

openquake.calculators.scenario_damage.dist_total(data, multi_stat_dt)[source]
Parameters:
  • data – array of shape (T, L, R, ...)
  • multi_stat_dt – numpy dtype for statistical outputs
Returns:

array of shape (R,) with records of type multi_stat_dt

openquake.calculators.scenario_damage.scenario_damage(riskinput, riskmodel, rlzs_assoc, monitor)[source]

Core function for a damage computation.

Parameters:
Returns:

a dictionary {‘d_asset’: [(l, r, a, mean-stddev), ...],

‘d_taxonomy’: damage array of shape T, L, R, E, D, ‘c_asset’: [(l, r, a, mean-stddev), ...], ‘c_taxonomy’: damage array of shape T, L, R, E}

d_asset and d_taxonomy are related to the damage distributions whereas c_asset and c_taxonomy are the consequence distributions. If there is no consequence model c_asset is an empty list and c_taxonomy is a zero-value array.

openquake.calculators.scenario_risk module

class openquake.calculators.scenario_risk.ScenarioRiskCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.base.RiskCalculator

Run a scenario risk calculation

core_task(riskinput, riskmodel, rlzs_assoc, monitor)

Core function for a scenario computation.

Parameters:
Returns:

a dictionary { ‘agg’: array of shape (E, L, R, 2), ‘avg’: list of tuples (lt_idx, rlz_idx, asset_idx, statistics) } where E is the number of simulated events, L the number of loss types, R the number of realizations and statistics is an array of shape (n, R, 4), with n the number of assets in the current riskinput object

is_stochastic = True
post_execute(result)[source]

Compute stats for the aggregated distributions and save the results on the datastore.

pre_calculator = 'scenario'
pre_execute()[source]

Compute the GMFs, build the epsilons, the riskinputs, and a dictionary with the unit of measure, used in the export phase.

openquake.calculators.scenario_risk.scenario_risk(riskinput, riskmodel, rlzs_assoc, monitor)[source]

Core function for a scenario computation.

Parameters:
Returns:

a dictionary { ‘agg’: array of shape (E, L, R, 2), ‘avg’: list of tuples (lt_idx, rlz_idx, asset_idx, statistics) } where E is the number of simulated events, L the number of loss types, R the number of realizations and statistics is an array of shape (n, R, 4), with n the number of assets in the current riskinput object

openquake.calculators.ucerf_event_based module

class openquake.calculators.ucerf_event_based.ImperfectPlanarSurface(mesh_spacing, strike, dip, top_left, top_right, bottom_right, bottom_left)[source]

Bases: openquake.hazardlib.geo.surface.planar.PlanarSurface

The planar surface class sets a narrow tolerance for the rectangular plane to be distorted in cartesian space. Ruptures with aspect ratios << 1.0, and with a dip of less than 90 degrees, cannot be generated in a manner that is consistent with the definitions - and thus cannot be instantiated. This subclass modifies the original planar surface class such that the tolerance checks are over-ridden. We find that distance errors with respect to a simple fault surface with a mesh spacing of 0.001 km are only on the order of < 0.15 % for Rrup (< 2 % for Rjb, < 3.0E-5 % for Rx)

IMPERFECT_RECTANGLE_TOLERANCE = inf
class openquake.calculators.ucerf_event_based.UCERFEventBasedCalculator(oqparam, monitor=<Monitor dummy>, calc_id=None)[source]

Bases: openquake.calculators.event_based.EventBasedCalculator

Event based PSHA calculator generating the ruptures only

execute()[source]

Run the ucerf calculation

is_stochastic = True
pre_execute()[source]

parse the logic tree and source model input

class openquake.calculators.ucerf_event_based.UCERFSESControl(source_file, id, investigation_time, min_mag, npd=<openquake.hazardlib.pmf.PMF object at 0x2b0a1406ad90>, hdd=<openquake.hazardlib.pmf.PMF object at 0x2b0a1406a9d0>, aspect=1.5, upper_seismogenic_depth=0.0, lower_seismogenic_depth=15.0, msr=<WC1994>, mesh_spacing=1.0, trt='Active Shallow Crust', integration_distance=1000)[source]

Bases: object

Parameters:
  • source_file – Path to an existing HDF5 file containing the UCERF model
  • id (str) – Valid branch of UCERF
  • investigation_time (float) – Investigation time of event set (years)
  • min_mag (float) – Minimim magnitude for consideration of background sources
  • npd – Nodal plane distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • hdd – Hypocentral depth distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • aspect (float) – Aspect ratio
  • upper_seismoge nic_depth (float) – Upper seismogenic depth (km)
  • lower_seismogenic_depth (float) – Lower seismogenic depth (km)
  • msr – Magnitude scaling relation
  • mesh_spacing (float) – Spacing (km) of fault mesh
  • trt (str) – Tectonic region type
  • integration_distance (float) – Maximum distance from rupture to site for consideration
static build_idx_set(branch_code)[source]

Builds a dictionary of indices based on the branch code

Parameters:branch_code (str) – Code for the branch
generate_event_set(branch_id, sites=None, integration_distance=1000.0)[source]

Generates the event set corresponding to a particular branch

get_background_sids(branch_key, sites, integration_distance=1000.0)[source]

We can apply the filtering of the background sites as a pre-processing step - this is done here rather than in the sampling of the ruptures themselves

get_min_max_mag()[source]
update_seed(seed)[source]

Updates the random seed associated with the source

class openquake.calculators.ucerf_event_based.UCERFSourceConverter(investigation_time, rupture_mesh_spacing, complex_fault_mesh_spacing=None, width_of_mfd_bin=1.0, area_source_discretization=None)[source]

Bases: openquake.commonlib.sourceconverter.SourceConverter

Adjustment of the UCERF Source Converter to return the source information as an instance of the UCERF SES Control object

convert_UCERFSource(node)[source]

Converts the Ucerf Source node into an SES Control object

openquake.calculators.ucerf_event_based.compute_ruptures_gmfs_curves(source_models, sitecol, rlzs_assoc, monitor)[source]

Returns the ruptures as a TRT set :param source_models:

A list of UCERF source models, one per branch
Parameters:
  • sitecol – Site collection openquake.hazardlib.site.SiteCollection
  • rlzs_assoc – Instance of openquake.commonlib.source.RlzsAssoc
  • monitor – Instance of openquake.baselib.performance.Monitor
Returns:

Dictionary of rupture instances associated to a TRT ID

openquake.calculators.ucerf_event_based.generate_background_ruptures(tom, locations, occurrence, mag, npd, hdd, upper_seismogenic_depth, lower_seismogenic_depth, msr=<WC1994>, aspect=1.5, trt='Active Shallow Crust')[source]
Parameters:
  • tom – Temporal occurrence model as instance of :class: openquake.hazardlib.tom.TOM
  • locations (numpy.ndarray) – Array of locations [Longitude, Latitude] of the point sources
  • occurrence (numpy.ndarray) – Annual rates of occurrence
  • mag (float) – Magnitude
  • npd – Nodal plane distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • hdd – Hypocentral depth distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • upper_seismogenic_depth (float) – Upper seismogenic depth (km)
  • lower_seismogenic_depth (float) – Lower seismogenic depth (km)
  • msr – Magnitude scaling relation
  • aspect (float) – Aspect ratio
  • trt (str) – Tectonic region type
Returns:

List of ruptures

openquake.calculators.ucerf_event_based.get_rupture_dimensions(mag, nodal_plane, msr, rupture_aspect_ratio, upper_seismogenic_depth, lower_seismogenic_depth)[source]

Calculate and return the rupture length and width for given magnitude mag and nodal plane.

Parameters:nodal_plane – Instance of openquake.hazardlib.geo.nodalplane.NodalPlane.
Returns:Tuple of two items: rupture length in width in km.

The rupture area is calculated using method get_median_area() of source’s magnitude-scaling relationship. In any case the returned dimensions multiplication is equal to that value. Than the area is decomposed to length and width with respect to source’s rupture aspect ratio.

If calculated rupture width being inclined by nodal plane’s dip angle would not fit in between upper and lower seismogenic depth, the rupture width is shrunken to a maximum possible and rupture length is extended to preserve the same area.

openquake.calculators.ucerf_event_based.get_rupture_surface(mag, nodal_plane, hypocenter, msr, rupture_aspect_ratio, upper_seismogenic_depth, lower_seismogenic_depth, mesh_spacing=1.0)[source]

Create and return rupture surface object with given properties.

Parameters:
  • mag – Magnitude value, used to calculate rupture dimensions, see _get_rupture_dimensions().
  • nodal_plane – Instance of openquake.hazardlib.geo.nodalplane.NodalPlane describing the rupture orientation.
  • hypocenter – Point representing rupture’s hypocenter.
Returns:

Instance of PlanarSurface.

openquake.calculators.ucerf_event_based.get_ucerf_rupture(hdf5, iloc, idx_set, tom, sites, integration_distance, mesh_spacing=1.0, trt='Active Shallow Crust')[source]
Parameters:
  • hdf5 – Source Model hdf5 object as instance of :class: h5py.File
  • iloc (int) – Location of the rupture plane in the hdf5 file
  • idx_set (dict) – Set of indices for the branch

Generates a rupture set from a sample of the background model :param tom:

Temporal occurrence model as instance of :class: openquake.hazardlib.tom.TOM
Parameters:sites – Sites for consideration (can be None!)
openquake.calculators.ucerf_event_based.prefilter_background_model(hdf5, branch_key, sites, integration_distance, msr=<WC1994>, aspect=1.5)[source]

Identify those points within the integration distance :param sites:

Sites under consideration
Parameters:
  • integration_distance (float) – Maximum distance from rupture to site for consideration
  • msr – Magnitude scaling relation
  • aspect (float) – Aspect ratio
Returns:

List of site IDs within the integration distance

openquake.calculators.ucerf_event_based.prefilter_ruptures(hdf5, ridx, idx_set, sites, integration_distance)[source]

Determines if a rupture is likely to be inside the integration distance by considering the set of fault plane centroids. :param hdf5:

Source of UCERF file as h5py.File object
Parameters:
  • ridx (list) – List of indices composing the rupture sections
  • idx_set (dict) – Set of indices for the branch
  • sites – Sites for consideration (can be None!)
  • integration_distance (float) – Maximum distance from rupture to site for consideration
openquake.calculators.ucerf_event_based.sample_background_model(hdf5, branch_key, tom, filter_idx, min_mag, npd, hdd, upper_seismogenic_depth, lower_seismogenic_depth, msr=<WC1994>, aspect=1.5, trt='Active Shallow Crust')[source]

Generates a rupture set from a sample of the background model :param branch_key:

Key to indicate the branch for selecting the background model
Parameters:
  • tom – Temporal occurrence model as instance of :class: openquake.hazardlib.tom.TOM
  • filter_idx – Sites for consideration (can be None!)
  • min_mag (float) – Minimim magnitude for consideration of background sources
  • npd – Nodal plane distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • hdd – Hypocentral depth distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • aspect (float) – Aspect ratio
  • upper_seismogenic_depth (float) – Upper seismogenic depth (km)
  • lower_seismogenic_depth (float) – Lower seismogenic depth (km)
  • msr – Magnitude scaling relation
  • integration_distance (float) – Maximum distance from rupture to site for consideration

openquake.calculators.views module

openquake.calculators.views.avglosses_data_transfer(token, dstore)[source]

Determine the amount of average losses transferred from the workers to the controller node in a risk calculation.

openquake.calculators.views.classify_gsim_lt(gsim_lt)[source]
Returns:“trivial”, “simple” or “complex”
openquake.calculators.views.ebr_data_transfer(token, dstore)[source]

Display the data transferred in an event based risk calculation

openquake.calculators.views.form(value)[source]

Format numbers in a nice way.

>>> form(0)
'0'
>>> form(0.0)
'0.0'
>>> form(0.0001)
'1.000E-04'
>>> form(1003.4)
'1,003'
>>> form(103.4)
'103'
>>> form(9.3)
'9.300'
>>> form(-1.2)
'-1.2'
openquake.calculators.views.get_max_gmf_size(dstore)[source]

Upper limit for the size of the GMFs

openquake.calculators.views.performance_view(dstore)[source]

Returns the performance view as a numpy array.

openquake.calculators.views.portfolio_loss_from_agg_loss_table(agg_loss_table, loss_dt)[source]
openquake.calculators.views.portfolio_loss_from_losses_by_taxon(losses_by_taxon, loss_dt)[source]
openquake.calculators.views.rst_table(data, header=None, fmt=None)[source]

Build a .rst table from a matrix.

>>> tbl = [['a', 1], ['b', 2]]
>>> print(rst_table(tbl, header=['Name', 'Value']))
==== =====
Name Value
==== =====
a    1    
b    2    
==== =====
openquake.calculators.views.stats(name, array, *extras)[source]

Returns statistics from an array of numbers.

Parameters:name – a descriptive string
Returns:(name, mean, std, min, max, len)
openquake.calculators.views.sum_table(records)[source]

Used to compute summaries. The records are assumed to have numeric fields, except the first field which is ignored, since it typically contains a label. Here is an example:

>>> sum_table([('a', 1), ('b', 2)])
['total', 3]
openquake.calculators.views.sum_tbl(tbl, kfield, vfields)[source]

Aggregate a composite array and compute the totals on a given key.

>>> dt = numpy.dtype([('name', (bytes, 10)), ('value', int)])
>>> tbl = numpy.array([('a', 1), ('a', 2), ('b', 3)], dt)
>>> sum_tbl(tbl, 'name', ['value'])['value']
array([3, 3])
openquake.calculators.views.view_assetcol(token, dstore)[source]

Display the exposure in CSV format

openquake.calculators.views.view_assets_by_site(token, dstore)[source]

Display statistical information about the distribution of the assets

openquake.calculators.views.view_biggest_ebr_gmf(token, dstore)[source]

Returns the size of the biggest GMF in an event based risk calculation

openquake.calculators.views.view_contents(token, dstore)[source]

Returns the size of the contents of the datastore and its total size

openquake.calculators.views.view_csm_info(token, dstore)[source]
openquake.calculators.views.view_exposure_info(token, dstore)[source]

Display info about the exposure model

openquake.calculators.views.view_fullreport(token, dstore)[source]

Display an .rst report about the computation

openquake.calculators.views.view_inputs(token, dstore)[source]
openquake.calculators.views.view_job_info(token, dstore)[source]

Determine the amount of data transferred from the controller node to the workers and back in a classical calculation.

openquake.calculators.views.view_loss_curves_avg(token, dstore)[source]

Returns the average losses computed from the loss curves; for each asset shows all realizations.

openquake.calculators.views.view_mean_avg_losses(token, dstore)[source]
openquake.calculators.views.view_params(token, dstore)[source]
openquake.calculators.views.view_performance(token, dstore)[source]

Display performance information

openquake.calculators.views.view_portfolio_loss(token, dstore)[source]

The loss for the full portfolio, for each realization and loss type, extracted from the event loss table.

openquake.calculators.views.view_required_params_per_trt(token, dstore)[source]

Display the parameters needed by each tectonic region type

openquake.calculators.views.view_ruptures_events(token, dstore)[source]
openquake.calculators.views.view_ruptures_per_trt(token, dstore)[source]
openquake.calculators.views.view_short_source_info(token, dstore, maxrows=20)[source]
openquake.calculators.views.view_slow_sources(token, dstore, maxrows=20)[source]

Returns the slowest sources

openquake.calculators.views.view_task_durations(token, dstore)[source]

Display the raw task durations. Here is an example of usage:

$ oq show task_durations:classical
openquake.calculators.views.view_task_info(token, dstore)[source]

Display statistical information about the tasks performance. It is possible to get full information about a specific task with a command like this one, for a classical calculation:

$ oq show task_info:classical
openquake.calculators.views.view_task_slowest(token, dstore)[source]

Display info about the slowest classical task.

openquake.calculators.views.view_times_by_source_class(token, dstore)[source]

Returns the calculation times depending on the source typology

openquake.calculators.views.view_totlosses(token, dstore)[source]

This is a debugging view. You can use it to check that the total losses, i.e. the losses obtained by summing the average losses on all assets are indeed equal to the aggregate losses. This is a sanity check for the correctness of the implementation.

Module contents