openquake.calculators package

base module

class openquake.calculators.base.BaseCalculator(oqparam, calc_id)[source]

Bases: object

Abstract base class for all calculators.

  • oqparam – OqParam object
  • monitor – monitor object
  • calc_id – numeric calculation ID
accept_precalc = []

Defensive programming against users providing an incorrect pre-calculation ID (with --hazard-calculation-id).

Parameters:precalc_mode – calculation_mode of the previous calculation

Core routine running on the workers.


Execution phase. Usually will run in parallel the core function and return a dictionary with the results.


Export all the outputs in the datastore in the given export formats. Individual outputs are not exported if there are multiple realizations.

from_engine = False

Gzipping the inputs and saving them in the datastore

is_stochastic = False
monitor(operation='', **kw)[source]
Returns:a new Monitor instance

Post-processing phase of the aggregated output. It must be overridden with the export code. It will return a dictionary of output files.


Checks to run after the pre_execute but before the execute


Initialization phase.

precalc = None
run(pre_execute=True, concurrent_tasks=None, remove=True, shutdown=False, **kw)[source]

Run the calculation and return the exported outputs.

  • pre_execute – set it to False to avoid running pre_execute
  • concurrent_tasks – set it to 0 to disable parallelization
  • remove – set it to False to remove the hdf5cache file (if any)
  • shutdown – set it to True to shutdown the ProcessPool

Update the current calculation parameters and save engine_version

class openquake.calculators.base.HazardCalculator(oqparam, calc_id)[source]

Bases: openquake.calculators.base.BaseCalculator

Base class for hazard calculators based on source models

Returns:the number of stored events
Returns:the total number of sites
Returns:the number of realizations

Overridden in event based

Returns:True if there are less than max_sites_disagg

Defined in MultiRiskCalculator


To be overridden to initialize the datasets needed by the calculation


Read the risk models and set the attribute .crmodel. The crmodel can be empty for hazard calculations. Save the loss ratios (if any) in the datastore.

load_insurance_data(ins_types, ins_files)[source]

Read the insurance files and populate the policy_dict


For compatibility with the engine


Check if there is a previous calculation ID. If yes, read the inputs by retrieving the previous calculation; if not, read the inputs directly.


Read the exposure, the risk models and update the attributes .sitecol, .assetcol


Read risk data and sources if any


Save the risk models in the datastore

Returns:a SourceFilter

Save info about the composite source model inside the full_lt dataset

Parameters:rel_ruptures – dictionary TRT -> number of relevant ruptures
store_source_info(calc_times, nsites=False)[source]

Save (eff_ruptures, num_sites, calc_time) inside the source_info

exception openquake.calculators.base.InvalidCalculationID[source]

Bases: Exception

Raised when running a post-calculation on top of an incompatible pre-calculation

class openquake.calculators.base.RiskCalculator(oqparam, calc_id)[source]

Bases: openquake.calculators.base.HazardCalculator

Base class for all risk calculators. A risk calculator must set the attributes .crmodel, .sitecol, .assetcol, .riskinputs in the pre_execute phase.

Parameters:kind – kind of hazard getter, can be ‘poe’ or ‘gmf’
Returns:a list of RiskInputs objects, sorted by IMT.
combine(acc, res)[source]

Combine the outputs assuming acc and res are dictionaries


Parallelize on the riskinputs and returns a dictionary of results. Require a .core_task to be defined with signature (riskinputs, crmodel, param, monitor).

read_shakemap(haz_sitecol, assetcol)[source]

Enabled only if there is a shakemap_id parameter in the job.ini. Download, unzip, parse USGS shakemap files and build a corresponding set of GMFs which are then filtered with the hazard site collection and stored in the datastore.

Returns:an array with the realization weights of shape R
openquake.calculators.base.check_amplification(ampl_df, sitecol)[source]

Make sure the amplification codes in the site collection match the ones in the amplification table.

  • ampl_df – the amplification table as a pandas DataFrame
  • sitecol – the site collection
openquake.calculators.base.check_time_event(oqparam, occupancy_periods)[source]

Check the time_event parameter in the datastore, by comparing with the periods found in the exposure.

openquake.calculators.base.consistent(dic1, dic2)[source]

Check if two dictionaries with default are consistent:

>>> consistent({'PGA': 0.05, 'SA(0.3)': 0.05}, {'default': 0.05})
>>> consistent({'SA(0.3)': 0.1, 'SA(0.6)': 0.05},
... {'default': 0.1, 'SA(0.3)': 0.1, 'SA(0.6)': 0.05})
openquake.calculators.base.create_gmf_data(dstore, M, sec_imts=(), data=None)[source]

Create and possibly populate the datasets in the gmf_data group


Physically, an extremely small intensity measure level can have an extremely large probability of exceedence, however that probability cannot be exactly 1 unless the level is exactly 0. Numerically, the PoE can be 1 and this give issues when calculating the damage (there is a log(0) in openquake.risklib.scientific.annual_frequency_of_exceedence). Here we solve the issue by replacing the unphysical probabilities 1 with .9999999999999999 (the float64 closest to 1).

openquake.calculators.base.get_calc(job_ini, calc_id)[source]

Factory function returning a Calculator instance

  • job_ini – path to job.ini file
  • calc_id – calculation ID
openquake.calculators.base.import_gmfs(dstore, oqparam, sids)[source]

Import in the datastore a ground motion field CSV file.

  • dstore – the datastore
  • oqparam – an OqParam instance
  • sids – the complete site IDs


openquake.calculators.base.save_agg_values(dstore, assetcol, lossnames, tagnames)[source]

Store agg_keys, agg_values. :returns: the aggkey dictionary key -> tags

openquake.calculators.base.set_array(longarray, shortarray)[source]
  • longarray – a numpy array of floats of length L >= l
  • shortarray – a numpy array of floats of length l

Fill longarray with the values of shortarray, starting from the left. If shortarry is shorter than longarray, then the remaining elements on the right are filled with numpy.nan values.

getters module

class openquake.calculators.getters.GmfDataGetter(sid, df, num_events, num_rlzs)[source]

Bases: object

An object with an .init() and .get_hazard() method

Parameters:gsim – ignored
Returns:the underlying DataFrame
class openquake.calculators.getters.GmfGetter(rupgetter, srcfilter, oqparam, amplifier=None, sec_perils=())[source]

Bases: object

An hazard getter with methods .get_gmfdata and .get_hazard returning ground motion values.

Returns:a dict with keys gmfdata, hcurves

Yield a GmfComputer instance for each non-discarded rupture

get_gmfdata(mon=<Monitor [jenkins]>)[source]
Returns:a DataFrame with fields eid, sid, gmv_
Parameters:gsim – ignored
class openquake.calculators.getters.PmapGetter(dstore, weights, sids, imtls=(), poes=())[source]

Bases: object

Read hazard curves from the datastore for all realizations or for a specific realization.

  • dstore – a DataStore instance or file system path to it
  • sids – the subset of sites to consider (if None, all sites)
Parameters:gsim – ignored
Returns:R probability curves for the given site
get_hcurves(pmap, rlzs_by_gsim)[source]
Parameters:pmap_by_et_id – a dictionary of ProbabilityMaps by group ID
Returns:an array of PoEs of shape (N, R, M, L)

Compute the mean curve as a ProbabilityMap

Parameters:grp – if not None must be a string of the form “grp-XX”; in that case returns the mean considering only the contribution for group XX
Returns:a list of R probability curves with shape L

Read the poes and set the .data attribute with the hazard curves

class openquake.calculators.getters.RuptureGetter(proxies, filename, et_id, trt, rlzs_by_gsim)[source]

Bases: object

  • proxies – a list of RuptureProxies
  • filename – path to the HDF5 file containing a ‘rupgeoms’ dataset
  • et_id – source group index
  • trt – tectonic region type string
  • rlzs_by_gsim – dictionary gsim -> rlzs for the group
Returns:a list of RuptureProxies
Returns:a dictionary with the parameters of the rupture
split(srcfilter, maxw)[source]
Yields:RuptureProxies with weight < maxw
class openquake.calculators.getters.ZeroGetter(sid, rlzs, R)[source]

Bases: openquake.calculators.getters.GmfDataGetter

An object with an .init() and .get_hazard() method

openquake.calculators.getters.build_stat_curve(poes, imtls, stat, weights)[source]

Build statistics by taking into account IMT-dependent weights

openquake.calculators.getters.gen_rupture_getters(dstore, ct=0, slc=slice(None, None, None))[source]



Extract EBRuptures from the datastore

openquake.calculators.getters.get_gmfgetter(dstore, rup_id)[source]
Returns:GmfGetter associated to the given rupture
Returns:a list of slices
Returns:a composite data type for the sig_eps output

classical module

class openquake.calculators.classical.ClassicalCalculator(oqparam, calc_id)[source]

Bases: openquake.calculators.base.HazardCalculator

Classical PSHA calculator

accept_precalc = ['classical']
agg_dicts(acc, dic)[source]

Aggregate dictionaries of hazard curves by updating the accumulator.

  • acc – accumulator dictionary
  • dic – dict with keys pmap, calc_times, rup_data
core_task(srcs, rlzs_by_gsim, params, monitor)

Read the SourceFilter and call the classical calculator in hazardlib


Store some empty datasets in the datastore


Run in parallel core_task(sources, sitecol, monitor), by parallelizing on the sources according to their weight and tectonic region type.

get_args(grp_ids, hazard)[source]
Returns:a list of Starmap arguments
Returns:the unique source IDs contained in the composite model

Compute the statistical hazard curves

save_hazard(acc, pmap_by_kind)[source]

Works by side effect by saving hcurves and hmaps on the datastore

  • acc – ignored
  • pmap_by_kind – a dictionary of ProbabilityMaps

Set the pointsource_distance

class openquake.calculators.classical.Hazard(dstore, full_lt, pgetter, srcidx)[source]

Bases: object

Helper class for storing the PoEs

init(pmaps, grp_id)[source]

Initialize the pmaps dictionary with zeros, if needed


Store data inside disagg_by_src/disagg_by_grp

store_poes(grp_id, pmap)[source]

Store the pmap of the given group inside the _poes dataset

openquake.calculators.classical.build_hazard(pgetter, N, hstats, individual_curves, max_sites_disagg, amplifier, monitor)[source]
  • pgetter – an openquake.commonlib.getters.PmapGetter
  • N – the total number of sites
  • hstats – a list of pairs (statname, statfunc)
  • individual_curves – if True, also build the individual curves
  • max_sites_disagg – if there are less sites than this, store rup info
  • amplifier – instance of Amplifier or None
  • monitor – instance of Monitor

a dictionary kind -> ProbabilityMap

The “kind” is a string of the form ‘rlz-XXX’ or ‘mean’ of ‘quantile-XXX’ used to specify the kind of output.

openquake.calculators.classical.classical(srcs, rlzs_by_gsim, params, monitor)[source]

Read the SourceFilter and call the classical calculator in hazardlib

openquake.calculators.classical.get_extreme_poe(array, imtls)[source]
  • array – array of shape (L, G) with L=num_levels, G=num_gsims
  • imtls – DictArray imt -> levels

the maximum PoE corresponding to the maximum level for IMTs and GSIMs

openquake.calculators.classical.make_hmap_png(hmap, lons, lats)[source]
  • hmap – a dictionary with keys calc_id, m, p, imt, poe, inv_time, array
  • lons – an array of longitudes
  • lats – an array of latitudes

an Image object containing the hazard map

openquake.calculators.classical.preclassical(srcs, srcfilter, params, monitor)[source]

Weight the sources. Also split them if split_sources is true. If ps_grid_spacing is set, grid the point sources before weighting them.

NB: srcfilter can be on a reduced site collection for performance reasons

openquake.calculators.classical.run_preclassical(csm, oqparam, h5)[source]
  • csm – a CompositeSourceModel with attribute .srcfilter
  • oqparam – the parameters in job.ini file
  • h5 – a DataStore instance
openquake.calculators.classical.store_ctxs(dstore, rupdata, grp_id)[source]

Store contexts with the same magnitude in the datastore

classical_bcr module

class openquake.calculators.classical_bcr.ClassicalBCRCalculator(oqparam, calc_id)[source]

Bases: openquake.calculators.classical_risk.ClassicalRiskCalculator

Classical BCR Risk calculator

accept_precalc = ['classical']
core_task(riskinputs, param, monitor)

Compute and return the average losses for each asset.

openquake.calculators.classical_bcr.classical_bcr(riskinputs, param, monitor)[source]

Compute and return the average losses for each asset.


classical_damage module

class openquake.calculators.classical_damage.ClassicalDamageCalculator(oqparam, calc_id)[source]

Bases: openquake.calculators.classical_risk.ClassicalRiskCalculator

Scenario damage calculator

accept_precalc = ['classical']
core_task(riskinputs, param, monitor)

Core function for a classical damage computation.


dictionaries asset_ordinal -> damage(R, L, D)


Export the result in CSV format.

Parameters:result – a dictionary asset_ordinal -> array(R, L, D)
openquake.calculators.classical_damage.classical_damage(riskinputs, param, monitor)[source]

Core function for a classical damage computation.


dictionaries asset_ordinal -> damage(R, L, D)

classical_risk module

class openquake.calculators.classical_risk.ClassicalRiskCalculator(oqparam, calc_id)[source]

Bases: openquake.calculators.base.RiskCalculator

Classical Risk calculator

accept_precalc = ['classical']
core_task(riskinputs, param, monitor)

Compute and return the average losses for each asset.


Saving loss curves in the datastore.

Parameters:result – aggregated result of the task classical_risk

Associate the assets to the sites and build the riskinputs.

precalc = 'classical'
openquake.calculators.classical_risk.classical_risk(riskinputs, param, monitor)[source]

Compute and return the average losses for each asset.


disaggregation module

Disaggregation calculator core functionality

class openquake.calculators.disaggregation.DisaggregationCalculator(oqparam, calc_id)[source]

Bases: openquake.calculators.base.HazardCalculator

Classical PSHA disaggregation calculator

accept_precalc = ['classical', 'disaggregation']
agg_result(acc, result)[source]

Collect the results coming from compute_disagg into self.results.

  • acc – dictionary sid -> trti, magi -> 6D array
  • result – dictionary with the result coming from a task

Submit disaggregation tasks and return the results


Performs the disaggregation


Run the disaggregation phase.

get_curve(sid, rlzs)[source]

Get the hazard curves for the given site ID and realizations.

  • sid – site ID
  • rlzs – a matrix of indices of shape Z

a list of Z arrays of PoEs


Save all the results of the disaggregation. NB: the number of results to save is #sites * #rlzs * #disagg_poes * #IMTs.

Parameters:results – a dictionary sid, imti, kind -> trti -> disagg matrix

Checks on the number of sites, atomic groups and size of the disaggregation matrix.

precalc = 'classical'

Save disagg-bins


Save the computed PMFs in the datastore

Parameters:results – a dict s, m, k -> 6D-matrix of shape (T, Ma, Lo, La, P, Z) or (T, Ma, D, E, P, Z) depending if k is 0 or k is 1
openquake.calculators.disaggregation.compute_disagg(dstore, slc, cmaker, hmap4, trti, magi, bin_edges, monitor)[source]
  • dstore – a DataStore instance
  • slc – a slice of ruptures
  • cmaker – a openquake.hazardlib.gsim.base.ContextMaker instance
  • hmap4 – an ArrayWrapper of shape (N, M, P, Z)
  • trti – tectonic region type index
  • magi – magnitude bin indices
  • bin_egdes – a quartet (dist_edges, lon_edges, lat_edges, eps_edges)
  • monitor – monitor of the currently running job

a dictionary sid, imti -> 6D-array

openquake.calculators.disaggregation.get_outputs_size(shapedic, disagg_outputs)[source]
Returns:the total size of the outputs
Parameters:mat6 – a 6D matrix with axis (D, Lo, La, E, P, Z)
Returns:two matrices of shape (D, E, P, Z) and (Lo, La, P, Z)
openquake.calculators.disaggregation.output_dict(shapedic, disagg_outputs)[source]

event_based module

class openquake.calculators.event_based.EventBasedCalculator(oqparam, calc_id)[source]

Bases: openquake.calculators.base.HazardCalculator

Event based PSHA calculator generating the ground motion fields and the hazard curves from the ruptures, depending on the configuration parameters.


Initial accumulator, a dictionary (et_id, gsim) -> curves

accept_precalc = ['event_based', 'ebrisk', 'event_based_risk']
agg_dicts(acc, result)[source]
  • acc – accumulator dictionary
  • result – an AccumDict with events, ruptures, gmfs and hcurves

Prefilter the composite source model and store the source_info

core_task(rupgetter, param, monitor)

Compute GMFs and optionally hazard curves

is_stochastic = True

Compute and save avg_gmf, unless there are too many GMFs

openquake.calculators.event_based.compute_avg_gmf(gmf_df, weights, min_iml)[source]
  • gmf_df – a DataFrame with colums eid, sid, rlz, gmv…
  • weights – E weights associated to the realizations
  • min_iml – array of M minimum intensities

a dictionary site_id -> array of shape (2, M)

openquake.calculators.event_based.compute_gmfs(rupgetter, param, monitor)[source]

Compute GMFs and optionally hazard curves

openquake.calculators.event_based.get_mean_curves(dstore, imt)[source]

Extract the mean hazard curves from the datastore, as an array of shape (N, L1)

post_risk module

class openquake.calculators.post_risk.PostRiskCalculator(oqparam, calc_id)[source]

Bases: openquake.calculators.base.RiskCalculator

Compute losses and loss curves starting from an event loss table.


Sanity check on tot_losses

openquake.calculators.post_risk.get_loss_builder(dstore, return_periods=None, loss_dt=None)[source]
Parameters:dstore – datastore for an event based risk calculation
Returns:a LossCurvesMapsBuilder instance
openquake.calculators.post_risk.get_src_loss_table(dstore, L)[source]
Returns:(source_ids, array of losses of shape (Ns, L))
openquake.calculators.post_risk.post_risk(builder, kr_losses, monitor)[source]
Returns:dictionary kr -> L loss curves
openquake.calculators.post_risk.reagg_idxs(num_tags, tagnames)[source]
  • num_tags – dictionary tagname -> number of tags with that tagname
  • tagnames – subset of tagnames of interest

T = T1 x … X TN indices with repetitions

Reaggregate indices. Consider for instance a case with 3 tagnames, taxonomy (4 tags), region (3 tags) and country (2 tags):

>>> num_tags = dict(taxonomy=4, region=3, country=2)

There are T = T1 x T2 x T3 = 4 x 3 x 2 = 24 combinations. The function will return 24 reaggregated indices with repetions depending on the selected subset of tagnames.

For instance reaggregating by taxonomy and region would give:

>>> list(reagg_idxs(num_tags, ['taxonomy', 'region']))  # 4x3
[0, 0, 1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 6, 6, 7, 7, 8, 8, 9, 9, 10, 10, 11, 11]

Reaggregating by taxonomy and country would give:

>>> list(reagg_idxs(num_tags, ['taxonomy', 'country']))  # 4x2
[0, 1, 0, 1, 0, 1, 2, 3, 2, 3, 2, 3, 4, 5, 4, 5, 4, 5, 6, 7, 6, 7, 6, 7]

Reaggregating by region and country would give:

>>> list(reagg_idxs(num_tags, ['region', 'country']))  # 3x2
[0, 1, 2, 3, 4, 5, 0, 1, 2, 3, 4, 5, 0, 1, 2, 3, 4, 5, 0, 1, 2, 3, 4, 5]

Here is an example of single tag aggregation:

>>> list(reagg_idxs(num_tags, ['taxonomy']))  # 4
[0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 2, 2, 2, 2, 2, 2, 3, 3, 3, 3, 3, 3]

reportwriter module

Utilities to build a report writer generating a .rst report for a calculation

class openquake.calculators.reportwriter.ReportWriter(dstore)[source]

Bases: object

A particularly smart view over the datastore

add(name, obj=None)[source]

Add the view named name to the report text


Build the report and return a restructed text string


Save the report

title = {'params': 'Parameters', 'inputs': 'Input files', 'full_lt': 'Composite source model', 'required_params_per_trt': 'Required parameters per tectonic region type', 'ruptures_events': 'Specific information for event based', 'job_info': 'Data transfer', 'biggest_ebr_gmf': 'Maximum memory allocated for the GMFs', 'avglosses_data_transfer': 'Estimated data transfer for the avglosses', 'exposure_info': 'Exposure model', 'slow_sources': 'Slowest sources', 'task:start_classical:0': 'Fastest task', 'task:start_classical:-1': 'Slowest task', 'task_info': 'Information about the tasks', 'eff_ruptures': 'Computation times by source typology', 'performance': 'Slowest operations'}
openquake.calculators.reportwriter.build_report(job_ini, output_dir=None)[source]

Write a report.csv file with information about the calculation without running it

  • job_ini – full pathname of the job.ini file
  • output_dir – the directory where the report is written (default the input directory)

scenario_damage module

class openquake.calculators.scenario_damage.ScenarioDamageCalculator(oqparam, calc_id)[source]

Bases: openquake.calculators.base.RiskCalculator

Damage calculator

accept_precalc = ['scenario', 'event_based', 'event_based_risk']
combine(acc, res)[source]

Combine the results and grows dd_data

core_task(riskinputs, param, monitor)

Core function for a damage computation.


a dictionary of arrays

is_stochastic = True

Compute stats for the aggregated distributions and save the results on the datastore.

precalc = 'event_based'

Sanity check on the total number of buildings

openquake.calculators.scenario_damage.bin_ddd(fractions, n, seed)[source]

Converting fractions into discrete damage distributions using bincount and numpy.random.choice

Parameters:numbers – an array of numbers
Returns:number of non-uint32 number
openquake.calculators.scenario_damage.run_sec_sims(damages, haz, sec_sims, seed)[source]
  • damages – array of shape (E, D) for a given asset
  • haz – dataframe of size E with a probability field
  • sec_sims – pair (probability field, number of simulations)
  • seed – random seed to use

Run secondary simulations and update the array damages

openquake.calculators.scenario_damage.scenario_damage(riskinputs, param, monitor)[source]

Core function for a damage computation.


a dictionary of arrays

scenario_risk module

class openquake.calculators.scenario_risk.EventBasedRiskCalculator(oqparam, calc_id)[source]

Bases: openquake.calculators.base.RiskCalculator

Run a scenario/event_based risk calculation

accept_precalc = ['scenario', 'event_based', 'event_based_risk', 'ebrisk']
combine(acc, res)[source]
core_task(riskinputs, param, monitor)

Core function for a scenario_risk/event_based_risk computation.


a dictionary { ‘alt’: AggLoggTable instance ‘losses_by_asset’: list of tuples (lt_idx, rlz_idx, asset_ordinal, totloss)}

is_stochastic = True

Compute stats for the aggregated distributions and save the results on the datastore.


Compute the GMFs, build the epsilons, the riskinputs, and a dictionary with the unit of measure, used in the export phase.

precalc = 'event_based'
openquake.calculators.scenario_risk.event_based_risk(riskinputs, param, monitor)[source]

Core function for a scenario_risk/event_based_risk computation.


a dictionary { ‘alt’: AggLoggTable instance ‘losses_by_asset’: list of tuples (lt_idx, rlz_idx, asset_ordinal, totloss)}

openquake.calculators.scenario_risk.run_sec_sims(out, loss_types, sec_sims, seed)[source]
  • out – a dictionary of losses
  • loss_types – the loss types
  • sec_sims – pair (probability field, number of simulations)
  • seed – random seed to use

Run secondary simulations and update the losses

views module

class openquake.calculators.views.GmpeExtractor(dstore)[source]

Bases: object

extract(et_ids, rlz_ids)[source]
class openquake.calculators.views.Source(source_id, code, num_ruptures, checksum)

Bases: tuple


Alias for field number 3


Alias for field number 1


Alias for field number 2


Alias for field number 0

openquake.calculators.views.avglosses_data_transfer(token, dstore)[source]

Determine the amount of average losses transferred from the workers to the controller node in a risk calculation.

openquake.calculators.views.binning_error(values, eids, nbins=10)[source]
  • values – E values
  • eids – E integer event indices

std/mean for the sums of the values

Group the values in nbins depending on the eids and returns the variability of the sums relative to the mean.


Format numbers in a nice way.

>>> form(0)
>>> form(0.0)
>>> form(0.0001)
>>> form(1003.4)
>>> form(103.4)
>>> form(9.3)
>>> form(-1.2)
openquake.calculators.views.portfolio_damage_error(token, dstore)[source]

The damages and errors for the full portfolio, extracted from the asset damage table.

openquake.calculators.views.rst_table(data, header=None, fmt=None)[source]

Build a .rst table from a matrix or a DataFrame

>>> tbl = [['a', 1], ['b', 2]]
>>> print(rst_table(tbl, header=['Name', 'Value']))
==== =====
Name Value
==== =====
a    1    
b    2    
==== =====
openquake.calculators.views.stats(name, array, *extras)[source]

Returns statistics from an array of numbers.

Parameters:name – a descriptive string
Returns:(name, mean, rel_std, min, max, len)

Used to compute summaries. The records are assumed to have numeric fields, except the first field which is ignored, since it typically contains a label. Here is an example:

>>> sum_table([('a', 1), ('b', 2)])
['total', 3]
openquake.calculators.views.view_assets_by_site(token, dstore)[source]

Display statistical information about the distribution of the assets

openquake.calculators.views.view_bad_ruptures(token, dstore)[source]

Display the ruptures degenerating to a point

openquake.calculators.views.view_contents(token, dstore)[source]

Returns the size of the contents of the datastore and its total size

openquake.calculators.views.view_disagg_times(token, dstore)[source]

Display slow tasks for disaggregation

openquake.calculators.views.view_ebrups_by_mag(token, dstore)[source]

Show how many event based ruptures there are for each magnitude

openquake.calculators.views.view_eff_ruptures(token, dstore)[source]
openquake.calculators.views.view_event_rates(token, dstore)[source]

Show the number of events per realization multiplied by risk_time/eff_time

openquake.calculators.views.view_events_by_mag(token, dstore)[source]

Show how many events there are for each magnitude

openquake.calculators.views.view_exposure_info(token, dstore)[source]

Display info about the exposure model

openquake.calculators.views.view_extreme(token, dstore)[source]

Show sites where the mean hazard map reaches maximum values

openquake.calculators.views.view_extreme_gmvs(token, dstore)[source]

Display table of extreme GMVs with fields (eid, gmv_0, sid, rlz. rup)

openquake.calculators.views.view_extreme_groups(token, dstore)[source]

Show the source groups contributing the most to the highest IML

openquake.calculators.views.view_full_lt(token, dstore)[source]
openquake.calculators.views.view_fullreport(token, dstore)[source]

Display an .rst report about the computation

openquake.calculators.views.view_global_gmfs(token, dstore)[source]

Display GMFs on the first IMT averaged on everything for debugging purposes

openquake.calculators.views.view_global_hazard(token, dstore)[source]

Display the global hazard for the calculation. This is used for debugging purposes when comparing the results of two calculations.

openquake.calculators.views.view_global_hmaps(token, dstore)[source]

Display the global hazard maps for the calculation. They are used for debugging purposes when comparing the results of two calculations. They are the mean over the sites of the mean hazard maps.

openquake.calculators.views.view_global_poes(token, dstore)[source]

Display global probabilities averaged on all sites and all GMPEs

openquake.calculators.views.view_gmf(token, dstore)[source]

Display a mean gmf for debugging purposes

openquake.calculators.views.view_gmf_error(token, dstore)[source]

Display a gmf relative error for seed dependency

openquake.calculators.views.view_gmvs(token, dstore)[source]

Show the GMVs on a given site ID

openquake.calculators.views.view_gmvs_to_hazard(token, dstore)[source]

Show the number of GMFs over the highest IML

openquake.calculators.views.view_inputs(token, dstore)[source]
openquake.calculators.views.view_job_info(token, dstore)[source]

Determine the amount of data transferred from the controller node to the workers and back in a classical calculation.

openquake.calculators.views.view_maximum_intensity(token, dstore)[source]

Show intensities at minimum and maximum distance for the highest magnitude

openquake.calculators.views.view_mean_disagg(token, dstore)[source]

Display mean quantities for the disaggregation. Useful for checking differences between two calculations.

openquake.calculators.views.view_num_units(token, dstore)[source]

Display the number of units by taxonomy

openquake.calculators.views.view_params(token, dstore)[source]
openquake.calculators.views.view_performance(token, dstore)[source]

Display performance information

openquake.calculators.views.view_portfolio_damage(token, dstore)[source]

The mean full portfolio damage for each loss type, extracted from the average damages

openquake.calculators.views.view_portfolio_loss(token, dstore)[source]

The mean portfolio loss for each loss type, extracted from the event loss table.

openquake.calculators.views.view_portfolio_losses(token, dstore)[source]

The losses for the full portfolio, for each realization and loss type, extracted from the event loss table.

openquake.calculators.views.view_required_params_per_trt(token, dstore)[source]

Display the parameters needed by each tectonic region type

openquake.calculators.views.view_ruptures_events(token, dstore)[source]
openquake.calculators.views.view_short_source_info(token, dstore, maxrows=20)[source]
openquake.calculators.views.view_slow_ruptures(token, dstore, maxrows=25)[source]

Show the slowest ruptures

openquake.calculators.views.view_slow_sources(token, dstore, maxrows=20)[source]

Returns the slowest sources

openquake.calculators.views.view_sum(token, dstore)[source]

Show the sum of an array on the first axis; used to debug the damages

openquake.calculators.views.view_task_durations(token, dstore)[source]

Display the raw task durations. Here is an example of usage:

$ oq show task_durations:classical
openquake.calculators.views.view_task_ebrisk(token, dstore)[source]

Display info about ebrisk tasks:

$ oq show task_ebrisk:-1 # the slowest task

openquake.calculators.views.view_task_hazard(token, dstore)[source]

Display info about a given task. Here are a few examples of usage:

$ oq show task:classical:0  # the fastest task
$ oq show task:classical:-1  # the slowest task
openquake.calculators.views.view_task_info(token, dstore)[source]

Display statistical information about the tasks performance. It is possible to get full information about a specific task with a command like this one, for a classical calculation:

$ oq show task_info:classical
openquake.calculators.views.view_totlosses(token, dstore)[source]

This is a debugging view. You can use it to check that the total losses, i.e. the losses obtained by summing the average losses on all assets are indeed equal to the aggregate losses. This is a sanity check for the correctness of the implementation.

openquake.calculators.views.view_zero_losses(token, dstore)[source]

Sanity check on avg_losses and avg_gmf

extract module

class openquake.calculators.extract.Extract[source]

Bases: dict

A callable dictionary of functions with a single instance called extract. Then extract(dstore, fullkey) dispatches to the function determined by the first part of fullkey (a slash-separated string) by passing as argument the second part of fullkey.

For instance extract(dstore, ‘sitecol’).

add(key, cache=False)[source]
class openquake.calculators.extract.Extractor(calc_id)[source]

Bases: object

A class to extract data from a calculation.

Parameters:calc_id – a calculation ID

NB: instantiating the Extractor opens the datastore.


Close the datastore

get(what, asdict=False)[source]
Parameters:what – what to extract
Returns:an ArrayWrapper instance or a dictionary if asdict is True
exception openquake.calculators.extract.NotFound[source]

Bases: Exception

class openquake.calculators.extract.RuptureData(trt, gsims)[source]

Bases: object

Container for information about the ruptures of a given tectonic region type.


Convert a list of rupture proxies into an array of dtype RuptureRata.dt

exception openquake.calculators.extract.WebAPIError[source]

Bases: RuntimeError

Wrapper for an error on a WebAPI server

class openquake.calculators.extract.WebExtractor(calc_id, server=None, username=None, password=None)[source]

Bases: openquake.calculators.extract.Extractor

A class to extract data from the WebAPI.

  • calc_id – a calculation ID
  • server – hostname of the webapi server (can be ‘’)
  • username – login username (can be ‘’)
  • password – login password (can be ‘’)

NB: instantiating the WebExtractor opens a session.


Close the session


Dump the remote datastore on a local path.

Parameters:what – what to extract
Returns:an ArrayWrapper instance

Array of bytes

openquake.calculators.extract.build_damage_array(data, damage_dt)[source]
  • data – an array of shape (A, L, D)
  • damage_dt – a damage composite data type loss_type -> states

a composite array of length N and dtype damage_dt

Parameters:dstore – a datastore instance
Returns:a composite dtype loss_type -> (ds1, ds2, …)
openquake.calculators.extract.cast(loss_array, loss_dt)[source]
openquake.calculators.extract.crm_attrs(dstore, what)[source]
Returns:the attributes of the risk model, i.e. limit_states, loss_types, min_iml and covs, needed by the risk exporters.
openquake.calculators.extract.extract_(dstore, dspath)[source]

Extracts an HDF5 path object from the datastore, for instance extract(dstore, ‘sitecol’).

openquake.calculators.extract.extract_agg_curves(dstore, what)[source]

Aggregate loss curves from the ebrisk calculator:

/extract/agg_curves? kind=stats&absolute=1&loss_type=occupants&occupancy=RES

Returns an array of shape (P, S, 1…) or (P, R, 1…)

openquake.calculators.extract.extract_agg_damages(dstore, what)[source]

Aggregate damages of the given loss type and tags. Use it as /extract/agg_damages/structural?taxonomy=RC&custom_site_id=20126

Returns:array of shape (R, D), being R the number of realizations and D the number of damage states, or an array of length 0 if there is no data for the given tags
openquake.calculators.extract.extract_agg_loss_table(dstore, what)[source]
openquake.calculators.extract.extract_agg_losses(dstore, what)[source]

Aggregate losses of the given loss type and tags. Use it as /extract/agg_losses/structural?taxonomy=RC&custom_site_id=20126 /extract/agg_losses/structural?taxonomy=RC&custom_site_id=*

Returns:an array of shape (T, R) if one of the tag names has a * value an array of shape (R,), being R the number of realizations an array of length 0 if there is no data for the given tags
openquake.calculators.extract.extract_aggregate(dstore, what)[source]

/extract/aggregate/avg_losses? kind=mean&loss_type=structural&tag=taxonomy&tag=occupancy

openquake.calculators.extract.extract_asset_risk(dstore, what)[source]

Extract an array of assets + risk fields, optionally filtered by tag. Use it as /extract/asset_risk?taxonomy=RC&taxonomy=MSBC&occupancy=RES

openquake.calculators.extract.extract_asset_tags(dstore, tagname)[source]

Extract an array of asset tags for the given tagname. Use it as /extract/asset_tags or /extract/asset_tags/taxonomy

openquake.calculators.extract.extract_assets(dstore, what)[source]

Extract an array of assets, optionally filtered by tag. Use it as /extract/assets?taxonomy=RC&taxonomy=MSBC&occupancy=RES

openquake.calculators.extract.extract_avg_gmf(dstore, what)[source]
openquake.calculators.extract.extract_damages_npz(dstore, what)[source]
openquake.calculators.extract.extract_disagg(dstore, what)[source]

Extract a disaggregation output Example: disagg?kind=Mag_Dist&imt=PGA&poe_id=0&site_id=1

openquake.calculators.extract.extract_disagg_by_src(dstore, what)[source]

Extract the disagg_by_src information Example:

openquake.calculators.extract.extract_disagg_layer(dstore, what)[source]

Extract a disaggregation layer containing all sites and outputs Example:

openquake.calculators.extract.extract_effect(dstore, what)[source]

Extracts the effect of ruptures. Use it as /extract/effect

openquake.calculators.extract.extract_eids_by_gsim(dstore, what)[source]

Returns a dictionary gsim -> event_ids for the first TRT Example:

openquake.calculators.extract.extract_event_info(dstore, eidx)[source]

Extract information about the given event index. Example:

openquake.calculators.extract.extract_exposure_metadata(dstore, what)[source]

Extract the loss categories and the tags of the exposure. Use it as /extract/exposure_metadata

openquake.calculators.extract.extract_extreme_event(dstore, eidx)[source]

Extract information about the given event index. Example:

openquake.calculators.extract.extract_gmf_npz(dstore, what)[source]
openquake.calculators.extract.extract_gridded_sources(dstore, what)[source]

Extract information about the gridded sources (requires ps_grid_spacing) Use it as /extract/gridded_sources?task_no=0. Returns a json string id -> lonlats

openquake.calculators.extract.extract_gsims_by_trt(dstore, what)[source]

Extract the dictionary gsims_by_trt

openquake.calculators.extract.extract_hcurves(dstore, what)[source]

Extracts hazard curves. Use it as /extract/hcurves?kind=mean&imt=PGA or /extract/hcurves?kind=rlz-0&imt=SA(1.0)

openquake.calculators.extract.extract_hmaps(dstore, what)[source]

Extracts hazard maps. Use it as /extract/hmaps?imt=PGA

openquake.calculators.extract.extract_losses_by_asset(dstore, what)[source]
openquake.calculators.extract.extract_mean_std_curves(dstore, what)[source]

Yield imls/IMT and poes/IMT containg mean and stddev for all sites

openquake.calculators.extract.extract_mfd(dstore, what)[source]

Display num_ruptures by magnitude for event based calculations. Example:

openquake.calculators.extract.extract_num_events(dstore, what)[source]
Returns:the number of events (if any)
openquake.calculators.extract.extract_oqparam(dstore, dummy)[source]

Extract job parameters as a JSON npz. Use it as /extract/oqparam

openquake.calculators.extract.extract_realizations(dstore, dummy)[source]

Extract an array of realizations. Use it as /extract/realizations

openquake.calculators.extract.extract_relevant_events(dstore, dummy=None)[source]

Extract the relevant events Example:

openquake.calculators.extract.extract_rups_by_mag_dist(dstore, what)[source]

Extracts the number of ruptures by mag, dist. Use it as /extract/rups_by_mag_dist

openquake.calculators.extract.extract_rupture_info(dstore, what)[source]

Extract some information about the ruptures, including the boundary. Example:

openquake.calculators.extract.extract_ruptures(dstore, what)[source]

Extract some information about the ruptures, including the boundary. Example:

openquake.calculators.extract.extract_sitecol(dstore, what)[source]

Extracts the site collection array (not the complete object, otherwise it would need to be pickled). Use it as /extract/sitecol?field=vs30

openquake.calculators.extract.extract_sources(dstore, what)[source]

Extract information about a source model. Use it as /extract/sources?limit=10 or /extract/sources?source_id=1&source_id=2 or /extract/sources?code=A&code=B

openquake.calculators.extract.extract_task_info(dstore, what)[source]

Extracts the task distribution. Use it as /extract/task_info?kind=classical

openquake.calculators.extract.extract_tot_curves(dstore, what)[source]

Aggregate loss curves from the ebrisk calculator:

/extract/tot_curves? kind=stats&absolute=1&loss_type=occupants

Returns an array of shape (P, S) or (P, R)

openquake.calculators.extract.extract_uhs(dstore, what)[source]

Extracts uniform hazard spectra. Use it as /extract/uhs?kind=mean or /extract/uhs?kind=rlz-0, etc

openquake.calculators.extract.extract_weights(dstore, what)[source]

Extract the realization weights

Returns:{‘stats’: dic, ‘loss_types’: dic, ‘num_rlzs’: R}
openquake.calculators.extract.get_mesh(sitecol, complete=True)[source]
Returns:a lon-lat or lon-lat-depth array depending if the site collection is at sea level or not
openquake.calculators.extract.get_ruptures_within(dstore, bbox)[source]

Extract the ruptures within the given bounding box, a string minlon,minlat,maxlon,maxlat. Example:,44,10,46

openquake.calculators.extract.hazard_items(dic, mesh, *extras, **kw)[source]
  • dic – dictionary of arrays of the same shape
  • mesh – a mesh array with lon, lat fields of the same length
  • extras – optional triples (field, dtype, values)
  • kw – dictionary of parameters (like investigation_time)

a list of pairs (key, value) suitable for storage in .npz format


ast.literal_eval the string if possible, otherwise returns it unchanged

openquake.calculators.extract.norm(qdict, params)[source]
openquake.calculators.extract.parse(query_string, info={})[source]
Returns:a normalized query_dict as in the following examples:
>>> parse('kind=stats', {'stats': {'mean': 0, 'max': 1}})
{'kind': ['mean', 'max'], 'k': [0, 1], 'rlzs': False}
>>> parse('kind=rlzs', {'stats': {}, 'num_rlzs': 3})
{'kind': ['rlz-000', 'rlz-001', 'rlz-002'], 'k': [0, 1, 2], 'rlzs': True}
>>> parse('kind=mean', {'stats': {'mean': 0, 'max': 1}})
{'kind': ['mean'], 'k': [0], 'rlzs': False}
>>> parse('kind=rlz-3&imt=PGA&site_id=0', {'stats': {}})
{'kind': ['rlz-3'], 'imt': ['PGA'], 'site_id': [0], 'k': [3], 'rlzs': True}

Replace /, ?, & characters with underscores and ‘=’ with ‘-‘