openquake.calculators package

base module

class openquake.calculators.base.BaseCalculator(oqparam, calc_id=None)[source]

Bases: object

Abstract base class for all calculators.

Parameters:
  • oqparam – OqParam object
  • monitor – monitor object
  • calc_id – numeric calculation ID
before_export()[source]

Set the attributes nbytes

core_task(*args)[source]

Core routine running on the workers.

execute()[source]

Execution phase. Usually will run in parallel the core function and return a dictionary with the results.

export(exports=None)[source]

Export all the outputs in the datastore in the given export formats. Individual outputs are not exported if there are multiple realizations.

from_engine = False
is_stochastic = False
monitor(operation, **kw)[source]
Returns:a new Monitor instance
post_execute(result)[source]

Post-processing phase of the aggregated output. It must be overridden with the export code. It will return a dictionary of output files.

pre_calculator = None
pre_execute()[source]

Initialization phase.

run(pre_execute=True, concurrent_tasks=None, close=True, **kw)[source]

Run the calculation and return the exported outputs.

save_params(**kw)[source]

Update the current calculation parameters and save engine_version

set_log_format()[source]

Set the format of the root logger

class openquake.calculators.base.HazardCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.base.BaseCalculator

Base class for hazard calculators based on source models

can_read_parent()[source]
Returns:the parent datastore if it is present and can be read from the workers, None otherwise
compute_previous()[source]
count_eff_ruptures(result_dict, src_group_id)[source]

Returns the number of ruptures in the src_group (after filtering) or 0 if the src_group has been filtered away.

Parameters:
  • result_dict – a dictionary with keys (grp_id, gsim)
  • src_group_id – the source group ID
filter_csm()[source]
Returns:(filtered CompositeSourceModel, SourceFilter)
get_min_iml(oq)[source]
init()[source]

To be overridden to initialize the datasets needed by the calculation

load_riskmodel()[source]

Read the risk model and set the attribute .riskmodel. The riskmodel can be empty for hazard calculations. Save the loss ratios (if any) in the datastore.

post_process()[source]

For compatibility with the engine

pre_execute()[source]

Check if there is a pre_calculator or a previous calculation ID. If yes, read the inputs by invoking the precalculator or by retrieving the previous calculation; if not, read the inputs directly.

precalc = None
read_exposure(haz_sitecol=None)[source]

Read the exposure, the riskmodel and update the attributes .sitecol, .assetcol

read_inputs()[source]

Read risk data and sources if any

read_previous(precalc_id)[source]

Read the previous calculation datastore by checking the consistency of the calculation_mode, then read the risk data.

read_risk_data()[source]

Read the exposure (if any), the risk model (if any) and then the site collection, possibly extracted from the exposure.

store_source_info(infos, acc)[source]
exception openquake.calculators.base.InvalidCalculationID[source]

Bases: Exception

Raised when running a post-calculation on top of an incompatible pre-calculation

class openquake.calculators.base.RiskCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.base.HazardCalculator

Base class for all risk calculators. A risk calculator must set the attributes .riskmodel, .sitecol, .assetcol, .riskinputs in the pre_execute phase.

R
Returns:the number of realizations as read from csm_info
build_riskinputs(kind, eps=None, num_events=0)[source]
Parameters:
  • kind – kind of hazard getter, can be ‘poe’ or ‘gmf’
  • eps – a matrix of epsilons (or None)
  • num_events – how many events there are
Returns:

a list of RiskInputs objects, sorted by IMT.

combine(acc, res)[source]
execute()[source]

Parallelize on the riskinputs and returns a dictionary of results. Require a .core_task to be defined with signature (riskinputs, riskmodel, rlzs_assoc, monitor).

make_eps(num_ruptures)[source]
Parameters:num_ruptures – the size of the epsilon array for each asset
read_shakemap(haz_sitecol, assetcol)[source]

Enabled only if there is a shakemap_id parameter in the job.ini. Download, unzip, parse USGS shakemap files and build a corresponding set of GMFs which are then filtered with the hazard site collection and stored in the datastore.

openquake.calculators.base.check_precalc_consistency(calc_mode, precalc_mode)[source]

Defensive programming against users providing an incorrect pre-calculation ID (with --hazard-calculation-id)

Parameters:
  • calc_mode – calculation_mode of the current calculation
  • precalc_mode – calculation_mode of the previous calculation
openquake.calculators.base.check_time_event(oqparam, occupancy_periods)[source]

Check the time_event parameter in the datastore, by comparing with the periods found in the exposure.

openquake.calculators.base.get_gmv_data(sids, gmfs)[source]

Convert an array of shape (R, N, E, I) into an array of type gmv_data_dt

openquake.calculators.base.import_gmfs(dstore, fname, sids)[source]

Import in the datastore a ground motion field CSV file.

Parameters:
  • dstore – the datastore
  • fname – the CSV file
  • sids – the site IDs (complete)
Returns:

event_ids, num_rlzs

openquake.calculators.base.save_gmdata(calc, n_rlzs)[source]

Save a composite array gmdata in the datastore.

Parameters:
  • calc – a calculator with a dictionary .gmdata {rlz: data}
  • n_rlzs – the total number of realizations
openquake.calculators.base.save_gmf_data(dstore, sitecol, gmfs, eids=())[source]
Parameters:
openquake.calculators.base.save_gmfs(calculator)[source]
Parameters:calculator – a scenario_risk/damage or event_based_risk calculator
Returns:a pair (eids, R) where R is the number of realizations
openquake.calculators.base.set_array(longarray, shortarray)[source]
Parameters:
  • longarray – a numpy array of floats of length L >= l
  • shortarray – a numpy array of floats of length l

Fill longarray with the values of shortarray, starting from the left. If shortarry is shorter than longarray, then the remaining elements on the right are filled with numpy.nan values.

classical module

class openquake.calculators.classical.ClassicalCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.classical.PSHACalculator

Classical PSHA calculator

core_task(pgetter, hstats, monitor)
Parameters:
  • pgetter – an openquake.commonlib.getters.PmapGetter
  • hstats – a list of pairs (statname, statfunc)
  • monitor – instance of Monitor
Returns:

a dictionary kind -> ProbabilityMap

The “kind” is a string of the form ‘rlz-XXX’ or ‘mean’ of ‘quantile-XXX’ used to specify the kind of output.

execute()[source]

Build statistical hazard curves from the stored PoEs

gen_args()[source]
Yields:pgetter, hstats, monitor
post_execute(acc)[source]

Save the number of bytes per each dataset

pre_calculator = 'psha'
save_hcurves(acc, pmap_by_kind)[source]

Works by side effect by saving hcurves and statistics on the datastore; the accumulator stores the number of bytes saved.

Parameters:
  • acc – dictionary kind -> nbytes
  • pmap_by_kind – a dictionary of ProbabilityMaps
class openquake.calculators.classical.PSHACalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.base.HazardCalculator

Classical PSHA calculator

agg_dicts(acc, pmap_by_grp)[source]

Aggregate dictionaries of hazard curves by updating the accumulator.

Parameters:
  • acc – accumulator dictionary
  • pmap_by_grp – dictionary grp_id -> ProbabilityMap
core_task(group, src_filter, gsims, param, monitor=<Monitor dummy>)

Compute the hazard curves for a set of sources belonging to the same tectonic region type for all the GSIMs associated to that TRT. The arguments are the same as in calc_hazard_curves(), except for gsims, which is a list of GSIM instances.

Returns:a dictionary {grp_id: pmap} with attributes .grp_ids, .calc_times, .eff_ruptures
execute()[source]

Run in parallel core_task(sources, sitecol, monitor), by parallelizing on the sources according to their weight and tectonic region type.

gen_args(monitor)[source]

Used in the case of large source model logic trees.

Parameters:monitor – a openquake.baselib.performance.Monitor
Yields:(sources, sites, gsims, monitor) tuples
post_execute(pmap_by_grp_id)[source]

Collect the hazard curves by realization and export them.

Parameters:pmap_by_grp_id – a dictionary grp_id -> hazard curves
zerodict()[source]

Initial accumulator, a dict grp_id -> ProbabilityMap(L, G)

class openquake.calculators.classical.PreCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.classical.PSHACalculator

Calculator to filter the sources and compute the number of effective ruptures

core_task(sources, srcfilter, gsims, param, monitor)

Count the number of ruptures contained in the given sources by applying a raw source filtering on the integration distance. Return a dictionary src_group_id -> {}. All sources must belong to the same tectonic region type.

openquake.calculators.classical.build_hcurves_and_stats(pgetter, hstats, monitor)[source]
Parameters:
  • pgetter – an openquake.commonlib.getters.PmapGetter
  • hstats – a list of pairs (statname, statfunc)
  • monitor – instance of Monitor
Returns:

a dictionary kind -> ProbabilityMap

The “kind” is a string of the form ‘rlz-XXX’ or ‘mean’ of ‘quantile-XXX’ used to specify the kind of output.

openquake.calculators.classical.count_ruptures(sources, srcfilter, gsims, param, monitor)[source]

Count the number of ruptures contained in the given sources by applying a raw source filtering on the integration distance. Return a dictionary src_group_id -> {}. All sources must belong to the same tectonic region type.

openquake.calculators.classical.fix_ones(pmap)[source]

Physically, an extremely small intensity measure level can have an extremely large probability of exceedence, however that probability cannot be exactly 1 unless the level is exactly 0. Numerically, the PoE can be 1 and this give issues when calculating the damage (there is a log(0) in openquake.risklib.scientific.annual_frequency_of_exceedence). Here we solve the issue by replacing the unphysical probabilities 1 with .9999999999999999 (the float64 closest to 1).

openquake.calculators.classical.get_src_ids(sources)[source]
Returns:a string with the source IDs of the given sources, stripping the extension after the colon, if any
openquake.calculators.classical.saving_sources_by_task(iterargs, dstore)[source]

Yield the iterargs again by populating ‘task_info/source_data’

classical_bcr module

class openquake.calculators.classical_bcr.ClassicalBCRCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.classical_risk.ClassicalRiskCalculator

Classical BCR Risk calculator

core_task(riskinput, riskmodel, param, monitor)

Compute and return the average losses for each asset.

Parameters:
post_execute(result)[source]
openquake.calculators.classical_bcr.classical_bcr(riskinput, riskmodel, param, monitor)[source]

Compute and return the average losses for each asset.

Parameters:

classical_damage module

class openquake.calculators.classical_damage.ClassicalDamageCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.classical_risk.ClassicalRiskCalculator

Scenario damage calculator

core_task(riskinput, riskmodel, param, monitor)

Core function for a classical damage computation.

Parameters:
Returns:

a nested dictionary rlz_idx -> asset -> <damage array>

post_execute(result)[source]

Export the result in CSV format.

Parameters:result – a dictionary asset -> fractions per damage state
openquake.calculators.classical_damage.classical_damage(riskinput, riskmodel, param, monitor)[source]

Core function for a classical damage computation.

Parameters:
Returns:

a nested dictionary rlz_idx -> asset -> <damage array>

classical_risk module

class openquake.calculators.classical_risk.ClassicalRiskCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.base.RiskCalculator

Classical Risk calculator

core_task(riskinput, riskmodel, param, monitor)

Compute and return the average losses for each asset.

Parameters:
post_execute(result)[source]

Saving loss curves in the datastore.

Parameters:result – aggregated result of the task classical_risk
pre_calculator = 'classical'
pre_execute()[source]

Associate the assets to the sites and build the riskinputs.

openquake.calculators.classical_risk.classical_risk(riskinput, riskmodel, param, monitor)[source]

Compute and return the average losses for each asset.

Parameters:

disaggregation module

Disaggregation calculator core functionality

class openquake.calculators.disaggregation.DisaggregationCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.base.HazardCalculator

Classical PSHA disaggregation calculator

POE_TOO_BIG = "You are trying to disaggregate for poe=%s.\nHowever the source model #%d, '%s',\nproduces at most probabilities of %.7f for rlz=#%d, IMT=%s.\nThe disaggregation PoE is too big or your model is wrong,\nproducing too small PoEs."
agg_result(acc, result)[source]

Collect the results coming from compute_disagg into self.results, a dictionary with key (sid, rlzi, poe, imt, trti) and values which are probability arrays.

Parameters:
  • acc – dictionary k -> dic accumulating the results
  • result – dictionary with the result coming from a task
build_disagg_by_src(iml4)[source]
Parameters:
  • dstore – a datastore
  • iml4 – 4D array of IMLs with shape (N, 1, M, P)
build_stats(results, hstats)[source]
Parameters:
  • results – dict key -> 6D disagg_matrix
  • hstats – (statname, statfunc) pairs
check_poes_disagg(curves)[source]

Raise an error if the given poes_disagg are too small compared to the hazard curves.

execute()[source]

Performs the disaggregation

full_disaggregation(curves)[source]

Run the disaggregation phase.

Parameters:curves – a list of hazard curves, one per site

The curves can be all None if iml_disagg is set in the job.ini

get_NRPM()[source]
Returns:(num_sites, num_rlzs, num_poes, num_imts)
get_curves(sid)[source]

Get all the relevant hazard curves for the given site ordinal. Returns a dictionary rlz_id -> curve_by_imt.

post_execute(results)[source]

Save all the results of the disaggregation. NB: the number of results to save is #sites * #rlzs * #disagg_poes * #IMTs.

Parameters:results – a dictionary (sid, rlzi, poe, imt) -> trti -> disagg matrix
pre_execute()[source]
save_bin_edges()[source]

Save disagg-bins

save_disagg_result(dskey, results)[source]

Save the computed PMFs in the datastore

Parameters:
  • dskey – dataset key; can be ‘disagg’ or ‘disagg-stats’
  • results – a dictionary sid, rlz, poe, imt -> 6D disagg_matrix
openquake.calculators.disaggregation.agg_probs(*probs)[source]

Aggregate probabilities withe the usual formula 1 - (1 - P1) … (1 - Pn)

openquake.calculators.disaggregation.compute_disagg(src_filter, sources, cmaker, iml4, trti, bin_edges, oqparam, monitor)[source]
Parameters:
  • src_filter – a openquake.hazardlib.calc.filter.SourceFilter instance
  • sources – list of hazardlib source objects
  • cmaker – a openquake.hazardlib.gsim.base.ContextMaker instance
  • iml4 – an array of intensities of shape (N, R, M, P)
  • trti (dict) – tectonic region type index
  • bin_egdes – a dictionary site_id -> edges
  • oqparam – the parameters in the job.ini file
  • monitor – monitor of the currently running job
Returns:

a dictionary of probability arrays, with composite key (sid, rlzi, poe, imt, iml, trti).

event_based module

class openquake.calculators.event_based.EventBasedCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.base.HazardCalculator

Event based PSHA calculator generating the ground motion fields and the hazard curves from the ruptures, depending on the configuration parameters.

combine_pmaps_and_save_gmfs(acc, results)[source]

Combine the hazard curves (if any) and save the gmfs (if any) sequentially; notice that the gmfs may come from different tasks in any order.

Parameters:
  • acc – an accumulator for the hazard curves
  • results – dictionaries rlzi, imt -> [gmf_array, curves_by_imt]
Returns:

a new accumulator

core_task(getters, oq, monitor)
Parameters:
  • getters – a list of GmfGetter instances
  • oq – an OqParam instance
  • monitor – a Monitor instance
Returns:

a list of dictionaries with keys gmfcoll and hcurves

execute()[source]

Run in parallel core_task(sources, sitecol, monitor), by parallelizing on the ruptures according to their weight and tectonic region type.

gen_args()[source]
Yields:the arguments for compute_gmfs_and_curves
is_stochastic = True
post_execute(result)[source]
Parameters:result – a dictionary (src_group_id, gsim) -> haz_curves or an empty dictionary if hazard_curves_from_gmfs is false
pre_calculator = 'event_based_rupture'
save_gmf_bytes()[source]

Save the attribute nbytes in the gmf_data datasets

class openquake.calculators.event_based.EventBasedRuptureCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.base.HazardCalculator

Event based PSHA calculator generating the ruptures only

agg_dicts(acc, ruptures_by_grp_id)[source]

Accumulate dictionaries of ruptures and populate the events dataset in the datastore.

Parameters:
  • acc – accumulator dictionary
  • ruptures_by_grp_id – a nested dictionary grp_id -> ruptures
core_task(sources, src_filter, gsims, param, monitor)
Parameters:
  • sources – a sequence of sources of the same group
  • src_filter – a source site filter
  • gsims – a list of GSIMs for the current tectonic region model
  • param – a dictionary of additional parameters
  • monitor – monitor instance
Returns:

a dictionary src_group_id -> [Rupture instances]

execute()[source]
gen_args(csm, monitor)[source]

Used in the case of large source model logic trees.

Parameters:
Yields:

(sources, sites, gsims, monitor) tuples

init()[source]

Set the random seed passed to the SourceManager and the minimum_intensity dictionary.

is_stochastic = True
post_execute(result)[source]

Save the SES collection

save_ruptures(ruptures_by_grp_id)[source]

Extend the ‘events’ dataset with the events from the given ruptures; also, save the ruptures if the flag save_ruptures is on.

Parameters:ruptures_by_grp_id – a dictionary grp_id -> list of EBRuptures
zerodict()[source]

Initial accumulator, a dictionary (grp_id, gsim) -> curves

openquake.calculators.event_based.compute_gmfs_and_curves(getters, oq, monitor)[source]
Parameters:
  • getters – a list of GmfGetter instances
  • oq – an OqParam instance
  • monitor – a Monitor instance
Returns:

a list of dictionaries with keys gmfcoll and hcurves

openquake.calculators.event_based.compute_ruptures(sources, src_filter, gsims, param, monitor)[source]
Parameters:
  • sources – a sequence of sources of the same group
  • src_filter – a source site filter
  • gsims – a list of GSIMs for the current tectonic region model
  • param – a dictionary of additional parameters
  • monitor – monitor instance
Returns:

a dictionary src_group_id -> [Rupture instances]

openquake.calculators.event_based.get_events(ebruptures)[source]

Extract an array of dtype stored_event_dt from a list of EBRuptures

openquake.calculators.event_based.get_mean_curves(dstore)[source]

Extract the mean hazard curves from the datastore, as a composite array of length nsites.

openquake.calculators.event_based.set_counts(dstore, dsetname)[source]
Parameters:
  • dstore – a DataStore instance
  • dsetname – name of dataset with a field grp_id
Returns:

a dictionary grp_id > counts

openquake.calculators.event_based.set_random_years(dstore, name, investigation_time)[source]

Set on the events dataset year labels sensitive to the SES ordinal and the investigation time.

Parameters:
  • dstore – a DataStore instance
  • name – name of the dataset (‘events’)
  • investigation_time – investigation time
openquake.calculators.event_based.update_nbytes(dstore, key, array)[source]

event_based_risk module

class openquake.calculators.event_based_risk.EbrCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.base.RiskCalculator

Event based PSHA calculator generating the total losses by taxonomy

combine(dummy, res)[source]
Parameters:
  • dummy – unused parameter
  • res – a result dictionary
core_task(riskinput, riskmodel, param, monitor)
Parameters:
Returns:

a dictionary of numpy arrays of shape (L, R)

epsilon_getter()[source]
Returns:a callable (start, stop) producing a slice of epsilons
execute()[source]

Run the calculator and aggregate the results

gen_args()[source]

Yield the arguments required by build_ruptures, i.e. the source models, the asset collection, the riskmodel and others.

is_stochastic = True
post_execute(result)[source]

Save risk data and possibly execute the EbrPostCalculator

postproc()[source]

Build aggregate loss curves and run EbrPostCalculator

pre_calculator = 'event_based_rupture'
pre_execute()[source]
save_losses(dic, offset=0)[source]

Save the event loss tables incrementally.

Parameters:
  • dic – dictionary with agglosses, assratios, avglosses, lrs_idx
  • offset – realization offset
save_results(allres, num_rlzs)[source]
Parameters:
  • allres – an iterable of result iterators
  • num_rlzs – the total number of realizations
Returns:

the total number of events

start_tasks(sm_id, sitecol, assetcol, riskmodel, imtls, trunc_level, correl_model, min_iml)[source]
Parameters:
  • sm_id – source model ordinal
  • sitecol – a SiteCollection instance
  • assetcol – an AssetCollection instance
  • riskmodel – a RiskModel instance
  • imtls – Intensity Measure Types and Levels
  • trunc_level – truncation level
  • correl_model – correlation model
  • min_iml – vector of minimum intensities, one per IMT
Returns:

an IterResult instance

class openquake.calculators.event_based_risk.EbrPostCalculator(calc)[source]

Bases: openquake.calculators.base.RiskCalculator

execute()[source]
post_execute()[source]
pre_execute()[source]
save_curves_maps(acc, res)[source]

Save the loss curves and maps (if any).

Returns:the total number of stored bytes.
openquake.calculators.event_based_risk.build_curves_maps(avalues, builder, lrgetter, stats, clp, monitor)[source]

Build loss curves and optionally maps if conditional_loss_poes are set.

openquake.calculators.event_based_risk.build_loss_tables(dstore)[source]

Compute the total losses by rupture and losses by rlzi.

openquake.calculators.event_based_risk.event_based_risk(riskinput, riskmodel, param, monitor)[source]
Parameters:
Returns:

a dictionary of numpy arrays of shape (L, R)

reportwriter module

Utilities to build a report writer generating a .rst report for a calculation

class openquake.calculators.reportwriter.ReportWriter(dstore)[source]

Bases: object

A particularly smart view over the datastore

add(name, obj=None)[source]

Add the view named name to the report text

make_report()[source]

Build the report and return a restructed text string

save(fname)[source]

Save the report

title = {'task_classical:0': 'Fastest task', 'dupl_sources': 'Duplicated sources', 'short_source_info': 'Slowest sources', 'rlzs_assoc': 'Realizations per (TRT, GSIM)', 'params': 'Parameters', 'job_info': 'Data transfer', 'performance': 'Slowest operations', 'task_info': 'Information about the tasks', 'times_by_source_class': 'Computation times by source typology', 'required_params_per_trt': 'Required parameters per tectonic region type', 'biggest_ebr_gmf': 'Maximum memory allocated for the GMFs', 'exposure_info': 'Exposure model', 'task_classical:-1': 'Slowest task', 'avglosses_data_transfer': 'Estimated data transfer for the avglosses', 'ruptures_events': 'Specific information for event based', 'inputs': 'Input files', 'csm_info': 'Composite source model', 'ruptures_per_trt': 'Number of ruptures per tectonic region type'}
openquake.calculators.reportwriter.build_report(job_ini, output_dir=None)[source]

Write a report.csv file with information about the calculation without running it

Parameters:
  • job_ini – full pathname of the job.ini file
  • output_dir – the directory where the report is written (default the input directory)
openquake.calculators.reportwriter.indent(text)[source]

scenario module

class openquake.calculators.scenario.ScenarioCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.base.HazardCalculator

Scenario hazard calculator

execute()[source]

Compute the GMFs and return a dictionary gsim -> array(N, E, I)

init()[source]
is_stochastic = True
post_execute(dummy)[source]
pre_execute()[source]

Read the site collection and initialize GmfComputer and seeds

scenario_damage module

class openquake.calculators.scenario_damage.ScenarioDamageCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.base.RiskCalculator

Scenario damage calculator

core_task(riskinput, riskmodel, param, monitor)

Core function for a damage computation.

Parameters:
Returns:

a dictionary {‘d_asset’: [(l, r, a, mean-stddev), …],

’d_event’: damage array of shape R, L, E, D, ‘c_asset’: [(l, r, a, mean-stddev), …], ‘c_event’: damage array of shape R, L, E}

d_asset and d_tag are related to the damage distributions whereas c_asset and c_tag are the consequence distributions. If there is no consequence model c_asset is an empty list and c_tag is a zero-valued array.

is_stochastic = True
post_execute(result)[source]

Compute stats for the aggregated distributions and save the results on the datastore.

pre_calculator = 'scenario'
pre_execute()[source]
openquake.calculators.scenario_damage.dist_by_asset(data, multi_stat_dt, number)[source]
Parameters:
  • data – array of shape (N, R, L, 2, …)
  • multi_stat_dt – numpy dtype for statistical outputs
  • number – expected number of units per asset
Returns:

array of shape (N, R) with records of type multi_stat_dt

openquake.calculators.scenario_damage.scenario_damage(riskinput, riskmodel, param, monitor)[source]

Core function for a damage computation.

Parameters:
Returns:

a dictionary {‘d_asset’: [(l, r, a, mean-stddev), …],

’d_event’: damage array of shape R, L, E, D, ‘c_asset’: [(l, r, a, mean-stddev), …], ‘c_event’: damage array of shape R, L, E}

d_asset and d_tag are related to the damage distributions whereas c_asset and c_tag are the consequence distributions. If there is no consequence model c_asset is an empty list and c_tag is a zero-valued array.

scenario_risk module

class openquake.calculators.scenario_risk.ScenarioRiskCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.base.RiskCalculator

Run a scenario risk calculation

core_task(riskinput, riskmodel, param, monitor)

Core function for a scenario computation.

Parameters:
Returns:

a dictionary { ‘agg’: array of shape (E, L, R, 2), ‘avg’: list of tuples (lt_idx, rlz_idx, asset_ordinal, statistics) } where E is the number of simulated events, L the number of loss types, R the number of realizations and statistics is an array of shape (n, R, 4), with n the number of assets in the current riskinput object

is_stochastic = True
post_execute(result)[source]

Compute stats for the aggregated distributions and save the results on the datastore.

pre_calculator = 'scenario'
pre_execute()[source]

Compute the GMFs, build the epsilons, the riskinputs, and a dictionary with the unit of measure, used in the export phase.

openquake.calculators.scenario_risk.scenario_risk(riskinput, riskmodel, param, monitor)[source]

Core function for a scenario computation.

Parameters:
Returns:

a dictionary { ‘agg’: array of shape (E, L, R, 2), ‘avg’: list of tuples (lt_idx, rlz_idx, asset_ordinal, statistics) } where E is the number of simulated events, L the number of loss types, R the number of realizations and statistics is an array of shape (n, R, 4), with n the number of assets in the current riskinput object

ucerf_event_classical module

class openquake.calculators.ucerf_classical.UCERFClassicalCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.classical.ClassicalCalculator

pre_calculator = 'ucerf_psha'
class openquake.calculators.ucerf_classical.UcerfPSHACalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.classical.PSHACalculator

UCERF classical calculator.

core_task(rupset_idx, ucerf_source, src_filter, gsims, monitor)
Parameters:
  • rupset_idx – indices of the rupture sets
  • ucerf_source – an object taking the place of a source for UCERF
  • src_filter – a source filter returning the sites affected by the source
  • gsims – a list of GSIMs
  • monitor – a monitor instance
Returns:

a ProbabilityMap

execute()[source]

Run in parallel core_task(sources, sitecol, monitor), by parallelizing on the sources according to their weight and tectonic region type.

is_stochastic = False
pre_execute()[source]

parse the logic tree and source model input

openquake.calculators.ucerf_classical.convert_UCERFSource(self, node)[source]

Converts the Ucerf Source node into an SES Control object

openquake.calculators.ucerf_classical.ucerf_classical(rupset_idx, ucerf_source, src_filter, gsims, monitor)[source]
Parameters:
  • rupset_idx – indices of the rupture sets
  • ucerf_source – an object taking the place of a source for UCERF
  • src_filter – a source filter returning the sites affected by the source
  • gsims – a list of GSIMs
  • monitor – a monitor instance
Returns:

a ProbabilityMap

ucerf_event_based module

class openquake.calculators.ucerf_event_based.ImperfectPlanarSurface(strike, dip, top_left, top_right, bottom_right, bottom_left)[source]

Bases: openquake.hazardlib.geo.surface.planar.PlanarSurface

The planar surface class sets a narrow tolerance for the rectangular plane to be distorted in cartesian space. Ruptures with aspect ratios << 1.0, and with a dip of less than 90 degrees, cannot be generated in a manner that is consistent with the definitions - and thus cannot be instantiated. This subclass modifies the original planar surface class such that the tolerance checks are over-ridden. We find that distance errors with respect to a simple fault surface with a mesh spacing of 0.001 km are only on the order of < 0.15 % for Rrup (< 2 % for Rjb, < 3.0E-5 % for Rx)

IMPERFECT_RECTANGLE_TOLERANCE = inf
class openquake.calculators.ucerf_event_based.List[source]

Bases: list

Trivial container returned by compute_losses

class openquake.calculators.ucerf_event_based.UCERFHazardCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.event_based.EventBasedCalculator

Runs a standard event based calculation starting from UCERF ruptures

pre_calculator = 'ucerf_rupture'
class openquake.calculators.ucerf_event_based.UCERFRiskCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.event_based_risk.EbrCalculator

Event based risk calculator for UCERF, parallelizing on the source models

execute()[source]
gen_args()[source]

Yield the arguments required by build_ruptures, i.e. the source models, the asset collection, the riskmodel and others.

pre_execute()

parse the logic tree and source model input

class openquake.calculators.ucerf_event_based.UCERFRuptureCalculator(oqparam, calc_id=None)[source]

Bases: openquake.calculators.event_based.EventBasedRuptureCalculator

Event based PSHA calculator generating the ruptures only

core_task(sources, src_filter, gsims, param, monitor)
Parameters:
  • sources – a list with a single UCERF source
  • src_filter – a SourceFilter instance
  • gsims – a list of GSIMs
  • param – extra parameters
  • monitor – a Monitor instance
Returns:

an AccumDict grp_id -> EBRuptures

gen_args(csm, monitor)[source]

Generate a task for each branch

pre_execute()[source]

parse the logic tree and source model input

class openquake.calculators.ucerf_event_based.UCERFSource(source_file, id, investigation_time, start_date, min_mag, npd=<openquake.hazardlib.pmf.PMF object>, hdd=<openquake.hazardlib.pmf.PMF object>, aspect=1.5, upper_seismogenic_depth=0.0, lower_seismogenic_depth=15.0, msr=<WC1994>, mesh_spacing=1.0, trt='Active Shallow Crust', integration_distance=1000)[source]

Bases: object

Parameters:
  • source_file – Path to an existing HDF5 file containing the UCERF model
  • id (str) – Valid branch of UCERF
  • investigation_time (float) – Investigation time of event set (years)
  • start_date – Starting date of the investigation (None for time independent)
  • min_mag (float) – Minimim magnitude for consideration of background sources
  • npd – Nodal plane distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • hdd – Hypocentral depth distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • aspect (float) – Aspect ratio
  • upper_seismoge nic_depth (float) – Upper seismogenic depth (km)
  • lower_seismogenic_depth (float) – Lower seismogenic depth (km)
  • msr – Magnitude scaling relation
  • mesh_spacing (float) – Spacing (km) of fault mesh
  • trt (str) – Tectonic region type
  • integration_distance (float) – Maximum distance from rupture to site for consideration
count_ruptures()[source]

The length of the rupture array if the branch_id is set, else 0

gen_trace_planes(ridx)[source]
Yields:trace and rupture planes for the given rupture index
generate_event_set(background_sids, src_filter, seed)[source]

Generates the event set corresponding to a particular branch

get_background_sids(src_filter)[source]

We can apply the filtering of the background sites as a pre-processing step - this is done here rather than in the sampling of the ruptures themselves

get_background_sources(src_filter)[source]

Turn the background model of a given branch into a set of point sources

Parameters:src_filter – SourceFilter instance
get_centroids(ridx)[source]
Returns:array of centroids for the given rupture index
get_min_max_mag()[source]

Called when updating the SourceGroup

get_ridx(iloc)[source]

List of rupture indices for the given iloc

get_rupture_sites(ridx, src_filter, mag)[source]

Determines if a rupture is likely to be inside the integration distance by considering the set of fault plane centroids and returns the affected sites if any.

Parameters:
  • ridx – List of indices composing the rupture sections
  • src_filter – SourceFilter instance
  • mag – Magnitude of the rupture for consideration
Returns:

The sites affected by the rupture (or None)

get_ucerf_rupture(iloc, src_filter)[source]
Parameters:
  • iloc – Location of the rupture plane in the hdf5 file
  • src_filter – Sites for consideration and maximum distance
iter_ruptures()[source]

Yield ruptures for the current set of indices (.rupset_idx)

mags
new(grp_id, branch_id)[source]
Parameters:
  • grp_id – ordinal of the source group
  • branch_name – name of the UCERF branch
  • branch_id – string associated to the branch
Returns:

a new UCERFSource associated to the branch_id

rake
rate
tectonic_region_type = 'Active Shallow Crust'
weight

Weight of the source, equal to the number of ruptures contained

openquake.calculators.ucerf_event_based.build_idx_set(branch_id, start_date)[source]

Builds a dictionary of keys based on the branch code

openquake.calculators.ucerf_event_based.compute_losses(ssm, src_filter, param, riskmodel, imts, trunc_level, correl_model, min_iml, monitor)[source]

Compute the losses for a single source model. Returns the ruptures as an attribute .ruptures_by_grp of the list of losses.

Parameters:
  • ssm – CompositeSourceModel containing a single source model
  • sitecol – a SiteCollection instance
  • param – a dictionary of parameters
  • riskmodel – a RiskModel instance
  • imts – a list of Intensity Measure Types
  • trunc_level – truncation level
  • correl_model – correlation model
  • min_iml – vector of minimum intensities, one per IMT
  • monitor – a Monitor instance
Returns:

a List containing the losses by taxonomy and some attributes

openquake.calculators.ucerf_event_based.compute_ruptures(sources, src_filter, gsims, param, monitor)[source]
Parameters:
  • sources – a list with a single UCERF source
  • src_filter – a SourceFilter instance
  • gsims – a list of GSIMs
  • param – extra parameters
  • monitor – a Monitor instance
Returns:

an AccumDict grp_id -> EBRuptures

openquake.calculators.ucerf_event_based.generate_background_ruptures(tom, locations, occurrence, mag, npd, hdd, upper_seismogenic_depth, lower_seismogenic_depth, msr=<WC1994>, aspect=1.5, trt='Active Shallow Crust')[source]
Parameters:
  • tom – Temporal occurrence model as instance of :class: openquake.hazardlib.tom.TOM
  • locations (numpy.ndarray) – Array of locations [Longitude, Latitude] of the point sources
  • occurrence (numpy.ndarray) – Annual rates of occurrence
  • mag (float) – Magnitude
  • npd – Nodal plane distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • hdd – Hypocentral depth distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • upper_seismogenic_depth (float) – Upper seismogenic depth (km)
  • lower_seismogenic_depth (float) – Lower seismogenic depth (km)
  • msr – Magnitude scaling relation
  • aspect (float) – Aspect ratio
  • trt (str) – Tectonic region type
Returns:

List of ruptures

openquake.calculators.ucerf_event_based.get_composite_source_model(oq)[source]
Parameters:oqopenquake.commonlib.oqvalidation.OqParam instance
Returns:a class:`openquake.commonlib.source.CompositeSourceModel
openquake.calculators.ucerf_event_based.get_rupture_dimensions(mag, nodal_plane, msr, rupture_aspect_ratio, upper_seismogenic_depth, lower_seismogenic_depth)[source]

Calculate and return the rupture length and width for given magnitude mag and nodal plane.

Parameters:nodal_plane – Instance of openquake.hazardlib.geo.nodalplane.NodalPlane.
Returns:Tuple of two items: rupture length in width in km.

The rupture area is calculated using method get_median_area() of source’s magnitude-scaling relationship. In any case the returned dimensions multiplication is equal to that value. Than the area is decomposed to length and width with respect to source’s rupture aspect ratio.

If calculated rupture width being inclined by nodal plane’s dip angle would not fit in between upper and lower seismogenic depth, the rupture width is shrunken to a maximum possible and rupture length is extended to preserve the same area.

openquake.calculators.ucerf_event_based.get_rupture_surface(mag, nodal_plane, hypocenter, msr, rupture_aspect_ratio, upper_seismogenic_depth, lower_seismogenic_depth, mesh_spacing=1.0)[source]

Create and return rupture surface object with given properties.

Parameters:
  • mag – Magnitude value, used to calculate rupture dimensions, see _get_rupture_dimensions().
  • nodal_plane – Instance of openquake.hazardlib.geo.nodalplane.NodalPlane describing the rupture orientation.
  • hypocenter – Point representing rupture’s hypocenter.
Returns:

Instance of PlanarSurface.

openquake.calculators.ucerf_event_based.sample_background_model(hdf5, branch_key, tom, seed, filter_idx, min_mag, npd, hdd, upper_seismogenic_depth, lower_seismogenic_depth, msr=<WC1994>, aspect=1.5, trt='Active Shallow Crust')[source]

Generates a rupture set from a sample of the background model

Parameters:
  • branch_key – Key to indicate the branch for selecting the background model
  • tom – Temporal occurrence model as instance of :class: openquake.hazardlib.tom.TOM
  • seed – Random seed to use in the call to tom.sample_number_of_occurrences
  • filter_idx – Sites for consideration (can be None!)
  • min_mag (float) – Minimim magnitude for consideration of background sources
  • npd – Nodal plane distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • hdd – Hypocentral depth distribution as instance of :class: openquake.hazardlib.pmf.PMF
  • aspect (float) – Aspect ratio
  • upper_seismogenic_depth (float) – Upper seismogenic depth (km)
  • lower_seismogenic_depth (float) – Lower seismogenic depth (km)
  • msr – Magnitude scaling relation
  • integration_distance (float) – Maximum distance from rupture to site for consideration

views module

openquake.calculators.views.avglosses_data_transfer(token, dstore)[source]

Determine the amount of average losses transferred from the workers to the controller node in a risk calculation.

openquake.calculators.views.classify_gsim_lt(gsim_lt)[source]
Returns:“trivial”, “simple” or “complex”
openquake.calculators.views.ebr_data_transfer(token, dstore)[source]

Display the data transferred in an event based risk calculation

openquake.calculators.views.form(value)[source]

Format numbers in a nice way.

>>> form(0)
'0'
>>> form(0.0)
'0.0'
>>> form(0.0001)
'1.000E-04'
>>> form(1003.4)
'1,003'
>>> form(103.4)
'103'
>>> form(9.3)
'9.30000'
>>> form(-1.2)
'-1.2'
openquake.calculators.views.performance_view(dstore)[source]

Returns the performance view as a numpy array.

openquake.calculators.views.rst_table(data, header=None, fmt=None)[source]

Build a .rst table from a matrix.

>>> tbl = [['a', 1], ['b', 2]]
>>> print(rst_table(tbl, header=['Name', 'Value']))
==== =====
Name Value
==== =====
a    1    
b    2    
==== =====
openquake.calculators.views.stats(name, array, *extras)[source]

Returns statistics from an array of numbers.

Parameters:name – a descriptive string
Returns:(name, mean, std, min, max, len)
openquake.calculators.views.sum_table(records)[source]

Used to compute summaries. The records are assumed to have numeric fields, except the first field which is ignored, since it typically contains a label. Here is an example:

>>> sum_table([('a', 1), ('b', 2)])
['total', 3]
openquake.calculators.views.sum_tbl(tbl, kfield, vfields)[source]

Aggregate a composite array and compute the totals on a given key.

>>> dt = numpy.dtype([('name', (bytes, 10)), ('value', int)])
>>> tbl = numpy.array([('a', 1), ('a', 2), ('b', 3)], dt)
>>> sum_tbl(tbl, 'name', ['value'])['value']
array([3, 3])
openquake.calculators.views.view_assets_by_site(token, dstore)[source]

Display statistical information about the distribution of the assets

openquake.calculators.views.view_contents(token, dstore)[source]

Returns the size of the contents of the datastore and its total size

openquake.calculators.views.view_csm_info(token, dstore)[source]
openquake.calculators.views.view_dupl_sources(token, dstore)[source]

Display the duplicated sources from source_info

openquake.calculators.views.view_elt(token, dstore)[source]

Display the event loss table averaged by event

openquake.calculators.views.view_exposure_info(token, dstore)[source]

Display info about the exposure model

openquake.calculators.views.view_flat_hcurves(token, dstore)[source]

Display the flat hazard curves for the calculation. They are used for debugging purposes when comparing the results of two calculations. They are the mean over the sites of the mean hazard curves.

openquake.calculators.views.view_flat_hmaps(token, dstore)[source]

Display the flat hazard maps for the calculation. They are used for debugging purposes when comparing the results of two calculations. They are the mean over the sites of the mean hazard maps.

openquake.calculators.views.view_fullreport(token, dstore)[source]

Display an .rst report about the computation

openquake.calculators.views.view_global_poes(token, dstore)[source]

Display global probabilities averaged on all sites and all GMPEs

openquake.calculators.views.view_hmap(token, dstore)[source]

Display the highest 20 points of the mean hazard map. Called as $ oq show hmap:0.1 # 10% PoE

openquake.calculators.views.view_inputs(token, dstore)[source]
openquake.calculators.views.view_job_info(token, dstore)[source]

Determine the amount of data transferred from the controller node to the workers and back in a classical calculation.

openquake.calculators.views.view_mean_avg_losses(token, dstore)[source]
openquake.calculators.views.view_mean_disagg(token, dstore)[source]

Display mean quantities for the disaggregation. Useful for checking differences between two calculations.

openquake.calculators.views.view_num_units(token, dstore)[source]

Display the number of units by taxonomy

openquake.calculators.views.view_params(token, dstore)[source]
openquake.calculators.views.view_performance(token, dstore)[source]

Display performance information

openquake.calculators.views.view_pmap(token, dstore)[source]

Display the mean ProbabilityMap associated to a given source group name

openquake.calculators.views.view_portfolio_loss(token, dstore)[source]

The loss for the full portfolio, for each realization and loss type, extracted from the event loss table.

openquake.calculators.views.view_required_params_per_trt(token, dstore)[source]

Display the parameters needed by each tectonic region type

openquake.calculators.views.view_ruptures_events(token, dstore)[source]
openquake.calculators.views.view_ruptures_per_trt(token, dstore)[source]
openquake.calculators.views.view_short_source_info(token, dstore, maxrows=20)[source]
openquake.calculators.views.view_slow_sources(token, dstore, maxrows=20)[source]

Returns the slowest sources

openquake.calculators.views.view_task_classical(token, dstore)[source]

Display info about a given task. Here are a few examples of usage:

$ oq show task_classical:0  # the fastest task
$ oq show task_classical:-1  # the slowest task
openquake.calculators.views.view_task_durations(token, dstore)[source]

Display the raw task durations. Here is an example of usage:

$ oq show task_durations:classical
openquake.calculators.views.view_task_info(token, dstore)[source]

Display statistical information about the tasks performance. It is possible to get full information about a specific task with a command like this one, for a classical calculation:

$ oq show task_info:classical
openquake.calculators.views.view_task_risk(token, dstore)[source]

Display info about a given risk task. Here are a few examples of usage:

$ oq show task_risk:0  # the fastest task
$ oq show task_risk:-1  # the slowest task
openquake.calculators.views.view_times_by_source_class(token, dstore)[source]

Returns the calculation times depending on the source typology

openquake.calculators.views.view_totlosses(token, dstore)[source]

This is a debugging view. You can use it to check that the total losses, i.e. the losses obtained by summing the average losses on all assets are indeed equal to the aggregate losses. This is a sanity check for the correctness of the implementation.