openquake.risklib package#
openquake.risklib.riskinput module#
- class openquake.risklib.riskinput.RiskInput(hazard_getter, asset_df)[source]#
- Bases: - object- Site specific inputs. - Parameters:
- hazard_getter – a callable returning the hazard data for all realizations 
- asset_df – a DataFrame of assets on the given site 
 
 
openquake.risklib.riskmodels module#
- class openquake.risklib.riskmodels.CompositeRiskModel(oqparam, risklist, consdict=())[source]#
- Bases: - Mapping- A container (riskid, kind) -> riskmodel - Parameters:
- oqparam – an - openquake.commonlib.oqvalidation.OqParaminstance
- fragdict – a dictionary riskid -> loss_type -> fragility functions 
- vulndict – a dictionary riskid -> loss_type -> vulnerability function 
- consdict – a dictionary riskid -> loss_type -> consequence functions 
 
 - compute_csq(assets, dd5, tmap_df, oq)[source]#
- Parameters:
- assets – asset array 
- dd5 – distribution functions of shape (P, A, E, L, D) 
- tmap_df – DataFrame corresponding to the given taxonomy 
- oq – OqParam instance with .loss_types and .time_event 
 
- Returns:
- a dict consequence_name, loss_type -> array[P, A, E] 
 
 - get_outputs(asset_df, haz, sec_losses=(), rndgen=None, country='?')[source]#
- Parameters:
- asset_df – a DataFrame of assets with the same taxonomy and country 
- haz – a DataFrame of GMVs on the sites of the assets 
- sec_losses – a list of functions 
- rndgen – a MultiEventRNG instance 
 
- Returns:
- a list of dictionaries loss_type-> output 
 
 - classmethod read(dstore, oqparam)[source]#
- Parameters:
- dstore – a DataStore instance 
- Returns:
- a - CompositeRiskModelinstance
 
 - reduce(taxonomies)[source]#
- Parameters:
- taxonomies – a set of taxonomies 
- Returns:
- a new CompositeRiskModel reduced to the given taxonomies 
 
 - set_tmap(tmap_df, taxidx)[source]#
- Set the attribute .tmap_df if the risk IDs in the taxonomy mapping are consistent with the fragility functions. 
 - property taxonomy_dict#
- Returns:
- a dict taxonomy string -> taxonomy index 
 
 - tmap_df = ()#
 
- class openquake.risklib.riskmodels.PerilDict[source]#
- Bases: - dict- >>> pd = PerilDict({('groundshaking', 'structural'): .23}) >>> pd['structural'] 0.23 >>> pd['structurl'] Traceback (most recent call last): ... KeyError: ('groundshaking', 'structurl') 
- class openquake.risklib.riskmodels.RiskFuncList(iterable=(), /)[source]#
- Bases: - list- A list of risk functions with attributes .id, .loss_type, .kind 
- class openquake.risklib.riskmodels.RiskModel(calcmode, taxonomy, risk_functions, **kw)[source]#
- Bases: - object- Base class. Can be used in the tests as a mock. - Parameters:
- taxonomy – a taxonomy string 
- risk_functions – a dict peril -> (loss_type, kind) -> risk_function 
 
 - classical_bcr(peril, loss_type, assets, hazard, rng=None)[source]#
- Parameters:
- loss_type – the loss type 
- assets – a list of N assets of the same taxonomy 
- hazard – a dictionary col -> hazard curve 
- _eps – dummy parameter, unused 
 
- Returns:
- a list of triples (eal_orig, eal_retro, bcr_result) 
 
 - classical_damage(peril, loss_type, assets, hazard_curve, rng=None)[source]#
- Parameters:
- loss_type – the loss type 
- assets – a list of N assets of the same taxonomy 
- hazard_curve – a dictionary col -> hazard curve 
 
- Returns:
- an array of N x D elements 
 - where N is the number of points and D the number of damage states. 
 - classical_risk(peril, loss_type, assets, hazard_curve, rng=None)[source]#
- Parameters:
- loss_type (str) – the loss type considered 
- assets – assets is an iterator over A - openquake.risklib.scientific.Assetinstances
- hazard_curve – an array of poes 
- eps – ignored, here only for API compatibility with other calculators 
 
- Returns:
- a composite array (loss, poe) of shape (A, C) 
 
 - compositemodel = None#
 - ebrisk(peril, loss_type, assets, gmf_df, rndgen)#
- Returns:
- a DataFrame with columns eid, eid, loss 
 
 - event_based_damage(peril, loss_type, assets, gmf_df, rng=None)#
- Parameters:
- loss_type – the loss type 
- assets – a list of A assets of the same taxonomy 
- gmf_df – a DataFrame of GMFs 
- epsilons – dummy parameter, unused 
 
- Returns:
- an array of shape (A, E, D) elements 
 - where N is the number of points, E the number of events and D the number of damage states. 
 - event_based_risk(peril, loss_type, assets, gmf_df, rndgen)[source]#
- Returns:
- a DataFrame with columns eid, eid, loss 
 
 - property loss_types#
- The list of loss types in the underlying vulnerability functions, in lexicographic order 
 - scenario(peril, loss_type, assets, gmf_df, rndgen)#
- Returns:
- a DataFrame with columns eid, eid, loss 
 
 - scenario_damage(peril, loss_type, assets, gmf_df, rng=None)[source]#
- Parameters:
- loss_type – the loss type 
- assets – a list of A assets of the same taxonomy 
- gmf_df – a DataFrame of GMFs 
- epsilons – dummy parameter, unused 
 
- Returns:
- an array of shape (A, E, D) elements 
 - where N is the number of points, E the number of events and D the number of damage states. 
 - scenario_risk(peril, loss_type, assets, gmf_df, rndgen)#
- Returns:
- a DataFrame with columns eid, eid, loss 
 
 - time_event = None#
 
- openquake.risklib.riskmodels.build_vf_node(vf)[source]#
- Convert a VulnerabilityFunction object into a Node suitable for XML conversion. 
- openquake.risklib.riskmodels.get_risk_files(inputs)[source]#
- Parameters:
- inputs – a dictionary key -> path name 
- Returns:
- a dictionary “peril/kind/cost_type” -> fname 
 
- openquake.risklib.riskmodels.get_risk_functions(oqparam)[source]#
- Parameters:
- oqparam – an OqParam instance 
- Returns:
- a list of risk functions 
 
- openquake.risklib.riskmodels.get_riskmodel(taxonomy, oqparam, risk_functions)[source]#
- Return an instance of the correct risk model class, depending on the attribute calculation_mode of the object oqparam. - Parameters:
- taxonomy – a taxonomy string 
- oqparam – an object containing the parameters needed by the RiskModel class 
- extra – extra parameters to pass to the RiskModel class 
 
 
openquake.risklib.scientific module#
This module includes the scientific API of the oq-risklib
- class openquake.risklib.scientific.ConsequenceModel(id, assetCategory, lossCategory, description, limitStates)[source]#
- Bases: - dict- Dictionary of consequence functions. You can access each function given its name with the square bracket notation. - Parameters:
- id (str) – ID of the model 
- assetCategory (str) – asset category (i.e. buildings, population) 
- lossCategory (str) – loss type (i.e. structural, contents, …) 
- description (str) – description of the model 
- limitStates – a list of limit state strings 
 
 - kind = 'consequence'#
 
- class openquake.risklib.scientific.CurveParams(index, loss_type, curve_resolution, ratios, user_provided)#
- Bases: - tuple- curve_resolution#
- Alias for field number 2 
 - index#
- Alias for field number 0 
 - loss_type#
- Alias for field number 1 
 - ratios#
- Alias for field number 3 
 - user_provided#
- Alias for field number 4 
 
- class openquake.risklib.scientific.FragilityFunctionContinuous(limit_state, mean, stddev, minIML, maxIML, nodamage=0)[source]#
- Bases: - object- kind = 'fragility'#
 
- class openquake.risklib.scientific.FragilityFunctionDiscrete(limit_state, imls, poes, no_damage_limit=None)[source]#
- Bases: - object- kind = 'fragility'#
 
- class openquake.risklib.scientific.FragilityFunctionList(array, **attrs)[source]#
- Bases: - list- A list of fragility functions with common attributes; there is a function for each limit state. - build(limit_states, discretization=20, steps_per_interval=1)[source]#
- Parameters:
- limit_states – a sequence of limit states 
- discretization – continouos fragility discretization parameter 
- steps_per_interval – steps_per_interval parameter 
 
- Returns:
- a populated FragilityFunctionList instance 
 
 - kind = 'fragility'#
 
- class openquake.risklib.scientific.FragilityModel(id, assetCategory, lossCategory, description, limitStates)[source]#
- Bases: - dict- Container for a set of fragility functions. You can access each function given the IMT and taxonomy with the square bracket notation. - Parameters:
- id (str) – ID of the model 
- assetCategory (str) – asset category (i.e. buildings, population) 
- lossCategory (str) – loss type (i.e. structural, contents, …) 
- description (str) – description of the model 
- limitStates – a list of limit state strings 
 
 
- class openquake.risklib.scientific.LossCurvesMapsBuilder(conditional_loss_poes, return_periods, loss_dt, weights, eff_time, risk_investigation_time, pla_factor=None)[source]#
- Bases: - object- Build losses curves and maps for all loss types at the same time. - Parameters:
- conditional_loss_poes – a list of PoEs, possibly empty 
- return_periods – ordered array of return periods 
- loss_dt – composite dtype for the loss types 
- weights – weights of the realizations 
- num_events – number of events for each realization 
- eff_time – ses_per_logic_tree_path * hazard investigation time 
 
 
- class openquake.risklib.scientific.MultiEventRNG(master_seed, eids, asset_correlation=0)[source]#
- Bases: - object- An object - MultiEventRNG(master_seed, eids, asset_correlation=0)has a method- .get(A, eids)which returns a matrix of (A, E) normally distributed random numbers. If the- asset_correlationis 1 the numbers are the same.- >>> rng = MultiEventRNG( ... master_seed=42, eids=[0, 1, 2], asset_correlation=1) >>> eids = numpy.array([1] * 3) >>> means = numpy.array([.5] * 3) >>> covs = numpy.array([.1] * 3) >>> rng.lognormal(eids, means, covs) array([0.38892466, 0.38892466, 0.38892466]) >>> rng.beta(eids, means, covs) array([0.4372343 , 0.57308132, 0.56392573]) >>> fractions = numpy.array([[[.8, .1, .1]]]) >>> rng.discrete_dmg_dist([0], fractions, [10]) array([[[8, 2, 0]]], dtype=uint32) - beta(eids, means, covs)[source]#
- Parameters:
- eids – event IDs 
- means – array of floats in the range 0..1 
- covs – array of floats with the same shape 
 
- Returns:
- array of floats following the beta distribution 
 - This function works properly even when some or all of the stddevs are zero: in that case it returns the means since the distribution becomes extremely peaked. It also works properly when some one or all of the means are zero, returning zero in that case. 
 - boolean_dist(probs, num_sims)[source]#
- Convert E probabilities into an array of (E, S) booleans, being S the number of secondary simulations. - >>> rng = MultiEventRNG(master_seed=42, eids=[0, 1, 2]) >>> dist = rng.boolean_dist(probs=[.1, .2, 0.], num_sims=100) >>> dist.sum(axis=1) # around 10% and 20% respectively array([12., 17., 0.]) 
 
- class openquake.risklib.scientific.RiskComputer(crm, taxidx, country_str='?')[source]#
- Bases: - dict- A callable dictionary of risk models able to compute average losses according to the taxonomy mapping. It also computes secondary losses after the average (this is a hugely simplifying approximation). - Parameters:
- crm – a CompositeRiskModel 
- asset_df – a DataFrame of assets with the same taxonomy 
 
 - get_dd5(adf, gmf_df, rng=None, C=0, crm=None)[source]#
- Parameters:
- adf – DataFrame of assets on the given site with the same taxonomy 
- gmf_df – GMFs on the given site for E events 
- rng – MultiEvent random generator or None 
- C – Number of consequences 
 
- Returns:
- damage distribution of shape (P, A, E, L, D+C) 
 
 - output(asset_df, haz, sec_losses=(), rndgen=None)[source]#
- Compute averages by using the taxonomy mapping - Parameters:
- asset_df – assets on the same site with the same taxonomy 
- haz – a DataFrame of GMFs or an array of PoEs 
- sec_losses – a list of functions updating the loss dict 
- rndgen – None or MultiEventRNG instance 
 
- Yields:
- dictionaries {loss_type: loss_output} 
 
 
- class openquake.risklib.scientific.Sampler(distname, rng, lratios=(), cols=None)[source]#
- Bases: - object
- class openquake.risklib.scientific.VulnerabilityFunction(vf_id, imt, imls, mean_loss_ratios, covs=None, distribution='LN')[source]#
- Bases: - object- dtype = dtype([('iml', '<f8'), ('loss_ratio', '<f8'), ('cov', '<f8')])#
 - interpolate(gmf_df, col)[source]#
- Parameters:
- gmf_df – DataFrame of GMFs 
- Returns:
- DataFrame of interpolated loss ratios and covs 
 
 - kind = 'vulnerability'#
 - mean_imls()[source]#
- Compute the mean IMLs (Intensity Measure Level) for the given vulnerability function. - Parameters:
- vulnerability_function – the vulnerability function where the IMLs (Intensity Measure Level) are taken from. 
 
 - mean_loss_ratios_with_steps(steps)[source]#
- Split the mean loss ratios, producing a new set of loss ratios. The new set of loss ratios always includes 0.0 and 1.0 - Parameters:
- steps (int) – - the number of steps we make to go from one loss ratio to the next. For example, if we have [0.5, 0.7]: - steps = 1 produces [0.0, 0.5, 0.7, 1] steps = 2 produces [0.0, 0.25, 0.5, 0.6, 0.7, 0.85, 1] steps = 3 produces [0.0, 0.17, 0.33, 0.5, 0.57, 0.63, 0.7, 0.8, 0.9, 1] 
 
 - seed = None#
 
- class openquake.risklib.scientific.VulnerabilityFunctionWithPMF(vf_id, imt, imls, loss_ratios, probs)[source]#
- Bases: - VulnerabilityFunction- Vulnerability function with an explicit distribution of probabilities - Parameters:
- vf_id (str) – vulnerability function ID 
- imt (str) – Intensity Measure Type 
- imls – intensity measure levels (L) 
- ratios – an array of mean ratios (M) 
- probs – a matrix of probabilities of shape (M, L) 
 
 - interpolate(gmf_df, col)[source]#
- Parameters:
- gmvs – DataFrame of GMFs 
- col – name of the column to consider 
 
- Returns:
- DataFrame of interpolated probabilities 
 
 - loss_ratio_exceedance_matrix(loss_ratios)[source]#
- Compute the LREM (Loss Ratio Exceedance Matrix). Required for the Classical Risk and BCR Calculators. Currently left unimplemented as the PMF format is used only for the Scenario and Event Based Risk Calculators. - Parameters:
- steps (int) – Number of steps between loss ratios. 
 
 
- class openquake.risklib.scientific.VulnerabilityModel(id=None, assetCategory=None, lossCategory=None)[source]#
- Bases: - dict- Container for a set of vulnerability functions. You can access each function given the IMT and taxonomy with the square bracket notation. - Parameters:
- id (str) – ID of the model 
- assetCategory (str) – asset category (i.e. buildings, population) 
- lossCategory (str) – loss type (i.e. structural, contents, …) 
 
 - All such attributes are None for a vulnerability model coming from a NRML 0.4 file. 
- openquake.risklib.scientific.annual_frequency_of_exceedence(poe, t_haz)[source]#
- Parameters:
- poe – array of probabilities of exceedence in time t_haz 
- t_haz – hazard investigation time 
 
- Returns:
- array of frequencies (with +inf values where poe=1) 
 
- openquake.risklib.scientific.average_loss(lc)[source]#
- Given a loss curve array with poe and loss fields, computes the average loss on a period of time. - Note:
- As the loss curve is supposed to be piecewise linear as it is a result of a linear interpolation, we compute an exact integral by using the trapeizodal rule with the width given by the loss bin width. 
 
- openquake.risklib.scientific.bcr(eal_original, eal_retrofitted, interest_rate, asset_life_expectancy, asset_value, retrofitting_cost)[source]#
- Compute the Benefit-Cost Ratio. - BCR = (EALo - EALr)(1-exp(-r*t))/(r*C) - Where: - BCR – Benefit cost ratio 
- EALo – Expected annual loss for original asset 
- EALr – Expected annual loss for retrofitted asset 
- r – Interest rate 
- t – Life expectancy of the asset 
- C – Retrofitting cost 
 
- openquake.risklib.scientific.broadcast(func, composite_array, *args)[source]#
- Broadcast an array function over a composite array 
- openquake.risklib.scientific.build_imls(ff, continuous_fragility_discretization, steps_per_interval=0)[source]#
- Build intensity measure levels from a fragility function. If the function is continuous, they are produced simply as a linear space between minIML and maxIML. If the function is discrete, they are generated with a complex logic depending on the noDamageLimit and the parameter steps per interval. - Parameters:
- ff – a fragility function object 
- continuous_fragility_discretization – .ini file parameter 
- steps_per_interval – .ini file parameter 
 
- Returns:
- generated imls 
 
- openquake.risklib.scientific.build_loss_curve_dt(curve_resolution, insurance_losses=False)[source]#
- Parameters:
- curve_resolution – dictionary loss_type -> curve_resolution 
- insurance_losses – configuration parameter 
 
- Returns:
- loss_curve_dt 
 
- openquake.risklib.scientific.classical(vulnerability_function, hazard_imls, hazard_poes, loss_ratios, investigation_time, risk_investigation_time)[source]#
- Parameters:
- vulnerability_function – an instance of - openquake.risklib.scientific.VulnerabilityFunctionrepresenting the vulnerability function used to compute the curve.
- hazard_imls – the hazard intensity measure type and levels 
- loss_ratios – a tuple of C loss ratios 
- investigation_time – hazard investigation time 
- risk_investigation_time – risk investigation time 
 
- Returns:
- an array of shape (2, C) 
 
- openquake.risklib.scientific.classical_damage(fragility_functions, hazard_imls, hazard_poes, investigation_time, risk_investigation_time, steps_per_interval=1)[source]#
- Parameters:
- fragility_functions – a list of fragility functions for each damage state 
- hazard_imls – Intensity Measure Levels 
- hazard_poes – hazard curve 
- investigation_time – hazard investigation time 
- risk_investigation_time – risk investigation time 
- steps_per_interval – steps per interval 
 
- Returns:
- an array of D probabilities of occurrence where D is the numbers of damage states. 
 
- openquake.risklib.scientific.compose_dds(dmg_dists)[source]#
- Compose an array of N damage distributions: - >>> compose_dds([[.6, .2, .1, .1], [.5, .3 ,.1, .1]]) array([0.3 , 0.34, 0.17, 0.19]) 
- openquake.risklib.scientific.conditional_loss_ratio(loss_ratios, poes, probability)[source]#
- Return the loss ratio corresponding to the given PoE (Probability of Exceendance). We can have four cases: - If probability is in poes it takes the bigger corresponding loss_ratios. 
- If it is in (poe1, poe2) where both poe1 and poe2 are in poes, then we perform a linear interpolation on the corresponding losses 
- if the given probability is smaller than the lowest PoE defined, it returns the max loss ratio . 
- if the given probability is greater than the highest PoE defined it returns zero. 
 - Parameters:
- loss_ratios – non-decreasing loss ratio values (float32) 
- poes – non-increasing probabilities of exceedance values (float32) 
- probability (float) – the probability value used to interpolate the loss curve 
 
 
- openquake.risklib.scientific.consequence(consequence, assets, coeff, loss_type, time_event)[source]#
- Parameters:
- consequence – kind of consequence 
- assets – asset array (shape A) 
- coeff – composite array of coefficients of shape (A, E) 
- time_event – time event string 
 
- Returns:
- array of shape (A, E) 
 
- openquake.risklib.scientific.dds_to_poes(dmg_dists)[source]#
- Convert an array of damage distributions into an array of PoEs - >>> dds_to_poes([[.7, .2, .1], [0., 0., 1.0]]) array([[1. , 0.3, 0.1], [1. , 1. , 1. ]]) 
- openquake.risklib.scientific.fine_graining(points, steps)[source]#
- Parameters:
- points – a list of floats 
- steps (int) – expansion steps (>= 2) 
 
 - >>> fine_graining([0, 1], steps=0) [0, 1] >>> fine_graining([0, 1], steps=1) [0, 1] >>> fine_graining([0, 1], steps=2) array([0. , 0.5, 1. ]) >>> fine_graining([0, 1], steps=3) array([0. , 0.33333333, 0.66666667, 1. ]) >>> fine_graining([0, 0.5, 0.7, 1], steps=2) array([0. , 0.25, 0.5 , 0.6 , 0.7 , 0.85, 1. ]) - N points become S * (N - 1) + 1 points with S > 0 
- openquake.risklib.scientific.fix_losses(orig_losses, num_events, eff_time=0, sorting=True)[source]#
- Possibly add zeros and sort the passed losses. - Parameters:
- orig_losses – an array of size num_losses 
- num_events – an integer >= num_losses 
 
- Returns:
- three arrays of size num_events 
 
- openquake.risklib.scientific.get_agg_value(consequence, agg_values, agg_id, xltype, time_event)[source]#
- Returns:
- sum of the values corresponding to agg_id for the given consequence 
 
- openquake.risklib.scientific.insurance_loss_curve(curve, deductible, insurance_limit)[source]#
- Compute an insured loss ratio curve given a loss ratio curve - Parameters:
- curve – an array 2 x R (where R is the curve resolution) 
- deductible (float) – the deductible limit in fraction form 
- insurance_limit (float) – the insured limit in fraction form 
 
 - >>> losses = numpy.array([3, 20, 101]) >>> poes = numpy.array([0.9, 0.5, 0.1]) >>> insurance_loss_curve(numpy.array([losses, poes]), 5, 100) array([[ 3. , 20. ], [ 0.85294118, 0.5 ]]) 
- openquake.risklib.scientific.insurance_losses(asset_df, losses_by_lt, policy_df)[source]#
- Parameters:
- asset_df – DataFrame of assets 
- losses_by_lt – loss_type -> DataFrame[eid, aid, variance, loss] 
- policy_df – a DataFrame of policies 
 
 
- openquake.risklib.scientific.insured_losses(losses, deductible, insurance_limit)[source]#
- Parameters:
- losses – array of ground-up losses 
- deductible – array of deductible values 
- insurance_limit – array of insurance limit values 
 
 - Compute insured losses for the given asset and losses, from the point of view of the insurance company. For instance: - >>> insured_losses(numpy.array([3, 20, 101]), ... numpy.array([5, 5, 5]), numpy.array([100, 100, 100])) array([ 0, 15, 95]) - if the loss is 3 (< 5) the company does not pay anything 
- if the loss is 20 the company pays 20 - 5 = 15 
- if the loss is 101 the company pays 100 - 5 = 95 
 
- openquake.risklib.scientific.loss_maps(curves, conditional_loss_poes)[source]#
- Parameters:
- curves – an array of loss curves 
- conditional_loss_poes – a list of conditional loss poes 
 
- Returns:
- a composite array of loss maps with the same shape 
 
- openquake.risklib.scientific.losses_by_period(losses, return_periods, num_events, eff_time=None, sorting=True, name='curve', pla_factor=None)[source]#
- Parameters:
- losses – simulated losses as an array, list or DataFrame column 
- return_periods – return periods of interest 
- num_events – the number of events (>= number of losses) 
- eff_time – investigation_time * ses_per_logic_tree_path 
 
- Returns:
- a dictionary with the interpolated losses for the return periods, possibly with NaNs and possibly also a post-loss-amplified curve 
 - NB: the return periods must be ordered integers >= 1. The interpolated losses are defined inside the interval min_time < time < eff_time where min_time = eff_time /num_events. On the right of the interval they have NaN values; on the left zero values. If num_events is not passed, it is inferred from the number of losses; if eff_time is not passed, it is inferred from the longest return period. Here is an example: - >>> losses = [3, 2, 3.5, 4, 3, 23, 11, 2, 1, 4, 5, 7, 8, 9, 13] >>> losses_by_period(losses, [1, 2, 5, 10, 20, 50, 100], 20) {'curve': array([ 0. , 0. , 0. , 3.5, 8. , 13. , 23. ])} 
- openquake.risklib.scientific.maximum_probable_loss(losses, return_period, eff_time, sorting=True)[source]#
- Returns:
- Maximum Probable Loss at the given return period 
 - >>> losses = [1000., 0., 2000., 1500., 780., 900., 1700., 0., 100., 200.] >>> float(maximum_probable_loss(losses, 2000, 10_000)) 900.0 
- openquake.risklib.scientific.mean_std(fractions)[source]#
- Given an N x M matrix, returns mean and std computed on the rows, i.e. two M-dimensional vectors. 
- openquake.risklib.scientific.normalize_curves_eb(curves)[source]#
- A more sophisticated version of normalize_curves, used in the event based calculator. - Parameters:
- curves – a list of pairs (losses, poes) 
- Returns:
- first losses, all_poes 
 
- openquake.risklib.scientific.pairwise(iterable)[source]#
- Parameters:
- iterable – a sequence of N values (s0, s1, …) 
- Returns:
- N-1 pairs (s0, s1), (s1, s2), (s2, s3), … 
 - >>> list(pairwise('ABC')) [('A', 'B'), ('B', 'C')] 
- openquake.risklib.scientific.pairwise_diff(values, addlast=False)[source]#
- Differences between a value and the next value in a sequence. If addlast is set the last value is added to the difference, i.e. N values are returned instead of N-1. 
- openquake.risklib.scientific.pairwise_mean(values)[source]#
- Averages between a value and the next value in a sequence 
- openquake.risklib.scientific.pla_factor(df)[source]#
- Post-Loss-Amplification factor interpolator. To be instantiated with a DataFrame with columns return_period and pla_factor. 
- openquake.risklib.scientific.probability_of_exceedance(afoe, t_risk)[source]#
- Parameters:
- afoe – array of annual frequencies of exceedence 
- t_risk – risk investigation time 
 
- Returns:
- array of probabilities of exceedance in time t_risk 
 
- openquake.risklib.scientific.return_periods(eff_time, num_losses)[source]#
- Parameters:
- eff_time – ses_per_logic_tree_path * investigation_time 
- num_losses – used to determine the minimum period 
 
- Returns:
- an array of periods of dtype uint32 
 - Here are a few examples: - >>> return_periods(1, 1) Traceback (most recent call last): ... ValueError: eff_time too small: 1 >>> return_periods(2, 2) array([1, 2], dtype=uint32) >>> return_periods(2, 10) array([1, 2], dtype=uint32) >>> return_periods(100, 2) array([ 50, 100], dtype=uint32) >>> return_periods(1000, 1000) array([ 1, 2, 5, 10, 20, 50, 100, 200, 500, 1000], dtype=uint32) 
Module contents#
Checks on the risk inputs#
In the regular workflow the engine processes the inputs in the following order:
- the risk functions are read 
- the site collection is read 
- the exposure is read and associated to the sites 
- the taxonomy mapping is read, the taxonomies are checked against the exposure taxonomies and the risk IDs checked against the risk functions 
- the consequence functions are checked in set_tmap 
The perils are checked and you can get errors like
ValueError: Invalid key in job_r.ini: groundshking_fragility_file
The keys are checked and you can get errors like
ValueError: Unknown key groundshaking_fraglity_file in job_r.ini
The consequences are checked and for an unknown peril you will get an error
InvalidFile: consequences.csv: unknown peril=’groundsaking’ at line=1
The consequences are checked and for an unknown taxonomy you will get a warning:
In consequences.csv there are taxonomies missing in the exposure: {‘Concrete’}
 
    
  
  
