openquake.calculators.export package

Submodules

openquake.calculators.export.hazard module

class openquake.calculators.export.hazard.DisaggMatrix(poe, iml, dim_labels, matrix)

Bases: tuple

dim_labels

Alias for field number 2

iml

Alias for field number 1

matrix

Alias for field number 3

poe

Alias for field number 0

class openquake.calculators.export.hazard.GmfCollection(sitecol, imts, ruptures, investigation_time)[source]

Bases: object

Object converting the parameters

Parameters:
  • sitecol – SiteCollection
  • ruptures – ruptures
  • investigation_time – investigation time

into an object with the right form for the EventBasedGMFXMLWriter. Iterating over a GmfCollection yields GmfSet objects.

class openquake.calculators.export.hazard.GmfSet(gmfset, investigation_time, ses_idx)[source]

Bases: object

Small wrapper around the list of Gmf objects associated to the given SES.

class openquake.calculators.export.hazard.GroundMotionField(imt, sa_period, sa_damping, rupture_id, gmf_nodes)[source]

Bases: object

The Ground Motion Field generated by the given rupture

class openquake.calculators.export.hazard.GroundMotionFieldNode(gmv, loc)[source]

Bases: object

class openquake.calculators.export.hazard.HazardCurve(location, poes)

Bases: tuple

location

Alias for field number 0

poes

Alias for field number 1

class openquake.calculators.export.hazard.HazardMap(lon, lat, iml)

Bases: tuple

iml

Alias for field number 2

lat

Alias for field number 1

lon

Alias for field number 0

class openquake.calculators.export.hazard.Location(xyz)[source]

Bases: object

class openquake.calculators.export.hazard.Rup(eid, ses_idx, indices, gmfa)

Bases: tuple

eid

Alias for field number 0

gmfa

Alias for field number 3

indices

Alias for field number 2

ses_idx

Alias for field number 1

class openquake.calculators.export.hazard.UHS(imls, location)

Bases: tuple

imls

Alias for field number 0

location

Alias for field number 1

openquake.calculators.export.hazard.add_imt(fname, imt)[source]
>>> add_imt('/path/to/hcurve_23.csv', 'SA(0.1)')
'/path/to/hcurve-SA(0.1)_23.csv'
openquake.calculators.export.hazard.build_hcurves(getter, imtls, monitor)[source]
openquake.calculators.export.hazard.convert_to_array(pmap, sitemesh, imtls)[source]

Convert the probability map into a composite array with header of the form PGA-0.1, PGA-0.2 ...

Parameters:
  • pmap – probability map
  • sitemesh – mesh of N sites
  • imtls – a DictArray with IMT and levels
Returns:

a composite array of lenght N

openquake.calculators.export.hazard.export_disagg_csv(ekey, dstore)[source]
openquake.calculators.export.hazard.export_disagg_xml(ekey, dstore)[source]
openquake.calculators.export.hazard.export_fullreport(ekey, dstore)[source]
openquake.calculators.export.hazard.export_gmf(ekey, dstore)[source]
Parameters:
  • ekey – export key, i.e. a pair (datastore key, fmt)
  • dstore – datastore object
openquake.calculators.export.hazard.export_gmf_data_csv(ekey, dstore)[source]
openquake.calculators.export.hazard.export_gmf_scenario_npz(ekey, dstore)[source]
openquake.calculators.export.hazard.export_gmf_xml(key, dest, sitecol, imts, ruptures, rlz, investigation_time)[source]
Parameters:
  • key – output_type and export_type
  • dest – name of the exported file
  • sitecol – the full site collection
  • imts – the list of intensity measure types
  • ruptures – an ordered list of ruptures
  • rlz – a realization object
  • investigation_time – investigation time (None for scenario)
openquake.calculators.export.hazard.export_hazard_csv(key, dest, sitemesh, pmap, imtls, comment)[source]

Export the curves of the given realization into CSV.

Parameters:
  • key – output_type and export_type
  • dest – name of the exported file
  • sitemesh – site collection
  • pmap – a ProbabilityMap
  • imtls (dict) – intensity measure types and levels
  • comment – comment to use as header of the exported CSV file
openquake.calculators.export.hazard.export_hcurves_by_imt_csv(key, kind, rlzs_assoc, fname, sitecol, pmap, oq)[source]

Export the curves of the given realization into CSV.

Parameters:
  • key – output_type and export_type
  • kind – a string with the kind of output (realization or statistics)
  • rlzs_assoc – a openquake.commonlib.source.RlzsAssoc instance
  • fname – name of the exported file
  • sitecol – site collection
  • pmap – a probability map
  • oq – job.ini parameters
openquake.calculators.export.hazard.export_hcurves_csv(ekey, dstore)[source]

Exports the hazard curves into several .csv files

Parameters:
  • ekey – export key, i.e. a pair (datastore key, fmt)
  • dstore – datastore object
openquake.calculators.export.hazard.export_hcurves_npz(ekey, dstore)[source]
openquake.calculators.export.hazard.export_hcurves_rlzs(ekey, dstore)[source]

Export all hazard curves in a single .hdf5 file. This is not recommended, even if this exporter is parallel and very efficient. I was able to export 6 GB of curves per minute. However for large calculations it is then impossible to view the .hdf5 file with the hdfviewer because you will run out of memory. Also, compression is not enabled, otherwise all the time will be spent in the compression phase in the controller node with the workers doing nothing. The recommended way to postprocess large computations is to instantiate the PmapGetter and to work one block of sites at the time, discarding what it is not needed. The exporter here is meant for small/medium calculation and as an example of what you should implement yourself if you need to postprocess the hazard curves.

openquake.calculators.export.hazard.export_hcurves_xml_json(ekey, dstore)[source]
openquake.calculators.export.hazard.export_hmaps_npz(ekey, dstore)[source]
openquake.calculators.export.hazard.export_hmaps_xml_json(ekey, dstore)[source]
openquake.calculators.export.hazard.export_realizations(ekey, dstore)[source]
openquake.calculators.export.hazard.export_ruptures_xml(ekey, dstore)[source]
Parameters:
  • ekey – export key, i.e. a pair (datastore key, fmt)
  • dstore – datastore object
openquake.calculators.export.hazard.export_ses_csv(ekey, dstore)[source]
Parameters:
  • ekey – export key, i.e. a pair (datastore key, fmt)
  • dstore – datastore object
openquake.calculators.export.hazard.export_sourcegroups(ekey, dstore)[source]
openquake.calculators.export.hazard.export_uhs_npz(ekey, dstore)[source]
openquake.calculators.export.hazard.export_uhs_xml(ekey, dstore)[source]
openquake.calculators.export.hazard.get_mesh(sitecol, complete=True)[source]
openquake.calculators.export.hazard.get_metadata(realizations, kind)[source]
Parameters:
  • realizations (list) – realization objects
  • kind (str) – kind of data, i.e. a key in the datastore
Returns:

a dictionary with smlt_path, gsimlt_path, statistics, quantile_value

openquake.calculators.export.hazard.hazard_curve_name(dstore, ekey, kind, rlzs_assoc)[source]
Parameters:
  • calc_id – the calculation ID
  • ekey – the export key
  • kind – the kind of key
  • rlzs_assoc – a RlzsAssoc instance
openquake.calculators.export.hazard.save_disagg_to_csv(metadata, matrices)[source]

Save disaggregation matrices to multiple .csv files.

openquake.calculators.export.risk module

class openquake.calculators.export.risk.AggCurve(losses, poes, average_loss, stddev_loss)

Bases: tuple

average_loss

Alias for field number 2

losses

Alias for field number 0

poes

Alias for field number 1

stddev_loss

Alias for field number 3

class openquake.calculators.export.risk.BcrData(location, asset_ref, average_annual_loss_original, average_annual_loss_retrofitted, bcr)

Bases: tuple

asset_ref

Alias for field number 1

average_annual_loss_original

Alias for field number 2

average_annual_loss_retrofitted

Alias for field number 3

bcr

Alias for field number 4

location

Alias for field number 0

class openquake.calculators.export.risk.Location(x, y)[source]

Bases: object

class openquake.calculators.export.risk.LossCurve(location, asset_ref, poes, losses, loss_ratios, average_loss, stddev_loss)

Bases: tuple

asset_ref

Alias for field number 1

average_loss

Alias for field number 5

location

Alias for field number 0

loss_ratios

Alias for field number 4

losses

Alias for field number 3

poes

Alias for field number 2

stddev_loss

Alias for field number 6

class openquake.calculators.export.risk.LossMap(location, asset_ref, value, std_dev)

Bases: tuple

asset_ref

Alias for field number 1

location

Alias for field number 0

std_dev

Alias for field number 3

value

Alias for field number 2

class openquake.calculators.export.risk.Output(ltype, path, array)

Bases: tuple

array

Alias for field number 2

ltype

Alias for field number 0

path

Alias for field number 1

openquake.calculators.export.risk.add_quotes(values)[source]
openquake.calculators.export.risk.build_damage_array(data, damage_dt)[source]
Parameters:
  • data – an array of length N with fields ‘mean’ and ‘stddev’
  • damage_dt – a damage composite data type loss_type -> states
Returns:

a composite array of length N and dtype damage_dt

openquake.calculators.export.risk.build_damage_dt(dstore)[source]
Parameters:dstore – a datastore instance
Returns:a composite dtype loss_type -> (mean_ds1, stdv_ds1, ...)
openquake.calculators.export.risk.copy_to(elt, rup_data, rup_ids)[source]

Copy information from the ruptures into the elt array for the given rupture serials.

openquake.calculators.export.risk.export_agg_curve_rlzs(ekey, dstore)[source]
openquake.calculators.export.risk.export_agg_losses(ekey, dstore)[source]
Parameters:
  • ekey – export key, i.e. a pair (datastore key, fmt)
  • dstore – datastore object
openquake.calculators.export.risk.export_agg_losses_ebr(ekey, dstore)[source]
Parameters:
  • ekey – export key, i.e. a pair (datastore key, fmt)
  • dstore – datastore object
openquake.calculators.export.risk.export_agglosses(ekey, dstore)[source]
openquake.calculators.export.risk.export_all_losses_npz(ekey, dstore)[source]
openquake.calculators.export.risk.export_asset_loss_table(ekey, dstore)[source]

Export in parallel the asset loss table from the datastore.

NB1: for large calculation this may run out of memory NB2: due to an heisenbug in the parallel reading of .hdf5 files this works reliably only if the datastore has been created by a different process

The recommendation is: do not use this exporter: rather, study its source code and write what you need. Every postprocessing is different.

openquake.calculators.export.risk.export_avg_losses(ekey, dstore)[source]
Parameters:
  • ekey – export key, i.e. a pair (datastore key, fmt)
  • dstore – datastore object
openquake.calculators.export.risk.export_bcr_map(ekey, dstore)[source]
openquake.calculators.export.risk.export_bcr_map_rlzs(ekey, dstore)[source]
openquake.calculators.export.risk.export_csq_by_taxon_csv(ekey, dstore)[source]
openquake.calculators.export.risk.export_damage(ekey, dstore)[source]
openquake.calculators.export.risk.export_damage_taxon(ekey, dstore)[source]
openquake.calculators.export.risk.export_damage_total(ekey, dstore)[source]
openquake.calculators.export.risk.export_dmg_by_asset_csv(ekey, dstore)[source]
openquake.calculators.export.risk.export_dmg_by_taxon_csv(ekey, dstore)[source]
openquake.calculators.export.risk.export_dmg_totalcsv(ekey, dstore)[source]
openquake.calculators.export.risk.export_dmg_xml(key, dstore, damage_states, dmg_data, lt, rlz)[source]

Export damage outputs in XML format.

Parameters:
  • key – dmg_dist_per_asset|dmg_dist_per_taxonomy|dmg_dist_total|collapse_map
  • dstore – the datastore
  • damage_states – the list of damage states
  • dmg_data – a list [(loss_type, unit, asset_ref, mean, stddev), ...]
  • lt – loss type string
  • rlz – a realization object
openquake.calculators.export.risk.export_loss_curves(ekey, dstore)[source]
openquake.calculators.export.risk.export_loss_curves_rlzs(ekey, dstore)[source]
openquake.calculators.export.risk.export_loss_curves_stats(ekey, dstore)[source]
openquake.calculators.export.risk.export_loss_maps_csv(ekey, dstore)[source]
openquake.calculators.export.risk.export_loss_maps_npz(ekey, dstore)[source]
openquake.calculators.export.risk.export_loss_maps_rlzs_xml_geojson(ekey, dstore)[source]
openquake.calculators.export.risk.export_loss_maps_stats_xml_geojson(ekey, dstore)[source]
openquake.calculators.export.risk.export_losses_by_asset(ekey, dstore)[source]
Parameters:
  • ekey – export key, i.e. a pair (datastore key, fmt)
  • dstore – datastore object
openquake.calculators.export.risk.export_losses_by_asset_npz(ekey, dstore)[source]
Parameters:
  • ekey – export key, i.e. a pair (datastore key, fmt)
  • dstore – datastore object
openquake.calculators.export.risk.export_losses_by_event(ekey, dstore)[source]
Parameters:
  • ekey – export key, i.e. a pair (datastore key, fmt)
  • dstore – datastore object
openquake.calculators.export.risk.export_losses_by_taxon_csv(ekey, dstore)[source]
openquake.calculators.export.risk.export_losses_total_csv(ekey, dstore)[source]
openquake.calculators.export.risk.export_rlzs_by_asset_csv(ekey, dstore)[source]
openquake.calculators.export.risk.get_eids_years_serials(events_by_grp, eids)[source]
openquake.calculators.export.risk.get_loss_maps(dstore, kind)[source]
Parameters:
  • dstore – a DataStore instance
  • kind – ‘rlzs’ or ‘stats’
openquake.calculators.export.risk.get_loss_ratios(lrgetter, aids, monitor)[source]
openquake.calculators.export.risk.get_paths(rlz)[source]
Parameters:rlz – a logic tree realization (composite or simple)
Returns:a dict {‘source_model_tree_path’: string, ‘gsim_tree_path’: string}
openquake.calculators.export.risk.get_rup_data(ebruptures)[source]
openquake.calculators.export.risk.indices(*sizes)[source]

Module contents

exception openquake.calculators.export.MissingExporter[source]

Bases: exceptions.Exception

Raised when there is not exporter for the given pair (dskey, fmt)

openquake.calculators.export.export_csv(ekey, dstore)[source]

Default csv exporter for arrays stored in the output.hdf5 file

Parameters:
  • ekey – export key
  • dstore – datastore object
Returns:

a list with the path of the exported file

openquake.calculators.export.keyfunc(ekey)[source]

Extract the name before the colons:

>>> keyfunc(('agg_loss_table', 'csv'))
('agg_loss_table', 'csv')
>>> keyfunc(('agg_loss_table/1', 'csv'))
('agg_loss_table', 'csv')
>>> keyfunc(('agg_loss_table/1/0', 'csv'))
('agg_loss_table', 'csv')