openquake.server package¶
Subpackages¶
Submodules¶
openquake.server.dbapi module¶
One of the worst thing about Python is the DB API 2.0 specification, which is unusable except for building frameworks. It should have been a stepping stone for an usable DB API 3.0 that never happened. So, instead of a good low level API, we had a proliferation of Object Relational Mappers making our lives a lot harder. Fortunately, there has always been good Pythonistas in the anti-ORM camp.
This module is heavily inspired by the dbapiext module by Martin Blais, which is part of the antiorm package. The main (only) difference is that I am using the question mark (?) for the placeholders instead of the percent sign (%) to avoid confusions with other usages of the %s, in particular in LIKE queries and in expressions like strftime(‘%s’, time) used in SQLite.
In less than 200 lines of code there is enough support to build dynamic SQL queries and to make an ORM unneeded, since we do not need database independence.
dbiapi tutorial¶
The only thing you must to know is the Db class, which is lazy wrapper over a database connection. You instantiate it by passing a connection function and its arguments:
>>> import sqlite3
>>> db = Db(sqlite3.connect, ':memory:')
Now you have an interface to your database, the db object. This object is lazy, i.e. the connection is not yet instantiated, but it will be when you will access its .conn attribute. This attribute is automatically accessed when you call the interface to run a query, for instance to create an empty table:
>>> curs = db('CREATE TABLE job ('
... 'id INTEGER PRIMARY KEY AUTOINCREMENT, value INTEGER)')
You can populate the table by using the .insert method:
>>> db.insert('job', ['value'], [(42,), (43,)])
<sqlite3.Cursor object at ...>
Notice that this method returns a standard DB API 2.0 cursor and you have access to all of its features: for instance here you could extract the lastrowid.
Then you can run SELECT queries:
>>> rows = db('SELECT * FROM job')
The dbapi provides a Row class which is used to hold the results of SELECT queries and is working as one would expect:
>>> rows
[<Row(id=1, value=42)>, <Row(id=2, value=43)>]
>>> tuple(rows[0])
(1, 42)
>>> rows[0].id
1
>>> rows[0].value
42
>>> rows[0]._fields
['id', 'value']
The queries can have different kind of ? parameters:
?s is for interpolated string parameters:
>>> db('SELECT * FROM ?s', 'job') # ?s is replaced by 'job' [<Row(id=1, value=42)>, <Row(id=2, value=43)>]
?x is for escaped parameters (to avoid SQL injection):
>>> db('SELECT * FROM job WHERE id=?x', 1) # ?x is replaced by 1 [<Row(id=1, value=42)>]
?s and ?x are for scalar parameters; ?S and ?X are for sequences:
>>> db('INSERT INTO job (?S) VALUES (?X)', ['id', 'value'], (3, 44)) <sqlite3.Cursor object at ...>
You can see how the interpolation works by calling the expand method that returns the interpolated template (alternatively, there is a debug=True flag when calling db that prints the same info). In this case
>>> db.expand('INSERT INTO job (?S) VALUES (?X)', ['id', 'value'], [3, 44])
'INSERT INTO job (id, value) VALUES (?, ?)'
As you see, ?S parameters work by replacing a list of strings with a comma separated string, where ?X parameters are replaced by a comma separated sequence of question marks, i.e. the low level placeholder for SQLite. The interpolation performs a regular search and replace, so if you have a ?- string in your template that must not be escaped, you can run into issues. This is an error:
>>> match("SELECT * FROM job WHERE id=?x AND description='Lots of ?s'", 1)
Traceback (most recent call last):
...
ValueError: Incorrect number of ?-parameters in SELECT * FROM job WHERE id=?x AND description='Lots of ?s', expected 1
This is correct:
>>> match("SELECT * FROM job WHERE id=?x AND description=?x", 1, 'Lots of ?s')
('SELECT * FROM job WHERE id=? AND description=?', (1, 'Lots of ?s'))
There are three other ? parameters:
?D is for dictionaries and it is used mostly in UPDATE queries:
>>> match('UPDATE mytable SET ?D WHERE id=?x', dict(value=33, other=5), 1) ('UPDATE mytable SET other=?, value=? WHERE id=?', (5, 33, 1))
?A is for dictionaries and it is used in AND queries:
>>> match('SELECT * FROM job WHERE ?A', dict(value=33, id=5)) ('SELECT * FROM job WHERE id=? AND value=?', (5, 33))
?O is for dictionaries and it is used in OR queries:
>>> match('SELECT * FROM job WHERE ?O', dict(value=33, id=5)) ('SELECT * FROM job WHERE id=? OR value=?', (5, 33))
The dictionary parameters are ordered per field name, just to make the templates reproducible. ?A and ?O are smart enough to treat specially None parameters, that are turned into NULL:
>>> match('SELECT * FROM job WHERE ?A', dict(value=None, id=5))
('SELECT * FROM job WHERE id=? AND value IS NULL', (5,))
The ? parameters are matched positionally; it is also possible to pass to the db object a few keyword arguments to tune the standard behavior. In particular, if you know that a query must return a single row you can do the following:
>>> db('SELECT * FROM job WHERE id=?x', 1, one=True)
<Row(id=1, value=42)>
Without the one=True the query would have returned a list with a single element. If you know that the query must return a scalar you can do the following:
>>> db('SELECT value FROM job WHERE id=?x', 1, scalar=True)
42
If a query that should return a scalar returns something else, or if a query that should return a row returns a different number of rows, appropriate errors are raised:
>>> db('SELECT * FROM job WHERE id=?x', 1, scalar=True)
Traceback (most recent call last):
...
TooManyColumns: 2, expected 1
>>> db('SELECT * FROM job', None, one=True)
Traceback (most recent call last):
...
TooManyRows: 3, expected 1
If a row is expected but not found, a NotFound exception is raised:
>>> db('SELECT * FROM job WHERE id=?x', None, one=True)
Traceback (most recent call last):
...
NotFound
-
class
openquake.server.dbapi.
Db
(connect, *args, **kw)[source]¶ Bases:
object
A wrapper over a DB API 2 connection. See the tutorial.
-
conn
¶
-
classmethod
expand
(m_templ, *m_args)[source]¶ Performs partial interpolation of the template. Used for debugging.
-
path
¶ Path to the underlying sqlite file
-
-
exception
openquake.server.dbapi.
NotFound
[source]¶ Bases:
exceptions.Exception
Raised when a scalar query has not output
-
class
openquake.server.dbapi.
Row
(fields, values)[source]¶ Bases:
_abcoll.Sequence
A pickleable row, working both as a tuple and an object:
>>> row = Row(['id', 'value'], (1, 2)) >>> tuple(row) (1, 2) >>> assert row[0] == row.id and row[1] == row.value
Parameters: - fields – a sequence of field names
- values – a sequence of values (one per field)
-
class
openquake.server.dbapi.
Table
(fields, rows)[source]¶ Bases:
list
Just a list of Rows with an attribute _fields
-
exception
openquake.server.dbapi.
TooManyColumns
[source]¶ Bases:
exceptions.Exception
Raised when a scalar query has more than one column
openquake.server.dbserver module¶
openquake.server.manage module¶
openquake.server.middleware module¶
-
class
openquake.server.middleware.
LoginRequiredMiddleware
[source]¶ Middleware that requires a user to be authenticated to view any page other than LOGIN_URL. Exemptions to this requirement can be specified in settings via a list of regular expressions in LOGIN_EXEMPT_URLS.
Requires authentication middleware and template context processors to be loaded. You’ll get an error if they aren’t.
openquake.server.settings module¶
openquake.server.urls module¶
openquake.server.utils module¶
-
openquake.server.utils.
check_webserver_running
(url='http://localhost:8800', max_retries=30)[source]¶ Returns True if a given URL is responding within a given timeout.
openquake.server.views module¶
-
openquake.server.views.
DEFAULT_EXPORT_TYPE
= 'xml'¶ For exporting calculation outputs, the client can request a specific format (xml, geojson, csv, etc.). If the client does not specify give them (NRML) XML by default.