aiida.backends.djsite.db.migrations package

exception aiida.backends.djsite.db.migrations.DeserializationException[source]

Bases: aiida.common.exceptions.AiidaException

__module__ = 'aiida.backends.djsite.db.migrations'
class aiida.backends.djsite.db.migrations.ModelModifierV0025(apps, model_class)[source]

Bases: object

AIIDA_ATTRIBUTE_SEP = '.'
__dict__ = mappingproxy({'__module__': 'aiida.backends.djsite.db.migrations', 'AIIDA_ATTRIBUTE_SEP': '.', '_subspecifier_field_name': 'dbnode', '_sep': '.', '__init__': <function ModelModifierV0025.__init__>, 'apps': <property object>, 'subspecifiers_dict': <function ModelModifierV0025.subspecifiers_dict>, 'subspecifier_pk': <function ModelModifierV0025.subspecifier_pk>, 'validate_key': <function ModelModifierV0025.validate_key>, 'get_value_for_node': <function ModelModifierV0025.get_value_for_node>, 'getvalue': <function ModelModifierV0025.getvalue>, 'set_value_for_node': <function ModelModifierV0025.set_value_for_node>, 'del_value_for_node': <function ModelModifierV0025.del_value_for_node>, 'del_value': <function ModelModifierV0025.del_value>, 'set_value': <function ModelModifierV0025.set_value>, 'create_value': <function ModelModifierV0025.create_value>, '__dict__': <attribute '__dict__' of 'ModelModifierV0025' objects>, '__weakref__': <attribute '__weakref__' of 'ModelModifierV0025' objects>, '__doc__': None})
__init__(apps, model_class)[source]

Initialize self. See help(type(self)) for accurate signature.

__module__ = 'aiida.backends.djsite.db.migrations'
__weakref__

list of weak references to the object (if defined)

_sep = '.'
_subspecifier_field_name = 'dbnode'
apps
create_value(key, value, subspecifier_value=None, other_attribs={})[source]

Create a new list of attributes, without storing them, associated with the current key/value pair (and to the given subspecifier, e.g. the DbNode for DbAttributes and DbExtras).

Note:

No hits are done on the DB, in particular no check is done on the existence of the given nodes.

Parameters:
  • key – a string with the key to create (can contain the separator self._sep if this is a sub-attribute: indeed, this function calls itself recursively)
  • value – the value to store (a basic data type or a list or a dict)
  • subspecifier_value – must be None if this class has no subspecifier set (e.g., the DbSetting class). Must be the value of the subspecifier (e.g., the dbnode) for classes that define it (e.g. DbAttribute and DbExtra)
  • other_attribs – a dictionary of other parameters, to store only on the level-zero attribute (e.g. for description in DbSetting).
Returns:

always a list of class instances; it is the user responsibility to store such entries (typically with a Django bulk_create() call).

del_value(key, only_children=False, subspecifier_value=None)[source]

Delete a value associated with the given key (if existing).

Note:

No exceptions are raised if no entry is found.

Parameters:
  • key – the key to delete. Can contain the separator self._sep if you want to delete a subkey.
  • only_children – if True, delete only children and not the entry itself.
  • subspecifier_value – must be None if this class has no subspecifier set (e.g., the DbSetting class). Must be the value of the subspecifier (e.g., the dbnode) for classes that define it (e.g. DbAttribute and DbExtra)
del_value_for_node(dbnode, key)[source]

Delete an attribute from the database for the given dbnode.

Note:

no exception is raised if no attribute with the given key is found in the DB.

Parameters:
  • dbnode – the dbnode for which you want to delete the key.
  • key – the key to delete.
get_value_for_node(dbnode, key)[source]

Get an attribute from the database for the given dbnode.

Returns:the value stored in the Db table, correctly converted to the right type.
Raises:AttributeError – if no key is found for the given dbnode
getvalue(attr)[source]

This can be called on a given row and will get the corresponding value, casting it correctly.

set_value(key, value, with_transaction=False, subspecifier_value=None, other_attribs={}, stop_if_existing=False)[source]

Set a new value in the DB, possibly associated to the given subspecifier.

Note:

This method also stored directly in the DB.

Parameters:
  • key – a string with the key to create (must be a level-0 attribute, that is it cannot contain the separator cls._sep).
  • value – the value to store (a basic data type or a list or a dict)
  • subspecifier_value – must be None if this class has no subspecifier set (e.g., the DbSetting class). Must be the value of the subspecifier (e.g., the dbnode) for classes that define it (e.g. DbAttribute and DbExtra)
  • with_transaction – True if you want this function to be managed with transactions. Set to False if you already have a manual management of transactions in the block where you are calling this function (useful for speed improvements to avoid recursive transactions)
  • other_attribs – a dictionary of other parameters, to store only on the level-zero attribute (e.g. for description in DbSetting).
  • stop_if_existing – if True, it will stop with an UniquenessError exception if the new entry would violate an uniqueness constraint in the DB (same key, or same key+node, depending on the specific subclass). Otherwise, it will first delete the old value, if existent. The use with True is useful if you want to use a given attribute as a “locking” value, e.g. to avoid to perform an action twice on the same node. Note that, if you are using transactions, you may get the error only when the transaction is committed.
set_value_for_node(dbnode, key, value, with_transaction=False, stop_if_existing=False)[source]

This is the raw-level method that accesses the DB. No checks are done to prevent the user from (re)setting a valid key. To be used only internally.

Todo:

there may be some error on concurrent write; not checked in this unlucky case!

Parameters:
  • dbnode – the dbnode for which the attribute should be stored; in an integer is passed, this is used as the PK of the dbnode, without any further check (for speed reasons)
  • key – the key of the attribute to store; must be a level-zero attribute (i.e., no separators in the key)
  • value – the value of the attribute to store
  • with_transaction – if True (default), do this within a transaction, so that nothing gets stored if a subitem cannot be created. Otherwise, if this parameter is False, no transaction management is performed.
  • stop_if_existing – if True, it will stop with an UniquenessError exception if the key already exists for the given node. Otherwise, it will first delete the old value, if existent. The use with True is useful if you want to use a given attribute as a “locking” value, e.g. to avoid to perform an action twice on the same node. Note that, if you are using transactions, you may get the error only when the transaction is committed.
Raises:

ValueError – if the key contains the separator symbol used internally to unpack dictionaries and lists (defined in cls._sep).

subspecifier_pk(attr)[source]

Return the subspecifier PK in the database (or None, if no subspecifier should be used)

subspecifiers_dict(attr)[source]

Return a dict to narrow down the query to only those matching also the subspecifier.

validate_key(key)[source]

Validate the key string to check if it is valid (e.g., if it does not contain the separator symbol.).

Returns:None if the key is valid
Raises:aiida.common.ValidationError – if the key is not valid
aiida.backends.djsite.db.migrations._deserialize_attribute(mainitem, subitems, sep, original_class=None, original_pk=None, lesserrors=False)[source]

Deserialize a single attribute.

Parameters:
  • mainitem – the main item (either the attribute itself for base types (None, string, …) or the main item for lists and dicts. Must contain the ‘key’ key and also the following keys: datatype, tval, fval, ival, bval, dval. NOTE that a type check is not performed! tval is expected to be a string, dval a date, etc.
  • subitems – must be a dictionary of dictionaries. In the top-level dictionary, the key must be the key of the attribute, stripped of all prefixes (i.e., if the mainitem has key ‘a.b’ and we pass subitems ‘a.b.0’, ‘a.b.1’, ‘a.b.1.c’, their keys must be ‘0’, ‘1’, ‘1.c’). It must be None if the value is not iterable (int, str, float, …). It is an empty dictionary if there are no subitems.
  • sep – a string, the separator between subfields (to separate the name of a dictionary from the keys it contains, for instance)
  • original_class – if these elements come from a specific subclass of DbMultipleValueAttributeBaseClass, pass here the class (note: the class, not the instance!). This is used only in case the wrong number of elements is found in the raw data, to print a more meaningful message (if the class has a dbnode associated to it)
  • original_pk – if the elements come from a specific subclass of DbMultipleValueAttributeBaseClass that has a dbnode associated to it, pass here the PK integer. This is used only in case the wrong number of elements is found in the raw data, to print a more meaningful message
  • lesserrors – If set to True, in some cases where the content of the DB is not consistent but data is still recoverable, it will just log the message rather than raising an exception (e.g. if the number of elements of a dictionary is different from the number declared in the ival field).
Returns:

the deserialized value

Raises:

aiida.backends.djsite.db.migrations.DeserializationException – if an error occurs

aiida.backends.djsite.db.migrations._update_schema_version(version, apps, schema_editor)[source]

The update schema uses the current models (and checks if the value is stored in EAV mode or JSONB) to avoid to use the DbSettings schema that may change (as it changed with the migration of the settings table to JSONB)

aiida.backends.djsite.db.migrations.current_schema_version()[source]
aiida.backends.djsite.db.migrations.deserialize_attributes(data, sep, original_class=None, original_pk=None)[source]

Deserialize the attributes from the format internally stored in the DB to the actual format (dictionaries, lists, integers, …

Parameters:
  • data – must be a dictionary of dictionaries. In the top-level dictionary, the key must be the key of the attribute. The value must be a dictionary with the following keys: datatype, tval, fval, ival, bval, dval. Other keys are ignored. NOTE that a type check is not performed! tval is expected to be a string, dval a date, etc.
  • sep – a string, the separator between subfields (to separate the name of a dictionary from the keys it contains, for instance)
  • original_class – if these elements come from a specific subclass of DbMultipleValueAttributeBaseClass, pass here the class (note: the class, not the instance!). This is used only in case the wrong number of elements is found in the raw data, to print a more meaningful message (if the class has a dbnode associated to it)
  • original_pk – if the elements come from a specific subclass of DbMultipleValueAttributeBaseClass that has a dbnode associated to it, pass here the PK integer. This is used only in case the wrong number of elements is found in the raw data, to print a more meaningful message
Returns:

a dictionary, where for each entry the corresponding value is returned, deserialized back to lists, dictionaries, etc. Example: if data = {'a': {'datatype': "list", "ival": 2, ...}, 'a.0': {'datatype': "int", "ival": 2, ...}, 'a.1': {'datatype': "txt", "tval":  "yy"}], it will return {"a": [2, "yy"]}

aiida.backends.djsite.db.migrations.upgrade_schema_version(up_revision, down_revision)[source]

Submodules

class aiida.backends.djsite.db.migrations.0001_initial.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0001_initial'
dependencies = [('auth', '0001_initial')]
operations = [<CreateModel name='DbUser', fields=[('id', <django.db.models.fields.AutoField>), ('password', <django.db.models.fields.CharField>), ('last_login', <django.db.models.fields.DateTimeField>), ('is_superuser', <django.db.models.fields.BooleanField>), ('email', <django.db.models.fields.EmailField>), ('first_name', <django.db.models.fields.CharField>), ('last_name', <django.db.models.fields.CharField>), ('institution', <django.db.models.fields.CharField>), ('is_staff', <django.db.models.fields.BooleanField>), ('is_active', <django.db.models.fields.BooleanField>), ('date_joined', <django.db.models.fields.DateTimeField>), ('groups', <django.db.models.fields.related.ManyToManyField>), ('user_permissions', <django.db.models.fields.related.ManyToManyField>)], options={'abstract': False}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbAttribute', fields=[('id', <django.db.models.fields.AutoField>), ('key', <django.db.models.fields.CharField>), ('datatype', <django.db.models.fields.CharField>), ('tval', <django.db.models.fields.TextField>), ('fval', <django.db.models.fields.FloatField>), ('ival', <django.db.models.fields.IntegerField>), ('bval', <django.db.models.fields.NullBooleanField>), ('dval', <django.db.models.fields.DateTimeField>)], options={'abstract': False}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbAuthInfo', fields=[('id', <django.db.models.fields.AutoField>), ('auth_params', <django.db.models.fields.TextField>), ('metadata', <django.db.models.fields.TextField>), ('enabled', <django.db.models.fields.BooleanField>), ('aiidauser', <django.db.models.fields.related.ForeignKey>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbCalcState', fields=[('id', <django.db.models.fields.AutoField>), ('state', <django.db.models.fields.CharField>), ('time', <django.db.models.fields.DateTimeField>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbComment', fields=[('id', <django.db.models.fields.AutoField>), ('uuid', <django.db.models.fields.CharField>), ('ctime', <django.db.models.fields.DateTimeField>), ('mtime', <django.db.models.fields.DateTimeField>), ('content', <django.db.models.fields.TextField>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbComputer', fields=[('id', <django.db.models.fields.AutoField>), ('uuid', <django.db.models.fields.CharField>), ('name', <django.db.models.fields.CharField>), ('hostname', <django.db.models.fields.CharField>), ('description', <django.db.models.fields.TextField>), ('enabled', <django.db.models.fields.BooleanField>), ('transport_type', <django.db.models.fields.CharField>), ('scheduler_type', <django.db.models.fields.CharField>), ('transport_params', <django.db.models.fields.TextField>), ('metadata', <django.db.models.fields.TextField>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbExtra', fields=[('id', <django.db.models.fields.AutoField>), ('key', <django.db.models.fields.CharField>), ('datatype', <django.db.models.fields.CharField>), ('tval', <django.db.models.fields.TextField>), ('fval', <django.db.models.fields.FloatField>), ('ival', <django.db.models.fields.IntegerField>), ('bval', <django.db.models.fields.NullBooleanField>), ('dval', <django.db.models.fields.DateTimeField>)], options={'abstract': False}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbGroup', fields=[('id', <django.db.models.fields.AutoField>), ('uuid', <django.db.models.fields.CharField>), ('name', <django.db.models.fields.CharField>), ('type', <django.db.models.fields.CharField>), ('time', <django.db.models.fields.DateTimeField>), ('description', <django.db.models.fields.TextField>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbLink', fields=[('id', <django.db.models.fields.AutoField>), ('label', <django.db.models.fields.CharField>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbLock', fields=[('key', <django.db.models.fields.CharField>), ('creation', <django.db.models.fields.DateTimeField>), ('timeout', <django.db.models.fields.IntegerField>), ('owner', <django.db.models.fields.CharField>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbLog', fields=[('id', <django.db.models.fields.AutoField>), ('time', <django.db.models.fields.DateTimeField>), ('loggername', <django.db.models.fields.CharField>), ('levelname', <django.db.models.fields.CharField>), ('objname', <django.db.models.fields.CharField>), ('objpk', <django.db.models.fields.IntegerField>), ('message', <django.db.models.fields.TextField>), ('metadata', <django.db.models.fields.TextField>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbNode', fields=[('id', <django.db.models.fields.AutoField>), ('uuid', <django.db.models.fields.CharField>), ('type', <django.db.models.fields.CharField>), ('label', <django.db.models.fields.CharField>), ('description', <django.db.models.fields.TextField>), ('ctime', <django.db.models.fields.DateTimeField>), ('mtime', <django.db.models.fields.DateTimeField>), ('nodeversion', <django.db.models.fields.IntegerField>), ('public', <django.db.models.fields.BooleanField>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbPath', fields=[('id', <django.db.models.fields.AutoField>), ('depth', <django.db.models.fields.IntegerField>), ('entry_edge_id', <django.db.models.fields.IntegerField>), ('direct_edge_id', <django.db.models.fields.IntegerField>), ('exit_edge_id', <django.db.models.fields.IntegerField>), ('child', <django.db.models.fields.related.ForeignKey>), ('parent', <django.db.models.fields.related.ForeignKey>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbSetting', fields=[('id', <django.db.models.fields.AutoField>), ('key', <django.db.models.fields.CharField>), ('datatype', <django.db.models.fields.CharField>), ('tval', <django.db.models.fields.TextField>), ('fval', <django.db.models.fields.FloatField>), ('ival', <django.db.models.fields.IntegerField>), ('bval', <django.db.models.fields.NullBooleanField>), ('dval', <django.db.models.fields.DateTimeField>), ('description', <django.db.models.fields.TextField>), ('time', <django.db.models.fields.DateTimeField>)], options={'abstract': False}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbWorkflow', fields=[('id', <django.db.models.fields.AutoField>), ('uuid', <django.db.models.fields.CharField>), ('ctime', <django.db.models.fields.DateTimeField>), ('mtime', <django.db.models.fields.DateTimeField>), ('label', <django.db.models.fields.CharField>), ('description', <django.db.models.fields.TextField>), ('nodeversion', <django.db.models.fields.IntegerField>), ('lastsyncedversion', <django.db.models.fields.IntegerField>), ('state', <django.db.models.fields.CharField>), ('report', <django.db.models.fields.TextField>), ('module', <django.db.models.fields.TextField>), ('module_class', <django.db.models.fields.TextField>), ('script_path', <django.db.models.fields.TextField>), ('script_md5', <django.db.models.fields.CharField>), ('user', <django.db.models.fields.related.ForeignKey>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbWorkflowData', fields=[('id', <django.db.models.fields.AutoField>), ('name', <django.db.models.fields.CharField>), ('time', <django.db.models.fields.DateTimeField>), ('data_type', <django.db.models.fields.CharField>), ('value_type', <django.db.models.fields.CharField>), ('json_value', <django.db.models.fields.TextField>), ('aiida_obj', <django.db.models.fields.related.ForeignKey>), ('parent', <django.db.models.fields.related.ForeignKey>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <CreateModel name='DbWorkflowStep', fields=[('id', <django.db.models.fields.AutoField>), ('name', <django.db.models.fields.CharField>), ('time', <django.db.models.fields.DateTimeField>), ('nextcall', <django.db.models.fields.CharField>), ('state', <django.db.models.fields.CharField>), ('calculations', <django.db.models.fields.related.ManyToManyField>), ('parent', <django.db.models.fields.related.ForeignKey>), ('sub_workflows', <django.db.models.fields.related.ManyToManyField>), ('user', <django.db.models.fields.related.ForeignKey>)], options={}, bases=(<class 'django.db.models.base.Model'>,)>, <AlterUniqueTogether name='dbworkflowstep', unique_together={('parent', 'name')}>, <AlterUniqueTogether name='dbworkflowdata', unique_together={('parent', 'name', 'data_type')}>, <AlterUniqueTogether name='dbsetting', unique_together={('key',)}>, <AddField model_name='dbnode', name='children', field=<django.db.models.fields.related.ManyToManyField>, preserve_default=True>, <AddField model_name='dbnode', name='dbcomputer', field=<django.db.models.fields.related.ForeignKey>, preserve_default=True>, <AddField model_name='dbnode', name='outputs', field=<django.db.models.fields.related.ManyToManyField>, preserve_default=True>, <AddField model_name='dbnode', name='user', field=<django.db.models.fields.related.ForeignKey>, preserve_default=True>, <AddField model_name='dblink', name='input', field=<django.db.models.fields.related.ForeignKey>, preserve_default=True>, <AddField model_name='dblink', name='output', field=<django.db.models.fields.related.ForeignKey>, preserve_default=True>, <AlterUniqueTogether name='dblink', unique_together={('input', 'output'), ('output', 'label')}>, <AddField model_name='dbgroup', name='dbnodes', field=<django.db.models.fields.related.ManyToManyField>, preserve_default=True>, <AddField model_name='dbgroup', name='user', field=<django.db.models.fields.related.ForeignKey>, preserve_default=True>, <AlterUniqueTogether name='dbgroup', unique_together={('name', 'type')}>, <AddField model_name='dbextra', name='dbnode', field=<django.db.models.fields.related.ForeignKey>, preserve_default=True>, <AlterUniqueTogether name='dbextra', unique_together={('dbnode', 'key')}>, <AddField model_name='dbcomment', name='dbnode', field=<django.db.models.fields.related.ForeignKey>, preserve_default=True>, <AddField model_name='dbcomment', name='user', field=<django.db.models.fields.related.ForeignKey>, preserve_default=True>, <AddField model_name='dbcalcstate', name='dbnode', field=<django.db.models.fields.related.ForeignKey>, preserve_default=True>, <AlterUniqueTogether name='dbcalcstate', unique_together={('dbnode', 'state')}>, <AddField model_name='dbauthinfo', name='dbcomputer', field=<django.db.models.fields.related.ForeignKey>, preserve_default=True>, <AlterUniqueTogether name='dbauthinfo', unique_together={('aiidauser', 'dbcomputer')}>, <AddField model_name='dbattribute', name='dbnode', field=<django.db.models.fields.related.ForeignKey>, preserve_default=True>, <AlterUniqueTogether name='dbattribute', unique_together={('dbnode', 'key')}>, <RunPython functools.partial(<function _update_schema_version>, '1.0.1') reverse_code=functools.partial(<function _update_schema_version>, '1.0.0')>]
class aiida.backends.djsite.db.migrations.0002_db_state_change.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0002_db_state_change'
dependencies = [('db', '0001_initial')]
operations = [<AlterField model_name='dbcalcstate', name='state', field=<django.db.models.fields.CharField>, preserve_default=True>, <RunPython <function fix_calc_states>>, <RunPython functools.partial(<function _update_schema_version>, '1.0.2') reverse_code=functools.partial(<function _update_schema_version>, '1.0.1')>]
aiida.backends.djsite.db.migrations.0002_db_state_change.fix_calc_states(apps, schema_editor)[source]

Bases: django.db.migrations.migration.Migration

class aiida.backends.djsite.db.migrations.0004_add_daemon_and_uuid_indices.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0004_add_daemon_and_uuid_indices'
dependencies = [('db', '0003_add_link_type')]
operations = [<RunSQL '\n CREATE INDEX tval_idx_for_daemon\n ON db_dbattribute (tval)\n WHERE ("db_dbattribute"."tval"\n IN (\'COMPUTED\', \'WITHSCHEDULER\', \'TOSUBMIT\'))'>, <AlterField model_name='dbnode', name='uuid', field=<django.db.models.fields.CharField>, preserve_default=True>, <RunPython functools.partial(<function _update_schema_version>, '1.0.4') reverse_code=functools.partial(<function _update_schema_version>, '1.0.3')>]
class aiida.backends.djsite.db.migrations.0005_add_cmtime_indices.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0005_add_cmtime_indices'
dependencies = [('db', '0004_add_daemon_and_uuid_indices')]
operations = [<AlterField model_name='dbnode', name='ctime', field=<django.db.models.fields.DateTimeField>, preserve_default=True>, <AlterField model_name='dbnode', name='mtime', field=<django.db.models.fields.DateTimeField>, preserve_default=True>, <RunPython functools.partial(<function _update_schema_version>, '1.0.5') reverse_code=functools.partial(<function _update_schema_version>, '1.0.4')>]
class aiida.backends.djsite.db.migrations.0006_delete_dbpath.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0006_delete_dbpath'
dependencies = [('db', '0005_add_cmtime_indices')]
operations = [<RemoveField model_name='dbpath', name='child'>, <RemoveField model_name='dbpath', name='parent'>, <RemoveField model_name='dbnode', name='children'>, <DeleteModel name='DbPath'>, <RunSQL '\n DROP TRIGGER IF EXISTS autoupdate_tc ON db_dblink;\n DROP FUNCTION IF EXISTS update_tc();\n '>, <RunPython functools.partial(<function _update_schema_version>, '1.0.6') reverse_code=functools.partial(<function _update_schema_version>, '1.0.5')>]
class aiida.backends.djsite.db.migrations.0007_update_linktypes.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0007_update_linktypes'
dependencies = [('db', '0006_delete_dbpath')]
operations = [<RunSQL "\n UPDATE db_dblink set type='createlink' WHERE db_dblink.id IN (\n SELECT db_dblink_1.id\n FROM db_dbnode AS db_dbnode_1\n JOIN db_dblink AS db_dblink_1 ON db_dblink_1.input_id = db_dbnode_1.id\n JOIN db_dbnode AS db_dbnode_2 ON db_dblink_1.output_id = db_dbnode_2.id\n WHERE db_dbnode_1.type LIKE 'calculation.inline.%'\n AND db_dbnode_2.type LIKE 'data.%'\n AND db_dblink_1.type = 'returnlink'\n );\n ">, <RunSQL "\n UPDATE db_dblink set type='inputlink' where id in (\n SELECT db_dblink_1.id\n FROM db_dbnode AS db_dbnode_1\n JOIN db_dblink AS db_dblink_1 ON db_dblink_1.input_id = db_dbnode_1.id\n JOIN db_dbnode AS db_dbnode_2 ON db_dblink_1.output_id = db_dbnode_2.id\n WHERE ( db_dbnode_1.type LIKE 'data.%' or db_dbnode_1.type = 'code.Code.' )\n AND db_dbnode_2.type LIKE 'calculation.%'\n AND ( db_dblink_1.type = null OR db_dblink_1.type = '')\n );\n ">, <RunSQL "\n UPDATE db_dblink set type='createlink' where id in (\n SELECT db_dblink_1.id\n FROM db_dbnode AS db_dbnode_1\n JOIN db_dblink AS db_dblink_1 ON db_dblink_1.input_id = db_dbnode_1.id\n JOIN db_dbnode AS db_dbnode_2 ON db_dblink_1.output_id = db_dbnode_2.id\n WHERE db_dbnode_2.type LIKE 'data.%'\n AND (\n db_dbnode_1.type LIKE 'calculation.job.%'\n OR\n db_dbnode_1.type = 'calculation.inline.InlineCalculation.'\n )\n AND ( db_dblink_1.type = null OR db_dblink_1.type = '')\n );\n ">, <RunSQL "\n UPDATE db_dblink set type='returnlink' where id in (\n SELECT db_dblink_1.id\n FROM db_dbnode AS db_dbnode_1\n JOIN db_dblink AS db_dblink_1 ON db_dblink_1.input_id = db_dbnode_1.id\n JOIN db_dbnode AS db_dbnode_2 ON db_dblink_1.output_id = db_dbnode_2.id\n WHERE db_dbnode_2.type LIKE 'data.%'\n AND db_dbnode_1.type = 'calculation.work.WorkCalculation.'\n AND ( db_dblink_1.type = null OR db_dblink_1.type = '')\n );\n ">, <RunSQL "\n UPDATE db_dblink set type='calllink' where id in (\n SELECT db_dblink_1.id\n FROM db_dbnode AS db_dbnode_1\n JOIN db_dblink AS db_dblink_1 ON db_dblink_1.input_id = db_dbnode_1.id\n JOIN db_dbnode AS db_dbnode_2 ON db_dblink_1.output_id = db_dbnode_2.id\n WHERE db_dbnode_1.type = 'calculation.work.WorkCalculation.'\n AND db_dbnode_2.type LIKE 'calculation.%'\n AND ( db_dblink_1.type = null OR db_dblink_1.type = '')\n );\n ">, <RunPython functools.partial(<function _update_schema_version>, '1.0.8') reverse_code=functools.partial(<function _update_schema_version>, '1.0.7')>]
class aiida.backends.djsite.db.migrations.0008_code_hidden_to_extra.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0008_code_hidden_to_extra'
dependencies = [('db', '0007_update_linktypes')]
operations = [<RunSQL "\n INSERT INTO db_dbextra (key, datatype, tval, fval, ival, bval, dval, dbnode_id) (\n SELECT db_dbattribute.key, db_dbattribute.datatype, db_dbattribute.tval, db_dbattribute.fval, db_dbattribute.ival, db_dbattribute.bval, db_dbattribute.dval, db_dbattribute.dbnode_id\n FROM db_dbattribute JOIN db_dbnode ON db_dbnode.id = db_dbattribute.dbnode_id\n WHERE db_dbattribute.key = 'hidden'\n AND db_dbnode.type = 'code.Code.'\n );\n ">, <RunSQL "\n DELETE FROM db_dbattribute\n WHERE id in (\n SELECT db_dbattribute.id\n FROM db_dbattribute\n JOIN db_dbnode ON db_dbnode.id = db_dbattribute.dbnode_id\n WHERE db_dbattribute.key = 'hidden' AND db_dbnode.type = 'code.Code.'\n );\n ">, <RunPython functools.partial(<function _update_schema_version>, '1.0.8') reverse_code=functools.partial(<function _update_schema_version>, '1.0.7')>]
class aiida.backends.djsite.db.migrations.0009_base_data_plugin_type_string.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0009_base_data_plugin_type_string'
dependencies = [('db', '0008_code_hidden_to_extra')]
operations = [<RunSQL "\n UPDATE db_dbnode SET type = 'data.bool.Bool.' WHERE type = 'data.base.Bool.';\n UPDATE db_dbnode SET type = 'data.float.Float.' WHERE type = 'data.base.Float.';\n UPDATE db_dbnode SET type = 'data.int.Int.' WHERE type = 'data.base.Int.';\n UPDATE db_dbnode SET type = 'data.str.Str.' WHERE type = 'data.base.Str.';\n UPDATE db_dbnode SET type = 'data.list.List.' WHERE type = 'data.base.List.';\n ">, <RunPython functools.partial(<function _update_schema_version>, '1.0.9') reverse_code=functools.partial(<function _update_schema_version>, '1.0.8')>]
class aiida.backends.djsite.db.migrations.0010_process_type.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0010_process_type'
dependencies = [('db', '0009_base_data_plugin_type_string')]
operations = [<AddField model_name='dbnode', name='process_type', field=<django.db.models.fields.CharField>>, <RunPython functools.partial(<function _update_schema_version>, '1.0.10') reverse_code=functools.partial(<function _update_schema_version>, '1.0.9')>]
class aiida.backends.djsite.db.migrations.0011_delete_kombu_tables.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0011_delete_kombu_tables'
dependencies = [('db', '0010_process_type')]
operations = [<RunSQL "\n DROP TABLE IF EXISTS kombu_message;\n DROP TABLE IF EXISTS kombu_queue;\n DELETE FROM db_dbsetting WHERE key = 'daemon|user';\n DELETE FROM db_dbsetting WHERE key = 'daemon|task_stop|retriever';\n DELETE FROM db_dbsetting WHERE key = 'daemon|task_start|retriever';\n DELETE FROM db_dbsetting WHERE key = 'daemon|task_stop|updater';\n DELETE FROM db_dbsetting WHERE key = 'daemon|task_start|updater';\n DELETE FROM db_dbsetting WHERE key = 'daemon|task_stop|submitter';\n DELETE FROM db_dbsetting WHERE key = 'daemon|task_start|submitter';\n ">, <RunPython functools.partial(<function _update_schema_version>, '1.0.11') reverse_code=functools.partial(<function _update_schema_version>, '1.0.10')>]
class aiida.backends.djsite.db.migrations.0012_drop_dblock.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0012_drop_dblock'
dependencies = [('db', '0011_delete_kombu_tables')]
operations = [<DeleteModel name='DbLock'>, <RunPython functools.partial(<function _update_schema_version>, '1.0.12') reverse_code=functools.partial(<function _update_schema_version>, '1.0.11')>]
class aiida.backends.djsite.db.migrations.0013_django_1_8.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0013_django_1_8'
dependencies = [('db', '0012_drop_dblock')]
operations = [<AlterField model_name='dbuser', name='last_login', field=<django.db.models.fields.DateTimeField>>, <AlterField model_name='dbuser', name='email', field=<django.db.models.fields.EmailField>>, <RunPython functools.partial(<function _update_schema_version>, '1.0.13') reverse_code=functools.partial(<function _update_schema_version>, '1.0.12')>]

Add a uniqueness constraint to the uuid column of DbNode table.

class aiida.backends.djsite.db.migrations.0014_add_node_uuid_unique_constraint.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Add a uniqueness constraint to the uuid column of DbNode table.

__module__ = 'aiida.backends.djsite.db.migrations.0014_add_node_uuid_unique_constraint'
dependencies = [('db', '0013_django_1_8')]
operations = [<RunPython <function verify_node_uuid_uniqueness> reverse_code=<function reverse_code>>, <AlterField model_name='dbnode', name='uuid', field=<django.db.models.fields.CharField>>, <RunPython functools.partial(<function _update_schema_version>, '1.0.14') reverse_code=functools.partial(<function _update_schema_version>, '1.0.13')>]
aiida.backends.djsite.db.migrations.0014_add_node_uuid_unique_constraint.reverse_code(apps, schema_editor)[source]
aiida.backends.djsite.db.migrations.0014_add_node_uuid_unique_constraint.verify_node_uuid_uniqueness(apps, schema_editor)[source]

Check whether the database contains nodes with duplicate UUIDS.

Note that we have to redefine this method from aiida.manage.database.integrity.verify_node_uuid_uniqueness because the migrations.RunPython command that will invoke this function, will pass two arguments and therefore this wrapper needs to have a different function signature.

Raises:IntegrityError if database contains nodes with duplicate UUIDS.

Invalidating node hash - User should rehash nodes for caching

class aiida.backends.djsite.db.migrations.0015_invalidating_node_hash.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Invalidating node hash - User should rehash nodes for caching

__module__ = 'aiida.backends.djsite.db.migrations.0015_invalidating_node_hash'
dependencies = [('db', '0014_add_node_uuid_unique_constraint')]
operations = [<RunSQL " DELETE FROM db_dbextra WHERE key='_aiida_hash';" reverse_sql=" DELETE FROM db_dbextra WHERE key='_aiida_hash';">, <RunPython functools.partial(<function _update_schema_version>, '1.0.15') reverse_code=functools.partial(<function _update_schema_version>, '1.0.14')>]
class aiida.backends.djsite.db.migrations.0016_code_sub_class_of_data.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0016_code_sub_class_of_data'
dependencies = [('db', '0015_invalidating_node_hash')]
operations = [<RunSQL sql="UPDATE db_dbnode SET type = 'data.code.Code.' WHERE type = 'code.Code.';", reverse_sql="UPDATE db_dbnode SET type = 'code.Code.' WHERE type = 'data.code.Code.';">, <RunPython functools.partial(<function _update_schema_version>, '1.0.16') reverse_code=functools.partial(<function _update_schema_version>, '1.0.15')>]
class aiida.backends.djsite.db.migrations.0017_drop_dbcalcstate.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

__module__ = 'aiida.backends.djsite.db.migrations.0017_drop_dbcalcstate'
dependencies = [('db', '0016_code_sub_class_of_data')]
operations = [<DeleteModel name='DbCalcState'>, <RunPython functools.partial(<function _update_schema_version>, '1.0.17') reverse_code=functools.partial(<function _update_schema_version>, '1.0.16')>]

Migration for upgrade to django 1.11

class aiida.backends.djsite.db.migrations.0018_django_1_11.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Migration for upgrade to django 1.11

This migration switches from the django_extensions UUID field to the native UUIDField of django 1.11

It also introduces unique constraints on all uuid columns (previously existed only on dbnode).

__module__ = 'aiida.backends.djsite.db.migrations.0018_django_1_11'
dependencies = [('db', '0017_drop_dbcalcstate')]
operations = [<RunPython <function _verify_uuid_uniqueness> reverse_code=<function reverse_code>>, <AlterField model_name='dbcomment', name='uuid', field=<django.db.models.fields.UUIDField>>, <AlterField model_name='dbcomputer', name='uuid', field=<django.db.models.fields.UUIDField>>, <AlterField model_name='dbgroup', name='uuid', field=<django.db.models.fields.UUIDField>>, <AlterField model_name='dbnode', name='uuid', field=<django.db.models.fields.CharField>>, <AlterField model_name='dbnode', name='uuid', field=<django.db.models.fields.UUIDField>>, <AlterField model_name='dbuser', name='email', field=<django.db.models.fields.EmailField>>, <AlterField model_name='dbuser', name='groups', field=<django.db.models.fields.related.ManyToManyField>>, <AlterField model_name='dbworkflow', name='uuid', field=<django.db.models.fields.UUIDField>>, <RunPython functools.partial(<function _update_schema_version>, '1.0.18') reverse_code=functools.partial(<function _update_schema_version>, '1.0.17')>]
aiida.backends.djsite.db.migrations.0018_django_1_11._verify_uuid_uniqueness(apps, schema_editor)[source]

Check whether the respective tables contain rows with duplicate UUIDS.

Note that we have to redefine this method from aiida.manage.database.integrity because the migrations.RunPython command that will invoke this function, will pass two arguments and therefore this wrapper needs to have a different function signature.

Raises:IntegrityError if database contains rows with duplicate UUIDS.
aiida.backends.djsite.db.migrations.0018_django_1_11.reverse_code(apps, schema_editor)[source]

Migration to reflect the name change of the built in calculation entry points in the database.

class aiida.backends.djsite.db.migrations.0019_migrate_builtin_calculations.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Migration to remove entry point groups from process type strings and prefix unknown types with a marker.

__module__ = 'aiida.backends.djsite.db.migrations.0019_migrate_builtin_calculations'
dependencies = [('db', '0018_django_1_11')]
operations = [<RunSQL sql="\n UPDATE db_dbnode SET type = 'calculation.job.arithmetic.add.ArithmeticAddCalculation.'\n WHERE type = 'calculation.job.simpleplugins.arithmetic.add.ArithmeticAddCalculation.';\n\n UPDATE db_dbnode SET type = 'calculation.job.templatereplacer.TemplatereplacerCalculation.'\n WHERE type = 'calculation.job.simpleplugins.templatereplacer.TemplatereplacerCalculation.';\n\n UPDATE db_dbnode SET process_type = 'aiida.calculations:arithmetic.add'\n WHERE process_type = 'aiida.calculations:simpleplugins.arithmetic.add';\n\n UPDATE db_dbnode SET process_type = 'aiida.calculations:templatereplacer'\n WHERE process_type = 'aiida.calculations:simpleplugins.templatereplacer';\n\n UPDATE db_dbattribute AS a SET tval = 'arithmetic.add'\n FROM db_dbnode AS n WHERE a.dbnode_id = n.id\n AND a.key = 'input_plugin'\n AND a.tval = 'simpleplugins.arithmetic.add'\n AND n.type = 'data.code.Code.';\n\n UPDATE db_dbattribute AS a SET tval = 'templatereplacer'\n FROM db_dbnode AS n WHERE a.dbnode_id = n.id\n AND a.key = 'input_plugin'\n AND a.tval = 'simpleplugins.templatereplacer'\n AND n.type = 'data.code.Code.';\n ", reverse_sql="\n UPDATE db_dbnode SET type = 'calculation.job.simpleplugins.arithmetic.add.ArithmeticAddCalculation.'\n WHERE type = 'calculation.job.arithmetic.add.ArithmeticAddCalculation.';\n\n UPDATE db_dbnode SET type = 'calculation.job.simpleplugins.templatereplacer.TemplatereplacerCalculation.'\n WHERE type = 'calculation.job.templatereplacer.TemplatereplacerCalculation.';\n\n UPDATE db_dbnode SET process_type = 'aiida.calculations:simpleplugins.arithmetic.add'\n WHERE process_type = 'aiida.calculations:arithmetic.add';\n\n UPDATE db_dbnode SET process_type = 'aiida.calculations:simpleplugins.templatereplacer'\n WHERE process_type = 'aiida.calculations:templatereplacer';\n\n UPDATE db_dbattribute AS a SET tval = 'simpleplugins.arithmetic.add'\n FROM db_dbnode AS n WHERE a.dbnode_id = n.id\n AND a.key = 'input_plugin'\n AND a.tval = 'arithmetic.add'\n AND n.type = 'data.code.Code.';\n\n UPDATE db_dbattribute AS a SET tval = 'simpleplugins.templatereplacer'\n FROM db_dbnode AS n WHERE a.dbnode_id = n.id\n AND a.key = 'input_plugin'\n AND a.tval = 'templatereplacer'\n AND n.type = 'data.code.Code.';\n ">, <RunPython functools.partial(<function _update_schema_version>, '1.0.19') reverse_code=functools.partial(<function _update_schema_version>, '1.0.18')>]

Migration after the provenance redesign

class aiida.backends.djsite.db.migrations.0020_provenance_redesign.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Migration to effectuate changes introduced by the provenance redesign

This includes in order:

  • Rename the type column of process nodes
  • Remove illegal links
  • Rename link types

The exact reverse operation is not possible because the renaming of the type string of JobCalculation nodes is done in a lossy way. Originally this type string contained the exact sub class of the JobCalculation but in the migration this is changed to always be node.process.calculation.calcjob.CalcJobNode.. In the reverse operation, this can then only be reset to calculation.job.JobCalculation. but the information on the exact sub class is lost.

__module__ = 'aiida.backends.djsite.db.migrations.0020_provenance_redesign'
dependencies = [('db', '0019_migrate_builtin_calculations')]
operations = [<RunPython <function migrate_infer_calculation_entry_point> reverse_code=<function reverse_code>, atomic=True>, <RunPython <function detect_unexpected_links> reverse_code=<function reverse_code>, atomic=True>, <RunSQL "\n DELETE FROM db_dblink WHERE db_dblink.id IN (\n SELECT db_dblink.id FROM db_dblink\n INNER JOIN db_dbnode ON db_dblink.input_id = db_dbnode.id\n WHERE\n (db_dbnode.type LIKE 'calculation.job%' OR db_dbnode.type LIKE 'calculation.inline%')\n AND db_dblink.type = 'returnlink'\n ); -- Delete all outgoing RETURN links from JobCalculation and InlineCalculation nodes\n\n DELETE FROM db_dblink WHERE db_dblink.id IN (\n SELECT db_dblink.id FROM db_dblink\n INNER JOIN db_dbnode ON db_dblink.input_id = db_dbnode.id\n WHERE\n (db_dbnode.type LIKE 'calculation.job%' OR db_dbnode.type LIKE 'calculation.inline%')\n AND db_dblink.type = 'calllink'\n ); -- Delete all outgoing CALL links from JobCalculation and InlineCalculation nodes\n\n DELETE FROM db_dblink WHERE db_dblink.id IN (\n SELECT db_dblink.id FROM db_dblink\n INNER JOIN db_dbnode ON db_dblink.input_id = db_dbnode.id\n WHERE\n (db_dbnode.type LIKE 'calculation.function%' OR db_dbnode.type LIKE 'calculation.work%')\n AND db_dblink.type = 'createlink'\n ); -- Delete all outgoing CREATE links from FunctionCalculation and WorkCalculation nodes\n\n UPDATE db_dbnode SET type = 'calculation.work.WorkCalculation.'\n WHERE type = 'calculation.process.ProcessCalculation.';\n -- First migrate very old `ProcessCalculation` to `WorkCalculation`\n\n UPDATE db_dbnode SET type = 'node.process.workflow.workfunction.WorkFunctionNode.' FROM db_dbattribute\n WHERE db_dbattribute.dbnode_id = db_dbnode.id\n AND type = 'calculation.work.WorkCalculation.'\n AND db_dbattribute.key = 'function_name';\n -- WorkCalculations that have a `function_name` attribute are FunctionCalculations\n\n UPDATE db_dbnode SET type = 'node.process.workflow.workchain.WorkChainNode.'\n WHERE type = 'calculation.work.WorkCalculation.';\n -- Update type for `WorkCalculation` nodes - all what is left should be `WorkChainNodes`\n\n UPDATE db_dbnode SET type = 'node.process.calculation.calcjob.CalcJobNode.'\n WHERE type LIKE 'calculation.job.%'; -- Update type for JobCalculation nodes\n\n UPDATE db_dbnode SET type = 'node.process.calculation.calcfunction.CalcFunctionNode.'\n WHERE type = 'calculation.inline.InlineCalculation.'; -- Update type for InlineCalculation nodes\n\n UPDATE db_dbnode SET type = 'node.process.workflow.workfunction.WorkFunctionNode.'\n WHERE type = 'calculation.function.FunctionCalculation.'; -- Update type for FunctionCalculation nodes\n\n UPDATE db_dblink SET type = 'create' WHERE type = 'createlink'; -- Rename `createlink` to `create`\n UPDATE db_dblink SET type = 'return' WHERE type = 'returnlink'; -- Rename `returnlink` to `return`\n\n UPDATE db_dblink SET type = 'input_calc' FROM db_dbnode\n WHERE db_dblink.output_id = db_dbnode.id AND db_dbnode.type LIKE 'node.process.calculation%'\n AND db_dblink.type = 'inputlink';\n -- Rename `inputlink` to `input_calc` if the target node is a calculation type node\n\n UPDATE db_dblink SET type = 'input_work' FROM db_dbnode\n WHERE db_dblink.output_id = db_dbnode.id AND db_dbnode.type LIKE 'node.process.workflow%'\n AND db_dblink.type = 'inputlink';\n -- Rename `inputlink` to `input_work` if the target node is a workflow type node\n\n UPDATE db_dblink SET type = 'call_calc' FROM db_dbnode\n WHERE db_dblink.output_id = db_dbnode.id AND db_dbnode.type LIKE 'node.process.calculation%'\n AND db_dblink.type = 'calllink';\n -- Rename `calllink` to `call_calc` if the target node is a calculation type node\n\n UPDATE db_dblink SET type = 'call_work' FROM db_dbnode\n WHERE db_dblink.output_id = db_dbnode.id AND db_dbnode.type LIKE 'node.process.workflow%'\n AND db_dblink.type = 'calllink';\n -- Rename `calllink` to `call_work` if the target node is a workflow type node\n\n " reverse_sql="\n UPDATE db_dbnode SET type = 'calculation.job.JobCalculation.'\n WHERE type = 'node.process.calculation.calcjob.CalcJobNode.';\n\n UPDATE db_dbnode SET type = 'calculatison.inline.InlineCalculation.'\n WHERE type = 'node.process.calculation.calcfunction.CalcFunctionNode.';\n\n UPDATE db_dbnode SET type = 'calculation.function.FunctionCalculation.'\n WHERE type = 'node.process.workflow.workfunction.WorkFunctionNode.';\n\n UPDATE db_dbnode SET type = 'calculation.work.WorkCalculation.'\n WHERE type = 'node.process.workflow.workchain.WorkChainNode.';\n\n\n UPDATE db_dblink SET type = 'inputlink'\n WHERE type = 'input_call' OR type = 'input_work';\n\n UPDATE db_dblink SET type = 'calllink'\n WHERE type = 'call_call' OR type = 'call_work';\n\n UPDATE db_dblink SET type = 'createlink'\n WHERE type = 'create';\n\n UPDATE db_dblink SET type = 'returnlink'\n WHERE type = 'return';\n\n ">, <RunPython functools.partial(<function _update_schema_version>, '1.0.20') reverse_code=functools.partial(<function _update_schema_version>, '1.0.19')>]

Scan the database for any links that are unexpected.

The checks will verify that there are no outgoing call or return links from calculation nodes and that if a workflow node has a create link, it has at least an accompanying return link to the same data node, or it has a call link to a calculation node that takes the created data node as input.

aiida.backends.djsite.db.migrations.0020_provenance_redesign.migrate_infer_calculation_entry_point(apps, schema_editor)[source]

Set the process type for calculation nodes by inferring it from their type string.

aiida.backends.djsite.db.migrations.0020_provenance_redesign.reverse_code(apps, schema_editor)[source]

Reversing the inference of the process type is not possible and not necessary.

Migration that renames name and type columns to label and type_string

class aiida.backends.djsite.db.migrations.0021_dbgroup_name_to_label_type_to_type_string.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Migration that renames name and type columns to label and type_string

__module__ = 'aiida.backends.djsite.db.migrations.0021_dbgroup_name_to_label_type_to_type_string'
dependencies = [('db', '0020_provenance_redesign')]
operations = [<RenameField model_name='dbgroup', old_name='name', new_name='label'>, <RenameField model_name='dbgroup', old_name='type', new_name='type_string'>, <AlterUniqueTogether name='dbgroup', unique_together={('label', 'type_string')}>, <RunPython functools.partial(<function _update_schema_version>, '1.0.21') reverse_code=functools.partial(<function _update_schema_version>, '1.0.20')>]

Migration after the update of group_types

class aiida.backends.djsite.db.migrations.0022_dbgroup_type_string_change_content.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Migration after the update of group_types

__module__ = 'aiida.backends.djsite.db.migrations.0022_dbgroup_type_string_change_content'
dependencies = [('db', '0021_dbgroup_name_to_label_type_to_type_string')]
operations = [<RunSQL sql="UPDATE db_dbgroup SET type_string = 'user' WHERE type_string = '';\nUPDATE db_dbgroup SET type_string = 'data.upf' WHERE type_string = 'data.upf.family';\nUPDATE db_dbgroup SET type_string = 'auto.import' WHERE type_string = 'aiida.import';\nUPDATE db_dbgroup SET type_string = 'auto.run' WHERE type_string = 'autogroup.run';", reverse_sql="UPDATE db_dbgroup SET type_string = '' WHERE type_string = 'user';\nUPDATE db_dbgroup SET type_string = 'data.upf.family' WHERE type_string = 'data.upf';\nUPDATE db_dbgroup SET type_string = 'aiida.import' WHERE type_string = 'auto.import';\nUPDATE db_dbgroup SET type_string = 'autogroup.run' WHERE type_string = 'auto.run';">, <RunPython functools.partial(<function _update_schema_version>, '1.0.22') reverse_code=functools.partial(<function _update_schema_version>, '1.0.21')>]

Migration of ProcessNode attributes for metadata options whose key changed.

class aiida.backends.djsite.db.migrations.0023_calc_job_option_attribute_keys.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Migration of ProcessNode attributes for metadata options whose key changed.

Renamed attribute keys:

  • custom_environment_variables -> environment_variables (CalcJobNode)
  • jobresource_params -> resources (CalcJobNode)
  • _process_label -> process_label (ProcessNode)
  • parser -> parser_name (CalcJobNode)
Deleted attributes:
  • linkname_retrieved (We do not actually delete it just in case some relies on it)
__module__ = 'aiida.backends.djsite.db.migrations.0023_calc_job_option_attribute_keys'
dependencies = [('db', '0022_dbgroup_type_string_change_content')]
operations = [<RunSQL sql="\n UPDATE db_dbattribute AS attribute\n SET key = regexp_replace(attribute.key, '^custom_environment_variables', 'environment_variables')\n FROM db_dbnode AS node\n WHERE\n (\n attribute.key = 'custom_environment_variables' OR\n attribute.key LIKE 'custom\\_environment\\_variables.%'\n ) AND\n node.type = 'node.process.calculation.calcjob.CalcJobNode.' AND\n node.id = attribute.dbnode_id;\n -- custom_environment_variables -> environment_variables\n\n UPDATE db_dbattribute AS attribute\n SET key = regexp_replace(attribute.key, '^jobresource_params', 'resources')\n FROM db_dbnode AS node\n WHERE\n (\n attribute.key = 'jobresource_params' OR\n attribute.key LIKE 'jobresource\\_params.%'\n ) AND\n node.type = 'node.process.calculation.calcjob.CalcJobNode.' AND\n node.id = attribute.dbnode_id;\n -- jobresource_params -> resources\n\n UPDATE db_dbattribute AS attribute\n SET key = regexp_replace(attribute.key, '^_process_label', 'process_label')\n FROM db_dbnode AS node\n WHERE\n attribute.key = '_process_label' AND\n node.type LIKE 'node.process.%' AND\n node.id = attribute.dbnode_id;\n -- _process_label -> process_label\n\n UPDATE db_dbattribute AS attribute\n SET key = regexp_replace(attribute.key, '^parser', 'parser_name')\n FROM db_dbnode AS node\n WHERE\n attribute.key = 'parser' AND\n node.type = 'node.process.calculation.calcjob.CalcJobNode.' AND\n node.id = attribute.dbnode_id;\n -- parser -> parser_name\n ", reverse_sql="\n UPDATE db_dbattribute AS attribute\n SET key = regexp_replace(attribute.key, '^environment_variables', 'custom_environment_variables')\n FROM db_dbnode AS node\n WHERE\n (\n attribute.key = 'environment_variables' OR\n attribute.key LIKE 'environment\\_variables.%'\n ) AND\n node.type = 'node.process.calculation.calcjob.CalcJobNode.' AND\n node.id = attribute.dbnode_id;\n -- environment_variables -> custom_environment_variables\n\n UPDATE db_dbattribute AS attribute\n SET key = regexp_replace(attribute.key, '^resources', 'jobresource_params')\n FROM db_dbnode AS node\n WHERE\n (\n attribute.key = 'resources' OR\n attribute.key LIKE 'resources.%'\n ) AND\n node.type = 'node.process.calculation.calcjob.CalcJobNode.' AND\n node.id = attribute.dbnode_id;\n -- resources -> jobresource_params\n\n UPDATE db_dbattribute AS attribute\n SET key = regexp_replace(attribute.key, '^process_label', '_process_label')\n FROM db_dbnode AS node\n WHERE\n attribute.key = 'process_label' AND\n node.type LIKE 'node.process.%' AND\n node.id = attribute.dbnode_id;\n -- process_label -> _process_label\n\n UPDATE db_dbattribute AS attribute\n SET key = regexp_replace(attribute.key, '^parser_name', 'parser')\n FROM db_dbnode AS node\n WHERE\n attribute.key = 'parser_name' AND\n node.type = 'node.process.calculation.calcjob.CalcJobNode.' AND\n node.id = attribute.dbnode_id;\n -- parser_name -> parser\n ">, <RunPython functools.partial(<function _update_schema_version>, '1.0.23') reverse_code=functools.partial(<function _update_schema_version>, '1.0.22')>]

Migration for the update of the DbLog table. Addition of uuids

class aiida.backends.djsite.db.migrations.0024_dblog_update.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

This migration updates the DbLog schema and adds UUID for correct export of the DbLog entries. More specifically, it adds UUIDS, it exports to files the not needed log entries (that correspond to legacy workflows and unknown entities), it creates a foreign key to the dbnode table, it transfers there the objpk data to the new dbnode column (just altering the objpk column and making it a foreign key when containing data, raised problems) and in the end objpk and objname columns are removed.

__module__ = 'aiida.backends.djsite.db.migrations.0024_dblog_update'
dependencies = [('db', '0023_calc_job_option_attribute_keys')]
operations = [<RunPython <function export_and_clean_workflow_logs> reverse_code=<function RunPython.noop>>, <RunPython <function clean_dblog_metadata> reverse_code=<function enrich_dblog_metadata>>, <RunSQL '' reverse_sql='UPDATE db_dblog SET objname=db_dbnode.type FROM db_dbnode WHERE db_dbnode.id = db_dblog.objpk'>, <RemoveField model_name='dblog', name='objname'>, <AddField model_name='dblog', name='dbnode', field=<django.db.models.fields.related.ForeignKey>>, <RunSQL 'UPDATE db_dblog SET dbnode_id=objpk' reverse_sql='UPDATE db_dblog SET objpk=dbnode_id'>, <AlterField model_name='dblog', name='dbnode', field=<django.db.models.fields.related.ForeignKey>>, <RemoveField model_name='dblog', name='objpk'>, <AddField model_name='dblog', name='uuid', field=<django.db.models.fields.UUIDField>>, <RunPython <function set_new_uuid> reverse_code=<function RunPython.noop>>, <AlterField model_name='dblog', name='uuid', field=<django.db.models.fields.UUIDField>>, <RunPython functools.partial(<function _update_schema_version>, '1.0.24') reverse_code=functools.partial(<function _update_schema_version>, '1.0.23')>]
aiida.backends.djsite.db.migrations.0024_dblog_update.clean_dblog_metadata(apps, _)[source]

Remove objpk and objname from the DbLog table metadata.

aiida.backends.djsite.db.migrations.0024_dblog_update.enrich_dblog_metadata(apps, _)[source]

Add objpk and objname to the DbLog table metadata.

aiida.backends.djsite.db.migrations.0024_dblog_update.export_and_clean_workflow_logs(apps, schema_editor)[source]

Export the logs records that correspond to legacy workflows and to unknown entities.

aiida.backends.djsite.db.migrations.0024_dblog_update.get_legacy_workflow_log_number(schema_editor)[source]

Get the number of the log records that correspond to legacy workflows

aiida.backends.djsite.db.migrations.0024_dblog_update.get_logs_with_no_nodes_number(schema_editor)[source]

Get the number of the log records that don’t correspond to a node

aiida.backends.djsite.db.migrations.0024_dblog_update.get_serialized_legacy_workflow_logs(schema_editor)[source]

Get the serialized log records that correspond to legacy workflows

aiida.backends.djsite.db.migrations.0024_dblog_update.get_serialized_logs_with_no_nodes(schema_editor)[source]

Get the serialized log records that don’t correspond to a node

aiida.backends.djsite.db.migrations.0024_dblog_update.get_serialized_unknown_entity_logs(schema_editor)[source]

Get the serialized log records that correspond to unknown entities

aiida.backends.djsite.db.migrations.0024_dblog_update.get_unknown_entity_log_number(schema_editor)[source]

Get the number of the log records that correspond to unknown entities

aiida.backends.djsite.db.migrations.0024_dblog_update.set_new_uuid(apps, _)[source]

Set new UUIDs for all logs

Data migration for Data nodes after it was moved in the aiida.orm.node module changing the type string.

class aiida.backends.djsite.db.migrations.0025_move_data_within_node_module.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Data migration for Data nodes after it was moved in the aiida.orm.node module changing the type string.

__module__ = 'aiida.backends.djsite.db.migrations.0025_move_data_within_node_module'
dependencies = [('db', '0024_dblog_update')]
operations = [<RunSQL sql="\n UPDATE db_dbnode\n SET type = regexp_replace(type, '^data.', 'node.data.')\n WHERE type LIKE 'data.%'\n ", reverse_sql="\n UPDATE db_dbnode\n SET type = regexp_replace(type, '^node.data.', 'data.')\n WHERE type LIKE 'node.data.%'\n ">, <RunPython functools.partial(<function _update_schema_version>, '1.0.25') reverse_code=functools.partial(<function _update_schema_version>, '1.0.24')>]

Data migration for TrajectoryData nodes where symbol lists are moved from repository array to attribute.

This process has to be done in two separate consecutive migrations to prevent data loss in between.

class aiida.backends.djsite.db.migrations.0026_trajectory_symbols_to_attribute.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Storing symbols in TrajectoryData nodes as attributes, while keeping numpy arrays. TrajectoryData symbols arrays are deleted in the next migration. We split the migration into two because every migration is wrapped in an atomic transaction and we want to avoid to delete the data while it is written in the database

__module__ = 'aiida.backends.djsite.db.migrations.0026_trajectory_symbols_to_attribute'
dependencies = [('db', '0025_move_data_within_node_module')]
operations = [<RunPython <function create_trajectory_symbols_attribute> reverse_code=<function delete_trajectory_symbols_attribute>>, <RunPython functools.partial(<function _update_schema_version>, '1.0.26') reverse_code=functools.partial(<function _update_schema_version>, '1.0.25')>]
aiida.backends.djsite.db.migrations.0026_trajectory_symbols_to_attribute.create_trajectory_symbols_attribute(apps, _)[source]

Create the symbols attribute from the repository array for all TrajectoryData nodes.

aiida.backends.djsite.db.migrations.0026_trajectory_symbols_to_attribute.delete_trajectory_symbols_attribute(apps, _)[source]

Delete the symbols attribute for all TrajectoryData nodes.

Data migration for TrajectoryData nodes where symbol lists are moved from repository array to attribute.

This process has to be done in two separate consecutive migrations to prevent data loss in between.

class aiida.backends.djsite.db.migrations.0027_delete_trajectory_symbols_array.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Deleting duplicated information stored in TrajectoryData symbols numpy arrays

__module__ = 'aiida.backends.djsite.db.migrations.0027_delete_trajectory_symbols_array'
dependencies = [('db', '0026_trajectory_symbols_to_attribute')]
operations = [<RunPython <function delete_trajectory_symbols_array> reverse_code=<function create_trajectory_symbols_array>>, <RunPython functools.partial(<function _update_schema_version>, '1.0.27') reverse_code=functools.partial(<function _update_schema_version>, '1.0.26')>]
aiida.backends.djsite.db.migrations.0027_delete_trajectory_symbols_array.create_trajectory_symbols_array(apps, _)[source]

Create the symbols array for all TrajectoryData nodes.

aiida.backends.djsite.db.migrations.0027_delete_trajectory_symbols_array.delete_trajectory_symbols_array(apps, _)[source]

Delete the symbols array from all TrajectoryData nodes.

Final data migration for Nodes after aiida.orm.nodes reorganization was finalized to remove the node. prefix

class aiida.backends.djsite.db.migrations.0028_remove_node_prefix.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Now all node sub classes live in aiida.orm.nodes so now the node. prefix can be removed.

__module__ = 'aiida.backends.djsite.db.migrations.0028_remove_node_prefix'
dependencies = [('db', '0027_delete_trajectory_symbols_array')]
operations = [<RunSQL sql="\n UPDATE db_dbnode\n SET type = regexp_replace(type, '^node.data.', 'data.')\n WHERE type LIKE 'node.data.%';\n\n UPDATE db_dbnode\n SET type = regexp_replace(type, '^node.process.', 'process.')\n WHERE type LIKE 'node.process.%';\n ", reverse_sql="\n UPDATE db_dbnode\n SET type = regexp_replace(type, '^data.', 'node.data.')\n WHERE type LIKE 'data.%';\n\n UPDATE db_dbnode\n SET type = regexp_replace(type, '^process.', 'node.process.')\n WHERE type LIKE 'process.%';\n ">, <RunPython functools.partial(<function _update_schema_version>, '1.0.28') reverse_code=functools.partial(<function _update_schema_version>, '1.0.27')>]

Data migration for after ParameterData was renamed to Dict.

class aiida.backends.djsite.db.migrations.0029_rename_parameter_data_to_dict.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Data migration for after ParameterData was renamed to Dict.

__module__ = 'aiida.backends.djsite.db.migrations.0029_rename_parameter_data_to_dict'
dependencies = [('db', '0028_remove_node_prefix')]
operations = [<RunSQL sql="UPDATE db_dbnode SET type = 'data.dict.Dict.' WHERE type = 'data.parameter.ParameterData.';", reverse_sql="\n UPDATE db_dbnode SET type = 'data.parameter.ParameterData.' WHERE type = 'data.dict.Dict.';\n ">, <RunPython functools.partial(<function _update_schema_version>, '1.0.29') reverse_code=functools.partial(<function _update_schema_version>, '1.0.28')>]

Renaming DbNode.type to DbNode.node_type

class aiida.backends.djsite.db.migrations.0030_dbnode_type_to_dbnode_node_type.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Renaming DbNode.type to DbNode.node_type

__module__ = 'aiida.backends.djsite.db.migrations.0030_dbnode_type_to_dbnode_node_type'
dependencies = [('db', '0029_rename_parameter_data_to_dict')]
operations = [<RenameField model_name='dbnode', old_name='type', new_name='node_type'>, <RunPython functools.partial(<function _update_schema_version>, '1.0.30') reverse_code=functools.partial(<function _update_schema_version>, '1.0.29')>]

Remove DbComputer.enabled

class aiida.backends.djsite.db.migrations.0031_remove_dbcomputer_enabled.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Remove DbComputer.enabled

__module__ = 'aiida.backends.djsite.db.migrations.0031_remove_dbcomputer_enabled'
dependencies = [('db', '0030_dbnode_type_to_dbnode_node_type')]
operations = [<RemoveField model_name='dbcomputer', name='enabled'>, <RunPython functools.partial(<function _update_schema_version>, '1.0.31') reverse_code=functools.partial(<function _update_schema_version>, '1.0.30')>]

Remove legacy workflow.

class aiida.backends.djsite.db.migrations.0032_remove_legacy_workflows.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Remove legacy workflow.

__module__ = 'aiida.backends.djsite.db.migrations.0032_remove_legacy_workflows'
dependencies = [('db', '0031_remove_dbcomputer_enabled')]
operations = [<RunPython <function export_workflow_data> reverse_code=<function RunPython.noop>>, <RemoveField model_name='dbworkflow', name='user'>, <AlterUniqueTogether name='dbworkflowdata', unique_together=set()>, <RemoveField model_name='dbworkflowdata', name='aiida_obj'>, <RemoveField model_name='dbworkflowdata', name='parent'>, <AlterUniqueTogether name='dbworkflowstep', unique_together=set()>, <RemoveField model_name='dbworkflowstep', name='calculations'>, <RemoveField model_name='dbworkflowstep', name='parent'>, <RemoveField model_name='dbworkflowstep', name='sub_workflows'>, <RemoveField model_name='dbworkflowstep', name='user'>, <DeleteModel name='DbWorkflow'>, <DeleteModel name='DbWorkflowData'>, <DeleteModel name='DbWorkflowStep'>, <RunPython functools.partial(<function _update_schema_version>, '1.0.32') reverse_code=functools.partial(<function _update_schema_version>, '1.0.31')>]
aiida.backends.djsite.db.migrations.0032_remove_legacy_workflows.export_workflow_data(apps, _)[source]

Export existing legacy workflow data to a JSON file.

Drop the columns nodeversion and public from the DbNode model.

class aiida.backends.djsite.db.migrations.0034_drop_node_columns_nodeversion_public.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Drop the columns nodeversion and public from the DbNode model.

__module__ = 'aiida.backends.djsite.db.migrations.0034_drop_node_columns_nodeversion_public'
dependencies = [('db', '0033_replace_text_field_with_json_field')]
operations = [<RemoveField model_name='dbnode', name='nodeversion'>, <RemoveField model_name='dbnode', name='public'>, <RunPython functools.partial(<function _update_schema_version>, '1.0.34') reverse_code=functools.partial(<function _update_schema_version>, '1.0.33')>]

Simplify the DbUser model.

class aiida.backends.djsite.db.migrations.0035_simplify_user_model.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Simplify the DbUser model by dropping unused columns.

__module__ = 'aiida.backends.djsite.db.migrations.0035_simplify_user_model'
dependencies = [('db', '0034_drop_node_columns_nodeversion_public')]
operations = [<AlterField model_name='dbuser', name='password', field=<django.db.models.fields.CharField>>, <RemoveField model_name='dbuser', name='password'>, <RemoveField model_name='dbuser', name='date_joined'>, <RemoveField model_name='dbuser', name='groups'>, <RemoveField model_name='dbuser', name='is_active'>, <RemoveField model_name='dbuser', name='is_staff'>, <AlterField model_name='dbuser', name='is_superuser', field=<django.db.models.fields.BooleanField>>, <RemoveField model_name='dbuser', name='is_superuser'>, <RemoveField model_name='dbuser', name='last_login'>, <RemoveField model_name='dbuser', name='user_permissions'>, <RunPython functools.partial(<function _update_schema_version>, '1.0.35') reverse_code=functools.partial(<function _update_schema_version>, '1.0.34')>]

Drop the transport_params from the Computer database model.

class aiida.backends.djsite.db.migrations.0036_drop_computer_transport_params.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Drop the transport_params from the Computer database model.

__module__ = 'aiida.backends.djsite.db.migrations.0036_drop_computer_transport_params'
dependencies = [('db', '0035_simplify_user_model')]
operations = [<RemoveField model_name='dbcomputer', name='transport_params'>, <RunPython functools.partial(<function _update_schema_version>, '1.0.36') reverse_code=functools.partial(<function _update_schema_version>, '1.0.35')>]

Data migration for legacy JobCalculations.

These old nodes have already been migrated to the correct CalcJobNode type in a previous migration, but they can still contain a state attribute with a deprecated JobCalcState value and they are missing a value for the process_state, process_status, process_label and exit_status. The process_label is impossible to infer consistently in SQL so it will be omitted. The other will be mapped from the state attribute as follows:

Old state            | Process state  | Exit status | Process status
---------------------|----------------|-------------|----------------------------------------------------------
`NEW`                | `killed`       |    `None`   | Legacy `JobCalculation` with state `NEW`
`TOSUBMIT`           | `killed`       |    `None`   | Legacy `JobCalculation` with state `TOSUBMIT`
`SUBMITTING`         | `killed`       |    `None`   | Legacy `JobCalculation` with state `SUBMITTING`
`WITHSCHEDULER`      | `killed`       |    `None`   | Legacy `JobCalculation` with state `WITHSCHEDULER`
`COMPUTED`           | `killed`       |    `None`   | Legacy `JobCalculation` with state `COMPUTED`
`RETRIEVING`         | `killed`       |    `None`   | Legacy `JobCalculation` with state `RETRIEVING`
`PARSING`            | `killed`       |    `None`   | Legacy `JobCalculation` with state `PARSING`
`SUBMISSIONFAILED`   | `excepted`     |    `None`   | Legacy `JobCalculation` with state `SUBMISSIONFAILED`
`RETRIEVALFAILED`    | `excepted`     |    `None`   | Legacy `JobCalculation` with state `RETRIEVALFAILED`
`PARSINGFAILED`      | `excepted`     |    `None`   | Legacy `JobCalculation` with state `PARSINGFAILED`
`FAILED`             | `finished`     |      2      |  -
`FINISHED`           | `finished`     |      0      |  -
`IMPORTED`           |       -        |      -      |  -

Note the IMPORTED state was never actually stored in the state attribute, so we do not have to consider it. The old state attribute has to be removed after the data is migrated, because its value is no longer valid or useful.

Note: in addition to the three attributes mentioned in the table, all matched nodes will get Legacy JobCalculation as their process_label which is one of the default columns of verdi process list.

class aiida.backends.djsite.db.migrations.0038_data_migration_legacy_job_calculations.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Data migration for legacy JobCalculations.

__module__ = 'aiida.backends.djsite.db.migrations.0038_data_migration_legacy_job_calculations'
dependencies = [('db', '0037_attributes_extras_settings_json')]
operations = [<RunSQL sql='\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "killed", "process_status": "Legacy `JobCalculation` with state `NEW`", "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "NEW"}\';\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "killed", "process_status": "Legacy `JobCalculation` with state `TOSUBMIT`", "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "TOSUBMIT"}\';\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "killed", "process_status": "Legacy `JobCalculation` with state `SUBMITTING`", "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "SUBMITTING"}\';\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "killed", "process_status": "Legacy `JobCalculation` with state `WITHSCHEDULER`", "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "WITHSCHEDULER"}\';\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "killed", "process_status": "Legacy `JobCalculation` with state `COMPUTED`", "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "COMPUTED"}\';\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "killed", "process_status": "Legacy `JobCalculation` with state `RETRIEVING`", "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "RETRIEVING"}\';\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "killed", "process_status": "Legacy `JobCalculation` with state `PARSING`", "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "PARSING"}\';\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "excepted", "process_status": "Legacy `JobCalculation` with state `SUBMISSIONFAILED`", "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "SUBMISSIONFAILED"}\';\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "excepted", "process_status": "Legacy `JobCalculation` with state `RETRIEVALFAILED`", "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "RETRIEVALFAILED"}\';\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "excepted", "process_status": "Legacy `JobCalculation` with state `PARSINGFAILED`", "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "PARSINGFAILED"}\';\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "finished", "exit_status": 2, "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "FAILED"}\';\n UPDATE db_dbnode\n SET attributes = attributes - \'state\' || \'{"process_state": "finished", "exit_status": 0, "process_label": "Legacy JobCalculation"}\'\n WHERE node_type = \'process.calculation.calcjob.CalcJobNode.\' AND attributes @> \'{"state": "FINISHED"}\';\n ', reverse_sql=''>, <RunPython functools.partial(<function _update_schema_version>, '1.0.38') reverse_code=functools.partial(<function _update_schema_version>, '1.0.37')>]

Invalidating node hash - User should rehash nodes for caching

class aiida.backends.djsite.db.migrations.0039_reset_hash.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Invalidating node hash - User should rehash nodes for caching

__module__ = 'aiida.backends.djsite.db.migrations.0039_reset_hash'
dependencies = [('db', '0038_data_migration_legacy_job_calculations')]
operations = [<RunPython <function notify_user> reverse_code=<function notify_user>>, <RunSQL "UPDATE db_dbnode SET extras = extras #- '{_aiida_hash}'::text[];" reverse_sql="UPDATE db_dbnode SET extras = extras #- '{_aiida_hash}'::text[];">, <RunPython functools.partial(<function _update_schema_version>, '1.0.39') reverse_code=functools.partial(<function _update_schema_version>, '1.0.38')>]
aiida.backends.djsite.db.migrations.0039_reset_hash.notify_user(apps, schema_editor)[source]

Data migration for some legacy process attributes.

Attribute keys that are renamed:

  • _sealed -> sealed

Attribute keys that are removed entirely:

  • _finished
  • _failed
  • _aborted
  • _do_abort

Finally, after these first migrations, any remaining process nodes that still do not have a sealed attribute, have it set to True. Excluding the nodes that have a process_state attribute of one of the active states: created; running; or waiting, because those are valid active processes that are not yet sealed.

class aiida.backends.djsite.db.migrations.0040_data_migration_legacy_process_attributes.Migration(name, app_label)[source]

Bases: django.db.migrations.migration.Migration

Data migration for legacy process attributes.

__module__ = 'aiida.backends.djsite.db.migrations.0040_data_migration_legacy_process_attributes'
dependencies = [('db', '0039_reset_hash')]
operations = [<RunSQL sql='\n UPDATE db_dbnode\n SET attributes = jsonb_set(attributes, \'{"sealed"}\', attributes->\'_sealed\')\n WHERE attributes ? \'_sealed\' AND node_type LIKE \'process.%\';\n -- Copy `_sealed` -> `sealed`\n\n UPDATE db_dbnode SET attributes = attributes - \'_sealed\'\n WHERE attributes ? \'_sealed\' AND node_type LIKE \'process.%\';\n -- Delete `_sealed`\n\n UPDATE db_dbnode SET attributes = attributes - \'_finished\'\n WHERE attributes ? \'_finished\' AND node_type LIKE \'process.%\';\n -- Delete `_finished`\n\n UPDATE db_dbnode SET attributes = attributes - \'_failed\'\n WHERE attributes ? \'_failed\' AND node_type LIKE \'process.%\';\n -- Delete `_failed`\n\n UPDATE db_dbnode SET attributes = attributes - \'_aborted\'\n WHERE attributes ? \'_aborted\' AND node_type LIKE \'process.%\';\n -- Delete `_aborted`\n\n UPDATE db_dbnode SET attributes = attributes - \'_do_abort\'\n WHERE attributes ? \'_do_abort\' AND node_type LIKE \'process.%\';\n -- Delete `_do_abort`\n\n UPDATE db_dbnode\n SET attributes = jsonb_set(attributes, \'{"sealed"}\', to_jsonb(True))\n WHERE\n node_type LIKE \'process.%\' AND\n NOT (attributes ? \'sealed\') AND\n attributes->>\'process_state\' NOT IN (\'created\', \'running\', \'waiting\');\n -- Set `sealed=True` for process nodes that do not yet have a `sealed` attribute AND are not in an active state\n ', reverse_sql=''>, <RunPython functools.partial(<function _update_schema_version>, '1.0.40') reverse_code=functools.partial(<function _update_schema_version>, '1.0.39')>]