aiida.orm.implementation.general package

Submodules

class aiida.orm.implementation.general.code.AbstractCode(**kwargs)[source]

Bases: aiida.orm.implementation.django.node.Node

A code entity. It can either be ‘local’, or ‘remote’.

  • Local code: it is a collection of files/dirs (added using the add_path() method), where one file is flagged as executable (using the set_local_executable() method).
  • Remote code: it is a pair (remotecomputer, remotepath_of_executable) set using the set_remote_computer_exec() method.

For both codes, one can set some code to be executed right before or right after the execution of the code, using the set_preexec_code() and set_postexec_code() methods (e.g., the set_preexec_code() can be used to load specific modules required for the code to be run).

HIDDEN_KEY = 'hidden'
__abstractmethods__ = frozenset(['set_remote_computer_exec', 'can_run_on', 'get', 'list_for_plugin', '_set_local', 'get_from_string'])
__module__ = 'aiida.orm.implementation.general.code'
__str__() <==> str(x)[source]
_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 39
_abc_registry = <_weakrefset.WeakSet object>
_hide()[source]

Hide the code (prevents from showing it in the verdi code list)

_init_internal_params()[source]

This function is called by the init method

_is_hidden()[source]

Determines whether the Code is hidden or not

_linking_as_output(dest, link_type)[source]

Raise a ValueError if a link from self to dest is not allowed.

An output of a code can only be a calculation

_logger = <logging.Logger object>
_plugin_type_string = 'code.AbstractCode.'
_query_type_string = 'code.'
_reveal()[source]

Reveal the code (allows to show it in the verdi code list) By default, it is revealed

_set_local()[source]

Set the code as a ‘local’ code, meaning that all the files belonging to the code will be copied to the cluster, and the file set with set_exec_filename will be run.

It also deletes the flags related to the local case (if any)

_set_remote()[source]

Set the code as a ‘remote’ code, meaning that the code itself has no files attached, but only a location on a remote computer (with an absolute path of the executable on the remote computer).

It also deletes the flags related to the local case (if any)

_validate()[source]

Check if the attributes and files retrieved from the DB are valid. Raise a ValidationError if something is wrong.

Must be able to work even before storing: therefore, use the get_attr and similar methods that automatically read either from the DB or from the internal attribute cache.

For the base class, this is always valid. Subclasses will reimplement this. In the subclass, always call the super()._validate() method first!

Add a link to the current node from the ‘src’ node. Both nodes must be a Node instance (or a subclass of Node) :note: In subclasses, change only this. Moreover, remember to call the super() method in order to properly use the caching logic!

Parameters:
  • src – the source object
  • label (str) – the name of the label to set the link from src. Default = None.
  • link_type – The type of link, must be one of the enum values from LinkType
can_run_on(computer)[source]

Return True if this code can run on the given computer, False otherwise.

Local codes can run on any machine; remote codes can run only on the machine on which they reside.

TODO: add filters to mask the remote machines on which a local code can run.

full_text_info

Return a (multiline) string with a human-readable detailed information on this computer.

classmethod get(pk=None, label=None, machinename=None)[source]

Get a Computer object with given identifier string, that can either be the numeric ID (pk), or the label (and computername) (if unique).

Parameters:
  • pk – the numeric ID (pk) for code
  • label – the code label identifying the code to load
  • machinename – the machine name where code is setup
Raises:
get_append_text()[source]

Return the postexec_code, or an empty string if no post-exec code was defined.

classmethod get_code_helper(label, machinename=None)[source]
Parameters:
  • label – the code label identifying the code to load
  • machinename – the machine name where code is setup
Raises:
get_desc()[source]

Returns a string with infos retrieved from PwCalculation node’s properties. :param node: :return: retsrt:

get_execname()[source]

Return the executable string to be put in the script. For local codes, it is ./LOCAL_EXECUTABLE_NAME For remote codes, it is the absolute path to the executable.

classmethod get_from_string(code_string)[source]

Get a Computer object with given identifier string in the format label@machinename. See the note below for details on the string detection algorithm.

Note

the (leftmost) ‘@’ symbol is always used to split code and computername. Therefore do not use ‘@’ in the code name if you want to use this function (‘@’ in the computer name are instead valid).

Parameters:

code_string – the code string identifying the code to load

Raises:
get_input_plugin_name()[source]

Return the name of the default input plugin (or None if no input plugin was set.

get_local_executable()[source]
get_prepend_text()[source]

Return the code that will be put in the scheduler script before the execution, or an empty string if no pre-exec code was defined.

get_remote_computer()[source]
get_remote_exec_path()[source]
is_local()[source]

Return True if the code is ‘local’, False if it is ‘remote’ (see also documentation of the set_local and set_remote functions).

classmethod list_for_plugin(plugin, labels=True)[source]

Return a list of valid code strings for a given plugin.

Parameters:
  • plugin – The string of the plugin.
  • labels – if True, return a list of code names, otherwise return the code PKs (integers).
Returns:

a list of string, with the code names if labels is True, otherwise a list of integers with the code PKs.

new_calc(*args, **kwargs)[source]

Create and return a new Calculation object (unstored) with the correct plugin subclass, as obtained by the self.get_input_plugin_name() method.

Parameters are passed to the calculation __init__ method.

Note:

it also directly creates the link to this code (that will of course be cached, since the new node is not stored yet).

Raises:
  • MissingPluginError – if the specified plugin does not exist.
  • ValueError – if no plugin was specified.
set_append_text(code)[source]

Pass a string of code that will be put in the scheduler script after the execution of the code.

set_files(files)[source]

Given a list of filenames (or a single filename string), add it to the path (all at level zero, i.e. without folders). Therefore, be careful for files with the same name!

Todo:decide whether to check if the Code must be a local executable to be able to call this function.
set_input_plugin_name(input_plugin)[source]

Set the name of the default input plugin, to be used for the automatic generation of a new calculation.

set_local_executable(exec_name)[source]

Set the filename of the local executable. Implicitly set the code as local.

set_prepend_text(code)[source]

Pass a string of code that will be put in the scheduler script before the execution of the code.

set_remote_computer_exec(remote_computer_exec)[source]

Set the code as remote, and pass the computer on which it resides and the absolute path on that computer.

Args:
remote_computer_exec: a tuple (computer, remote_exec_path), where
computer is a aiida.orm.Computer or an aiida.backends.djsite.db.models.DbComputer object, and remote_exec_path is the absolute path of the main executable on remote computer.
classmethod setup(**kwargs)[source]
aiida.orm.implementation.general.code.delete_code(code)[source]

Delete a code from the DB. Check before that there are no output nodes.

NOTE! Not thread safe… Do not use with many users accessing the DB at the same time.

Implemented as a function on purpose, otherwise complicated logic would be needed to set the internal state of the object after calling computer.delete().

class aiida.orm.implementation.general.comment.AbstractComment(**kwargs)[source]

Bases: object

__abstractmethods__ = frozenset(['set_ctime', 'get_ctime', 'uuid', 'set_mtime', 'get_content', 'set_user', 'to_be_stored', 'pk', 'get_user', 'delete', 'get_mtime', 'id', '__init__', 'set_content'])
__dict__ = dict_proxy({'set_ctime': <function set_ctime>, '__module__': 'aiida.orm.implementation.general.comment', '__metaclass__': <class 'abc.ABCMeta'>, 'set_content': <function set_content>, '_logger': <logging.Logger object>, '_abc_negative_cache': <_weakrefset.WeakSet object>, 'set_mtime': <function set_mtime>, 'get_content': <function get_content>, 'get_mtime': <function get_mtime>, 'to_be_stored': <abc.abstractproperty object>, '__dict__': <attribute '__dict__' of 'AbstractComment' objects>, 'get_ctime': <function get_ctime>, 'get_user': <function get_user>, '__weakref__': <attribute '__weakref__' of 'AbstractComment' objects>, 'id': <abc.abstractproperty object>, '__init__': <function __init__>, '_abc_cache': <_weakrefset.WeakSet object>, 'uuid': <abc.abstractproperty object>, '__abstractmethods__': frozenset(['set_ctime', 'get_ctime', 'uuid', 'set_mtime', 'get_content', 'set_user', 'to_be_stored', 'pk', 'get_user', 'delete', 'get_mtime', 'id', '__init__', 'set_content']), '_abc_negative_cache_version': 39, 'set_user': <function set_user>, 'pk': <abc.abstractproperty object>, '_abc_registry': <_weakrefset.WeakSet object>, '__doc__': None, 'delete': <function delete>})
__init__(**kwargs)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__metaclass__

alias of abc.ABCMeta

__module__ = 'aiida.orm.implementation.general.comment'
__weakref__

list of weak references to the object (if defined)

_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 39
_abc_registry = <_weakrefset.WeakSet object>
_logger = <logging.Logger object>
delete()[source]
get_content()[source]
get_ctime()[source]
get_mtime()[source]
get_user()[source]
id
pk
set_content(val)[source]
set_ctime(val)[source]
set_mtime(val)[source]
set_user(val)[source]
to_be_stored
uuid
class aiida.orm.implementation.general.computer.AbstractComputer(**kwargs)[source]

Bases: object

Base class to map a node in the DB + its permanent repository counterpart.

Stores attributes starting with an underscore.

Caches files and attributes before the first save, and saves everything only on store(). After the call to store(), attributes cannot be changed.

Only after storing (or upon loading from uuid) metadata can be modified and in this case they are directly set on the db.

In the plugin, also set the _plugin_type_string, to be set in the DB in the ‘type’ field.

__dict__ = dict_proxy({'__int__': <function __int__>, 'hostname': <abc.abstractproperty object>, 'set_enabled_state': <function set_enabled_state>, 'set_shebang': <function set_shebang>, '__str__': <function __str__>, '_hostname_validator': <classmethod object>, 'set_workdir': <function set_workdir>, 'get_transport_params': <function get_transport_params>, '_workdir_validator': <classmethod object>, '_set_mpirun_command_string': <function _set_mpirun_command_string>, '_get_metadata': <function _get_metadata>, '_set_workdir_string': <function _set_workdir_string>, '__weakref__': <attribute '__weakref__' of 'AbstractComputer' objects>, 'validate': <function validate>, 'set_hostname': <function set_hostname>, 'set_transport_params': <function set_transport_params>, 'get_transport_class': <function get_transport_class>, 'uuid': <abc.abstractproperty object>, '_description_validator': <classmethod object>, '_get_transport_type_string': <function _get_transport_type_string>, 'get_scheduler_type': <function get_scheduler_type>, '__dict__': <attribute '__dict__' of 'AbstractComputer' objects>, '_del_property': <function _del_property>, 'set_prepend_text': <function set_prepend_text>, '_get_default_mpiprocs_per_machine_string': <function _get_default_mpiprocs_per_machine_string>, '_default_mpiprocs_per_machine_validator': <function _default_mpiprocs_per_machine_validator>, 'pk': <abc.abstractproperty object>, 'logger': <property object>, 'full_text_info': <abc.abstractproperty object>, '_get_hostname_string': <function _get_hostname_string>, 'store': <function store>, 'is_enabled': <function is_enabled>, 'get_prepend_text': <function get_prepend_text>, 'get': <aiida.common.utils.abstractclassmethod object>, '_logger': <logging.Logger object>, '_set_append_text_string': <function _set_append_text_string>, 'get_shebang': <function get_shebang>, 'get_description': <function get_description>, '_scheduler_type_validator': <classmethod object>, '_set_enabled_state_string': <function _set_enabled_state_string>, 'set_scheduler_type': <function set_scheduler_type>, '_mpirun_command_validator': <function _mpirun_command_validator>, 'copy': <function copy>, '_set_transport_type_string': <function _set_transport_type_string>, '_set_metadata': <function _set_metadata>, 'get_dbauthinfo': <function get_dbauthinfo>, '_get_shebang_string': <function _get_shebang_string>, 'get_schema': <staticmethod object>, 'name': <abc.abstractproperty object>, 'is_user_enabled': <function is_user_enabled>, '_set_scheduler_type_string': <function _set_scheduler_type_string>, 'get_name': <function get_name>, '__doc__': "\n Base class to map a node in the DB + its permanent repository counterpart.\n\n Stores attributes starting with an underscore.\n\n Caches files and attributes before the first save, and saves everything only on store().\n After the call to store(), attributes cannot be changed.\n\n Only after storing (or upon loading from uuid) metadata can be modified\n and in this case they are directly set on the db.\n\n In the plugin, also set the _plugin_type_string, to be set in the DB in the 'type' field.\n ", '_cleanup_default_mpiprocs_per_machine': <function _cleanup_default_mpiprocs_per_machine>, '_transport_type_validator': <classmethod object>, '_set_default_mpiprocs_per_machine_string': <function _set_default_mpiprocs_per_machine_string>, '_set_shebang_string': <function _set_shebang_string>, '__module__': 'aiida.orm.implementation.general.computer', 'set': <function set>, 'set_description': <function set_description>, '_set_hostname_string': <function _set_hostname_string>, 'set_append_text': <function set_append_text>, 'get_append_text': <function get_append_text>, 'get_workdir': <function get_workdir>, 'get_calculations_on_computer': <function get_calculations_on_computer>, '_get_workdir_string': <function _get_workdir_string>, 'get_scheduler': <function get_scheduler>, 'get_default_mpiprocs_per_machine': <function get_default_mpiprocs_per_machine>, 'to_be_stored': <abc.abstractproperty object>, '_enabled_state_validator': <classmethod object>, 'set_default_mpiprocs_per_machine': <function set_default_mpiprocs_per_machine>, 'id': <abc.abstractproperty object>, '__init__': <function __init__>, 'set_mpirun_command': <function set_mpirun_command>, 'get_transport_type': <function get_transport_type>, '_shouldcall_default_mpiprocs_per_machine': <function _shouldcall_default_mpiprocs_per_machine>, '_get_scheduler_type_string': <function _get_scheduler_type_string>, 'get_hostname': <function get_hostname>, 'dbcomputer': <abc.abstractproperty object>, '_set_description_string': <function _set_description_string>, '_append_text_validator': <classmethod object>, '__repr__': <function __repr__>, '_get_prepend_text_string': <function _get_prepend_text_string>, '_name_validator': <classmethod object>, '_conf_attributes': <aiida.common.utils.classproperty object>, 'description': <abc.abstractproperty object>, '_set_prepend_text_string': <function _set_prepend_text_string>, 'list_names': <aiida.common.utils.abstractclassmethod object>, '_set_property': <function _set_property>, '_get_description_string': <function _get_description_string>, '_get_enabled_state_string': <function _get_enabled_state_string>, 'set_transport_type': <function set_transport_type>, 'set_name': <function set_name>, '_get_append_text_string': <function _get_append_text_string>, '_prepend_text_validator': <classmethod object>, '_get_property': <function _get_property>, '_get_mpirun_command_string': <function _get_mpirun_command_string>, 'is_user_configured': <function is_user_configured>, 'get_mpirun_command': <function get_mpirun_command>})
__init__(**kwargs)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__int__()[source]

Convert the class to an integer. This is needed to allow querying with Django. Be careful, though, not to pass it to a wrong field! This only returns the local DB principal key value.

__module__ = 'aiida.orm.implementation.general.computer'
__repr__() <==> repr(x)[source]
__str__() <==> str(x)[source]
__weakref__

list of weak references to the object (if defined)

classmethod _append_text_validator(append_text)[source]

Validates the append text string.

_cleanup_default_mpiprocs_per_machine()[source]

Called by the command line utility in case the _shouldcall_ routine returns False, to remove possible values that were previously set (e.g. if one used before a pbspro scheduler and set the default_mpiprocs_per_machine, and then switches to sge, the question is not asked, but the value should also be removed from the DB.

_conf_attributes = [('hostname', 'Fully-qualified hostname', 'The fully qualified host-name of this computer', False), ('description', 'Description', 'A human-readable description of this computer', False), ('enabled_state', 'Enabled', 'True or False; if False, the computer is disabled and calculations\nassociated with it will not be submitted', False), ('transport_type', 'Transport type', 'The name of the transport to be used. Valid names are: local,ssh', False), ('scheduler_type', 'Scheduler type', 'The name of the scheduler to be used. Valid names are: lsf,pbspro,torque,slurm,direct,sge', False), ('shebang', 'shebang line at the beginning of the submission script', 'this line specifies the first line of the submission script for this computer', False), ('workdir', 'AiiDA work directory', 'The absolute path of the directory on the computer where AiiDA will\nrun the calculations (typically, the scratch of the computer). You\ncan use the {username} replacement, that will be replaced by your\nusername on the remote computer', False), ('mpirun_command', 'mpirun command', 'The mpirun command needed on the cluster to run parallel MPI\nprograms. You can use the {tot_num_mpiprocs} replacement, that will be \nreplaced by the total number of cpus, or the other scheduler-dependent\nreplacement fields (see the scheduler docs for more information)', False), ('default_mpiprocs_per_machine', 'Default number of CPUs per machine', 'Enter here the default number of CPUs per machine (node) that \nshould be used if nothing is otherwise specified. Leave empty \nif you do not want to provide a default value.\n', False), ('prepend_text', 'Text to prepend to each command execution', 'This is a multiline string, whose content will be prepended inside\nthe submission script before the real execution of the job. It is\nyour responsibility to write proper bash code!', True), ('append_text', 'Text to append to each command execution', 'This is a multiline string, whose content will be appended inside\nthe submission script after the real execution of the job. It is\nyour responsibility to write proper bash code!', True)]
_default_mpiprocs_per_machine_validator(def_cpus_per_machine)[source]

Validates the default number of CPUs per machine (node)

_del_property(k, raise_exception=True)[source]
classmethod _description_validator(description)[source]

Validates the description.

classmethod _enabled_state_validator(enabled_state)[source]

Validates the hostname.

_get_append_text_string()[source]
_get_default_mpiprocs_per_machine_string()[source]

Get the default number of CPUs per machine (node) as a string

_get_description_string()[source]
_get_enabled_state_string()[source]
_get_hostname_string()[source]
_get_metadata()[source]
_get_mpirun_command_string()[source]
_get_prepend_text_string()[source]
_get_property(k, *args)[source]
_get_scheduler_type_string()[source]
_get_shebang_string()[source]
_get_transport_type_string()[source]
_get_workdir_string()[source]
classmethod _hostname_validator(hostname)[source]

Validates the hostname.

_logger = <logging.Logger object>
_mpirun_command_validator(mpirun_cmd)[source]

Validates the mpirun_command variable. MUST be called after properly checking for a valid scheduler.

classmethod _name_validator(name)[source]

Validates the name.

classmethod _prepend_text_validator(prepend_text)[source]

Validates the prepend text string.

classmethod _scheduler_type_validator(scheduler_type)[source]

Validates the transport string.

_set_append_text_string(string)[source]

Set the append_text starting from a string.

_set_default_mpiprocs_per_machine_string(string)[source]

Set the default number of CPUs per machine (node) from a string (set to None if the string is empty)

_set_description_string(string)[source]

Set the description starting from a string.

_set_enabled_state_string(string)[source]

Set the enabled state starting from a string.

_set_hostname_string(string)[source]

Set the hostname starting from a string.

_set_metadata(metadata_dict)[source]

Set the metadata.

_set_mpirun_command_string(string)[source]

Set the mpirun command string (from a string to a list).

_set_prepend_text_string(string)[source]

Set the prepend_text starting from a string.

_set_property(k, v)[source]
_set_scheduler_type_string(string)[source]

Set the scheduler_type starting from a string.

_set_shebang_string(string)[source]

Set the shebang line.

_set_transport_type_string(string)[source]

Set the transport_type starting from a string.

_set_workdir_string(string)[source]

Set the workdir starting from a string.

_shouldcall_default_mpiprocs_per_machine()[source]

Return True if the scheduler can accept ‘default_mpiprocs_per_machine’, False otherwise.

If there is a problem in determining the scheduler, return True to avoid exceptions.

classmethod _transport_type_validator(transport_type)[source]

Validates the transport string.

classmethod _workdir_validator(workdir)[source]

Validates the transport string.

copy()[source]

Return a copy of the current object to work with, not stored yet.

dbcomputer
description
full_text_info

Return a (multiline) string with a human-readable detailed information on this computer.

classmethod get(computer)[source]

Return a computer from its name (or from another Computer or DbComputer instance)

get_append_text()[source]
get_calculations_on_computer()[source]
get_dbauthinfo(user)[source]

Return the aiida.backends.djsite.db.models.DbAuthInfo instance for the given user on this computer, if the computer is not configured for the given user.

Parameters:user – a DbUser instance.
Returns:a aiida.backends.djsite.db.models.DbAuthInfo instance
Raises:NotExistent – if the computer is not configured for the given user.
get_default_mpiprocs_per_machine()[source]

Return the default number of CPUs per machine (node) for this computer, or None if it was not set.

get_description()[source]
get_hostname()[source]
get_mpirun_command()[source]

Return the mpirun command. Must be a list of strings, that will be then joined with spaces when submitting.

I also provide a sensible default that may be ok in many cases.

get_name()[source]
get_prepend_text()[source]
get_scheduler()[source]
get_scheduler_type()[source]
static get_schema()[source]
Every node property contains:
  • display_name: display name of the property
  • help text: short help text of the property
  • is_foreign_key: is the property foreign key to other type of the node
  • type: type of the property. e.g. str, dict, int
Returns:get schema of the computer
get_shebang()[source]
get_transport_class()[source]
get_transport_params()[source]
get_transport_type()[source]
get_workdir()[source]
hostname
id

Return the principal key in the DB.

is_enabled()[source]
is_user_configured(user)[source]

Return True if the computer is configured for the given user, False otherwise.

Parameters:user – a DbUser instance.
Returns:a boolean.
is_user_enabled(user)[source]

Return True if the computer is enabled for the given user (looking only at the per-user setting: the computer could still be globally disabled).

Note:Return False also if the user is not configured for the computer.
Parameters:user – a DbUser instance.
Returns:a boolean.
classmethod list_names()[source]

Return a list with all the names of the computers in the DB.

logger
name
pk

Return the principal key in the DB.

set(**kwargs)[source]
set_append_text(val)[source]
set_default_mpiprocs_per_machine(def_cpus_per_machine)[source]

Set the default number of CPUs per machine (node) for this computer. Accepts None if you do not want to set this value.

set_description(val)[source]
set_enabled_state(enabled)[source]
set_hostname(val)[source]
set_mpirun_command(val)[source]

Set the mpirun command. It must be a list of strings (you can use string.split() if you have a single, space-separated string).

set_name(val)[source]
set_prepend_text(val)[source]
set_scheduler_type(val)[source]
set_shebang(val)[source]
Parameters:val (str) – A valid shebang line
set_transport_params(val)[source]
set_transport_type(val)[source]
set_workdir(val)[source]
store()[source]

Store the computer in the DB.

Differently from Nodes, a computer can be re-stored if its properties are to be changed (e.g. a new mpirun command, etc.)

to_be_stored
uuid

Return the UUID in the DB.

validate()[source]

Check if the attributes and files retrieved from the DB are valid. Raise a ValidationError if something is wrong.

Must be able to work even before storing: therefore, use the get_attr and similar methods that automatically read either from the DB or from the internal attribute cache.

For the base class, this is always valid. Subclasses will reimplement this. In the subclass, always call the super().validate() method first!

class aiida.orm.implementation.general.computer.Util[source]

Bases: object

__abstractmethods__ = frozenset(['delete_computer'])
__dict__ = dict_proxy({'_abc_cache': <_weakrefset.WeakSet object>, '__module__': 'aiida.orm.implementation.general.computer', '__metaclass__': <class 'abc.ABCMeta'>, 'delete_computer': <function delete_computer>, '_abc_registry': <_weakrefset.WeakSet object>, '__abstractmethods__': frozenset(['delete_computer']), '_abc_negative_cache_version': 39, '_abc_negative_cache': <_weakrefset.WeakSet object>, '__dict__': <attribute '__dict__' of 'Util' objects>, '__weakref__': <attribute '__weakref__' of 'Util' objects>, '__doc__': None})
__metaclass__

alias of abc.ABCMeta

__module__ = 'aiida.orm.implementation.general.computer'
__weakref__

list of weak references to the object (if defined)

_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 39
_abc_registry = <_weakrefset.WeakSet object>
delete_computer(pk)[source]
class aiida.orm.implementation.general.group.AbstractGroup(**kwargs)[source]

Bases: object

An AiiDA ORM implementation of group of nodes.

__abstractmethods__ = frozenset(['__int__', 'remove_nodes', 'type_string', 'description', 'is_stored', 'user', 'query', 'nodes', '__init__', 'name', 'uuid', 'id', 'dbgroup', 'pk', 'add_nodes', 'store', 'delete'])
__dict__ = dict_proxy({'__int__': <function __int__>, '__module__': 'aiida.orm.implementation.general.group', 'type_string': <abc.abstractproperty object>, '__metaclass__': <class 'abc.ABCMeta'>, 'description': <abc.abstractproperty object>, 'get': <classmethod object>, '_abc_negative_cache': <_weakrefset.WeakSet object>, '__str__': <function __str__>, 'remove_nodes': <function remove_nodes>, 'is_user_defined': <function is_user_defined>, 'is_stored': <abc.abstractproperty object>, 'user': <abc.abstractproperty object>, '__dict__': <attribute '__dict__' of 'AbstractGroup' objects>, 'query': <aiida.common.utils.abstractclassmethod object>, '__weakref__': <attribute '__weakref__' of 'AbstractGroup' objects>, 'id': <abc.abstractproperty object>, '__init__': <function __init__>, 'name': <abc.abstractproperty object>, '_abc_cache': <_weakrefset.WeakSet object>, 'get_or_create': <classmethod object>, 'get_schema': <staticmethod object>, 'uuid': <abc.abstractproperty object>, 'add_nodes': <function add_nodes>, 'create': <classmethod object>, '__abstractmethods__': frozenset(['__int__', 'remove_nodes', 'type_string', 'description', 'is_stored', 'user', 'query', 'nodes', '__init__', 'name', 'uuid', 'id', 'dbgroup', 'pk', 'add_nodes', 'store', 'delete']), '_abc_negative_cache_version': 39, 'get_from_string': <classmethod object>, 'dbgroup': <abc.abstractproperty object>, 'pk': <abc.abstractproperty object>, 'nodes': <abc.abstractproperty object>, 'delete': <function delete>, '_abc_registry': <_weakrefset.WeakSet object>, '__doc__': '\n An AiiDA ORM implementation of group of nodes.\n ', 'store': <function store>, '__repr__': <function __repr__>})
__init__(**kwargs)[source]

Create a new group. Either pass a dbgroup parameter, to reload ad group from the DB (and then, no further parameters are allowed), or pass the parameters for the Group creation.

Parameters:
  • dbgroup – the dbgroup object, if you want to reload the group from the DB rather than creating a new one.
  • name – The group name, required on creation
  • description – The group description (by default, an empty string)
  • user – The owner of the group (by default, the automatic user)
  • type_string – a string identifying the type of group (by default, an empty string, indicating an user-defined group.
__int__()[source]

Convert the class to an integer. This is needed to allow querying with Django. Be careful, though, not to pass it to a wrong field! This only returns the local DB principal key (pk) value.

Returns:the integer pk of the node or None if not stored.
__metaclass__

alias of abc.ABCMeta

__module__ = 'aiida.orm.implementation.general.group'
__repr__() <==> repr(x)[source]
__str__() <==> str(x)[source]
__weakref__

list of weak references to the object (if defined)

_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 39
_abc_registry = <_weakrefset.WeakSet object>
add_nodes(nodes)[source]

Add a node or a set of nodes to the group.

Note:The group must be already stored.
Note:each of the nodes passed to add_nodes must be already stored.
Parameters:nodes – a Node or DbNode object to add to the group, or a list of Nodes or DbNodes to add.
classmethod create(*args, **kwargs)[source]

Create and store a new group.

Note: This method does not check for presence of the group. You may want to use get_or_create().

Returns:group
dbgroup
Returns:the corresponding Django DbGroup object.
delete()[source]

Delete the group from the DB

description
Returns:the description of the group as a string
classmethod get(*args, **kwargs)[source]
classmethod get_from_string(string)[source]

Get a group from a string. If only the name is provided, without colons, only user-defined groups are searched; add ‘:type_str’ after the group name to choose also the type of the group equal to ‘type_str’ (e.g. ‘data.upf’, ‘import’, etc.)

Raises:
  • ValueError – if the group type does not exist.
  • NotExistent – if the group is not found.
classmethod get_or_create(*args, **kwargs)[source]

Try to retrieve a group from the DB with the given arguments; create (and store) a new group if such a group was not present yet.

Returns:(group, created) where group is the group (new or existing, in any case already stored) and created is a boolean saying
static get_schema()[source]
Every node property contains:
  • display_name: display name of the property
  • help text: short help text of the property
  • is_foreign_key: is the property foreign key to other type of the node
  • type: type of the property. e.g. str, dict, int
Returns:get schema of the group
id
Returns:the principal key (the ID) as an integer, or None if the node was not stored yet
is_stored
Returns:True if the respective DbNode has been already saved in the DB, False otherwise
is_user_defined()[source]
Returns:True if the group is user defined, False otherwise
name
Returns:the name of the group as a string
nodes

Return a generator/iterator that iterates over all nodes and returns the respective AiiDA subclasses of Node, and also allows to ask for the number of nodes in the group using len().

pk
Returns:the principal key (the ID) as an integer, or None if the node was not stored yet
classmethod query(name=None, type_string='', pk=None, uuid=None, nodes=None, user=None, node_attributes=None, past_days=None, **kwargs)[source]

Query for groups.

Note:

By default, query for user-defined groups only (type_string==”“). If you want to query for all type of groups, pass type_string=None. If you want to query for a specific type of groups, pass a specific string as the type_string argument.

Parameters:
  • name – the name of the group
  • nodes – a node or list of nodes that belongs to the group (alternatively, you can also pass a DbNode or list of DbNodes)
  • pk – the pk of the group
  • uuid – the uuid of the group
  • type_string – the string for the type of node; by default, look only for user-defined groups (see note above).
  • user – by default, query for groups of all users; if specified, must be a DbUser object, or a string for the user email.
  • past_days – by default, query for all groups; if specified, query the groups created in the last past_days. Must be a datetime object.
  • node_attributes – if not None, must be a dictionary with format {k: v}. It will filter and return only groups where there is at least a node with an attribute with key=k and value=v. Different keys of the dictionary are joined with AND (that is, the group should satisfy all requirements. v can be a base data type (str, bool, int, float, …) If it is a list or iterable, that the condition is checked so that there should be at least a node in the group with key=k and value=each of the values of the iterable.
  • kwargs

    any other filter to be passed to DbGroup.objects.filter

    Example: if node_attributes = {'elements': ['Ba', 'Ti'], 'md5sum': 'xxx'},
    it will find groups that contain the node with md5sum = ‘xxx’, and moreover contain at least one node for element ‘Ba’ and one node for element ‘Ti’.
remove_nodes(nodes)[source]

Remove a node or a set of nodes to the group.

Note:The group must be already stored.
Note:each of the nodes passed to add_nodes must be already stored.
Parameters:nodes – a Node or DbNode object to add to the group, or a list of Nodes or DbNodes to add.
store()[source]
type_string
Returns:the string defining the type of the group
user
Returns:a Django DbUser object, representing the user associated to this group.
uuid
Returns:a string with the uuid
aiida.orm.implementation.general.group.get_group_type_mapping()[source]

Return a dictionary with {short_name: proper_long_name_in_DB} format, where short_name is the name to use on the command line, while proper_long_name_in_DB is the string stored in the type field of the DbGroup table.

It is defined as a function so that the import statements are confined inside here.

class aiida.orm.implementation.general.lock.AbstractLock(dblock)[source]

Bases: object

ORM class to handle the DbLock objects.

Handles the release of the Lock, offers utility functions to test if a Lock is expired or still valid and to get the lock key.

__dict__ = dict_proxy({'__dict__': <attribute '__dict__' of 'AbstractLock' objects>, '__module__': 'aiida.orm.implementation.general.lock', 'key': <property object>, 'release': <function release>, 'isexpired': <property object>, '__weakref__': <attribute '__weakref__' of 'AbstractLock' objects>, '__doc__': '\n ORM class to handle the DbLock objects.\n\n Handles the release of the Lock, offers utility functions to test if a Lock\n is expired or still valid and to get the lock key.\n ', '__init__': <function __init__>})
__init__(dblock)[source]

Initialize the Lock object with a DbLock. :param dblock: a DbLock object generated by the LockManager

__module__ = 'aiida.orm.implementation.general.lock'
__weakref__

list of weak references to the object (if defined)

isexpired

Test whether a lock is expired or still valid

key

Get the DbLock key :return: string with the lock key

release(owner='None')[source]

Releases the lock deleting the DbLock from the database. :param owner: a string with the Lock’s owner name :raise: ModificationNotAllowed: if the input owner is not the lock owner :raise: InternalError: if something goes bad with the database

class aiida.orm.implementation.general.lock.AbstractLockManager[source]

Bases: object

Management class to generate in a db-safe way locks.

The class handles the generation of lock through the creation of database records with unique key fields using transaction-safe methods.

__dict__ = dict_proxy({'__module__': 'aiida.orm.implementation.general.lock', 'aquire': <function aquire>, '__dict__': <attribute '__dict__' of 'AbstractLockManager' objects>, '__weakref__': <attribute '__weakref__' of 'AbstractLockManager' objects>, '__doc__': '\n Management class to generate in a db-safe way locks.\n\n The class handles the generation of lock through the creation of database\n records with unique ``key`` fields using transaction-safe methods.\n ', 'clear_all': <function clear_all>})
__module__ = 'aiida.orm.implementation.general.lock'
__weakref__

list of weak references to the object (if defined)

aquire(key, timeout=3600, owner='None')[source]

The class tries to generate a new DbLock object with a key, unique in the model. If the creation goes good the Lock is generated and returned, if not an error is raised. :param key: the unique lock key, a string :param timeout: how long the :return: a Lock object :raise: InternalError: if there is an expired lock with the same input key :raise: LockPresent: if there is a Lock already present with the same key

clear_all()[source]

Clears all the Locks, no matter if expired or not, useful for the bootstrap

class aiida.orm.implementation.general.node.AbstractNode(**kwargs)[source]

Bases: object

Base class to map a node in the DB + its permanent repository counterpart.

Stores attributes starting with an underscore.

Caches files and attributes before the first save, and saves everything only on store(). After the call to store(), attributes cannot be changed.

Only after storing (or upon loading from uuid) extras can be modified and in this case they are directly set on the db.

In the plugin, also set the _plugin_type_string, to be set in the DB in the ‘type’ field.

__abstractmethods__ = frozenset(['get_computer', '_increment_version_number_db', '_set_db_extra', '_del_db_attr', '_get_db_description_field', 'get_subclass_from_pk', '_db_iterextras', 'mtime', 'query', '_db_store_all', 'get_user', '__init__', 'add_comment', '_get_dbcomments', '_del_db_extra', '_get_db_input_links', '_update_db_description_field', '_set_db_attr', 'type', '_update_comment', '_get_db_attr', '_update_db_label_field', 'get_subclass_from_uuid', 'get_comments', 'copy', '_replace_dblink_from', '_store_cached_input_links', '_remove_comment', 'ctime', '_remove_dblink_from', '_db_attrs', '_db_store', '_db_iterattrs', '_get_db_output_links', '_reset_db_extras', '_add_dblink_from', '_get_db_extra', '_set_db_computer', '_get_db_label_field'])
__del__()[source]

Called only upon real object destruction from memory I just try to remove junk, whenever possible; do not trust too much this function!

__dict__ = dict_proxy({'__int__': <function __int__>, 'rehash': <function rehash>, '_abc_negative_cache': <_weakrefset.WeakSet object>, '_increment_version_number_db': <function _increment_version_number_db>, '_set_incompatibilities': [], '_validate': <function _validate>, 'add_path': <function add_path>, '__str__': <function __str__>, '_set_attr': <function _set_attr>, 'get_extra': <function get_extra>, '_del_db_attr': <function _del_db_attr>, '_linking_as_output': <function _linking_as_output>, '_get_db_description_field': <function _get_db_description_field>, '_add_dblink_from': <function _add_dblink_from>, 'attrs': <function attrs>, '__dict__': <attribute '__dict__' of 'AbstractNode' objects>, 'query': <aiida.common.utils.abstractclassmethod object>, '__weakref__': <attribute '__weakref__' of 'AbstractNode' objects>, 'set_extra_exclusive': <function set_extra_exclusive>, 'copy': <function copy>, '_add_cachelink_from': <function _add_cachelink_from>, '_del_attr': <function _del_attr>, 'id': <property object>, '_get_db_input_links': <aiida.common.utils.abstractclassmethod object>, '_set_db_extra': <function _set_db_extra>, '_abc_cache': <_weakrefset.WeakSet object>, '_update_db_description_field': <function _update_db_description_field>, 'get_outputs': <function get_outputs>, 'pk': <property object>, 'logger': <property object>, '_check_are_parents_stored': <function _check_are_parents_stored>, 'type': <abc.abstractproperty object>, '__doc__': "\n Base class to map a node in the DB + its permanent repository counterpart.\n\n Stores attributes starting with an underscore.\n\n Caches files and attributes before the first save, and saves everything\n only on store(). After the call to store(), attributes cannot be changed.\n\n Only after storing (or upon loading from uuid) extras can be modified\n and in this case they are directly set on the db.\n\n In the plugin, also set the _plugin_type_string, to be set in the DB in\n the 'type' field.\n ", 'store': <function store>, '_section_name': 'node', '_get_db_attr': <function _get_db_attr>, '_get_temp_folder': <function _get_temp_folder>, '_append_to_attr': <function _append_to_attr>, '_is_valid_cache': <function _is_valid_cache>, 'inp': <property object>, 'has_children': <property object>, 'get_subclass_from_uuid': <aiida.common.utils.abstractclassmethod object>, 'iterextras': <function iterextras>, '_logger': <logging.Logger object>, '_set_db_attr': <function _set_db_attr>, '_add_outputs_from_cache': <function _add_outputs_from_cache>, 'get_comments': <function get_comments>, 'querybuild': <aiida.common.utils.combomethod object>, 'get_inputs_dict': <function get_inputs_dict>, 'clear_hash': <function clear_hash>, '_db_store': <function _db_store>, '_get_dbcomments': <function _get_dbcomments>, '_store_cached_input_links': <function _store_cached_input_links>, '_remove_comment': <function _remove_comment>, '_db_attrs': <function _db_attrs>, '_path_subfolder_name': 'path', 'get_outputs_dict': <function get_outputs_dict>, 'get_schema': <staticmethod object>, 'ctime': <abc.abstractproperty object>, '_db_iterattrs': <function _db_iterattrs>, 'reset_extras': <function reset_extras>, '_remove_link_from': <function _remove_link_from>, '_get_db_output_links': <function _get_db_output_links>, '_store_input_nodes': <function _store_input_nodes>, '_plugin_type_string': 'node.AbstractNode.', '_reset_db_extras': <function _reset_db_extras>, 'is_stored': <property object>, '_abc_negative_cache_version': 39, '_get_db_extra': <function _get_db_extra>, '_set_db_computer': <function _set_db_computer>, '_get_same_node': <function _get_same_node>, '_iter_all_same_nodes': <function _iter_all_same_nodes>, '__module__': 'aiida.orm.implementation.general.node', 'set': <function set>, '__metaclass__': <class 'aiida.orm.implementation.general.node.__metaclass__'>, 'del_extra': <function del_extra>, 'get_abs_path': <function get_abs_path>, 'mtime': <abc.abstractproperty object>, 'set_computer': <function set_computer>, '_del_all_attrs': <function _del_all_attrs>, '_repository_folder': <property object>, 'set_extra': <function set_extra>, '_set_with_defaults': <function _set_with_defaults>, 'get_subclass_from_pk': <aiida.common.utils.abstractclassmethod object>, '_replace_link_from': <function _replace_link_from>, 'get_attrs': <function get_attrs>, '_init_internal_params': <function _init_internal_params>, '_db_store_all': <function _db_store_all>, 'get_user': <function get_user>, '__init__': <function __init__>, 'out': <property object>, 'add_comment': <function add_comment>, '_del_db_extra': <function _del_db_extra>, 'iterattrs': <function iterattrs>, '_cacheable': True, '_set_internal': <function _set_internal>, '__abstractmethods__': frozenset(['get_computer', '_increment_version_number_db', '_set_db_extra', '_del_db_attr', '_get_db_description_field', 'get_subclass_from_pk', '_db_iterextras', 'mtime', 'query', '_db_store_all', 'get_user', '__init__', 'add_comment', '_get_dbcomments', '_del_db_extra', '_get_db_input_links', '_update_db_description_field', '_set_db_attr', 'type', '_update_comment', '_get_db_attr', '_update_db_label_field', 'get_subclass_from_uuid', 'get_comments', 'copy', '_replace_dblink_from', '_store_cached_input_links', '_remove_comment', 'ctime', '_remove_dblink_from', '_db_attrs', '_db_store', '_db_iterattrs', '_get_db_output_links', '_reset_db_extras', '_add_dblink_from', '_get_db_extra', '_set_db_computer', '_get_db_label_field']), 'get_extras': <function get_extras>, 'label': <property object>, 'get_all_same_nodes': <function get_all_same_nodes>, 'extras': <function extras>, 'get_folder_list': <function get_folder_list>, '_query_type_string': 'node.', 'get_computer': <function get_computer>, '_get_objects_to_hash': <function _get_objects_to_hash>, '_update_comment': <function _update_comment>, 'get_attr': <function get_attr>, 'get_desc': <function get_desc>, 'description': <property object>, 'get_inputs': <function get_inputs>, '_update_db_label_field': <function _update_db_label_field>, '_hash_ignored_attributes': [], 'remove_path': <function remove_path>, '_replace_dblink_from': <function _replace_dblink_from>, 'uuid': <property object>, '_remove_dblink_from': <function _remove_dblink_from>, 'store_all': <function store_all>, '_db_iterextras': <function _db_iterextras>, 'has_parents': <property object>, '_set_defaults': <property object>, '_store_from_cache': <function _store_from_cache>, 'set_extras': <function set_extras>, '_abc_registry': <_weakrefset.WeakSet object>, '_get_folder_pathsubfolder': <property object>, 'add_link_from': <function add_link_from>, 'get_hash': <function get_hash>, 'folder': <property object>, '__repr__': <function __repr__>, '__del__': <function __del__>, 'dbnode': <property object>, '_get_db_label_field': <function _get_db_label_field>, '_has_cached_links': <function _has_cached_links>})
__init__(**kwargs)[source]

Initialize the object Node.

Parameters:uuid – if present, the Node with given uuid is loaded from the database. (It is not possible to assign a uuid to a new Node.)
__int__()[source]
class __metaclass__[source]

Bases: abc.ABCMeta

Some python black magic to set correctly the logger also in subclasses.

__module__ = 'aiida.orm.implementation.general.node'
static __new__(name, bases, attrs)[source]
__module__ = 'aiida.orm.implementation.general.node'
__repr__() <==> repr(x)[source]
__str__() <==> str(x)[source]
__weakref__

list of weak references to the object (if defined)

_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 39
_abc_registry = <_weakrefset.WeakSet object>

Add a link in the cache.

Add a link to the current node from the ‘src’ node. Both nodes must be a Node instance (or a subclass of Node)

Note:

this function should not be called directly; it acts directly on the database.

Parameters:
  • src – the source object
  • label (str) – the name of the label to set the link from src. Default = None.
_add_outputs_from_cache(cache_node)[source]
_append_to_attr(key, value, clean=True)[source]

Append value to an attribute of the Node (in the DbAttribute table).

Parameters:
  • key – key name of “list-type” attribute If attribute doesn’t exist, it is created.
  • value – the value to append to the list
  • clean – whether to clean the value WARNING: when set to False, storing will throw errors for any data types not recognized by the db backend
Raises:

ValidationError – if the key is not valid, e.g. it contains the separator symbol

_cacheable = True
_check_are_parents_stored()[source]

Check if all parents are already stored, otherwise raise.

Raises:ModificationNotAllowed – if one of the input nodes in not already stored.
_db_attrs()[source]

Returns the keys of the attributes as a generator, directly from the DB.

DO NOT USE DIRECTLY.

_db_iterattrs()[source]

Iterator over the attributes (directly in the DB!)

DO NOT USE DIRECTLY.

_db_iterextras()[source]

Iterator over the extras (directly in the DB!)

DO NOT USE DIRECTLY.

_db_store(with_transaction=True)[source]

Store a new node in the DB, also saving its repository directory and attributes.

After being called attributes cannot be changed anymore! Instead, extras can be changed only AFTER calling this store() function.

Note:After successful storage, those links that are in the cache, and for which also the parent node is already stored, will be automatically stored. The others will remain unstored.
Parameters:with_transaction – if False, no transaction is used. This is meant to be used ONLY if the outer calling function has already a transaction open!
_db_store_all(with_transaction=True, use_cache=None)[source]

Store the node, together with all input links, if cached, and also the linked nodes, if they were not stored yet.

Parameters:
  • with_transaction – if False, no transaction is used. This is meant to be used ONLY if the outer calling function has already a transaction open!
  • use_cache (bool) – Determines whether caching is used to find an equivalent node.
_del_all_attrs()[source]

Delete all attributes associated to this node.

Raises:ModificationNotAllowed – if the Node was already stored.
_del_attr(key, stored_check=True)[source]

Delete an attribute.

Parameters:
  • key – attribute to delete.
  • stored_check – when set to False will disable the mutability check
Raises:
_del_db_attr(key)[source]

Delete an attribute directly from the DB

DO NOT USE DIRECTLY.

Parameters:key – The key of the attribute to delete
_del_db_extra(key)[source]

Delete an extra, directly on the DB.

DO NOT USE DIRECTLY.

Parameters:key – key name
_get_db_attr(key)[source]

Return the attribute value, directly from the DB.

DO NOT USE DIRECTLY.

Parameters:key – the attribute key
Returns:the attribute value
Raises:AttributeError – if the attribute does not exist.
_get_db_description_field()[source]

Get the description of this node, acting directly at the DB level

_get_db_extra(key)[source]

Get an extra, directly from the DB.

DO NOT USE DIRECTLY.

Parameters:key – key name
Returns:the key value
Raises:AttributeError – if the key does not exist

Return a list of tuples (label, aiida_class) for each input link, possibly filtering only by those of a given type.

Parameters:link_type – if not None, a link type to filter results
Returns:a list of tuples (label, aiida_class)
_get_db_label_field()[source]

Get the label field acting directly on the DB

Returns:a string.

Return a list of tuples (label, aiida_class) for each output link, possibly filtering only by those of a given type.

Parameters:link_type – if not None, a link type to filter results
Returns:a list of tuples (label, aiida_class)
_get_dbcomments(pk=None)[source]

Return a sorted list of DbComment associated with the Node.

Parameters:pk – integer or list of integers. If it is specified, returns the comment values with desired pks. (pk refers to DbComment.pk)
Returns:the list of DbComment, sorted by pk.
_get_folder_pathsubfolder

Get the subfolder in the repository.

Returns:a Folder object.
_get_objects_to_hash()[source]

Return a list of objects which should be included in the hash.

_get_same_node()[source]

Returns a stored node from which the current Node can be cached, meaning that the returned Node is a valid cache, and its _aiida_hash attribute matches self.get_hash().

If there are multiple valid matches, the first one is returned. If no matches are found, None is returned.

Note that after self is stored, this function can return self.

_get_temp_folder()[source]

Get the folder of the Node in the temporary repository.

Returns:a SandboxFolder object mapping the node in the repository.

Return True if there is at least one cached (input) link, that is a link that is not stored yet in the database. False otherwise.

_hash_ignored_attributes = []
_increment_version_number_db()[source]

This function increments the version number in the DB. This should be called every time you need to increment the version (e.g. on adding a extra or attribute).

Note:Do not manually increment the version number, because if two different threads are adding/changing an attribute concurrently, the version number would be incremented only once.
_init_internal_params()[source]

Set the default values for this class; this method is automatically called by the init.

Note:if you inherit this function, ALWAYS remember to call super()._init_internal_params() as the first thing in your inherited function.
_is_valid_cache()[source]

Subclass hook to exclude certain Nodes (e.g. failed calculations) from being considered in the caching process.

_iter_all_same_nodes()[source]

Returns an iterator of all same nodes.

_linking_as_output(dest, link_type)[source]

Raise a ValueError if a link from self to dest is not allowed. Implement in subclasses.

Parameters:dest – the destination output Node
Returns:a boolean (True)
_logger = <logging.Logger object>
_path_subfolder_name = 'path'
_plugin_type_string = 'node.AbstractNode.'
_query_type_string = 'node.'
_remove_comment(comment_pk, user)[source]

Function called by verdi comment remove

Remove from the DB the input link with the given label.

Note:

this function should not be called directly; it acts directly on the database.

Note:

No checks are done to verify that the link actually exists.

Parameters:
  • label (str) – the label of the link from src to the current Node
  • link_type – The type of link, must be one of the enum values form LinkType

Remove from the DB the input link with the given label.

Note:

In subclasses, change only this. Moreover, remember to call the super() method in order to properly use the caching logic!

Note:

No error is raised if the link does not exist.

Parameters:
  • label (str) – the name of the label to set the link from src.
  • link_type – The type of link, must be one of the enum values form LinkType

Replace an input link with the given label and type, or simply creates it if it does not exist.

Note:

this function should not be called directly; it acts directly on the database.

Parameters:
  • src (str) – the source object.
  • label (str) – the label of the link from src to the current Node
  • link_type – The type of link, must be one of the enum values form LinkType

Replace an input link with the given label, or simply creates it if it does not exist.

Note:

In subclasses, change only this. Moreover, remember to call the super() method in order to properly use the caching logic!

Parameters:
  • src – the source object
  • label (str) – the name of the label to set the link from src.
_repository_folder

Get the permanent repository folder. Use preferentially the folder property.

Returns:the permanent RepositoryFolder object
_reset_db_extras(new_extras)[source]

Resets the extras (replacing existing ones) directly in the DB

DO NOT USE DIRECTLY!

Parameters:new_extras – dictionary with new extras
_section_name = 'node'
_set_attr(key, value, clean=True, stored_check=True)[source]

Set a new attribute to the Node (in the DbAttribute table).

Parameters:
  • key – key name
  • value – its value
  • clean – whether to clean values. WARNING: when set to False, storing will throw errors for any data types not recognized by the db backend
  • stored_check – when set to False will disable the mutability check
Raises:
_set_db_attr(key, value)[source]

Set the value directly in the DB, without checking if it is stored, or using the cache.

DO NOT USE DIRECTLY.

Parameters:
  • key – key name
  • value – its value
_set_db_computer(computer)[source]

Set the computer directly inside the dbnode member, in the DB.

DO NOT USE DIRECTLY.

Parameters:computer – the computer object
_set_db_extra(key, value, exclusive)[source]

Store extra directly in the DB, without checks.

DO NOT USE DIRECTLY.

Parameters:
  • key – key name
  • value – key value
  • exclusive – (default=False). If exclusive is True, it raises a UniquenessError if an Extra with the same name already exists in the DB (useful e.g. to “lock” a node and avoid to run multiple times the same computation on it).
_set_defaults

Default values to set in the __init__, if no value is explicitly provided for the given key. It is a dictionary, with k=v; if the key k is not provided to the __init__, and a value is present here, this is set.

_set_incompatibilities = []
_set_internal(arguments, allow_hidden=False)[source]

Works as self.set(), but takes a dictionary as the ‘arguments’ variable, instead of reading it from the kwargs; moreover, it allows to specify allow_hidden to True. In this case, if a a key starts with and underscore, as for instance _state, it will not call the function set__state but rather _set_state.

_set_with_defaults(**kwargs)[source]

Calls the set() method, but also adds the class-defined default values (defined in the self._set_defaults attribute), if they are not provided by the user.

Note:for the default values, also allow to define ‘hidden’ methods, meaning that if a default value has a key “_state”, it will not call the function “set__state” but rather “_set_state”. This is not allowed, instead, for the standard set() method.

Store all input links that are in the local cache, transferring them to the DB.

Note:This can be called only if all parents are already stored.
Note:Links are stored only after the input nodes are stored. Moreover, link storage is done in a transaction, and if one of the links cannot be stored, an exception is raised and all links will remain in the cache.
Note:This function can be called only after the node is stored. After that, it can be called multiple times, and nothing will be executed if no links are still in the cache.
Parameters:with_transaction – if False, no transaction is used. This is meant to be used ONLY if the outer calling function has already a transaction open!
_store_from_cache(cache_node, with_transaction)[source]
_store_input_nodes()[source]

Find all input nodes, and store them, checking that they do not have unstored inputs in turn.

Note:this function stores all nodes without transactions; always call it from within a transaction!
_update_comment(new_field, comment_pk, user)[source]

Function called by verdi comment update

_update_db_description_field(field_value)[source]

Update the description of this node, acting directly at the DB level

_update_db_label_field(field_value)[source]

Set the label field acting directly on the DB

_validate()[source]

Check if the attributes and files retrieved from the DB are valid. Raise a ValidationError if something is wrong.

Must be able to work even before storing: therefore, use the get_attr and similar methods that automatically read either from the DB or from the internal attribute cache.

For the base class, this is always valid. Subclasses will reimplement this. In the subclass, always call the super()._validate() method first!

add_comment(content, user=None)[source]

Add a new comment.

Parameters:content – string with comment

Add a link to the current node from the ‘src’ node. Both nodes must be a Node instance (or a subclass of Node) :note: In subclasses, change only this. Moreover, remember to call the super() method in order to properly use the caching logic!

Parameters:
  • src – the source object
  • label (str) – the name of the label to set the link from src. Default = None.
  • link_type – The type of link, must be one of the enum values from LinkType
add_path(src_abs, dst_path)[source]

Copy a file or folder from a local file inside the repository directory. If there is a subpath, folders will be created.

Copy to a cache directory if the entry has not been saved yet.

Parameters:
  • src_abs (str) – the absolute path of the file to copy.
  • dst_filename (str) – the (relative) path on which to copy.
Todo:

in the future, add an add_attachment() that has the same meaning of a extras file. Decide also how to store. If in two separate subfolders, remember to reset the limit.

attrs()[source]

Returns the keys of the attributes as a generator.

Returns:a generator of a strings
clear_hash()[source]

Sets the stored hash of the Node to None.

copy(**kwargs)[source]

Return a copy of the current object to work with, not stored yet.

This is a completely new entry in the DB, with its own UUID. Works both on stored instances and with not-stored ones.

Copies files and attributes, but not the extras. Does not store the Node to allow modification of attributes.

Returns:an object copy
ctime

Return the creation time of the node.

dbnode
Returns:the corresponding DbNode object.
del_extra(key)[source]

Delete a extra, acting directly on the DB! The action is immediately performed on the DB. Since extras can be added only after storing the node, this function is meaningful to be called only after the .store() method.

Parameters:key – key name
Raise:AttributeError: if key starts with underscore
Raise:ModificationNotAllowed: if the node is not stored yet
description

Get the description of the node.

Returns:a string
Return type:str
extras()[source]

Get the keys of the extras.

Returns:a list of strings
folder

Get the folder associated with the node, whether it is in the temporary or the permanent repository.

Returns:the RepositoryFolder object.
get_abs_path(path=None, section=None)[source]

Get the absolute path to the folder associated with the Node in the AiiDA repository.

Parameters:
  • path (str) – the name of the subfolder inside the section. If None returns the abspath of the folder. Default = None.
  • section – the name of the subfolder (‘path’ by default).
Returns:

a string with the absolute path

For the moment works only for one kind of files, ‘path’ (internal files)

get_all_same_nodes()[source]

Return a list of stored nodes which match the type and hash of the current node. For the stored nodes, the _aiida_hash extra is checked to determine the hash, while self.get_hash() is executed on the current node.

Only nodes which are a valid cache are returned. If the current node is already stored, it can be included in the returned list if self.get_hash() matches its _aiida_hash.

get_attr(key, default=())[source]

Get the attribute.

Parameters:
  • key – name of the attribute
  • default – if no attribute key is found, returns default
Returns:

attribute value

Raises:

AttributeError – If no attribute is found and there is no default

get_attrs()[source]

Return a dictionary with all attributes of this node.

get_comments(pk=None)[source]

Return a sorted list of comment values, one for each comment associated to the node.

Parameters:pk – integer or list of integers. If it is specified, returns the comment values with desired pks. (pk refers to DbComment.pk)
Returns:the list of comments, sorted by pk; each element of the list is a dictionary, containing (pk, email, ctime, mtime, content)
get_computer()[source]

Get the computer associated to the node.

Returns:the Computer object or None.
get_desc()[source]

Returns a string with infos retrieved from a node’s properties. This method is actually overwritten by the inheriting classes

Returns:a description string
get_extra(key, *args)[source]

Get the value of a extras, reading directly from the DB! Since extras can be added only after storing the node, this function is meaningful to be called only after the .store() method.

Parameters:
  • key – key name
  • value – if no attribute key is found, returns value
Returns:

the key value

Raises:

ValueError – If more than two arguments are passed to get_extra

get_extras()[source]

Get the value of extras, reading directly from the DB! Since extras can be added only after storing the node, this function is meaningful to be called only after the .store() method.

Returns:the dictionary of extras ({} if no extras)
get_folder_list(subfolder='.')[source]

Get the the list of files/directory in the repository of the object.

Parameters:subfolder – get the list of a subfolder
Returns:a list of strings.
get_hash(ignore_errors=True, **kwargs)[source]

Making a hash based on my attributes

get_inputs(node_type=None, also_labels=False, only_in_db=False, link_type=None)[source]

Return a list of nodes that enter (directly) in this node

Parameters:
  • node_type – If specified, should be a class, and it filters only elements of that specific type (or a subclass of ‘type’)
  • also_labels – If False (default) only return a list of input nodes. If True, return a list of tuples, where each tuple has the following format: (‘label’, Node), with ‘label’ the link label, and Node a Node instance or subclass
  • only_in_db – Return only the inputs that are in the database, ignoring those that are in the local cache. Otherwise, return all links.
  • link_type – Only get inputs of this link type, if None then returns all inputs of all link types.
get_inputs_dict(only_in_db=False, link_type=None)[source]

Return a dictionary where the key is the label of the input link, and the value is the input node.

Parameters:
  • only_in_db – If true only get stored links, not cached
  • link_type – Only get inputs of this link type, if None then returns all inputs of all link types.
Returns:

a dictionary {label:object}

get_outputs(node_type=None, also_labels=False, link_type=None)[source]

Return a list of nodes that exit (directly) from this node

Parameters:
  • node_type – if specified, should be a class, and it filters only elements of that specific node_type (or a subclass of ‘node_type’)
  • also_labels – if False (default) only return a list of input nodes. If True, return a list of tuples, where each tuple has the following format: (‘label’, Node), with ‘label’ the link label, and Node a Node instance or subclass
  • link_type – Only return outputs connected by links of this type.
get_outputs_dict(link_type=None)[source]

Return a dictionary where the key is the label of the output link, and the value is the input node. As some Nodes (Datas in particular) can have more than one output with the same label, all keys have the name of the link with appended the pk of the node in output. The key without pk appended corresponds to the oldest node.

Returns:a dictionary {linkname:object}
static get_schema()[source]
Every node property contains:
  • display_name: display name of the property
  • help text: short help text of the property
  • is_foreign_key: is the property foreign key to other type of the node
  • type: type of the property. e.g. str, dict, int
Returns:get schema of the node
classmethod get_subclass_from_pk(pk)[source]

Get a node object from the pk, with the proper subclass of Node. (integer primary key used in this database), but loading the proper subclass where appropriate.

Parameters:pk – a string with the pk of the object to be loaded.
Returns:the object of the proper subclass.
Raise:NotExistent: if there is no entry of the desired object kind with the given pk.
classmethod get_subclass_from_uuid(uuid)[source]

Get a node object from the uuid, with the proper subclass of Node. (if Node(uuid=…) is called, only the Node class is loaded).

Parameters:uuid – a string with the uuid of the object to be loaded.
Returns:the object of the proper subclass.
Raise:NotExistent: if there is no entry of the desired object kind with the given uuid.
get_user()[source]

Get the user.

Returns:a DbUser model object
has_children

Property to understand if children are attached to the node :return: a boolean

has_parents

Property to understand if parents are attached to the node :return: a boolean

id
Returns:the principal key (the ID) as an integer, or None if the node was not stored yet
inp

Traverse the graph of the database. Returns a databaseobject, linked to the current node, by means of the linkname. Example: B = A.inp.parameters: returns the object (B), with link from B to A, with linkname parameters C= A.inp: returns an InputManager, an object that is meant to be accessed as the previous example

is_stored

Return True if the node is stored, False otherwise.

iterattrs()[source]

Iterator over the attributes, returning tuples (key, value)

iterextras()[source]

Iterator over the extras, returning tuples (key, value)

Todo:verify that I am not creating a list internally
label

Get the label of the node.

Returns:a string.
logger

Get the logger of the Node object.

Returns:Logger object
mtime

Return the modification time of the node.

out

Traverse the graph of the database. Returns a databaseobject, linked to the current node, by means of the linkname. Example: B = A.out.results: Returns the object B, with link from A to B, with linkname parameters

pk
Returns:the principal key (the ID) as an integer, or None if the node was not stored yet
classmethod query(*args, **kwargs)[source]

Map to the aiidaobjects manager of the DbNode, that returns Node objects (or their subclasses) instead of DbNode entities.

# TODO: VERY IMPORTANT: the recognition of a subclass from the type # does not work if the modules defining the subclasses are not # put in subfolders. # In the future, fix it either to make a cache and to store the # full dependency tree, or save also the path.

querybuild(**kwargs)[source]

Instantiates and :returns: a QueryBuilder instance.

The QueryBuilder’s path has one vertice so far, namely this class. Additional parameters (e.g. filters or a label), can be passes as keyword arguments.

Parameters:
  • label – Label to give
  • filters – filters to apply
  • project – projections

This class is a comboclass (see combomethod()) therefore the method can be called as class or instance method. If called as an instance method, adds a filter on the id.

rehash()[source]

Re-generates the stored hash of the Node.

remove_path(path)[source]

Remove a file or directory from the repository directory. Can be called only before storing.

Parameters:path (str) – relative path to file/directory.
reset_extras(new_extras)[source]

Deletes existing extras and creates new ones. :param new_extras: dictionary with new extras :return: nothing, an exceptions is raised in several circumnstances

set(**kwargs)[source]

For each k=v pair passed as kwargs, call the corresponding set_k(v) method (e.g., calling self.set(property=5, mass=2) will call self.set_property(5) and self.set_mass(2). Useful especially in the __init__.

Note:it uses the _set_incompatibilities list of the class to check that we are not setting methods that cannot be set at the same time. _set_incompatibilities must be a list of tuples, and each tuple specifies the elements that cannot be set at the same time. For instance, if _set_incompatibilities = [(‘property’, ‘mass’)], then the call self.set(property=5, mass=2) will raise a ValueError. If a tuple has more than two values, it raises ValueError if all keys are provided at the same time, but it does not give any error if at least one of the keys is not present.
Note:If one element of _set_incompatibilities is a tuple with only one element, this element will not be settable using this function (and in particular,
Raises:ValueError – if the corresponding set_k method does not exist in self, or if the methods cannot be set at the same time.
set_computer(computer)[source]

Set the computer to be used by the node.

Note that the computer makes sense only for some nodes: Calculation, RemoteData, …

Parameters:computer – the computer object
set_extra(key, value, exclusive=False)[source]

Sets an extra of a calculation. No .store() to be called. Can be used only after saving.

Parameters:
  • key – key name
  • value – key value
  • exclusive – (default=False). If exclusive is True, it raises a UniquenessError if an Extra with the same name already exists in the DB (useful e.g. to “lock” a node and avoid to run multiple times the same computation on it).
Raises:

UniquenessError – if extra already exists and exclusive is True.

set_extra_exclusive(key, value)[source]

Set an extra in exclusive mode (stops if the attribute is already there). Deprecated, use set_extra() with exclusive=False

Parameters:
  • key – key name
  • value – key value
set_extras(the_dict)[source]

Immediately sets several extras of a calculation, in the DB! No .store() to be called. Can be used only after saving.

Parameters:the_dict – a dictionary of key:value to be set as extras
store(with_transaction=True, use_cache=None)[source]

Store a new node in the DB, also saving its repository directory and attributes.

After being called attributes cannot be changed anymore! Instead, extras can be changed only AFTER calling this store() function.

Note:After successful storage, those links that are in the cache, and for which also the parent node is already stored, will be automatically stored. The others will remain unstored.
Parameters:with_transaction – if False, no transaction is used. This is meant to be used ONLY if the outer calling function has already a transaction open!
store_all(with_transaction=True, use_cache=None)[source]

Store the node, together with all input links, if cached, and also the linked nodes, if they were not stored yet.

Parameters:with_transaction – if False, no transaction is used. This is meant to be used ONLY if the outer calling function has already a transaction open!
type

Get the type of the node.

Returns:a string.
uuid
Returns:a string with the uuid
class aiida.orm.implementation.general.node.AttributeManager(node)[source]

Bases: object

An object used internally to return the attributes as a dictionary.

Note:Important! It cannot be used to change variables, just to read them. To change values (of unstored nodes), use the proper Node methods.
__dict__ = dict_proxy({'__module__': 'aiida.orm.implementation.general.node', '__getitem__': <function __getitem__>, '__getattr__': <function __getattr__>, '__iter__': <function __iter__>, '__dir__': <function __dir__>, '__dict__': <attribute '__dict__' of 'AttributeManager' objects>, '_get_dict': <function _get_dict>, '__weakref__': <attribute '__weakref__' of 'AttributeManager' objects>, '__doc__': '\n An object used internally to return the attributes as a dictionary.\n\n :note: Important! It cannot be used to change variables, just to read\n them. To change values (of unstored nodes), use the proper Node methods.\n ', '__init__': <function __init__>})
__dir__()[source]

Allow to list the keys of the dictionary

__getattr__(name)[source]

Interface to get to dictionary values, using the key as an attribute.

Note:it works only for attributes that only contain letters, numbers and underscores, and do not start with a number.
Parameters:name – name of the key whose value is required.
__getitem__(name)[source]

Interface to get to dictionary values as a dictionary.

Parameters:name – name of the key whose value is required.
__init__(node)[source]
Parameters:node – the node object.
__iter__()[source]

Return the keys as an iterator

__module__ = 'aiida.orm.implementation.general.node'
__weakref__

list of weak references to the object (if defined)

_get_dict()[source]

Return the internal dictionary

class aiida.orm.implementation.general.node.NodeInputManager(node)[source]

Bases: object

To document

__dict__ = dict_proxy({'__dict__': <attribute '__dict__' of 'NodeInputManager' objects>, '__module__': 'aiida.orm.implementation.general.node', '__init__': <function __init__>, '__getitem__': <function __getitem__>, '__dir__': <function __dir__>, '__weakref__': <attribute '__weakref__' of 'NodeInputManager' objects>, '__iter__': <function __iter__>, '__getattr__': <function __getattr__>, '__doc__': '\n To document\n '})
__dir__()[source]

Allow to list all valid input links

__getattr__(name)[source]
Parameters:name – name of the attribute to be asked to the parser results.
__getitem__(name)[source]

interface to get to the parser results as a dictionary.

Parameters:name – name of the attribute to be asked to the parser results.
__init__(node)[source]
Parameters:node – the node object.
__iter__()[source]
__module__ = 'aiida.orm.implementation.general.node'
__weakref__

list of weak references to the object (if defined)

class aiida.orm.implementation.general.node.NodeOutputManager(node)[source]

Bases: object

To document

__dict__ = dict_proxy({'__dict__': <attribute '__dict__' of 'NodeOutputManager' objects>, '__module__': 'aiida.orm.implementation.general.node', '__init__': <function __init__>, '__getitem__': <function __getitem__>, '__dir__': <function __dir__>, '__weakref__': <attribute '__weakref__' of 'NodeOutputManager' objects>, '__iter__': <function __iter__>, '__getattr__': <function __getattr__>, '__doc__': '\n To document\n '})
__dir__()[source]

Allow to list all valid output links

__getattr__(name)[source]
Parameters:name – name of the attribute to be asked to the parser results.
__getitem__(name)[source]

interface to get to the parser results as a dictionary.

Parameters:name – name of the attribute to be asked to the parser results.
__init__(node)[source]
Parameters:node – the node object.
__iter__()[source]
__module__ = 'aiida.orm.implementation.general.node'
__weakref__

list of weak references to the object (if defined)

aiida.orm.implementation.general.node.clean_value(value)[source]

Get value from input and (recursively) replace, if needed, all occurrences of BaseType AiiDA data nodes with their value, and List with a standard list.

It also makes a deep copy of everything.

Note however that there is no logic to avoid infinite loops when the user passes some perverse recursive dictionary or list. In any case, however, this would not be storable by AiiDA…

Parameters:value – A value to be set as an attribute or an extra
Returns:a “cleaned” value, potentially identical to value, but with values replaced where needed.
class aiida.orm.implementation.general.user.AbstractUser(**kwargs)[source]

Bases: object

An AiiDA ORM implementation of a user.

REQUIRED_FIELDS = ['first_name', 'last_name', 'institution']
__abstractmethods__ = frozenset(['last_name', 'is_active', 'is_staff', 'search_for_users', 'institution', '__init__', 'date_joined', 'first_name', 'is_superuser', 'id', 'force_save', '_set_password', 'last_login', 'pk', 'save', 'email', '_get_password'])
__dict__ = dict_proxy({'__module__': 'aiida.orm.implementation.general.user', 'last_name': <abc.abstractproperty object>, '__metaclass__': <class 'abc.ABCMeta'>, 'has_usable_password': <function has_usable_password>, '_logger': <logging.Logger object>, '_abc_negative_cache': <_weakrefset.WeakSet object>, 'is_active': <abc.abstractproperty object>, 'is_staff': <abc.abstractproperty object>, 'search_for_users': <aiida.common.utils.abstractclassmethod object>, '__dict__': <attribute '__dict__' of 'AbstractUser' objects>, 'get_all_users': <classmethod object>, 'password': <property object>, '__weakref__': <attribute '__weakref__' of 'AbstractUser' objects>, 'id': <abc.abstractproperty object>, '__init__': <function __init__>, 'date_joined': <abc.abstractproperty object>, '_abc_cache': <_weakrefset.WeakSet object>, '_abc_negative_cache_version': 39, 'first_name': <abc.abstractproperty object>, 'get_schema': <staticmethod object>, '__abstractmethods__': frozenset(['last_name', 'is_active', 'is_staff', 'search_for_users', 'institution', '__init__', 'date_joined', 'first_name', 'is_superuser', 'id', 'force_save', '_set_password', 'last_login', 'pk', 'save', 'email', '_get_password']), 'force_save': <function force_save>, 'REQUIRED_FIELDS': ['first_name', 'last_name', 'institution'], '__doc__': '\n An AiiDA ORM implementation of a user.\n ', 'institution': <abc.abstractproperty object>, 'is_superuser': <abc.abstractproperty object>, '_set_password': <function _set_password>, 'last_login': <abc.abstractproperty object>, 'pk': <abc.abstractproperty object>, 'save': <function save>, '_abc_registry': <_weakrefset.WeakSet object>, 'email': <abc.abstractproperty object>, '_get_password': <function _get_password>})
__init__(**kwargs)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__metaclass__

alias of abc.ABCMeta

__module__ = 'aiida.orm.implementation.general.user'
__weakref__

list of weak references to the object (if defined)

_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 39
_abc_registry = <_weakrefset.WeakSet object>
_get_password()[source]
_logger = <logging.Logger object>
_set_password()[source]
date_joined
email
first_name
force_save()[source]
classmethod get_all_users()[source]
static get_schema()[source]
Every node property contains:
  • display_name: display name of the property
  • help text: short help text of the property
  • is_foreign_key: is the property foreign key to other type of the node
  • type: type of the property. e.g. str, dict, int
Returns:get schema of the user
has_usable_password()[source]
id
institution
is_active
is_staff
is_superuser
last_login
last_name
password
pk
save()[source]
classmethod search_for_users(**kwargs)[source]

Search for a user the passed keys.

Parameters:kwargs – The keys to search for the user with.
Returns:A list of users matching the search criteria.
class aiida.orm.implementation.general.user.Util[source]

Bases: object

__abstractmethods__ = frozenset(['delete_user'])
__dict__ = dict_proxy({'_abc_cache': <_weakrefset.WeakSet object>, '__module__': 'aiida.orm.implementation.general.user', '__metaclass__': <class 'abc.ABCMeta'>, '_abc_registry': <_weakrefset.WeakSet object>, '__abstractmethods__': frozenset(['delete_user']), 'delete_user': <function delete_user>, '_abc_negative_cache_version': 39, '_abc_negative_cache': <_weakrefset.WeakSet object>, '__dict__': <attribute '__dict__' of 'Util' objects>, '__weakref__': <attribute '__weakref__' of 'Util' objects>, '__doc__': None})
__metaclass__

alias of abc.ABCMeta

__module__ = 'aiida.orm.implementation.general.user'
__weakref__

list of weak references to the object (if defined)

_abc_cache = <_weakrefset.WeakSet object>
_abc_negative_cache = <_weakrefset.WeakSet object>
_abc_negative_cache_version = 39
_abc_registry = <_weakrefset.WeakSet object>
delete_user(pk)[source]

Delete the user with the given pk. :param pk: The user pk.

aiida.orm.implementation.general.utils.get_db_columns(db_class)[source]

This function returns a dictionary where the keys are the columns of the table corresponding to the db_class and the values are the column properties such as type, is_foreign_key and if so, the related table and column.

The function applies only to the sqlalchemy backend and for other backends it requires a mapping sqlalchemy–>new_backend. For Django this mapping is the dummy_model.

A similar logic applies to the QueryBuilder that is entirely built upon SQLAlchemy (+ dummy model, for Django)

Parameters:db_class – the database model whose schema has to be returned
Returns:a dictionary
aiida.orm.implementation.general.utils.get_foreign_key_infos(foreign_key)[source]

takes a foreignkey sqlalchemy object and returns the referent column name and the referred relation and column names :param foreign_key: a sqlalchemy ForeignKey object :return: a tuple of strings

class aiida.orm.implementation.general.workflow.AbstractWorkflow(**kwargs)[source]

Bases: object

Base class to represent a workflow. This is the superclass of any workflow implementations, and provides all the methods necessary to interact with the database.

The typical use case are workflow stored in the aiida.workflow packages, that are initiated either by the user in the shell or by some scripts, and that are monitored by the aiida daemon.

Workflow can have steps, and each step must contain some calculations to be executed. At the end of the step’s calculations the workflow is reloaded in memory and the next methods is called.

__dict__ = dict_proxy({'attach_calculation': <function attach_calculation>, '_increment_version_number_db': <function _increment_version_number_db>, 'add_path': <function add_path>, '__str__': <function __str__>, 'kill': <function kill>, 'get_report': <function get_report>, '__dict__': <attribute '__dict__' of 'AbstractWorkflow' objects>, 'query': <aiida.common.utils.abstractclassmethod object>, '__weakref__': <attribute '__weakref__' of 'AbstractWorkflow' objects>, 'get_subclass_from_dbnode': <aiida.common.utils.abstractclassmethod object>, 'kill_step_calculations': <function kill_step_calculations>, 'uuid': <abc.abstractproperty object>, 'is_subworkflow': <function is_subworkflow>, 'has_step': <function has_step>, 'next': <function next>, 'current_folder': <property object>, '_update_db_description_field': <function _update_db_description_field>, 'repo_folder': <property object>, 'pk': <abc.abstractproperty object>, 'logger': <property object>, '__doc__': "\n Base class to represent a workflow. This is the superclass of any workflow implementations,\n and provides all the methods necessary to interact with the database.\n\n The typical use case are workflow stored in the aiida.workflow packages, that are initiated\n either by the user in the shell or by some scripts, and that are monitored by the aiida daemon.\n\n Workflow can have steps, and each step must contain some calculations to be executed. At the\n end of the step's calculations the workflow is reloaded in memory and the next methods is called.\n\n .. todo: verify if there are other places (beside label and description) where\n the _increment_version_number_db routine needs to be called to increase\n the nodeversion after storing\n ", 'store': <function store>, 'append_to_report': <function append_to_report>, 'get_parameters': <function get_parameters>, '_section_name': 'workflow', 'add_results': <function add_results>, 'has_failed': <function has_failed>, 'get_subclass_from_uuid': <aiida.common.utils.abstractclassmethod object>, 'get_step_calculations': <function get_step_calculations>, 'get_state': <function get_state>, 'get_steps': <function get_steps>, 'info': <function info>, '_path_subfolder_name': 'path', 'ctime': <abc.abstractproperty object>, 'add_result': <function add_result>, 'sleep': <function sleep>, 'dbworkflowinstance': <abc.abstractproperty object>, 'add_attribute': <function add_attribute>, '__module__': 'aiida.orm.implementation.general.workflow', 'set_state': <function set_state>, 'get_abs_path': <function get_abs_path>, 'is_new': <function is_new>, 'get_subclass_from_pk': <aiida.common.utils.abstractclassmethod object>, 'has_finished_ok': <function has_finished_ok>, 'remove_path': <function remove_path>, '__init__': <function __init__>, 'get_step': <function get_step>, 'label': <abc.abstractproperty object>, 'get_step_workflows': <function get_step_workflows>, 'exit': <function exit>, 'get_folder_list': <function get_folder_list>, '_get_dbworkflowinstance': <function _get_dbworkflowinstance>, 'get_results': <function get_results>, 'get_all_calcs': <function get_all_calcs>, 'description': <abc.abstractproperty object>, '_update_db_label_field': <function _update_db_label_field>, 'get_attribute': <function get_attribute>, 'is_running': <function is_running>, 'step': <classmethod object>, 'get_parameter': <function get_parameter>, 'add_attributes': <function add_attributes>, 'attach_workflow': <function attach_workflow>, 'get_result': <function get_result>, '_get_folder_pathsubfolder': <property object>, 'get_temp_folder': <function get_temp_folder>, 'clear_report': <function clear_report>, 'set_params': <function set_params>, '__repr__': <function __repr__>, 'get_attributes': <function get_attributes>})
__init__(**kwargs)[source]

Initializes the Workflow super class, store the instance in the DB and in case stores the starting parameters.

If initialized with an uuid the Workflow is loaded from the DB, if not a new workflow is generated and added to the DB following the stack frameworks. This means that only modules inside aiida.workflows are allowed to implements the workflow super calls and be stored. The caller names, modules and files are retrieved from the stack.

Parameters:
  • uuid – a string with the uuid of the object to be loaded.
  • params – a dictionary of storable objects to initialize the specific workflow
Raise:

NotExistent: if there is no entry of the desired workflow kind with the given uuid.

__module__ = 'aiida.orm.implementation.general.workflow'
__repr__() <==> repr(x)[source]
__str__() <==> str(x)[source]
__weakref__

list of weak references to the object (if defined)

_get_dbworkflowinstance()[source]
_get_folder_pathsubfolder

Get the subfolder in the repository.

Returns:a Folder object.
_increment_version_number_db()[source]

This function increments the version number in the DB. This should be called every time you need to increment the version (e.g. on adding a extra or attribute).

_path_subfolder_name = 'path'
_section_name = 'workflow'
_update_db_description_field(field_value)[source]

Safety method to store the description of the workflow

Returns:a string
_update_db_label_field(field_value)[source]

Safety method to store the label of the workflow

Returns:a string
add_attribute(_name, _value)[source]

Add one attributes to the Workflow. If another attribute is present with the same name it will be overwritten. :param name: a string with the attribute name to store :param value: a storable object to store

add_attributes(_params)[source]

Add a set of attributes to the Workflow. If another attribute is present with the same name it will be overwritten. :param name: a string with the attribute name to store :param value: a storable object to store

add_path(src_abs, dst_path)[source]

Copy a file or folder from a local file inside the repository directory. If there is a subpath, folders will be created.

Copy to a cache directory if the entry has not been saved yet. src_abs: the absolute path of the file to copy. dst_filename: the (relative) path on which to copy.

add_result(_name, _value)[source]

Add one result to the Workflow. If another result is present with the same name it will be overwritten. :param name: a string with the result name to store :param value: a storable object to store

add_results(_params)[source]

Add a set of results to the Workflow. If another result is present with the same name it will be overwritten. :param name: a string with the result name to store :param value: a storable object to store

append_to_report(text)[source]

Adds text to the Workflow report.

Note:Once, in case the workflow is a subworkflow of any other Workflow this method calls the parent append_to_report method; now instead this is not the case anymore
attach_calculation(calc)[source]

Adds a calculation to the caller step in the database. This is a lazy call, no calculations will be launched until the next method gets called. For a step to be completed all the calculations linked have to be in RETRIEVED state, after which the next method gets called from the workflow manager. :param calc: a JobCalculation object :raise: AiidaException: in case the input is not of JobCalculation type

attach_workflow(sub_wf)[source]

Adds a workflow to the caller step in the database. This is a lazy call, no workflow will be started until the next method gets called. For a step to be completed all the workflows linked have to be in FINISHED state, after which the next method gets called from the workflow manager. :param next_method: a Workflow object

clear_report()[source]

Wipe the Workflow report. In case the workflow is a subworflow of any other Workflow this method calls the parent clear_report method.

ctime

Get the creation time of the workflow

current_folder

Get the current repository folder, whether the temporary or the permanent.

Returns:the RepositoryFolder object.
dbworkflowinstance

Get the DbWorkflow object stored in the super class.

Returns:DbWorkflow object from the database
description

Get the description of the workflow.

Returns:a string
exit()[source]

This is the method to call in next to finish the Workflow. When exit is the next method, and no errors are found, the Workflow is set to FINISHED and removed from the execution manager duties.

get_abs_path(path, section=None)[source]

TODO: For the moment works only for one kind of files, ‘path’ (internal files)

get_all_calcs(calc_class=<class 'aiida.orm.implementation.django.calculation.job.JobCalculation'>, calc_state=None, depth=15)[source]

Get all calculations connected with this workflow and all its subworflows up to a given depth. The list of calculations can be restricted to a given calculation type and state :param calc_class: the calculation class to which the calculations should belong (default: JobCalculation)

Parameters:
  • calc_state – a specific state to filter the calculations to retrieve
  • depth – the maximum depth level the recursion on sub-workflows will try to reach (0 means we stay at the step level and don’t go into sub-workflows, 1 means we go down to one step level of the sub-workflows, etc.)
Returns:

a list of JobCalculation objects

get_attribute(_name)[source]

Get one Workflow attribute :param name: a string with the attribute name to retrieve :return: a dictionary of storable objects

get_attributes()[source]

Get the Workflow attributes :return: a dictionary of storable objects

get_folder_list(subfolder='.')[source]

Get the the list of files/directory in the repository of the object.

Parameters:subfolder – get the list of a subfolder
Returns:a list of strings.
get_parameter(_name)[source]

Get one Workflow paramenter :param name: a string with the parameters name to retrieve :return: a dictionary of storable objects

get_parameters()[source]

Get the Workflow paramenters :return: a dictionary of storable objects

get_report()[source]

Return the Workflow report.

Note:once, in case the workflow is a subworkflow of any other Workflow this method calls the parent get_report method. This is not the case anymore.
Returns:a list of strings
get_result(_name)[source]

Get one Workflow result :param name: a string with the result name to retrieve :return: a dictionary of storable objects

get_results()[source]

Get the Workflow results :return: a dictionary of storable objects

get_state()[source]

Get the Workflow’s state :return: a state from wf_states in aiida.common.datastructures

get_step(step_method)[source]

Retrieves by name a step from the Workflow. :param step_method: a string with the name of the step to retrieve or a method :raise: ObjectDoesNotExist: if there is no step with the specific name. :return: a DbWorkflowStep object.

get_step_calculations(step_method, calc_state=None)[source]

Retrieves all the calculations connected to a specific step in the database. If the step is not existent it returns None, useful for simpler grammatic in the workflow definition. :param next_method: a Workflow step (decorated) method :param calc_state: a specific state to filter the calculations to retrieve :return: a list of JobCalculations objects

get_step_workflows(step_method)[source]

Retrieves all the workflows connected to a specific step in the database. If the step is not existent it returns None, useful for simpler grammatic in the workflow definition. :param next_method: a Workflow step (decorated) method

get_steps(state=None)[source]

Retrieves all the steps from a specific workflow Workflow with the possibility to limit the list to a specific step’s state. :param state: a state from wf_states in aiida.common.datastructures :return: a list of DbWorkflowStep objects.

classmethod get_subclass_from_dbnode(wf_db)[source]

Loads the workflow object and reaoads the python script in memory with the importlib library, the main class is searched and then loaded. :param wf_db: a specific DbWorkflowNode object representing the Workflow :return: a Workflow subclass from the specific source code

classmethod get_subclass_from_pk(pk)[source]

Calls the get_subclass_from_dbnode selecting the DbWorkflowNode from the input pk. :param pk: a primary key index for the DbWorkflowNode :return: a Workflow subclass from the specific source code

classmethod get_subclass_from_uuid(uuid)[source]

Calls the get_subclass_from_dbnode selecting the DbWorkflowNode from the input uuid. :param uuid: a uuid for the DbWorkflowNode :return: a Workflow subclass from the specific source code

get_temp_folder()[source]

Get the folder of the Node in the temporary repository.

Returns:a SandboxFolder object mapping the node in the repository.
has_failed()[source]

Returns True is the Workflow’s state is ERROR

has_finished_ok()[source]

Returns True is the Workflow’s state is FINISHED

has_step(step_method)[source]

Return if the Workflow has a step with a specific name. :param step_method: a string with the name of the step to retrieve or a method

info()[source]

Returns an array with all the informations about the modules, file, class to locate the workflow source code

is_new()[source]

Returns True is the Workflow’s state is CREATED

is_running()[source]

Returns True is the Workflow’s state is RUNNING

is_subworkflow()[source]

Return True is this is a subworkflow (i.e., if it has a parent), False otherwise.

kill(verbose=False)[source]

Stop the Workflow execution and change its state to FINISHED.

This method calls the kill method for each Calculation and each subworkflow linked to each RUNNING step.

Parameters:verbose – True to print the pk of each subworkflow killed
Raises:InvalidOperation – if some calculations cannot be killed (the workflow will be also put to SLEEP so that it can be killed later on)
kill_step_calculations(step)[source]

Calls the kill method for each Calculation linked to the step method passed as argument. :param step: a Workflow step (decorated) method

label

Get the label of the workflow.

Returns:a string
logger

Get the logger of the Workflow object, so that it also logs to the DB.

Returns:LoggerAdapter object, that works like a logger, but also has the ‘extra’ embedded
next(next_method)[source]

Adds the a new step to be called after the completion of the caller method’s calculations and subworkflows.

This method must be called inside a Workflow step, otherwise an error is thrown. The code finds the caller method and stores in the database the input next_method as the next method to be called. At this point no execution in made, only configuration updates in the database.

If during the execution of the caller method the user launched calculations or subworkflows, this method will add them to the database, making them available to the workflow manager to be launched. In fact all the calculation and subworkflow submissions are lazy method, really executed by this call.

Parameters:next_method – a Workflow step method to execute after the caller method
Raise:AiidaException: in case the caller method cannot be found or validated
Returns:the wrapped methods, decorated with the correct step name
pk

Returns the DbWorkflow pk

classmethod query(*args, **kwargs)[source]

Map to the aiidaobjects manager of the DbWorkflow, that returns Workflow objects instead of DbWorkflow entities.

remove_path(path)[source]

Remove a file or directory from the repository directory.

Can be called only before storing.

repo_folder

Get the permanent repository folder. Use preferentially the current_folder method.

Returns:the permanent RepositoryFolder object
set_params(params, force=False)[source]

Adds parameters to the Workflow that are both stored and used every time the workflow engine re-initialize the specific workflow to launch the new methods.

set_state(state)[source]

Set the Workflow’s state :param name: a state from wf_states in aiida.common.datastructures

sleep()[source]

Changes the workflow state to SLEEP, only possible to call from a Workflow step decorated method.

classmethod step(fun)[source]

This method is used as a decorator for workflow steps, and handles the method’s execution, the state updates and the eventual errors.

The decorator generates a wrapper around the input function to execute, adding with the correct step name and a utility variable to make it distinguishable from non-step methods.

When a step is launched, the wrapper tries to run the function in case of error the state of the workflow is moved to ERROR and the traceback is stored in the report. In general the input method is a step obtained from the Workflow object, and the decorator simply handles a controlled execution of the step allowing the code not to break in case of error in the step’s source code.

The wrapper also tests not to run two times the same step, unless a Workflow is in ERROR state, in this case all the calculations and subworkflows of the step are killed and a new execution is allowed.

Parameters:fun – a methods to wrap, making it a Workflow step
Raise:AiidaException: in case the workflow state doesn’t allow the execution
Returns:the wrapped methods,
store()[source]

Stores the DbWorkflow object data in the database

uuid

Returns the DbWorkflow uuid

exception aiida.orm.implementation.general.workflow.WorkflowKillError(*args, **kwargs)[source]

Bases: aiida.common.exceptions.AiidaException

An exception raised when a workflow failed to be killed. The error_message_list attribute contains the error messages from all the subworkflows.

__init__(*args, **kwargs)[source]

x.__init__(…) initializes x; see help(type(x)) for signature

__module__ = 'aiida.orm.implementation.general.workflow'
exception aiida.orm.implementation.general.workflow.WorkflowUnkillable[source]

Bases: aiida.common.exceptions.AiidaException

Raised when a workflow cannot be killed because it is in the FINISHED or ERROR state.

__module__ = 'aiida.orm.implementation.general.workflow'
aiida.orm.implementation.general.workflow.get_workflow_info(w, tab_size=2, short=False, pre_string='', depth=16)[source]

Return a string with all the information regarding the given workflow and all its calculations and subworkflows. This is a recursive function (to print all subworkflows info as well).

Parameters:
  • w – a DbWorkflow instance
  • tab_size – number of spaces to use for the indentation
  • short – if True, provide a shorter output (only total number of calculations, rather than the state of each calculation)
  • pre_string – string appended at the beginning of each line
  • depth – the maximum depth level the recursion on sub-workflows will try to reach (0 means we stay at the step level and don’t go into sub-workflows, 1 means we go down to one step level of the sub-workflows, etc.)
Return lines:

list of lines to be outputed

aiida.orm.implementation.general.workflow.kill_all()[source]

Kills all the workflows not in FINISHED state running the kill_from_uuid method in a loop.

Parameters:uuid – the UUID of the workflow to kill
aiida.orm.implementation.general.workflow.kill_from_pk()[source]

Kills a workflow from its pk.

Parameters:pk – the Pkof the workflow to kill