QuantumProvenanceData¶
- class lsst.daf.butler.QuantumProvenanceData(*, predicted_inputs: set[uuid.UUID], available_inputs: set[uuid.UUID], actual_inputs: set[uuid.UUID], predicted_outputs: set[uuid.UUID], actual_outputs: set[uuid.UUID], datastore_records: dict[str, lsst.daf.butler.datastore.record_data.SerializedDatastoreRecordData])¶
Bases:
_BaseModelCompat
A serializable struct for per-quantum provenance information and datastore records.
Notes
This class slightly duplicates information from the
Quantum
class itself (thepredicted_inputs
andpredicted_outputs
sets should have the same IDs present inQuantum.inputs
andQuantum.outputs
), but overall it assumes the originalQuantum
is also available to reconstruct the complete provenance (e.g. by associating dataset IDs with data IDs, dataset types, andRUN
names.Note that
pydantic
methodparse_raw()
is not going to work correctly for this class, usedirect
method instead.Attributes Summary
Methods Summary
collect_and_transfer
(butler, quanta, provenance)Transfer output datasets from multiple quanta to a more permanent
Butler
repository.construct
([_fields_set])Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
copy
(*[, include, exclude, update, deep])Duplicate a model, optionally choose which fields to include, exclude and change.
dict
(*[, include, exclude, by_alias, ...])Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
direct
(*, predicted_inputs, ...)Construct an instance directly without validators.
from_orm
(obj)json
(*[, include, exclude, by_alias, ...])Generate a JSON representation of the model,
include
andexclude
arguments as perdict()
.model_construct
([_fields_set])model_dump
(*[, mode, include, exclude, ...])model_dump_json
(*[, indent, include, ...])model_rebuild
(*[, force, raise_errors, ...])model_validate
(obj, *[, strict, ...])model_validate_json
(json_data, *[, strict, ...])parse_file
(path, *[, content_type, ...])parse_obj
(obj)parse_raw
(*args, **kwargs)schema
([by_alias, ref_template])schema_json
(*[, by_alias, ref_template])update_forward_refs
(**localns)Try to update ForwardRefs on fields based on this Model, globalns and localns.
validate
(value)Attributes Documentation
- model_fields = {'actual_inputs': ModelField(name='actual_inputs', type=Set[UUID], required=True), 'actual_outputs': ModelField(name='actual_outputs', type=Set[UUID], required=True), 'available_inputs': ModelField(name='available_inputs', type=Set[UUID], required=True), 'datastore_records': ModelField(name='datastore_records', type=Mapping[str, SerializedDatastoreRecordData], required=True), 'predicted_inputs': ModelField(name='predicted_inputs', type=Set[UUID], required=True), 'predicted_outputs': ModelField(name='predicted_outputs', type=Set[UUID], required=True)}¶
Methods Documentation
- static collect_and_transfer(butler: Butler, quanta: Iterable[Quantum], provenance: Iterable[QuantumProvenanceData]) None ¶
Transfer output datasets from multiple quanta to a more permanent
Butler
repository.- Parameters:
- butler
Butler
Full butler representing the data repository to transfer datasets to.
- quanta
Iterable
[Quantum
] Iterable of
Quantum
objects that carry information about predicted outputs. May be a single-pass iterator.- provenance
Iterable
[QuantumProvenanceData
] Provenance and datastore data for each of the given quanta, in the same order. May be a single-pass iterator.
- butler
Notes
Input-output provenance data is not actually transferred yet, because
Registry
has no place to store it.This method probably works most efficiently if run on all quanta for a single task label at once, because this will gather all datasets of a particular type together into a single vectorized
Registry
import. It should still behave correctly if run on smaller groups of quanta or even quanta from multiple tasks.Currently this method transfers datastore record data unchanged, with no possibility of actually moving (e.g.) files. Datastores that are present only in execution or only in the more permanent butler are ignored.
- classmethod construct(_fields_set: SetStr | None = None, **values: Any) Model ¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if
Config.extra = 'allow'
was set since it adds all passed values
- copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: DictStrAny | None = None, deep: bool = False) Model ¶
Duplicate a model, optionally choose which fields to include, exclude and change.
- Parameters:
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
deep – set to
True
to make a deep copy of the model
- Returns:
new model instance
- dict(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny ¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
- classmethod direct(*, predicted_inputs: Iterable[str | UUID], available_inputs: Iterable[str | UUID], actual_inputs: Iterable[str | UUID], predicted_outputs: Iterable[str | UUID], actual_outputs: Iterable[str | UUID], datastore_records: Mapping[str, Mapping]) QuantumProvenanceData ¶
Construct an instance directly without validators.
- Parameters:
- predicted_inputs
Iterable
ofstr
oruuid.UUID
The predicted inputs.
- available_inputs
Iterable
ofstr
oruuid.UUID
The available inputs.
- actual_inputs
Iterable
ofstr
oruuid.UUID
The actual inputs.
- predicted_outputs
Iterable
ofstr
oruuid.UUID
The predicted outputs.
- actual_outputs
Iterable
ofstr
oruuid.UUID
The actual outputs.
- datastore_records
Mapping
[str
,Mapping
] The datastore records.
- predicted_inputs
- Returns:
- provenance
QuantumProvenanceData
Serializable model of the quantum provenance.
- provenance
Notes
This differs from the Pydantic “construct” method in that the arguments are explicitly what the model requires, and it will recurse through members, constructing them from their corresponding
direct
methods.This method should only be called when the inputs are trusted.
- json(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode ¶
Generate a JSON representation of the model,
include
andexclude
arguments as perdict()
.encoder
is an optional function to supply asdefault
to json.dumps(), other arguments as perjson.dumps()
.
- model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, exclude: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True) dict[str, Any] ¶
- model_dump_json(*, indent: int | None = None, include: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, exclude: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True) str ¶
- classmethod model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: dict[str, Any] | None = None) bool | None ¶
- classmethod model_validate(obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None) Self ¶
- classmethod model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, context: dict[str, Any] | None = None) Self ¶
- classmethod parse_file(path: str | Path, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- classmethod parse_raw(*args: Any, **kwargs: Any) QuantumProvenanceData ¶
- classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny ¶
- classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode ¶