QuantumProvenanceData¶
- class lsst.daf.butler.QuantumProvenanceData(*, predicted_inputs: Set[UUID], available_inputs: Set[UUID], actual_inputs: Set[UUID], predicted_outputs: Set[UUID], actual_outputs: Set[UUID], datastore_records: Dict[str, SerializedDatastoreRecordData])¶
Bases:
BaseModelA serializable struct for per-quantum provenance information and datastore records.
Notes
This class slightly duplicates information from the
Quantumclass itself (thepredicted_inputsandpredicted_outputssets should have the same IDs present inQuantum.inputsandQuantum.outputs), but overall it assumes the originalQuantumis also available to reconstruct the complete provenance (e.g. by associating dataset IDs with data IDs, dataset types, andRUNnames.Note that
pydanticmethodparse_raw()is not going to work correctly for this class, usedirectmethod instead.Methods Summary
collect_and_transfer(butler, quanta, provenance)Transfer output datasets from multiple quanta to a more permantent
Butlerrepository.construct([_fields_set])Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
copy(*[, include, exclude, update, deep])Duplicate a model, optionally choose which fields to include, exclude and change.
dict(*[, include, exclude, by_alias, ...])Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
direct(*, predicted_inputs, ...)Construct an instance directly without validators.
from_orm(obj)json(*[, include, exclude, by_alias, ...])Generate a JSON representation of the model,
includeandexcludearguments as perdict().parse_file(path, *[, content_type, ...])parse_obj(obj)parse_raw(*args, **kwargs)schema([by_alias, ref_template])schema_json(*[, by_alias, ref_template])update_forward_refs(**localns)Try to update ForwardRefs on fields based on this Model, globalns and localns.
validate(value)Methods Documentation
- static collect_and_transfer(butler: Butler, quanta: Iterable[Quantum], provenance: Iterable[QuantumProvenanceData]) None¶
Transfer output datasets from multiple quanta to a more permantent
Butlerrepository.- Parameters:
- butler
Butler Full butler representing the data repository to transfer datasets to.
- quanta
Iterable[Quantum] Iterable of
Quantumobjects that carry information about predicted outputs. May be a single-pass iterator.- provenance
Iterable[QuantumProvenanceData] Provenance and datastore data for each of the given quanta, in the same order. May be a single-pass iterator.
- butler
Notes
Input-output provenance data is not actually transferred yet, because
Registryhas no place to store it.This method probably works most efficiently if run on all quanta for a single task label at once, because this will gather all datasets of a particular type together into a single vectorized
Registryimport. It should still behave correctly if run on smaller groups of quanta or even quanta from multiple tasks.Currently this method transfers datastore record data unchanged, with no possibility of actually moving (e.g.) files. Datastores that are present only in execution or only in the more permanent butler are ignored.
- classmethod construct(_fields_set: SetStr | None = None, **values: Any) Model¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if
Config.extra = 'allow'was set since it adds all passed values
- copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: DictStrAny | None = None, deep: bool = False) Model¶
Duplicate a model, optionally choose which fields to include, exclude and change.
- Parameters:
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
deep – set to
Trueto make a deep copy of the model
- Returns:
new model instance
- dict(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
- classmethod direct(*, predicted_inputs: Iterable[str | UUID], available_inputs: Iterable[str | UUID], actual_inputs: Iterable[str | UUID], predicted_outputs: Iterable[str | UUID], actual_outputs: Iterable[str | UUID], datastore_records: Mapping[str, Mapping]) QuantumProvenanceData¶
Construct an instance directly without validators.
This differs from the pydantic “construct” method in that the arguments are explicitly what the model requires, and it will recurse through members, constructing them from their corresponding
directmethods.This method should only be called when the inputs are trusted.
- json(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode¶
Generate a JSON representation of the model,
includeandexcludearguments as perdict().encoderis an optional function to supply asdefaultto json.dumps(), other arguments as perjson.dumps().
- classmethod parse_file(path: str | Path, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model¶
- classmethod parse_raw(*args: Any, **kwargs: Any) QuantumProvenanceData¶
- classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny¶
- classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode¶