SerializedQuantum¶
- class lsst.daf.butler.SerializedQuantum(*, taskName: str | None = None, dataId: SerializedDataCoordinate | None = None, datasetTypeMapping: Mapping[str, SerializedDatasetType], initInputs: Mapping[str, tuple[lsst.daf.butler.core.datasets.ref.SerializedDatasetRef, list[int]]], inputs: Mapping[str, list[tuple[lsst.daf.butler.core.datasets.ref.SerializedDatasetRef, list[int]]]], outputs: Mapping[str, list[tuple[lsst.daf.butler.core.datasets.ref.SerializedDatasetRef, list[int]]]], dimensionRecords: dict[int, lsst.daf.butler.core.dimensions._records.SerializedDimensionRecord] | None = None, datastoreRecords: dict[str, lsst.daf.butler.core.datastoreRecordData.SerializedDatastoreRecordData] | None = None)¶
- Bases: - _BaseModelCompat- Simplified model of a - Quantumsuitable for serialization.- Attributes Summary - Methods Summary - construct([_fields_set])- Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. - copy(*[, include, exclude, update, deep])- Duplicate a model, optionally choose which fields to include, exclude and change. - dict(*[, include, exclude, by_alias, ...])- Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. - direct(*, taskName, dataId, ...)- Construct a - SerializedQuantumdirectly without validators.- from_orm(obj)- json(*[, include, exclude, by_alias, ...])- Generate a JSON representation of the model, - includeand- excludearguments as per- dict().- model_construct([_fields_set])- model_dump(*[, mode, include, exclude, ...])- model_dump_json(*[, indent, include, ...])- model_rebuild(*[, force, raise_errors, ...])- model_validate(obj, *[, strict, ...])- model_validate_json(json_data, *[, strict, ...])- parse_file(path, *[, content_type, ...])- parse_obj(obj)- parse_raw(b, *[, content_type, encoding, ...])- schema([by_alias, ref_template])- schema_json(*[, by_alias, ref_template])- update_forward_refs(**localns)- Try to update ForwardRefs on fields based on this Model, globalns and localns. - validate(value)- Attributes Documentation - model_fields = {'dataId': ModelField(name='dataId', type=Optional[SerializedDataCoordinate], required=False, default=None), 'datasetTypeMapping': ModelField(name='datasetTypeMapping', type=Mapping[str, SerializedDatasetType], required=True), 'datastoreRecords': ModelField(name='datastoreRecords', type=Optional[Mapping[str, SerializedDatastoreRecordData]], required=False, default=None), 'dimensionRecords': ModelField(name='dimensionRecords', type=Optional[Mapping[int, SerializedDimensionRecord]], required=False, default=None), 'initInputs': ModelField(name='initInputs', type=Mapping[str, tuple[lsst.daf.butler.core.datasets.ref.SerializedDatasetRef, list[int]]], required=True), 'inputs': ModelField(name='inputs', type=Mapping[str, list[tuple[lsst.daf.butler.core.datasets.ref.SerializedDatasetRef, list[int]]]], required=True), 'outputs': ModelField(name='outputs', type=Mapping[str, list[tuple[lsst.daf.butler.core.datasets.ref.SerializedDatasetRef, list[int]]]], required=True), 'taskName': ModelField(name='taskName', type=Optional[str], required=False, default=None)}¶
 - Methods Documentation - classmethod construct(_fields_set: SetStr | None = None, **values: Any) Model¶
- Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if - Config.extra = 'allow'was set since it adds all passed values
 - copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: DictStrAny | None = None, deep: bool = False) Model¶
- Duplicate a model, optionally choose which fields to include, exclude and change. - Parameters:
- include – fields to include in new model 
- exclude – fields to exclude from new model, as with values this takes precedence over include 
- update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data 
- deep – set to - Trueto make a deep copy of the model
 
- Returns:
- new model instance 
 
 - dict(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny¶
- Generate a dictionary representation of the model, optionally specifying which fields to include or exclude. 
 - classmethod direct(*, taskName: str | None, dataId: dict | None, datasetTypeMapping: Mapping[str, dict], initInputs: Mapping[str, tuple[dict, list[int]]], inputs: Mapping[str, list[tuple[dict, list[int]]]], outputs: Mapping[str, list[tuple[dict, list[int]]]], dimensionRecords: dict[int, dict] | None, datastoreRecords: dict[str, dict] | None) SerializedQuantum¶
- Construct a - SerializedQuantumdirectly without validators.- This differs from the pydantic “construct” method in that the arguments are explicitly what the model requires, and it will recurse through members, constructing them from their corresponding - directmethods.- This method should only be called when the inputs are trusted. 
 - json(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode¶
- Generate a JSON representation of the model, - includeand- excludearguments as per- dict().- encoderis an optional function to supply as- defaultto json.dumps(), other arguments as per- json.dumps().
 - model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, exclude: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True) dict[str, Any]¶
 - model_dump_json(*, indent: int | None = None, include: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, exclude: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True) str¶
 - classmethod model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: dict[str, Any] | None = None) bool | None¶
 - classmethod model_validate(obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None) Self¶
 - classmethod model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, context: dict[str, Any] | None = None) Self¶
 - classmethod parse_file(path: str | Path, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model¶
 - classmethod parse_raw(b: str | bytes, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model¶
 - classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny¶
 - classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode¶