SerializedDatasetRef¶
- class lsst.daf.butler.SerializedDatasetRef(*, id: UUID, datasetType: SerializedDatasetType | None = None, dataId: SerializedDataCoordinate | None = None, run: StrictStr | None = None, component: StrictStr | None = None)¶
Bases:
_BaseModelCompat
Simplified model of a
DatasetRef
suitable for serialization.Attributes Summary
Methods Summary
construct
([_fields_set])Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
copy
(*[, include, exclude, update, deep])Duplicate a model, optionally choose which fields to include, exclude and change.
dict
(*[, include, exclude, by_alias, ...])Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
direct
(*, id, run[, datasetType, dataId, ...])Construct a
SerializedDatasetRef
directly without validators.from_orm
(obj)json
(*[, include, exclude, by_alias, ...])Generate a JSON representation of the model,
include
andexclude
arguments as perdict()
.model_construct
([_fields_set])model_dump_json
(*[, indent, include, ...])model_rebuild
(*[, force, raise_errors, ...])model_validate
(obj, *[, strict, ...])parse_file
(path, *[, content_type, ...])parse_obj
(obj)parse_raw
(b, *[, content_type, encoding, ...])schema
([by_alias, ref_template])schema_json
(*[, by_alias, ref_template])update_forward_refs
(**localns)Try to update ForwardRefs on fields based on this Model, globalns and localns.
validate
(value)Attributes Documentation
- model_fields = {'component': ModelField(name='component', type=Optional[StrictStr], required=False, default=None), 'dataId': ModelField(name='dataId', type=Optional[SerializedDataCoordinate], required=False, default=None), 'datasetType': ModelField(name='datasetType', type=Optional[SerializedDatasetType], required=False, default=None), 'id': ModelField(name='id', type=UUID, required=True), 'run': ModelField(name='run', type=Optional[StrictStr], required=False, default=None)}¶
Methods Documentation
- classmethod construct(_fields_set: SetStr | None = None, **values: Any) Model ¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if
Config.extra = 'allow'
was set since it adds all passed values
- copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: DictStrAny | None = None, deep: bool = False) Model ¶
Duplicate a model, optionally choose which fields to include, exclude and change.
- Parameters:
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
deep – set to
True
to make a deep copy of the model
- Returns:
new model instance
- dict(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny ¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
- classmethod direct(*, id: str, run: str, datasetType: dict[str, Any] | None = None, dataId: dict[str, Any] | None = None, component: str | None = None) SerializedDatasetRef ¶
Construct a
SerializedDatasetRef
directly without validators.Notes
This differs from the pydantic “construct” method in that the arguments are explicitly what the model requires, and it will recurse through members, constructing them from their corresponding
direct
methods.The
id
parameter is a string representation of dataset ID, it is converted to UUID by this method.This method should only be called when the inputs are trusted.
- json(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode ¶
Generate a JSON representation of the model,
include
andexclude
arguments as perdict()
.encoder
is an optional function to supply asdefault
to json.dumps(), other arguments as perjson.dumps()
.
- model_dump_json(*, indent: int | None = None, include: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, exclude: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True) str ¶
- classmethod model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: dict[str, Any] | None = None) bool | None ¶
- classmethod model_validate(obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None) Self ¶
- classmethod parse_file(path: str | Path, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- classmethod parse_raw(b: str | bytes, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny ¶
- classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode ¶