SerializedDatasetType¶
- class lsst.daf.butler.SerializedDatasetType(*, name: StrictStr, storageClass: StrictStr | None = None, dimensions: SerializedDimensionGraph | list[pydantic.types.StrictStr] | None = None, parentStorageClass: StrictStr | None = None, isCalibration: StrictBool = False)¶
Bases:
_BaseModelCompat
Simplified model of a
DatasetType
suitable for serialization.Attributes Summary
Methods Summary
construct
([_fields_set])Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.
copy
(*[, include, exclude, update, deep])Duplicate a model, optionally choose which fields to include, exclude and change.
dict
(*[, include, exclude, by_alias, ...])Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
direct
(*, name[, storageClass, dimensions, ...])Construct a
SerializedDatasetType
directly without validators.from_orm
(obj)json
(*[, include, exclude, by_alias, ...])Generate a JSON representation of the model,
include
andexclude
arguments as perdict()
.model_construct
([_fields_set])model_dump
(*[, mode, include, exclude, ...])model_dump_json
(*[, indent, include, ...])model_rebuild
(*[, force, raise_errors, ...])model_validate
(obj, *[, strict, ...])model_validate_json
(json_data, *[, strict, ...])parse_file
(path, *[, content_type, ...])parse_obj
(obj)parse_raw
(b, *[, content_type, encoding, ...])schema
([by_alias, ref_template])schema_json
(*[, by_alias, ref_template])update_forward_refs
(**localns)Try to update ForwardRefs on fields based on this Model, globalns and localns.
validate
(value)Attributes Documentation
- model_fields = {'dimensions': ModelField(name='dimensions', type=Union[SerializedDimensionGraph, list[pydantic.types.StrictStr], NoneType], required=False, default=None), 'isCalibration': ModelField(name='isCalibration', type=StrictBool, required=False, default=False), 'name': ModelField(name='name', type=StrictStr, required=True), 'parentStorageClass': ModelField(name='parentStorageClass', type=Optional[StrictStr], required=False, default=None), 'storageClass': ModelField(name='storageClass', type=Optional[StrictStr], required=False, default=None)}¶
Methods Documentation
- classmethod construct(_fields_set: SetStr | None = None, **values: Any) Model ¶
Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if
Config.extra = 'allow'
was set since it adds all passed values
- copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: DictStrAny | None = None, deep: bool = False) Model ¶
Duplicate a model, optionally choose which fields to include, exclude and change.
- Parameters:
include – fields to include in new model
exclude – fields to exclude from new model, as with values this takes precedence over include
update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data
deep – set to
True
to make a deep copy of the model
- Returns:
new model instance
- dict(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny ¶
Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.
- classmethod direct(*, name: str, storageClass: str | None = None, dimensions: list | dict | None = None, parentStorageClass: str | None = None, isCalibration: bool = False) SerializedDatasetType ¶
Construct a
SerializedDatasetType
directly without validators.This differs from Pydantic’s model_construct method in that the arguments are explicitly what the model requires, and it will recurse through members, constructing them from their corresponding
direct
methods.This method should only be called when the inputs are trusted.
- Parameters:
- name
str
The name of the dataset type.
- storageClass
str
orNone
The name of the storage class.
- dimensions
list
ordict
orNone
The dimensions associated with this dataset type.
- parentStorageClass
str
orNone
The parent storage class name if this is a component.
- isCalibration
bool
Whether this dataset type represents calibrations.
- name
- Returns:
SerializedDatasetType
A Pydantic model representing a dataset type.
- json(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode ¶
Generate a JSON representation of the model,
include
andexclude
arguments as perdict()
.encoder
is an optional function to supply asdefault
to json.dumps(), other arguments as perjson.dumps()
.
- model_dump(*, mode: Literal['json', 'python'] | str = 'python', include: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, exclude: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True) dict[str, Any] ¶
- model_dump_json(*, indent: int | None = None, include: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, exclude: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True) str ¶
- classmethod model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: dict[str, Any] | None = None) bool | None ¶
- classmethod model_validate(obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None) Self ¶
- classmethod model_validate_json(json_data: str | bytes | bytearray, *, strict: bool | None = None, context: dict[str, Any] | None = None) Self ¶
- classmethod parse_file(path: str | Path, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- classmethod parse_raw(b: str | bytes, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model ¶
- classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny ¶
- classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode ¶