MockDataset

class lsst.pipe.base.tests.mocks.MockDataset(*, ref: SerializedDatasetRef, quantum: MockDatasetQuantum | None = None, output_connection_name: str | None = None, converted_from: MockDataset | None = None, parent: MockDataset | None = None, parameters: dict[str, str] | None = None)

Bases: _BaseModelCompat

The in-memory dataset type used by MockStorageClass.

Attributes Summary

dataset_type

model_fields

storage_class

Methods Summary

construct([_fields_set])

Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data.

copy(*[, include, exclude, update, deep])

Duplicate a model, optionally choose which fields to include, exclude and change.

dict(*[, include, exclude, by_alias, ...])

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

from_orm(obj)

json(*[, include, exclude, by_alias, ...])

Generate a JSON representation of the model, include and exclude arguments as per dict().

make_derived(**kwargs)

Return a new MockDataset that represents applying some storage class operation to this one.

model_construct([_fields_set])

model_dump_json(*[, indent, include, ...])

model_rebuild(*[, force, raise_errors, ...])

model_validate(obj, *[, strict, ...])

parse_file(path, *[, content_type, ...])

parse_obj(obj)

parse_raw(b, *[, content_type, encoding, ...])

schema([by_alias, ref_template])

schema_json(*[, by_alias, ref_template])

update_forward_refs(**localns)

Try to update ForwardRefs on fields based on this Model, globalns and localns.

validate(value)

Attributes Documentation

dataset_type
model_fields = {'converted_from': ModelField(name='converted_from', type=Optional[MockDataset], required=False, default=None), 'output_connection_name': ModelField(name='output_connection_name', type=Optional[str], required=False, default=None), 'parameters': ModelField(name='parameters', type=Optional[Mapping[str, str]], required=False, default=None), 'parent': ModelField(name='parent', type=Optional[MockDataset], required=False, default=None), 'quantum': ModelField(name='quantum', type=Optional[MockDatasetQuantum], required=False, default=None), 'ref': ModelField(name='ref', type=SerializedDatasetRef, required=True)}
storage_class

Methods Documentation

classmethod construct(_fields_set: SetStr | None = None, **values: Any) Model

Creates a new model setting __dict__ and __fields_set__ from trusted or pre-validated data. Default values are respected, but no other validation is performed. Behaves as if Config.extra = 'allow' was set since it adds all passed values

copy(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, update: DictStrAny | None = None, deep: bool = False) Model

Duplicate a model, optionally choose which fields to include, exclude and change.

Parameters:
  • include – fields to include in new model

  • exclude – fields to exclude from new model, as with values this takes precedence over include

  • update – values to change/add in the new model. Note: the data is not validated before creating the new model: you should trust this data

  • deep – set to True to make a deep copy of the model

Returns:

new model instance

dict(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False) DictStrAny

Generate a dictionary representation of the model, optionally specifying which fields to include or exclude.

classmethod from_orm(obj: Any) Model
json(*, include: AbstractSetIntStr | MappingIntStrAny | None = None, exclude: AbstractSetIntStr | MappingIntStrAny | None = None, by_alias: bool = False, skip_defaults: bool | None = None, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, encoder: Callable[[Any], Any] | None = None, models_as_dict: bool = True, **dumps_kwargs: Any) unicode

Generate a JSON representation of the model, include and exclude arguments as per dict().

encoder is an optional function to supply as default to json.dumps(), other arguments as per json.dumps().

make_derived(**kwargs: Any) MockDataset

Return a new MockDataset that represents applying some storage class operation to this one.

Keyword arguments are fields of MockDataset or SerializedDatasetType to override in the result.

classmethod model_construct(_fields_set: set[str] | None = None, **values: Any) Self
model_dump_json(*, indent: int | None = None, include: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, exclude: set[int] | set[str] | dict[int, Any] | dict[str, Any] | None = None, by_alias: bool = False, exclude_unset: bool = False, exclude_defaults: bool = False, exclude_none: bool = False, round_trip: bool = False, warnings: bool = True) str
classmethod model_rebuild(*, force: bool = False, raise_errors: bool = True, _parent_namespace_depth: int = 2, _types_namespace: dict[str, Any] | None = None) bool | None
classmethod model_validate(obj: Any, *, strict: bool | None = None, from_attributes: bool | None = None, context: dict[str, Any] | None = None) Self
classmethod parse_file(path: str | Path, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model
classmethod parse_obj(obj: Any) Model
classmethod parse_raw(b: str | bytes, *, content_type: unicode = None, encoding: unicode = 'utf8', proto: Protocol = None, allow_pickle: bool = False) Model
classmethod schema(by_alias: bool = True, ref_template: unicode = '#/definitions/{model}') DictStrAny
classmethod schema_json(*, by_alias: bool = True, ref_template: unicode = '#/definitions/{model}', **dumps_kwargs: Any) unicode
classmethod update_forward_refs(**localns: Any) None

Try to update ForwardRefs on fields based on this Model, globalns and localns.

classmethod validate(value: Any) Model