Serialization#

lsst.images.serialization Package#

The OutputArchive and InputArchive classes, which abstract over different file formats, and various related utilities.

These archive interfaces are designed with two specific implementations in mind:

  • FITS augmented with a JSON block in a special BINTABLE HDU (see the fits module for details), inspired by the now-defunct ASDF-in-FITS concept.

  • ASDF (just hypothetical for now).

The base classes make some concessions to both FITS and ASDF in order to make the representations in those formats conform to their respective expectations.

For ASDF, this is simple: we use ASDF schemas whenever possible to represent primitive types, from units and times to multidimensional arrays. While the archive interfaces use Pydantic, which maps to JSON, not YAML, the expectation is that by encoding YAML tag information in the JSON Schema (which Pydantic allows us to customize), it should be straightforward for an ASDF archive implementation to have Pydantic dump to a Python dict (etc) tree, and then convert that to tagged YAML by walking the tree along with its schema.

For FITS, the challenge is primarily to populate standard FITS header cards when writing, despite the fact that FITS headers are generally too limiting to be our preferred way of round-tripping any information. To do this, the archive interfaces accept update_header and strip_header callback arguments that are only called by FITS implementations.

An implementation that writes HDF5 while embedding JSON should also be possible with these interfaces, but is not something we’ve designed around. A more natural HDF5 implementation might be possible by translating the JSON tree into a binary HDF5 hierarchy as well, but this would be considerably more effort at best.

Functions#

is_integer(t)

Test whether a NumberType corresponds to an integer type.

no_header_updates(header)

Do not make any modifications to the given FITS header.

Classes#

ArchiveReadError

Exception raised when the contents of an archive cannot be read.

ArchiveTree(*, metadata, MetadataValue] =, ...)

An intermediate base class of pydantic.BaseModel that should be used for all objects that may be used as the top-level tree models written to archives.

ArrayReferenceModel(*, source, shape, datatype)

Model for the subset of the ASDF 'ndarray' schema, in the case where the array data is stored elsewhere.

ArrayReferenceQuantityModel(*, value, unit)

Model for a subset of the ASDF 'quantity' schema for external arrays.

ButlerInfo(*, dataset, provenance)

Information about a butler dataset.

ColumnDefinitionModel(*, name, datatype[, ...])

A model that describes a column in a table.

InlineArrayModel(*, data, datatype)

Model for the subset of the ASDF 'ndarray' schema, in the case where the array data is stored inline.

InlineArrayQuantityModel(*, value, unit)

Model for a subset of the ASDF 'quantity' schema for inline arrays.

InputArchive()

Abstract interface for reading from a file format.

NestedOutputArchive(root, parent)

A proxy output archive that joins a root path into all names before delegating back to its parent archive.

NumberType(value[, names, module, qualname, ...])

Enumeration of array values types supported by the library.

OpaqueArchiveMetadata(*args, **kwargs)

Interface for opaque archive metadata.

OutputArchive()

Abstract interface for writing to a file format.

QuantityModel(*, value, unit)

Model for a subset of the ASDF 'quantity' schema for scalars.

TableCellReferenceModel(*, source, column, row)

A model that acts as a pointer to data in a table cell.

TableReferenceModel(*, source, columns)

Placeholder for an ASDF-like model for referencing binary tabular data.

TimeModel(*, value, scale[, format])

Model for a subset of the ASDF 'time' schema.

UnsupportedTableError

Exception raised if a table object has column types or structure that are not supported by this library.

Variables#

IntegerType

Type alias.

MetadataValue

Type alias.

SignedIntegerType

Type alias.

UnsignedIntegerType

Type alias.