ChainedDatastore

class lsst.daf.butler.datastores.chainedDatastore.ChainedDatastore(config, registry=None)

Bases: lsst.daf.butler.Datastore

Chained Datastores to allow read and writes from multiple datastores.

A ChainedDatastore is configured with multiple datastore configurations. A put() is always sent to each datastore. A get() operation is sent to each datastore in turn and the first datastore to return a valid dataset is used.

Parameters:
config : DatastoreConfig or str

Configuration. This configuration must include a datastores field as a sequence of datastore configurations. The order in this sequence indicates the order to use for read operations.

Attributes:
config : DatastoreConfig

Configuration used to create Datastore.

storageClassFactory : StorageClassFactory

Factory for creating storage class instances from name.

name : str

Label associated with this Datastore.

Attributes Summary

containerKey Key to specify where child datastores are configured.
defaultConfigFile Path to configuration defaults.

Methods Summary

exists(ref) Check if the dataset exists in one of the datastores.
get(ref[, parameters]) Load an InMemoryDataset from the store.
getUri(ref[, predict]) URI to the Dataset.
ingest(path, ref[, formatter, transfer]) Add an on-disk file with the given DatasetRef to the store, possibly transferring it.
put(inMemoryDataset, ref) Write a InMemoryDataset with a given DatasetRef to each datastore.
remove(ref) Indicate to the Datastore that a Dataset can be removed.
setConfigRoot(root, config, full) Set any filesystem-dependent config options for child Datastores to be appropriate for a new empty repository with the given root.
transfer(inputDatastore, ref) Retrieve a Dataset from an input Datastore, and store the result in this Datastore.

Attributes Documentation

containerKey = 'datastores'

Key to specify where child datastores are configured.

defaultConfigFile = 'datastores/chainedDatastore.yaml'

Path to configuration defaults. Relative to $DAF_BUTLER_DIR/config or absolute path. Can be None if no defaults specified.

Methods Documentation

exists(ref)

Check if the dataset exists in one of the datastores.

Parameters:
ref : DatasetRef

Reference to the required dataset.

Returns:
exists : bool

True if the entity exists in one of the child datastores.

get(ref, parameters=None)

Load an InMemoryDataset from the store.

The dataset is returned from the first datastore that has the dataset.

Parameters:
ref : DatasetRef

Reference to the required Dataset.

parameters : dict

StorageClass-specific parameters that specify, for example, a slice of the Dataset to be loaded.

Returns:
inMemoryDataset : object

Requested Dataset or slice thereof as an InMemoryDataset.

Raises:
FileNotFoundError

Requested dataset can not be retrieved.

TypeError

Return value from formatter has unexpected type.

ValueError

Formatter failed to process the dataset.

getUri(ref, predict=False)

URI to the Dataset.

The returned URI is from the first datastore in the list that has the dataset with preference given to the first dataset coming from a permanent datastore. If no datastores have the dataset and prediction is allowed, the predicted URI for the first datastore in the list will be returned.

Parameters:
ref : DatasetRef

Reference to the required Dataset.

predict : bool

If True, allow URIs to be returned of datasets that have not been written.

Returns:
uri : str

URI string pointing to the Dataset within the datastore. If the Dataset does not exist in the datastore, and if predict is True, the URI will be a prediction and will include a URI fragment “#predicted”.

Raises:
FileNotFoundError

A URI has been requested for a dataset that does not exist and guessing is not allowed.

Notes

If the datastore does not have entities that relate well to the concept of a URI the returned URI string will be descriptive. The returned URI is not guaranteed to be obtainable.

ingest(path, ref, formatter=None, transfer=None)

Add an on-disk file with the given DatasetRef to the store, possibly transferring it.

This method is forwarded to each of the chained datastores, trapping cases where a datastore has not implemented file ingest and ignoring them.

Parameters:
path : str

File path. Treated as relative to the repository root of each child datastore if not absolute.

ref : DatasetRef

Reference to the associated Dataset.

formatter : Formatter (optional)

Formatter that should be used to retreive the Dataset. If not provided, the formatter will be constructed according to Datastore configuration.

transfer : str (optional)

If not None, must be one of ‘move’, ‘copy’, ‘hardlink’, or ‘symlink’ indicating how to transfer the file. The new filename and location will be determined via template substitution, as with put. If the file is outside the datastore root, it must be transferred somehow.

Raises:
NotImplementedError

If all chained datastores have no ingest implemented or if a transfer mode of None is specified.

Notes

If an absolute path is given and “move” mode is specified, then we tell the child datastore to use “copy” mode and unlink it at the end. If a relative path is given then it is assumed the file is already inside the child datastore.

A transfer mode of None implies that the file is already within each of the (relevant) child datastores.

put(inMemoryDataset, ref)

Write a InMemoryDataset with a given DatasetRef to each datastore.

The put() to child datastores can fail with DatasetTypeNotSupportedError. The put() for this datastore will be deemed to have succeeded so long as at least one child datastore accepted the inMemoryDataset.

Parameters:
inMemoryDataset : object

The Dataset to store.

ref : DatasetRef

Reference to the associated Dataset.

Raises:
TypeError

Supplied object and storage class are inconsistent.

DatasetTypeNotSupportedError

All datastores reported DatasetTypeNotSupportedError.

remove(ref)

Indicate to the Datastore that a Dataset can be removed.

The dataset will be removed from each datastore. The dataset is not required to exist in every child datastore.

Parameters:
ref : DatasetRef

Reference to the required Dataset.

Raises:
FileNotFoundError

Attempt to remove a dataset that does not exist. Raised if none of the child datastores removed the dataset.

classmethod setConfigRoot(root, config, full)

Set any filesystem-dependent config options for child Datastores to be appropriate for a new empty repository with the given root.

Parameters:
root : str

Filesystem path to the root of the data repository.

config : Config

A Config to update. Only the subset understood by this component will be updated. Will not expand defaults.

full : Config

A complete config with all defaults expanded that can be converted to a DatastoreConfig. Read-only and will not be modified by this method. Repository-specific options that should not be obtained from defaults when Butler instances are constructed should be copied from full to Config.

transfer(inputDatastore, ref)

Retrieve a Dataset from an input Datastore, and store the result in this Datastore.

Parameters:
inputDatastore : Datastore

The external Datastore from which to retreive the Dataset.

ref : DatasetRef

Reference to the required Dataset in the input data store.

Returns:
results : list

List containing the return value from the put() to each child datastore.