S3Datastore¶
- 
class lsst.daf.butler.datastores.s3Datastore.S3Datastore(config: Union[DatastoreConfig, str], bridgeManager: DatastoreRegistryBridgeManager, butlerRoot: str = None)¶
- Bases: - lsst.daf.butler.datastores.fileLikeDatastore.FileLikeDatastore- Basic S3 Object Storage backed Datastore. - Parameters: - config : DatastoreConfigorstr
- Configuration. A string should refer to the name of the config file. 
- bridgeManager : DatastoreRegistryBridgeManager
- Object that manages the interface between - Registryand datastores.
- butlerRoot : str, optional
- New datastore root to use to override the configuration value. 
 - Raises: - ValueError
- If root location does not exist and - createis- Falsein the configuration.
 - Notes - S3Datastore supports non-link transfer modes for file-based ingest: - "move",- "copy", and- None(no transfer).- Attributes Summary - bridge- Object that manages the interface between this - Datastoreand the- Registry(- DatastoreRegistryBridge).- containerKey- defaultConfigFile- Path to configuration defaults. - isEphemeral- names- Names associated with this datastore returned as a list. - Methods Summary - addStoredItemInfo(refs, infos)- Record internal storage information associated with one or more datasets. - emptyTrash(ignore_errors)- Remove all datasets from the trash. - exists(ref)- Check if the dataset exists in the datastore. - export(refs, *, directory, str, …)- Export datasets for transfer to another data repository. - fromConfig(config, bridgeManager, butlerRoot)- Create datastore from type specified in config file. - get(ref, parameters, Any]] = None)- Load an InMemoryDataset from the store. - getLookupKeys()- Return all the lookup keys relevant to this datastore. - getStoredItemsInfo(ref)- Retrieve information associated with files stored in this - Datastoreassociated with this dataset ref.- getURI(ref, predict)- URI to the Dataset. - getURIs(ref, predict)- Return URIs associated with dataset. - ingest(*datasets, transfer)- Ingest one or more files into the datastore. - makeTableSpec()- put(inMemoryDataset, ref)- Write a InMemoryDataset with a given - DatasetRefto the store.- remove(ref)- Indicate to the Datastore that a dataset can be removed. - removeStoredItemInfo(ref)- Remove information about the file associated with this dataset. - setConfigRoot(root, config, full, overwrite)- Set any filesystem-dependent config options for this Datastore to be appropriate for a new empty repository with the given root. - transaction()- Context manager supporting - Datastoretransactions.- transfer(inputDatastore, ref)- Retrieve a dataset from an input - Datastore, and store the result in this- Datastore.- trash(ref, ignore_errors)- Indicate to the datastore that a dataset can be removed. - validateConfiguration(entities, …)- Validate some of the configuration for this datastore. - validateKey(lookupKey, entity, DatasetType, …)- Validate a specific look up key with supplied entity. - Attributes Documentation - 
bridge¶
- Object that manages the interface between this - Datastoreand the- Registry(- DatastoreRegistryBridge).
 - 
containerKey= None¶
 - 
defaultConfigFile= 'datastores/s3Datastore.yaml'¶
- Path to configuration defaults. Accessed within the - configresource or relative to a search path. Can be None if no defaults specified.
 - 
isEphemeral= False¶
 - 
names¶
- Names associated with this datastore returned as a list. - Can be different to - namefor a chaining datastore.
 - Methods Documentation - 
addStoredItemInfo(refs: Iterable[lsst.daf.butler.core.datasets.ref.DatasetRef], infos: Iterable[lsst.daf.butler.core.storedFileInfo.StoredFileInfo]) → None¶
- Record internal storage information associated with one or more datasets. - Parameters: - refs : sequence of DatasetRef
- The datasets that have been stored. 
- infos : sequence of StoredDatastoreItemInfo
- Metadata associated with the stored datasets. 
 
- refs : sequence of 
 - 
emptyTrash(ignore_errors: bool = True) → None¶
- Remove all datasets from the trash. - Parameters: 
 - 
exists(ref: lsst.daf.butler.core.datasets.ref.DatasetRef) → bool¶
- Check if the dataset exists in the datastore. - Parameters: - ref : DatasetRef
- Reference to the required dataset. 
 - Returns: 
- ref : 
 - 
export(refs: Iterable[lsst.daf.butler.core.datasets.ref.DatasetRef], *, directory: Union[lsst.daf.butler.core._butlerUri.ButlerURI, str, None] = None, transfer: Optional[str] = 'auto') → Iterable[lsst.daf.butler.core.repoTransfers.FileDataset]¶
- Export datasets for transfer to another data repository. - Parameters: - refs : iterable of DatasetRef
- Dataset references to be exported. 
- directory : str, optional
- Path to a directory that should contain files corresponding to output datasets. Ignored if - transferis- None.
- transfer : str, optional
- Mode that should be used to move datasets out of the repository. Valid options are the same as those of the - transferargument to- ingest, and datastores may similarly signal that a transfer mode is not supported by raising- NotImplementedError.
 - Returns: - dataset : iterable of DatasetTransfer
- Structs containing information about the exported datasets, in the same order as - refs.
 - Raises: - NotImplementedError
- Raised if the given transfer mode is not supported. 
 
- refs : iterable of 
 - 
static fromConfig(config: Config, bridgeManager: DatastoreRegistryBridgeManager, butlerRoot: Optional[str] = None) → 'Datastore'¶
- Create datastore from type specified in config file. - Parameters: - config : Config
- Configuration instance. 
- bridgeManager : DatastoreRegistryBridgeManager
- Object that manages the interface between - Registryand datastores.
- butlerRoot : str, optional
- Butler root directory. 
 
- config : 
 - 
get(ref: lsst.daf.butler.core.datasets.ref.DatasetRef, parameters: Optional[Mapping[str, Any]] = None) → Any¶
- Load an InMemoryDataset from the store. - Parameters: - ref : DatasetRef
- Reference to the required Dataset. 
- parameters : dict
- StorageClass-specific parameters that specify, for example, a slice of the dataset to be loaded.
 - Returns: - inMemoryDataset : object
- Requested dataset or slice thereof as an InMemoryDataset. 
 - Raises: - FileNotFoundError
- Requested dataset can not be retrieved. 
- TypeError
- Return value from formatter has unexpected type. 
- ValueError
- Formatter failed to process the dataset. 
 
- ref : 
 - 
getLookupKeys() → Set[LookupKey]¶
- Return all the lookup keys relevant to this datastore. - Returns: - keys : setofLookupKey
- The keys stored internally for looking up information based on - DatasetTypename or- StorageClass.
 
- keys : 
 - 
getStoredItemsInfo(ref: DatasetIdRef) → List[StoredFileInfo]¶
- Retrieve information associated with files stored in this - Datastoreassociated with this dataset ref.- Parameters: - ref : DatasetRef
- The dataset that is to be queried. 
 - Returns: - items : list[StoredDatastoreItemInfo]
- Stored information about the files and associated formatters associated with this dataset. Only one file will be returned if the dataset has not been disassembled. Can return an empty list if no matching datasets can be found. 
 
- ref : 
 - 
getURI(ref: lsst.daf.butler.core.datasets.ref.DatasetRef, predict: bool = False) → lsst.daf.butler.core._butlerUri.ButlerURI¶
- URI to the Dataset. - Parameters: - Returns: - uri : str
- URI pointing to the dataset within the datastore. If the dataset does not exist in the datastore, and if - predictis- True, the URI will be a prediction and will include a URI fragment “#predicted”. If the datastore does not have entities that relate well to the concept of a URI the returned URI will be descriptive. The returned URI is not guaranteed to be obtainable.
 - Raises: - FileNotFoundError
- Raised if a URI has been requested for a dataset that does not exist and guessing is not allowed. 
- RuntimeError
- Raised if a request is made for a single URI but multiple URIs are associated with this dataset. 
 - Notes - When a predicted URI is requested an attempt will be made to form a reasonable URI based on file templates and the expected formatter. 
- uri : 
 - 
getURIs(ref: lsst.daf.butler.core.datasets.ref.DatasetRef, predict: bool = False) → Tuple[Optional[lsst.daf.butler.core._butlerUri.ButlerURI], Dict[str, lsst.daf.butler.core._butlerUri.ButlerURI]]¶
- Return URIs associated with dataset. - Parameters: - ref : DatasetRef
- Reference to the required dataset. 
- predict : bool, optional
- If the datastore does not know about the dataset, should it return a predicted URI or not? 
 - Returns: 
- ref : 
 - 
ingest(*datasets, transfer: Optional[str] = None) → None¶
- Ingest one or more files into the datastore. - Parameters: - datasets : FileDataset
- Each positional argument is a struct containing information about a file to be ingested, including its path (either absolute or relative to the datastore root, if applicable), a complete - DatasetRef(with- dataset_id not None), and optionally a formatter class or its fully-qualified string name. If a formatter is not provided, the one the datastore would use for- puton that dataset is assumed.
- transfer : str, optional
- How (and whether) the dataset should be added to the datastore. If - None(default), the file must already be in a location appropriate for the datastore (e.g. within its root directory), and will not be modified. Other choices include “move”, “copy”, “link”, “symlink”, “relsymlink”, and “hardlink”. “link” is a special transfer mode that will first try to make a hardlink and if that fails a symlink will be used instead. “relsymlink” creates a relative symlink rather than use an absolute path. Most datastores do not support all transfer modes. “auto” is a special option that will let the data store choose the most natural option for itself.
 - Raises: - NotImplementedError
- Raised if the datastore does not support the given transfer mode (including the case where ingest is not supported at all). 
- DatasetTypeNotSupportedError
- Raised if one or more files to be ingested have a dataset type that is not supported by the datastore. 
- FileNotFoundError
- Raised if one of the given files does not exist. 
- FileExistsError
- Raised if transfer is not - Nonebut the (internal) location the file would be moved to is already occupied.
 - Notes - Subclasses should implement - _prepIngestand- _finishIngestinstead of implementing- ingestdirectly. Datastores that hold and delegate to child datastores may want to call those methods as well.- Subclasses are encouraged to document their supported transfer modes in their class documentation. 
- datasets : 
 - 
classmethod makeTableSpec() → lsst.daf.butler.core.ddl.TableSpec¶
 - 
put(inMemoryDataset: Any, ref: lsst.daf.butler.core.datasets.ref.DatasetRef) → None¶
- Write a InMemoryDataset with a given - DatasetRefto the store.- Parameters: - inMemoryDataset : object
- The dataset to store. 
- ref : DatasetRef
- Reference to the associated Dataset. 
 - Raises: - TypeError
- Supplied object and storage class are inconsistent. 
- DatasetTypeNotSupportedError
- The associated - DatasetTypeis not handled by this datastore.
 - Notes - If the datastore is configured to reject certain dataset types it is possible that the put will fail and raise a - DatasetTypeNotSupportedError. The main use case for this is to allow- ChainedDatastoreto put to multiple datastores without requiring that every datastore accepts the dataset.
- inMemoryDataset : 
 - 
remove(ref: DatasetRef) → None¶
- Indicate to the Datastore that a dataset can be removed. - Warning - This method deletes the artifact associated with this dataset and can not be reversed. - Parameters: - ref : DatasetRef
- Reference to the required Dataset. 
 - Raises: - FileNotFoundError
- Attempt to remove a dataset that does not exist. 
 - Notes - This method is used for immediate removal of a dataset and is generally reserved for internal testing of datastore APIs. It is implemented by calling - trash()and then immediately calling- emptyTrash(). This call is meant to be immediate so errors encountered during removal are not ignored.
- ref : 
 - 
removeStoredItemInfo(ref: DatasetIdRef) → None¶
- Remove information about the file associated with this dataset. - Parameters: - ref : DatasetRef
- The dataset that has been removed. 
 
- ref : 
 - 
classmethod setConfigRoot(root: str, config: lsst.daf.butler.core.config.Config, full: lsst.daf.butler.core.config.Config, overwrite: bool = True) → None¶
- Set any filesystem-dependent config options for this Datastore to be appropriate for a new empty repository with the given root. - Parameters: - root : str
- URI to the root of the data repository. 
- config : Config
- A - Configto update. Only the subset understood by this component will be updated. Will not expand defaults.
- full : Config
- A complete config with all defaults expanded that can be converted to a - DatastoreConfig. Read-only and will not be modified by this method. Repository-specific options that should not be obtained from defaults when Butler instances are constructed should be copied from- fullto- config.
- overwrite : bool, optional
- If - False, do not modify a value in- configif the value already exists. Default is always to overwrite with the provided- root.
 - Notes - If a keyword is explicitly defined in the supplied - configit will not be overridden by this method if- overwriteis- False. This allows explicit values set in external configs to be retained.
- root : 
 - 
transaction() → Iterator[lsst.daf.butler.core.datastore.DatastoreTransaction]¶
- Context manager supporting - Datastoretransactions.- Transactions can be nested, and are to be used in combination with - Registry.transaction.
 - 
transfer(inputDatastore: Datastore, ref: DatasetRef) → None¶
- Retrieve a dataset from an input - Datastore, and store the result in this- Datastore.- Parameters: - inputDatastore : Datastore
- The external - Datastorefrom which to retreive the Dataset.
- ref : DatasetRef
- Reference to the required dataset in the input data store. 
 
- inputDatastore : 
 - 
trash(ref: lsst.daf.butler.core.datasets.ref.DatasetRef, ignore_errors: bool = True) → None¶
- Indicate to the datastore that a dataset can be removed. - Parameters: - Raises: - FileNotFoundError
- Attempt to remove a dataset that does not exist. 
 
 - 
validateConfiguration(entities: Iterable[Union[lsst.daf.butler.core.datasets.ref.DatasetRef, lsst.daf.butler.core.datasets.type.DatasetType, lsst.daf.butler.core.storageClass.StorageClass]], logFailures: bool = False) → None¶
- Validate some of the configuration for this datastore. - Parameters: - Raises: - DatastoreValidationError
- Raised if there is a validation problem with a configuration. All the problems are reported in a single exception. 
 - Notes - This method checks that all the supplied entities have valid file templates and also have formatters defined. 
 - 
validateKey(lookupKey: LookupKey, entity: Union[DatasetRef, DatasetType, StorageClass]) → None¶
- Validate a specific look up key with supplied entity. - Parameters: - lookupKey : LookupKey
- Key to use to retrieve information from the datastore configuration. 
- entity : DatasetRef,DatasetType, orStorageClass
- Entity to compare with configuration retrieved using the specified lookup key. 
 - Raises: - DatastoreValidationError
- Raised if there is a problem with the combination of entity and lookup key. 
 - Notes - Bypasses the normal selection priorities by allowing a key that would normally not be selected to be validated. 
- lookupKey : 
 
- config :