lsst.daf.butler¶
Using the Butler¶
This module provides an abstracted data access interface, known as the Butler. It can be used to read and write data without having to know the details of file formats or locations.
Design and Development¶
lsst.daf.butler
is developed at https://github.com/lsst/daf_butler.
You can find Jira issues for this module under the daf_butler component.
Command Line Scripts¶
butler¶
butler [OPTIONS] COMMAND [ARGS]...
Options
-
--log-level
<LEVEL|COMPONENT=LEVEL>
¶ The logging level. Supported levels are [CRITICAL|ERROR|WARNING|INFO|DEBUG]
config-dump¶
Dump either a subset or full Butler configuration to standard output.
REPO is the URI or path to an existing data repository root or configuration file.
butler config-dump [OPTIONS] REPO
Options
-
-s
,
--subset
<subset>
¶ Subset of a configuration to report. This can be any key in the hierarchy such as ‘.datastore.root’ where the leading ‘.’ specified the delimiter for the hierarchy.
-
-p
,
--searchpath
<TEXT ...>
¶ Additional search paths to use for configuration overrides
-
--file
<outfile>
¶ Print the (possibly-expanded) configuration for a repository to a file, or to stdout by default.
-
-
@
,
--options-file
<options_file>
¶ URI to YAML file containing overrides of command line options. The YAML should be organized as a hierarchy with subcommand names at the top level options for that subcommand below.
Arguments
-
REPO
¶
Required argument
config-validate¶
Validate the configuration files for a Gen3 Butler repository.
REPO is the URI or path to an existing data repository root or configuration file.
butler config-validate [OPTIONS] REPO
Options
-
-q
,
--quiet
¶
Do not report individual failures.
-
-d
,
--dataset-type
<dataset_type>
¶ Specific DatasetType(s) to validate.
-
-i
,
--ignore
<TEXT ...>
¶ DatasetType(s) to ignore for validation.
-
-
@
,
--options-file
<options_file>
¶ URI to YAML file containing overrides of command line options. The YAML should be organized as a hierarchy with subcommand names at the top level options for that subcommand below.
Arguments
-
REPO
¶
Required argument
convert¶
Convert a Butler gen 2 repository into a gen 3 repository.
REPO is the URI or path to the gen3 repository. Will be created if it does not already exist
butler convert [OPTIONS] REPO
Options
-
--gen2root
<gen2root>
¶ Required Root path of the gen 2 repo to be converted.
-
--skymap-name
<skymap_name>
¶ Name of the new gen3 skymap (e.g. ‘discrete/ci_hsc’).
-
--skymap-config
<skymap_config>
¶ Path to skymap config file defining the new gen3 skymap.
-
--calibs
<calibs>
¶ Path to the gen 2 calibration repo. It can be absolute or relative to gen2root.
-
--reruns
<TEXT ...>
¶ List of gen 2 reruns to convert.
-
-t
,
--transfer
<transfer>
¶ Mode to use to transfer files into the new repository.
Options: auto|link|symlink|hardlink|copy|move|relsymlink
-
-C
,
--config-file
<config_file>
¶ Path to a
ConvertRepoConfig
override to be included after the Instrument config overrides are applied.
-
-
@
,
--options-file
<options_file>
¶ URI to YAML file containing overrides of command line options. The YAML should be organized as a hierarchy with subcommand names at the top level options for that subcommand below.
Arguments
-
REPO
¶
Required argument
create¶
Create an empty Gen3 Butler repository.
REPO is the URI or path to the new repository. Will be created if it does not exist.
butler create [OPTIONS] REPO
Options
-
--seed-config
<seed_config>
¶ Path to an existing YAML config file to apply (on top of defaults).
-
--standalone
¶
Include all defaults in the config file in the repo, insulating the repo from changes in package defaults.
-
--override
¶
Allow values in the supplied config to override all repo settings.
-
-f
,
--outfile
<outfile>
¶ Name of output file to receive repository configuration. Default is to write butler.yaml into the specified repo.
-
-
@
,
--options-file
<options_file>
¶ URI to YAML file containing overrides of command line options. The YAML should be organized as a hierarchy with subcommand names at the top level options for that subcommand below.
Arguments
-
REPO
¶
Required argument
define-visits¶
Define visits from exposures in the butler registry.
REPO is the URI or path to the gen3 repository. Will be created if it does not already exist
butler define-visits [OPTIONS] REPO
Options
-
-C
,
--config-file
<config_file>
¶ Path to a pex_config override to be included after the Instrument config overrides are applied.
-
--collections
<TEXT ...>
¶ The collections to be searched (in order) when reading datasets.
-
-i
,
--instrument
<instrument>
¶ Required The name or fully-qualified class name of an instrument.
-
-
@
,
--options-file
<options_file>
¶ URI to YAML file containing overrides of command line options. The YAML should be organized as a hierarchy with subcommand names at the top level options for that subcommand below.
Arguments
-
REPO
¶
Required argument
import¶
Import data into a butler repository.
REPO is the URI or path to the new repository. Will be created if it does not exist.
DIRECTORY is the folder containing dataset files.
butler import [OPTIONS] REPO DIRECTORY
Options
-
-t
,
--transfer
<transfer>
¶ The external data transfer mode.
Options: auto|link|symlink|hardlink|copy|move|relsymlink
-
--output-run
<output_run>
¶ Required The name of the run datasets should be output to.
-
--export-file
<export_file>
¶ Name for the file that contains database information associated with the exported datasets. If this is not an absolute path, does not exist in the current working directory, and –dir is provided, it is assumed to be in that directory. Defaults to “export.yaml”.
-
-s
,
--skip-dimensions
<TEXT ...>
¶ Dimensions that should be skipped during import
-
-
@
,
--options-file
<options_file>
¶ URI to YAML file containing overrides of command line options. The YAML should be organized as a hierarchy with subcommand names at the top level options for that subcommand below.
Arguments
-
REPO
¶
Required argument
-
DIRECTORY
¶
Required argument
ingest-raws¶
Ingest raw frames into from a directory into the butler registry
REPO is the URI or path to the gen3 repository. Will be created if it does not already exist
LOCATIONS specifies files to ingest and/or locations to search for files.
butler ingest-raws [OPTIONS] REPO LOCATIONS ...
Options
-
--regex
<regex>
¶ Regex string used to find files in directories listed in LOCATIONS. Searches for fits files by default.
-
-c
,
--config
<TEXT=TEXT>
¶ Config override, as a key-value pair.
-
-C
,
--config-file
<config_file>
¶ Path to a pex config override to be included after the Instrument config overrides are applied.
-
--output-run
<output_run>
¶ The name of the run datasets should be output to.
-
-t
,
--transfer
<transfer>
¶ The external data transfer mode.
Options: auto|link|symlink|hardlink|copy|move|relsymlink
-
--ingest-task
<ingest_task>
¶ The fully qualified class name of the ingest task to use.
-
-
@
,
--options-file
<options_file>
¶ URI to YAML file containing overrides of command line options. The YAML should be organized as a hierarchy with subcommand names at the top level options for that subcommand below.
Arguments
-
REPO
¶
Required argument
-
LOCATIONS
¶
Required argument(s)
prune-collection¶
Remove a collection and possibly prune datasets within it.
REPO is the URI or path to an existing data repository root or configuration file.
butler prune-collection [OPTIONS] REPO
Options
-
--collection
<collection>
¶ Name of the collection to remove. If this is a TAGGED or CHAINED collection, datasets within the collection are not modified unless –unstore is passed. If this is a RUN collection, –purge and –unstore must be passed, and all datasets in it are fully removed from the data repository.
-
--purge
¶
Permit RUN collections to be removed, fully removing datasets within them. Requires –unstore as an added precaution against accidental deletion. Must not be passed if the collection is not a RUN.
-
--unstore
¶
Remove all datasets in the collection from all datastores in which they appear.
-
-
@
,
--options-file
<options_file>
¶ URI to YAML file containing overrides of command line options. The YAML should be organized as a hierarchy with subcommand names at the top level options for that subcommand below.
Arguments
-
REPO
¶
Required argument
query-collections¶
Get the collections whose names match an expression.
REPO is the URI or path to an existing data repository root or configuration file.
GLOB is one or more glob-style expressions that fully or partially identify the collections to return.
butler query-collections [OPTIONS] REPO [GLOB] ...
Options
-
--collection-type
<collection_type>
¶ If provided, only list collections of this type.
Options: CHAINED|RUN|TAGGED
-
--flatten-chains
,
--no-flatten-chains
¶
Recursively get the child collections of matching CHAINED collections. Default is –no-flatten-chains.
-
--include-chains
,
--no-include-chains
¶
For –include-chains, return records for matching CHAINED collections. For –no-include-chains do not return records for CHAINED collections. Default is the opposite of –flatten-chains: include either CHAINED collections or their children, but not both.
-
-
@
,
--options-file
<options_file>
¶ URI to YAML file containing overrides of command line options. The YAML should be organized as a hierarchy with subcommand names at the top level options for that subcommand below.
Arguments
-
REPO
¶
Required argument
-
GLOB
¶
Optional argument(s)
query-dataset-types¶
Get the dataset types in a repository.
REPO is the URI or path to an existing data repository root or configuration file.
GLOB is one or more glob-style expressions that fully or partially identify the dataset types to return.
butler query-dataset-types [OPTIONS] REPO [GLOB] ...
Options
-
-v
,
--verbose
¶
Include dataset type name, dimensions, and storage class in output.
-
--components
,
--no-components
¶
For –components, apply all expression patterns to component dataset type names as well. For –no-components, never apply patterns to components. Default (where neither is specified) is to apply patterns to components only if their parent datasets were not matched by the expression. Fully-specified component datasets (
str
orDatasetType
instances) are always included.
-
-
@
,
--options-file
<options_file>
¶ URI to YAML file containing overrides of command line options. The YAML should be organized as a hierarchy with subcommand names at the top level options for that subcommand below.
Arguments
-
REPO
¶
Required argument
-
GLOB
¶
Optional argument(s)
register-instrument¶
Add an instrument to the data repository.
REPO is the URI or path to the gen3 repository. Will be created if it does not already exist
The fully-qualified name of an Instrument subclass.
butler register-instrument [OPTIONS] REPO INSTRUMENT ...
Arguments
-
REPO
¶
Required argument
-
INSTRUMENT
¶
Required argument(s)
write-curated-calibrations¶
Add an instrument’s curated calibrations to the data repository.
REPO is the URI or path to the gen3 repository. Will be created if it does not already exist
butler write-curated-calibrations [OPTIONS] REPO
Options
-
-i
,
--instrument
<instrument>
¶ Required The name or fully-qualified class name of an instrument.
-
--output-run
<output_run>
¶ The name of the run datasets should be output to.
-
-
@
,
--options-file
<options_file>
¶ URI to YAML file containing overrides of command line options. The YAML should be organized as a hierarchy with subcommand names at the top level options for that subcommand below.
Arguments
-
REPO
¶
Required argument
Specifying Command-Line Options from a File¶
Many of the butler
subcommands support the ability to provide default command-line options from an external text file.
This is done with the -@
(or --options-file
) option.
A YAML file can be supplied with new default values for any of the options, grouped by subcommand.
config-dump:
subset: .datastore
ingest-raws:
config: keyword=value
For example, if the above file is called defaults.yaml
then calling
butler config-dump -@defaults.yaml myrepo
would only report the datastore section of the config.
Note
Options explicitly given on the command-line always take precedence over those specified from an external options file.
The Dimensions System¶
Python API reference¶
lsst.daf.butler Package¶
Functions¶
addDimensionForeignKey (tableSpec, dimension, …) |
Add a field and possibly a foreign key to a table specification that reference the table for the given Dimension . |
Classes¶
AmbiguousDatasetError |
Exception raised when a DatasetRef is not resolved (has no ID or run), but the requested operation requires one of them. |
Butler (config, str, None] = None, *, butler, …) |
Main entry point for the data access system. |
ButlerConfig ([other, searchPaths]) |
Contains the configuration for a Butler |
ButlerURI |
Convenience wrapper around URI parsers. |
ButlerValidationError |
There is a problem with the Butler configuration. |
CollectionSearch (items, …) |
An ordered search path of collections and dataset type restrictions. |
CollectionType |
Enumeration used to label different types of collections. |
CompositesConfig ([other, validate, …]) |
|
CompositesMap (config, ButlerConfig, …) |
Determine whether a specific datasetType or StorageClass should be disassembled. |
Config ([other]) |
Implements a datatype that is used by Butler for configuration parameters. |
ConfigSubset ([other, validate, …]) |
Config representing a subset of a more general configuration. |
Constraints (config, str]], *, universe) |
Determine whether a DatasetRef , DatasetType , or StorageClass is allowed to be handled. |
ConstraintsConfig ([other]) |
Configuration information for Constraints |
ConstraintsValidationError |
Exception thrown when a constraints list has mutually exclusive definitions. |
DataCoordinate |
An immutable data ID dictionary that guarantees that its key-value pairs identify at least all required dimensions in a DimensionGraph . |
DataCoordinateIterable |
An abstract base class for homogeneous iterables of data IDs. |
DataCoordinateSequence (dataIds, graph, *, …) |
A DataCoordinateIterable implementation that supports the full collections.abc.Sequence interface. |
DataCoordinateSet (dataIds, graph, *, …) |
A DataCoordinateIterable implementation that adds some set-like functionality, and is backed by a true set-like object. |
DatabaseTimespanRepresentation |
An interface that encapsulates how timespans are represented in a database engine. |
DatasetComponent (name, storageClass, component) |
Component of a dataset and associated information. |
DatasetRef |
Reference to a Dataset in a Registry . |
DatasetType (name, dimensions, …) |
A named category of Datasets that defines how they are organized, related, and stored. |
DatasetTypeNotSupportedError |
A DatasetType is not handled by this routine. |
DatasetTypeRestriction (names, ellipsis]) |
An immutable set-like object that represents a restriction on the dataset types to search for within a collection. |
Datastore (config, str], bridgeManager, …) |
Datastore interface. |
DatastoreConfig ([other, validate, …]) |
|
DatastoreValidationError |
There is a problem with the Datastore configuration. |
DeferredDatasetHandle (butler, ref, parameters) |
Proxy class that provides deferred loading of a dataset from a butler. |
Dimension (name, *, related, uniqueKeys, **kwargs) |
A named data-organization concept that can be used as a key in a data ID. |
DimensionConfig ([other, validate, …]) |
Configuration that defines a DimensionUniverse . |
DimensionElement (name, *, related, metadata, …) |
A named data-organization concept that defines a label and/or metadata in the dimensions system. |
DimensionGraph |
An immutable, dependency-complete collection of dimensions. |
DimensionPacker (fixed, dimensions) |
An abstract base class for bidirectional mappings between a DataCoordinate and a packed integer ID. |
DimensionRecord (**kwargs) |
Base class for the Python representation of database records for a DimensionElement . |
DimensionUniverse |
A parent class that represents a complete, self-consistent set of dimensions and their relationships. |
FileDataset (path, refs, …) |
A struct that represents a dataset exported to a file. |
FileDescriptor (location, storageClass, …) |
Describes a particular file. |
FileTemplate (template) |
Format a path template into a fully expanded path. |
FileTemplateValidationError |
Exception thrown when a file template is not consistent with the associated DatasetType . |
FileTemplates (config, str], default, *, universe) |
Collection of FileTemplate templates. |
FileTemplatesConfig ([other]) |
Configuration information for FileTemplates |
Formatter (fileDescriptor, dataId, …) |
Interface for reading and writing Datasets with a particular StorageClass . |
FormatterFactory () |
Factory for Formatter instances. |
Location (datastoreRootUri, str], path) |
Identifies a location within the Datastore . |
LocationFactory (datastoreRoot, str]) |
Factory for Location instances. |
LookupKey (name, dimensions, …) |
Representation of key that can be used to lookup information based on dataset type name, storage class name, dimensions. |
MappingFactory (refType) |
Register the mapping of some key to a python type and retrieve instances. |
NamedKeyDict (*args) |
A dictionary wrapper that require keys to have a .name attribute, and permits lookups using either key objects or their names. |
NamedKeyMapping |
An abstract base class for custom mappings whose keys are objects with a str name attribute, for which lookups on the name as well as the object are permitted. |
NamedValueSet (elements) |
A custom mutable set class that requires elements to have a .name attribute, which can then be used as keys in dict -like lookup. |
Quantum (*, taskName, taskClass, dataId, run, …) |
A discrete unit of work that may depend on one or more datasets and produces one or more datasets. |
Registry (database, universe, *, attributes, …) |
Registry interface. |
SimpleQuery () |
A struct that combines SQLAlchemy objects representing SELECT, FROM, and WHERE clauses. |
SkyPixDimension (name, pixelization) |
A special Dimension subclass for hierarchical pixelizations of the sky. |
StorageClass (name, pytype, str, …) |
Class describing how a label maps to a particular Python type. |
StorageClassConfig ([other, validate, …]) |
|
StorageClassDelegate (storageClass) |
Class to delegate the handling of components and parameters for the python type associated with a particular StorageClass . |
StorageClassFactory (config, str, None] = None) |
Factory for StorageClass instances. |
StoredDatastoreItemInfo |
Internal information associated with a stored dataset in a Datastore . |
StoredFileInfo (formatter, …) |
Datastore-private metadata associated with a file stored in a Datastore. |
Timespan |
A 2-element named tuple for time intervals. |
ValidationError |
Some sort of validation error has occurred. |
YamlRepoExportBackend (stream) |
A repository export implementation that saves to a YAML file. |
YamlRepoImportBackend (stream, registry) |
A repository import implementation that reads from a YAML file. |
Class Inheritance Diagram¶
lsst.daf.butler.registry Package¶
Classes¶
CollectionSearch (items, …) |
An ordered search path of collections and dataset type restrictions. |
CollectionType |
Enumeration used to label different types of collections. |
ConflictingDefinitionError |
Exception raised when trying to insert a database record when a conflicting record already exists. |
DatasetTypeRestriction (names, ellipsis]) |
An immutable set-like object that represents a restriction on the dataset types to search for within a collection. |
DbAuth (path, envVar, authList, str]]] = None) |
Retrieves authentication information for database connections. |
DbAuthError |
A problem has occurred retrieving database authentication information. |
DbAuthPermissionsError |
Credentials file has incorrect permissions. |
InconsistentDataIdError |
Exception raised when a data ID contains contradictory key-value pairs, according to dimension relationships. |
MissingCollectionError |
Exception raised when an operation attempts to use a collection that does not exist. |
OrphanedRecordError |
Exception raised when trying to remove or modify a database record that is still being used in some other table. |
Registry (database, universe, *, attributes, …) |
Registry interface. |
RegistryConfig ([other, validate, …]) |
Class Inheritance Diagram¶
lsst.daf.butler.registry.interfaces Package¶
Classes¶
ButlerAttributeExistsError |
Exception raised when trying to update existing attribute without specifying force option. |
ButlerAttributeManager |
An interface for managing butler attributes in a Registry . |
ChainedCollectionRecord (key, name) |
A subclass of CollectionRecord that adds the list of child collections in a CHAINED collection. |
CollectionManager |
An interface for managing the collections (including runs) in a Registry . |
CollectionRecord (key, name, type) |
A struct used to represent a collection in internal Registry APIs. |
Database (*, origin, connection, namespace) |
An abstract interface that represents a particular database engine’s representation of a single schema/namespace/database. |
DatabaseConflictError |
Exception raised when database content (row values or schema entities) are inconsistent with what this client expects. |
DatasetRecordStorage (datasetType) |
An interface that manages the records associated with a particular DatasetType . |
DatasetRecordStorageManager |
An interface that manages the tables that describe datasets. |
DatastoreRegistryBridge (datastoreName) |
An abstract base class that defines the interface that a Datastore uses to communicate with a Registry . |
DatastoreRegistryBridgeManager (*, opaque, …) |
An abstract base class that defines the interface between Registry and Datastore when a new Datastore is constructed. |
DimensionRecordStorage |
An abstract base class that represents a way of storing the records associated with a single DimensionElement . |
DimensionRecordStorageManager (*, universe) |
An interface for managing the dimension records in a Registry . |
FakeDatasetRef |
A fake DatasetRef that can be used internally by butler where only the dataset ID is available. |
MissingCollectionError |
Exception raised when an operation attempts to use a collection that does not exist. |
OpaqueTableStorage (name) |
An interface that manages the records associated with a particular opaque table in a Registry . |
OpaqueTableStorageManager |
An interface that manages the opaque tables in a Registry . |
ReadOnlyDatabaseError |
Exception raised when a write operation is called on a read-only Database . |
RunRecord (key, name, type) |
A subclass of CollectionRecord that adds execution information and an interface for updating it. |
SchemaAlreadyDefinedError |
Exception raised when trying to initialize database schema when some tables already exist. |
StaticTablesContext (db) |
Helper class used to declare the static schema for a registry layer in a database. |
VersionTuple |
Class representing a version number. |
VersionedExtension |
Interface for extension classes with versions. |
Class Inheritance Diagram¶
lsst.daf.butler.registry.queries Package¶
Classes¶
ChainedDatasetQueryResults (chain) |
A DatasetQueryResults implementation that simply chains together other results objects, each for a different parent dataset type. |
DataCoordinateQueryResults (db, query, *, …) |
An enhanced implementation of DataCoordinateIterable that represents data IDs retrieved from a database query. |
DatasetQueryResults |
An interface for objects that represent the results of queries for datasets. |
ParentDatasetQueryResults (db, query, *, …) |
An object that represents results from a query for datasets with a single parent DatasetType . |
Query (*, graph, whereRegion, managers) |
An abstract base class for queries that return some combination of DatasetRef and DataCoordinate objects. |
QueryBuilder (summary, managers) |
A builder for potentially complex queries that join tables based on dimension relationships. |
QuerySummary (requested, *, dataId, …) |
A struct that holds and categorizes the dimensions involved in a query. |
RegistryManagers (collections, datasets, …) |
Struct used to pass around the manager objects that back a Registry and are used internally by the query system. |
Class Inheritance Diagram¶
lsst.daf.butler.registry.wildcards Module¶
Classes¶
CategorizedWildcard (strings, patterns, …) |
The results of preprocessing a wildcard expression to separate match patterns from strings. |
CollectionQuery (search, ellipsis], patterns, …) |
An unordered query for collections and dataset type restrictions. |
CollectionSearch (items, …) |
An ordered search path of collections and dataset type restrictions. |
DatasetTypeRestriction (names, ellipsis]) |
An immutable set-like object that represents a restriction on the dataset types to search for within a collection. |
Class Inheritance Diagram¶
Example datastores¶
lsst.daf.butler.datastores.chainedDatastore Module¶
Classes¶
ChainedDatastore (config, str], …) |
Chained Datastores to allow read and writes from multiple datastores. |
Class Inheritance Diagram¶
lsst.daf.butler.datastores.inMemoryDatastore Module¶
Classes¶
StoredMemoryItemInfo (timestamp, …) |
Internal InMemoryDatastore Metadata associated with a stored DatasetRef. |
InMemoryDatastore (config, str], …) |
Basic Datastore for writing to an in memory cache. |
Class Inheritance Diagram¶
lsst.daf.butler.datastores.posixDatastore Module¶
Classes¶
PosixDatastore (config, str], bridgeManager, …) |
Basic POSIX filesystem backed Datastore. |
Class Inheritance Diagram¶
lsst.daf.butler.datastores.s3Datastore Module¶
Classes¶
S3Datastore (config, str], bridgeManager, …) |
Basic S3 Object Storage backed Datastore. |
Class Inheritance Diagram¶
lsst.daf.butler.datastores.webdavDatastore Module¶
Classes¶
WebdavDatastore (config, str], bridgeManager, …) |
Basic Webdav Storage backed Datastore. |
Class Inheritance Diagram¶
Example formatters¶
lsst.daf.butler.formatters.file Module¶
Classes¶
FileFormatter (fileDescriptor, dataId, …) |
Interface for reading and writing files on a POSIX file system. |
Class Inheritance Diagram¶
lsst.daf.butler.formatters.json Module¶
Classes¶
JsonFormatter (fileDescriptor, dataId, …) |
Interface for reading and writing Python objects to and from JSON files. |
Class Inheritance Diagram¶
lsst.daf.butler.formatters.matplotlib Module¶
Classes¶
MatplotlibFormatter (fileDescriptor, dataId, …) |
Interface for writing matplotlib figures. |
Class Inheritance Diagram¶
lsst.daf.butler.formatters.parquet Module¶
Classes¶
ParquetFormatter (fileDescriptor, dataId, …) |
Interface for reading and writing Pandas DataFrames to and from Parquet files. |
Class Inheritance Diagram¶
lsst.daf.butler.formatters.pickle Module¶
Classes¶
PickleFormatter (fileDescriptor, dataId, …) |
Interface for reading and writing Python objects to and from pickle files. |
Class Inheritance Diagram¶
lsst.daf.butler.formatters.yaml Module¶
Classes¶
YamlFormatter (fileDescriptor, dataId, …) |
Interface for reading and writing Python objects to and from YAML files. |
Class Inheritance Diagram¶
Database backends¶
lsst.daf.butler.registry.databases.sqlite Module¶
Classes¶
SqliteDatabase (*, connection, origin, …) |
An implementation of the Database interface for SQLite3. |
Class Inheritance Diagram¶
lsst.daf.butler.registry.databases.postgresql Module¶
Classes¶
PostgresqlDatabase (*, connection, origin, …) |
An implementation of the Database interface for PostgreSQL. |
Class Inheritance Diagram¶
Support API¶
lsst.daf.butler.core.utils Module¶
Functions¶
allSlots (self) |
Return combined __slots__ for all classes in objects mro. |
getClassOf (typeOrName, str]) |
Given the type name or a type, return the python type. |
getFullTypeName (cls) |
Return full type name of the supplied entity. |
getInstanceOf (typeOrName, str], *args, **kwargs) |
Given the type name or a type, instantiate an object of that type. |
immutable (cls) |
A class decorator that simulates a simple form of immutability for the decorated class. |
iterable (a) |
Make input iterable. |
safeMakeDir (directory) |
Make a directory in a manner avoiding race conditions |
stripIfNotNone (s) |
Strip leading and trailing whitespace if the given object is not None. |
transactional (func) |
Decorator that wraps a method and makes it transactional. |
Class Inheritance Diagram¶
lsst.daf.butler.core.repoRelocation Module¶
Functions¶
replaceRoot (configRoot, butlerRoot, str, None]) |
Update a configuration root with the butler root location. |
Variables¶
BUTLER_ROOT_TAG |
The special string to be used in configuration files to indicate that the butler root location should be used. |
Test utilities¶
lsst.daf.butler.tests Package¶
Functions¶
addDatasetType (butler, name, dimensions, …) |
Add a new dataset type to a repository. |
expandUniqueId (butler, partialId) |
Return a complete data ID matching some criterion. |
makeTestCollection (repo) |
Create a read/write Butler to a fresh collection. |
makeTestRepo (root, dataIds, *[, config]) |
Create an empty repository with dummy data IDs. |
registerMetricsExample (butler) |
Modify a repository to support reading and writing MetricsExample objects. |
Classes¶
BadNoWriteFormatter (fileDescriptor, dataId, …) |
A formatter that always fails without writing anything. |
BadWriteFormatter (fileDescriptor, dataId, …) |
A formatter that never works but does leave a file behind. |
CliCmdTestBase |
A test case base that is used to verify click command functions import and call their respective script fucntions correctly. |
CliLogTestBase |
Tests log initialization, reset, and setting log levels. |
DatasetTestHelper |
Helper methods for Datasets |
DatastoreTestHelper |
Helper methods for Datastore tests |
DummyRegistry () |
Dummy Registry, for Datastore test purposes. |
ListDelegate (storageClass) |
Parameter handler for list parameters |
MetricsDelegate (storageClass) |
Parameter handler for parameters using Metrics |
MetricsExample ([summary, output, data]) |
Smorgasboard of information that might be the result of some processing. |
MultiDetectorFormatter (fileDescriptor, …) |