Job

class lsst.verify.Job(measurements=None, metrics=None, specs=None, meta=None)

Bases: JsonSerializationMixin

Container for Measurements, Blob s, and Metadata associated with a pipeline run.

Parameters:
measurementsMeasurementSet or list of Measurements, optional

Measurements to report in the Job.

metricslist of Metrics or a MetricSet, optional

Optional list of Metrics, or a MetricSet.

specsSpecificationSet or list of Specifications, optional

Optional specification information.

metadict, optional

Optional dictionary of metadata key-value entries.

Attributes Summary

json

Job data as a JSON-serialiable dict.

measurements

Measurements associated with the pipeline verification job (MeasurementSet).

meta

Metadata mapping (Metadata).

metrics

Metrics associated with the pipeline verification job (MetricSet).

specs

Specifications associated with the pipeline verifification job (SpecificationSet).

Methods Summary

deserialize([measurements, blobs, metrics, ...])

Deserialize a Verification Framework Job from a JSON serialization.

dispatch([api_user, api_password, api_url])

POST the job to SQUASH, LSST Data Management's metric dashboard.

jsonify_dict(d)

Recursively build JSON-renderable objects on all values in a dict.

load_metrics_package([package_name_or_path, ...])

Create a Job with metrics and specifications pre-loaded from a Verification Framework metrics package, such as verify_metrics.

reload_metrics_package([...])

Load a metrics package and add metric and specification definitions to the Job, as well as the collected measurements.

report([name, spec_tags, metric_tags])

Create a verification report that lists the pass/fail status of measurements against specifications in this job.

write(filename)

Write a JSON serialization to the filesystem.

write_json(filepath)

Write JSON to a file.

Attributes Documentation

json

Job data as a JSON-serialiable dict.

measurements

Measurements associated with the pipeline verification job (MeasurementSet).

meta

Metadata mapping (Metadata).

metrics

Metrics associated with the pipeline verification job (MetricSet).

specs

Specifications associated with the pipeline verifification job (SpecificationSet).

Methods Documentation

classmethod deserialize(measurements=None, blobs=None, metrics=None, specs=None, meta=None)

Deserialize a Verification Framework Job from a JSON serialization.

Parameters:
measurementslist, optional

List of serialized Measurement objects.

blobslist, optional

List of serialized Blob objects.

metricslist, optional

List of serialized Metric objects.

specslist, optional

List of serialized specification objects.

metadict, optional

Dictionary of key-value metadata entries.

Returns:
jobJob

Job instance built from serialized data.

Examples

Together, Job.json and Job.deserialize allow a verification job to be serialized and later re-instantiated.

>>> import json
>>> job = Job()
>>> json_str = json.dumps(job.json)
>>> json_obj = json.loads(json_str)
>>> new_job = Job.deserialize(**json_obj)
dispatch(api_user=None, api_password=None, api_url='https://squash-restful-api.lsst.codes', **kwargs)

POST the job to SQUASH, LSST Data Management’s metric dashboard.

Parameters:
api_urlstr, optional

Root URL of the SQUASH API server.

api_userstr, optional

API username.

api_passwordstr, optional

API password.

**kwargsoptional

Additional keyword arguments passed to lsst.verify.squash.post.

Returns:
outputresponse

The response from the POST request to the SQuaSH API

static jsonify_dict(d)

Recursively build JSON-renderable objects on all values in a dict.

Parameters:
ddict

Dictionary to convert into a JSON-serializable object. Values are recursively JSON-ified.

Returns:
json_dictdict

Dictionary that can be serialized to JSON.

Examples

Subclasses can use this method to prepare output in their json-method implementation. For example:

def json(self):
    return JsonSerializationMixin.jsonify_dict({
        'value': self.value,
    })
classmethod load_metrics_package(package_name_or_path='verify_metrics', subset=None, measurements=None, meta=None)

Create a Job with metrics and specifications pre-loaded from a Verification Framework metrics package, such as verify_metrics.

Parameters:
package_name_or_pathstr, optional

Name of an EUPS package that hosts metric and specification definition YAML files or the file path to a metrics package. 'verify_metrics' is the default package, and is where metrics and specifications are defined for most LSST Science Pipelines packages.

subsetstr, optional

If set, only metrics and specification for this package are loaded. For example, if subset='validate_drp', only validate_drp metrics are loaded. This argument is equivalent to the MetricSet.subset method. Default is None.

measurementsMeasurementSet or list of Measurements, optional

Measurements to report in the Job.

metadict, optional

Optional dictionary of metadata key-value entries to include in the Job.

Returns:
jobJob

Job instance.

reload_metrics_package(package_name_or_path='verify_metrics', subset=None)

Load a metrics package and add metric and specification definitions to the Job, as well as the collected measurements.

Parameters:
package_name_or_pathstr, optional

Name of an EUPS package that hosts metric and specification definition YAML files or the file path to a metrics package. 'verify_metrics' is the default package, and is where metrics and specifications are defined for most packages.

subsetstr, optional

If set, only metrics and specification for this package are loaded. For example, if subset='validate_drp', only validate_drp metrics are included in the MetricSet. This argument is equivalent to the MetricSet.subset method. Default is None.

Notes

This method is useful for loading metric and specification definitions into a job that was created without this information. In addition to being added to Job.metrics, metrics are also attached to Job.measurements items. This ensures that measurement values are normalized into the units of the metric definition when a Job is serialized.

report(name=None, spec_tags=None, metric_tags=None)

Create a verification report that lists the pass/fail status of measurements against specifications in this job.

In a Jupyter notebook, this report can be shown as an inline table.

Parameters:
namestr or lsst.verify.Name, optional

A package or metric name to subset specifications by. When set, only measurement and specification combinations belonging to that package or metric are included in the report.

spec_tagssequence of str, optional

A set of specification tag strings. when given, only specifications that have all the given tags are included in the report. For example, spec_tags=['LPM-17', 'minimum'].

metric_tagssequence of str, optional

A set of metric tag strings. When given, only specifications belonging to metrics that posess all given tags are included in the report. For example, metric_tags=['LPM-17', 'photometry'] selects sepifications that have both the 'LPM-17' and 'photometry' tags.

Returns:
reportlsst.verify.Report

Report instance. In a Jupyter notebook, you can view the report by calling Report.show.

Notes

This method uses the lsst.verify.SpecificationSet.report API to create the lsst.verify.Report, automatically inserting the Job‘s measurements and metadata for filtering specifiation tests.

In a Jupyter notebook environment, use the lsst.verify.Report.show method to view an interactive HTML table.

job = lsst.verify.Job()
# ...
report = job.report()
report.show()
write(filename)

Write a JSON serialization to the filesystem.

Parameters:
filenamestr

Name of the JSON file (including directories). This name should be unique among all task executions in a pipeline. The recommended extension is '.verify.json'. This convention is used by post-processing tools to discover verification framework outputs.

write_json(filepath)

Write JSON to a file.

Parameters:
filepathstr

Destination file name for JSON output.