Job¶
-
class
lsst.verify.Job(measurements=None, metrics=None, specs=None, meta=None)[source]¶ Bases:
lsst.verify.jsonmixin.JsonSerializationMixinContainer for
Measurements,Blobs, andMetadataassociated with a pipeline run.Parameters: measurements :
MeasurementSetorlistofMeasurements, optionalMeasurements to report in the Job.metrics :
listofMetrics or aMetricSet, optionalspecs :
SpecificationSetorlistofSpecifications, optionalOptional specification information.
meta :
dict, optionalOptional dictionary of metadata key-value entries.
Attributes Summary
jsonJobdata as a JSON-serialiabledict.measurementsMeasurements associated with the pipeline verification job ( MeasurementSet).metaMetadata mapping ( Metadata).metricsMetrics associated with the pipeline verification job ( MetricSet).specsSpecifications associated with the pipeline verifification job ( SpecificationSet).Methods Summary
deserialize([measurements, blobs, metrics, ...])Deserialize a Verification Framework Job from a JSON serialization. dispatch([api_user, api_password, api_url])POST the job to SQUASH, LSST Data Management’s metric dashboard. jsonify_dict(d)Recursively build JSON-renderable objects on all values in a dict. load_metrics_package([package_name_or_path, ...])Create a Job with metrics and specifications pre-loaded from a Verification Framework metrics package, such as verify_metrics. reload_metrics_package([...])Load a metrics package and add metric and specification definitions to the Job, as well as the collected measurements. report([name, spec_tags, metric_tags])Create a verification report that lists the pass/fail status of measurements against specifications in this job. write(filename)Write a JSON serialization to the filesystem. write_json(filepath)Write JSON to a file. Attributes Documentation
-
measurements¶ Measurements associated with the pipeline verification job (
MeasurementSet).
-
specs¶ Specifications associated with the pipeline verifification job (
SpecificationSet).
Methods Documentation
-
classmethod
deserialize(measurements=None, blobs=None, metrics=None, specs=None, meta=None)[source]¶ Deserialize a Verification Framework Job from a JSON serialization.
Parameters: measurements :
list, optionalList of serialized
Measurementobjects.blobs :
list, optionalList of serialized
Blobobjects.metrics :
list, optionalList of serialized
Metricobjects.specs :
list, optionalList of serialized specification objects.
meta :
dict, optionalDictionary of key-value metadata entries.
Returns: job :
JobJobinstance built from serialized data.Examples
Together,
Job.jsonandJob.deserializeallow a verification job to be serialized and later re-instantiated.>>> import json >>> job = Job() >>> json_str = json.dumps(job.json) >>> json_obj = json.loads(json_str) >>> new_job = Job.deserialize(**json_obj)
-
dispatch(api_user=None, api_password=None, api_url='https://squash-restful-api.lsst.codes', **kwargs)[source]¶ POST the job to SQUASH, LSST Data Management’s metric dashboard.
Parameters: api_url :
str, optionalRoot URL of the SQUASH API server.
api_user :
str, optionalAPI username.
api_password :
str, optionalAPI password.
**kwargs : optional
Additional keyword arguments passed to
lsst.verify.squash.post.
-
jsonify_dict(d)¶ Recursively build JSON-renderable objects on all values in a dict.
Parameters: d :
dictDictionary to convert into a JSON-serializable object. Values are recursively JSON-ified.
Returns: json_dict :
dictDictionary that can be serialized to JSON.
Examples
Subclasses can use this method to prepare output in their
json-method implementation. For example:def json(self): return JsonSerializationMixin.jsonify_dict({ 'value': self.value, })
-
classmethod
load_metrics_package(package_name_or_path='verify_metrics', subset=None, measurements=None, meta=None)[source]¶ Create a Job with metrics and specifications pre-loaded from a Verification Framework metrics package, such as verify_metrics.
Parameters: package_name_or_path :
str, optionalName of an EUPS package that hosts metric and specification definition YAML files or the file path to a metrics package.
'verify_metrics'is the default package, and is where metrics and specifications are defined for most LSST Science Pipelines packages.subset :
str, optionalIf set, only metrics and specification for this package are loaded. For example, if
subset='validate_drp', onlyvalidate_drpmetrics are loaded. This argument is equivalent to theMetricSet.subsetmethod. Default isNone.measurements :
MeasurementSetorlistofMeasurements, optionalMeasurements to report in the Job.
meta :
dict, optionalOptional dictionary of metadata key-value entries to include in the Job.
Returns: job :
JobJobinstance.
-
reload_metrics_package(package_name_or_path='verify_metrics', subset=None)[source]¶ Load a metrics package and add metric and specification definitions to the Job, as well as the collected measurements.
Parameters: package_name_or_path :
str, optionalName of an EUPS package that hosts metric and specification definition YAML files or the file path to a metrics package.
'verify_metrics'is the default package, and is where metrics and specifications are defined for most packages.subset :
str, optionalIf set, only metrics and specification for this package are loaded. For example, if
subset='validate_drp', onlyvalidate_drpmetrics are included in theMetricSet. This argument is equivalent to theMetricSet.subsetmethod. Default isNone.Notes
This method is useful for loading metric and specification definitions into a job that was created without this information. In addition to being added to
Job.metrics, metrics are also attached toJob.measurementsitems. This ensures that measurement values are normalized into the units of the metric definition when a Job is serialized.
-
report(name=None, spec_tags=None, metric_tags=None)[source]¶ Create a verification report that lists the pass/fail status of measurements against specifications in this job.
In a Jupyter notebook, this report can be shown as an inline table.
Parameters: name :
strorlsst.verify.Name, optionalA package or metric name to subset specifications by. When set, only measurement and specification combinations belonging to that package or metric are included in the report.
spec_tags : sequence of
str, optionalA set of specification tag strings. when given, only specifications that have all the given tags are included in the report. For example,
spec_tags=['LPM-17', 'minimum'].metric_tags : sequence of
str, optionalA set of metric tag strings. When given, only specifications belonging to metrics that posess all given tags are included in the report. For example,
metric_tags=['LPM-17', 'photometry']selects sepifications that have both the'LPM-17'and'photometry'tags.Returns: report :
lsst.verify.ReportReport instance. In a Jupyter notebook, you can view the report by calling
Report.show.See also
Notes
This method uses the
lsst.verify.SpecificationSet.reportAPI to create thelsst.verify.Report, automatically inserting theJob‘s measurements and metadata for filtering specifiation tests.In a Jupyter notebook environment, use the
lsst.verify.Report.showmethod to view an interactive HTML table.job = lsst.verify.Job() # ... report = job.report() report.show()
-
write(filename)[source]¶ Write a JSON serialization to the filesystem.
Parameters: filename :
strName of the JSON file (including directories). This name should be unique among all task executions in a pipeline. The recommended extension is
'.verify.json'. This convention is used by post-processing tools to discover verification framework outputs.
-