dispatch_verify.py¶
Upload LSST Science Pipelines Verification Job
datasets to
the SQUASH dashboard.
Job JSON files can be created by lsst.verify.Job.write
or
lsst.verify.output_quantities
. A Job
dataset consists of
metric measurements, associated blobs, and pipeline execution metadata.
Individual LSST Science Pipelines tasks typically write separate JSON datasets.
This command can collect and combine multiple Job JSON datasets into a single
Job upload.
Configuration
dispatch_verify.py is configurable from both the command line and environment variables. See the argument documenation for environment variable equivalents. Command line settings override environment variable configuration.
Metadata and environment
dispatch_verify.py can enrich Verification Job metadata with information from the environment. Currently dispatch_verify.py supports the Jenkins CI and the LSST Data Facility (LDF) execution environments.
In the Jenkins CI execution environment (--env=jenkins
) the
following environment variables are consumed:
BUILD_ID
: ID in the CI systemBUILD_URL
: CI page with information about the buildPRODUCT
: the name of the product built, e.g. ‘validate_drp’dataset
: the name of the dataset processed, e.g. ‘validation_data_cfht’label
: the name of the platform where it runsrefs
: the branches run by Jenkins, e.g. ‘tickets/DM-12345 main’
If --lsstsw
is used, additional Git branch information is included with
Science Pipelines package metadata.
In the LSST Data Facility execution environment (--env=ldf
) the following
environment variables are consumed:
DATASET
: the name of the dataset processed, e.g ‘HSC RC2’DATASET_REPO_URL
: a reference URL with information about the datasetRUN_ID
: ID of the run in the LDF environmentRUN_ID_URL
: a reference URL with information about the runVERSION_TAG
: the version tag of the LSST software used, e.g. ‘w_2018_18’
Note: currently it is not possible to gather Science Pipelines package metadata
in the LDF environment, thus if --env=ldf
is used --ignore-lsstsw
is
aslo used by default in this environment.
usage: dispatch_verify.py [-h] [--test] [--write PATH] [--show]
[--ignore-blobs] [--env {jenkins,ldf}]
[--lsstsw PATH] [--package-repos [PATH [PATH ...]]]
[--ignore-lsstsw] [--url URL] [--user USER]
[--password PASSWORD] [--date-created DATE_CREATED]
json [json ...]
More information is available at https://pipelines.lsst.io.
positional arguments¶
-
json
¶
Verification job JSON file, or files. When multiple JSON files are present, their measurements, blobs, and metadata are merged.
optional arguments¶
-
-h
,
--help
¶
show this help message and exit
-
--test
¶
Run this command without uploading to the SQUASH service. The JSON payload is printed to standard out.
-
--write
<path>
¶ Write the merged and enriched Job JSON dataset to the given path.
-
--show
¶
Print the assembled Job JSON to standard output.
-
--ignore-blobs
¶
Ignore data blobs even if they are available in the verificationjob.
Environment arguments¶
-
--env
{jenkins,ldf}
¶ Name of the environment where the verification job is being run. In some environments display_verify.py will gather additional metadata automatically: jenkins For the Jenkins CI (https://ci.lsst.codes) environment. ldf For the LSST Data Facility environment. Equivalent to the $VERIFY_ENV environment variable.
-
--lsstsw
<path>
¶ lsstsw directory path. If available, Stack package versions are read from lsstsw. Equivalent to the
$LSSTSW
environment variable. Disabled with--ignore-lsstsw.
-
--package-repos
<path>
¶ Paths to additional Stack package Git repositories. These packages are tracked in Job metadata, like lsstsw-based packages.
-
--ignore-lsstsw
¶
Ignore lsstsw metadata even if it is available (for example, the
$LSSTSW
variable is set).
SQUASH API arguments¶
-
--url
<url>
¶ Root URL of the SQUASH API. Equivalent to the
$SQUASH_URL
environment variable.
-
--user
<user>
¶ Username for SQUASH API. Equivalent to the $SQUASH_USER environment variable.
-
--password
<password>
¶ Password for SQUASH API. Equivalent to the
$SQUASH_PASSWORD
environment variable. If neither is set, you will be prompted.
-
--date-created
<date_created>
¶ ISO8601 formatted datetime in UTC for the Job creation date, e.g. 2021-06-30T22:28:25Z. If not provided the current datetime is used.