Characterization Metric Report¶
Starting from Summer 2015, administrative (“cycle”) releases are accompanied by a measurements report characterizing the current performance. Metrics included in these reports are expected to increase in number and sophistication at subsequent releases. This brief report describe measurements of interest that were carried out for the Summer 2015 cycle. In Winter 16 we focussed on moving towards the infrastructure to calculate and monitor these automatically so we can provide more extensive updates; therefore there are no changes from last cycle’s values.
Summary of Photometric Repeatability Measurements¶
Submitted by Jim Bosch
This dataset is a selection of i-band HyperSuprime-Cam engineering data taken in the SDSS Stripe 82 region. This dataset consists of 30s exposures, so it is somewhat similar to projected LSST data in depth. Our current calibration approach has many limitations relative to what we ultimately plan to implement for LSST:
- There’s currently no relative calibration being run at all.
- We have only limited correction for chromatic effects.
- There’s currently no allowance for zeropoint variations smaller than the scale of a CCD.
- We also use a much simpler sample selection than that proposed by the SRD.
A Jupyter notebook to compute the metrics can be found at https://github.com/lsst/afw/blob/tickets/DM-3896/examples/repeatability.ipynb.
Metric Characterized | Metric Ref | Target | Measured Value | Measurement Method |
---|---|---|---|---|
Photometric repeatability (procCalRep) | DLP-307 | \(\leq 13\) mmag | 10.6 mmag | DM-3338 (i band) |
Photometric repeatability (PA1gri) | DLP-315 | \(\leq 13\) mmag | 10.6 mmag | DM-3338 |
Photometric repeatability (PA1uzy) | DLP-316 | \(\leq 13\) mmag | 10.6 mmag | DM-3338 (i band) |
Summary of Algorithmic Performance Measurements¶
Submitted by John Swinbank
The i-band HSC engineering data (described above) was used where possible and the same caveats apply. Consult the tickets in the Measurement Method column for more details.
Metric Characterized | Metric Ref | Target | Measured Value | Measurement |
---|---|---|---|---|
Residual PSF Ellipticity Correlations (TE1) | DLP-290 | \(\leq 5\times 10^{-3}\) | \(6\times 10^{-5}\) | DM-3040 |
Residual PSF Ellipticity Correlations (TE2) | DLP-290 | \(\leq 5\times 10^{-3}\) | \(2\times 10^{-5}\) | DM-3047 |
Relative Astrometry (AM1) | DLP-310 | \(< 60\) mas | 12.49 mas | DM-3057 |
Relative Astrometry (AM2) | DLP-311 | \(< 60\) mas | 12.19 mas | DM-3064 |
Summary of Computational Performance Measurements¶
Submitted by John Swinbank and Simon Krughoff
At this point of Construction, the computational performance measurements are a combination of precursor data processing and extrapolation from R&D assumptions.
DECam/HITS data was used for the OTT1 estimate and for the diffim and single-frame measurement of the Alert Production Computational Budget in combination with data from the 3rd Data Challenge.
For the Data Release Production of the computational budget, we used DECam/HITS data for estimating diffim performance, HSC-I for assembling and measuring coadds and for forced measurement, estimates from FDR for multifit, and data from the 3rd Data Challenge for SDQA. Calculations for the DRP computational budget used this iPython notebook.
Metric Characterized | Metric Ref | Target | Measured Value | Measurement Method |
---|---|---|---|---|
OTT1 | DLP-328 | \(\leq 240\) sec | 200-250 sec | DM-3724 |
AP Computational Budget | DLP-329 | \(\leq 231\) TFLOPS | 34-39 TFLOPS | DM-3267 |
DRP Computational Budget | DLP-314 | \(\leq 645\) TFLOPS | 318 TFLOPS | DM-3083 |