Class SmoothedSpline¶
Defined in File Spline.h
Inheritance Relationships¶
Base Type¶
public lsst::afw::math::detail::Spline
(Class Spline)
Class Documentation¶
-
class
SmoothedSpline
: public lsst::afw::math::detail::Spline¶ Public Functions
-
SmoothedSpline
(std::vector<double> const &x, std::vector<double> const &y, std::vector<double> const &dy, double s, double *chisq = NULL, std::vector<double> *errs = NULL)¶ Cubic spline data smoother
Algorithm 642 collected algorithms from ACM. Algorithm appeared in Acm-Trans. Math. Software, vol.12, no. 2, Jun., 1986, p. 150.
Translated from fortran by a combination of f2c and RHL.
latest revision - 15 August 1985Author - M.F.Hutchinson CSIRO Division of Mathematics and Statistics P.O. Box 1965 Canberra, ACT 2601 Australia
- Note
y,c: spline coefficients (output). y is an array of length n; c is an n-1 by 3 matrix. The value of the spline approximation at t is s(t) = c[2][i]*d^3 + c[1][i]*d^2 + c[0][i]*d + y[i] where x[i] <= t < x[i+1] and d = t - x[i].
- Note
var: error variance. If var is negative (i.e. unknown) then the smoothing parameter is determined by minimizing the generalized cross validation and an estimate of the error variance is returned. If var is non-negative (i.e. known) then the smoothing parameter is determined to minimize an estimate, which depends on var, of the true mean square error. In particular, if var is zero, then an interpolating natural cubic spline is calculated. Set var to 1 if absolute standard deviations have been provided in dy (see above).
- Note
Additional information on the fit is available in the stat array. on normal exit the values are assigned as follows: stat[0] = smoothing parameter (= rho/(rho + 1)) stat[1] = estimate of the number of degrees of freedom of the residual sum of squares; this reduces to the usual value of n-2 when a least squares regression line is calculated. stat[2] = generalized cross validation stat[3] = mean square residual stat[4] = estimate of the true mean square error at the data points stat[5] = estimate of the error variance; chi^2/nu in the case of linear regression
- Note
If stat[0]==0 (rho==0) an interpolating natural cubic spline has been calculated; if stat[0]==1 (rho==infinite) a least squares regression line has been calculated.
- Note
Returns stat[4], an estimate of the true rms error
- Note
precision/hardware - double (originally VAX double)
- Note
the number of arithmetic operations required by the subroutine is proportional to n. The subroutine uses an algorithm developed by M.F. Hutchinson and F.R. de Hoog, ‘Smoothing Noisy Data with Spline Functions’, Numer. Math. 47 p.99 (1985)
- Parameters
[in] x
: array of length n containing the abscissae of the n data points (x(i),f(i)) i=0..n-1. x must be ordered so that x(i) < x(i+1)[in] y
: vector of length >= 3 containing the ordinates (or function values) of the data points[in] dy
: vector of standard deviations ofy
the error associated with the data point; each dy[] must be positive.[in] s
: desired chisq[out] chisq
: final chisq (if non-NULL)[out] errs
: error estimates, (if non-NULL). You’ll need to delete it
-