16.1.2 Linear Chi Squared Minimization
For least squares minimization we assumed that one of the variables () contained error that accounted for the deviation of the data from the model we want to fit it to.
This error was not quantified by the measurement, furthermore we gave each error term equal importance in the total error to be minimized.
What if we had a measurement for the uncertainty of each of our measurements? Let’s characterize these uncertainties using the standard deviation of each measurement: .
We now want to weight the contribution that each error value gives to the total error by the uncertainties . Ideally we want the model to fit within the uncertainties of the data points (or at least the fraction of the data points given by the confidence of the uncertainty). This means that we want to prioritize minimizing the error given by points with low uncertainty, or conversely we want to suppress the points with high uncertainty. To solve this we will minimize the value of the data:
where each error value is weighted by dividing it by the uncertainty. Note that if all of the where constant, we’d be dealing with least squares (the multiplicative factor will drop out in the minimization)
With 1 Independent Variable
Returning to our scenario with two variables and , modeled by the functional relation:
with a data set of measured and variables, with as the uncertainty of the values for , can now be written as:
Minimizing with respect to and , will yield:
where
Note that in practice the factors of the expectation values cancel out.