% ========================================================================
% Description: this is a least squares example which will fit 3 points with
% a linear then nonlinear fit, and plot them along with the original data.
% This is designed to go with the least squares handout.
%
% Author : C. Alex Simpkins
%
% Date : Oct 29, 2006
%
% ========================================================================
x_data=[1; 3; 2];
y_data=[2; 4; 3.5];
plot(x_data,y_data,'*');
xlabel('x');
ylabel('y');
title('Demo of linear least squares fitting of data');
%we want to fit the line y = a(1) + a(2)*x, where the a's are constant
%coefficients. So if we think of this as a linear algebra problem, we
%could state it as :
%
% 2 = a(1) + a(2)*(1)
% 4 = a(1) + a(2)*(3)
% 3.5= a(1) + a(2)*(2)
%
% or equivalently (imagine that the multiple bracket columns are each one
% large bracket)
%
% [ 2 ] [ 1 1 ] [a(1) ]
% [ 4 ] =[ 1 3 ]*[a(2) ]
% [ 3.5 ] [ 1 2 ]
%
% define b = [2 4 3.5]; A = [1 1; 1 3; 1 2]; then solve by a = A\b
%
%this will help if you wanted to make an arbitrarily large fit (ie if you
%had 81 rows, you would not type 1 in 81 times, use the ones function to
%generate the ones by stating "ones(81,1)" and the other column is your x
%data as column vectors, while b is the y data as column vectors.
tempA1 = ones(3,1);
%the matrix of 2 columns, one column of ones to correspond to the location
%of the coefficient of a(1) the former of which is always one and the other
%column for the coefficients of a(2) which is always the x point of each
%x,y data pair...
A = [ tempA1 x_data ];
%the y data as an n row by 1 column vector...
b = y_data;
%we could write x = A\b; if that makes more sense to you in terms of the
%derivation, but to save the step of a = x (and the re-use of the variable
%name x, which may be confusing since we called the x part of our data
%x_data we'll just write...
%compute the least squares fit...
a = A\b;
%now make a predicted y from our least squares line fit...
yp = a(1) + a(2)*x_data;
%keeping the data plot, we add our least squares fit line to the plot...
hold on;
plot(x_data,yp,'r')
%and label which information is which with a legend...
legend('Raw data','Linear Least Squares fit');
%note that if we wanted to do a higher order fit, we just need to make more
%columns of A, and the x_data must be squared, then cubed, and so on as we
%go left to right, corresponding to fitting y=a(1)+a(2)*x+a(3)*x^2, and so
%on...