Homework 6 FAQ

Q1: Should I leave the printed copies of homework 6 and 7 in your box thing at the cogsci building? Otherwise let
me know how I should get you it you.

Q2: For homework 6, it says to compute the norm-based prediction by calculating the predicted velocity using equation
velocity_pred=k(1)+k(2)*en2
I entered that equation in after I typed all the codes and Matlab gave me an extremely long list of numbers, should I copy and paste all the numbers into the word document and print it out to turn in (there's probably about more than 200 numbers)?  If not, what should I turn in? 

Q3: I can get a contour map to show up but i cannot get more than little red "L" shape to show up for the gradient descent plot.
i've tried plotting lots of different combinations of variables and i think the solution lies in the k_save values...

Q4: Can I write on 6 pages of the filler printer pages from the UCSD laser printers instead of 3 blank pages back and front? They're the ones with the UCSD email screen names on it. I've collected them over the quarter because I didn't want to throw them all away.

Q5: I plotted wrong then. I did this:
contour(J)
hold on
plot(k_save(:,1),k_save(:,2))

I had the : in the wrong place. Why is the colon on the x variable? That
is what I am now trying to figure out.

Q6: Can you tell me if these numbers are in the ballpark?

Err_n=
       3.8003e+003

C=
       0.9930

Q7: What results do I make a table of?  The actual and predicted error?
Norm of the error?  Correlation?  There are 1000 data points.  It takes up
15 pages even with 8pt font.  I get the feeling that that's not what
you're looking for...

Q8: How do I know if there's a linear relationship between distance and
velocity?  Do I need to do more computations?  What numbers previously
computed am I looking at?  I don't think there were any... unless you mean
the 3d graph.  In that case, it doesn't look linear...

Q9: Is this enough info:  Linear least squares do not account for
nonlinear equations.  They have limitations on what parameters they can
fit whereas conjugate gradient descents don't.

Q10: Another technique: Nelder-Mead Simplex- simplex compares points and
goes in direction of the minimum.  How can you apply it?  Let a simplex
run through the data points and find the minimum?

Q11: I

Q12: I

Q13: H


A1: I put a box in csb115 next to the printer, please turn both homeworks in there if you have not turned it in to Alex directly.


A2: Compute the norm-based prediction error by calculating a predicted velocity: 
velocity_pred=k(1)+k(2)*en2 

Then the norm of the error between predicted and actual: 
Err_n = norm ( velocity - velocity pred ) 

Now look at the correlation between the squared distance and the velocity magnitude. 
C = corr( en2, velocity) 

and list the results in a table you create in word or a similar program. 

So you need to next compute Err_n, then C, and those two values are what you should put in the table.  velocity_pred is just the predicted y, we don't have an error until we compute Err_n.


A3: It should look like an L.  If you note, k_save only has 3 observations, ie the algorithm only computed 3 guesses until converging to the answer.  So when you plot that you get 3 points.  The answer is something like k(1)=0 and k(2)=5 roughly.  We started at k=0 if you note in the code (that's our initial guess).


A4: yep.


A5:Two things - 

look at the code - the iter dimension is where we want the colon, because those are like observations, and then the other dimension is the first and second parameter k.  We can encode any dimension with variables or observations, we just typically use the first for observations and the second for variables. But here we are using code which is paralleled in the numerical methods book, so I didn't want to confuse the class - I kept with Bewley's notation in this assignment.


A6: The C is right, the other is not - use the variable k(1) and k(2) instead of the p's


A7: Look at the question carefully - it asks for the norm-based error and correlation


A8: error - is it small?  Correlation between the x data and y data (not the fit), look that up for some interpretation and info (wikipedia perhaps, or mathworld)


A9: roughly yes - look up linear least squares in our lectures, we discussed limitations of those


A10: application means how do you use it in matlab (commands, etc)


A11:


A12:


A13: