Thread: Math problem
View Single Post
Old 07-03-2007, 12:59 PM  
Rhesus
Confirmed User
 
Join Date: Aug 2004
Posts: 2,009
Math problem

One can best-fit a straight line y = ax + b through scattered dots (x,y) using least squares linear regression. One minimises the sum of the squares of the error of the found best-fitting line and the actual dots(/measurements).

Now I want to weigh each of these errors for distance to the most recent x(/measurement), giving most weight to the most recent point.

Say, if we observe time on the x axis [I'll use brackets as subscript is no option here],

c = the number of 'dots' we want to calculate a line from

y[t] = the observation at x[t], where t = -(c+1),... , -3, -2, -1, 0

weight w = 0.9^|x[0] - x[t]|

What formulas should I use to calculate a and b in y = ax + b?

For someone a little more proficient in math than I am it shouldn't be too hard.

$25 by epass to the first to correctly post and derive these formulas.

Thanks in advance!
Rhesus is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote