Thread: Math problem
View Single Post
Old 07-03-2007, 01:22 PM  
noone1
So Fucking Banned
 
Join Date: Apr 2007
Posts: 111
Quote:
Originally Posted by Rhesus View Post
One can best-fit a straight line y = ax + b through scattered dots (x,y) using least squares linear regression. One minimises the sum of the squares of the error of the found best-fitting line and the actual dots(/measurements).

Now I want to weigh each of these errors for distance to the most recent x(/measurement), giving most weight to the most recent point.

Say, if we observe time on the x axis [I'll use brackets as subscript is no option here],

c = the number of 'dots' we want to calculate a line from

y[t] = the observation at x[t], where t = -(c+1),... , -3, -2, -1, 0

weight w = 0.9^|x[0] - x[t]|

What formulas should I use to calculate a and b in y = ax + b?

For someone a little more proficient in math than I am it shouldn't be too hard.

$25 by epass to the first to correctly post and derive these formulas.

Thanks in advance!
Ugh. Google it. No one does this shit by hand. Grab a TI graphing calculator or a computer program.
noone1 is offline   Share thread on Digg Share thread on Twitter Share thread on Reddit Share thread on Facebook Reply With Quote