Re: [math-fun] least-squares derivation ?
After additional Googling, it would appear that the appropriate search terms are "total least squares" (as opposed to "ordinary least squares") and "orthogonal distance regression" for handling (x,y) data symmetrically. There is also this discussion, where we subtract off the centroid of all the (x,y) points, and then find the principal axis, which this author calls the "perpendicular regression": http://www.mathpages.com/home/kmath110.htm *** I assume that the "principal axis" is parallel to the "principal moment of inertia", which I assume is that axis of *minimum* moment of inertia. *** At 09:28 AM 10/4/2018, you wrote:
Standard least squares measures the *vertical* distance to the line from each point and minimizes the sum of the squares. Interchanging x and y is equivalent to switching to measuring the *horizontal* distance. You can instead measure the perpendicular distance, which is invariant under interchanging x and y (or any other isometry for that matter) but the solution is correspondingly hairier.
On Thu, Oct 4, 2018 at 9:24 AM Henry Baker <hbaker1@pipeline.com> wrote:
In common discussions of least squares, the parameters (m,b) are estimated for the equation y = m*x+b using as data various datapoints [x1,y1], [x2,y2], [x3,y3], etc.
For example, in Wikipedia (where m=beta2 and b=beta1):
https://en.wikipedia.org/wiki/Linear_least_squares#Example
So far, so good.
Now, if I merely exchange x and y, then my equation is x = m'*y+b', where should be m' = 1/m and b' = -b/m. (Let's ignore the case where the best m=0.)
However, if I then estimate (m',b') using the same least squares method, I don't get (1/m,-b/m) !
So either I'm doing something wrong, or perhaps there is a more symmetric least squares method that treats x and y symmetrically ??
Look up Singular Value Decomposition and Principal Component Analysis. Linear Algebra can be fun. -tom On Thu, Oct 4, 2018 at 10:22 AM Henry Baker <hbaker1@pipeline.com> wrote:
After additional Googling, it would appear that the appropriate search terms are "total least squares" (as opposed to "ordinary least squares") and "orthogonal distance regression" for handling (x,y) data symmetrically.
There is also this discussion, where we subtract off the centroid of all the (x,y) points, and then find the principal axis, which this author calls the "perpendicular regression":
http://www.mathpages.com/home/kmath110.htm
*** I assume that the "principal axis" is parallel to the "principal moment of inertia", which I assume is that axis of *minimum* moment of inertia. ***
At 09:28 AM 10/4/2018, you wrote:
Standard least squares measures the *vertical* distance to the line from each point and minimizes the sum of the squares. Interchanging x and y is equivalent to switching to measuring the *horizontal* distance. You can instead measure the perpendicular distance, which is invariant under interchanging x and y (or any other isometry for that matter) but the solution is correspondingly hairier.
On Thu, Oct 4, 2018 at 9:24 AM Henry Baker <hbaker1@pipeline.com> wrote:
In common discussions of least squares, the parameters (m,b) are estimated for the equation y = m*x+b using as data various datapoints [x1,y1], [x2,y2], [x3,y3], etc.
For example, in Wikipedia (where m=beta2 and b=beta1):
https://en.wikipedia.org/wiki/Linear_least_squares#Example
So far, so good.
Now, if I merely exchange x and y, then my equation is x = m'*y+b', where should be m' = 1/m and b' = -b/m. (Let's ignore the case where the best m=0.)
However, if I then estimate (m',b') using the same least squares method, I don't get (1/m,-b/m) !
So either I'm doing something wrong, or perhaps there is a more symmetric least squares method that treats x and y symmetrically ??
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com https://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
-- -- http://cube20.org/ -- http://golly.sf.net/ --
participants (2)
-
Henry Baker -
Tomas Rokicki