T.R | Title | User | Personal Name | Date | Lines |
---|
574.1 | Exact or approximate? | SQM::HALLYB | Free the quarks! | Tue Sep 09 1986 16:30 | 0 |
574.2 | | NULL::TORNHEIM | | Tue Sep 09 1986 16:52 | 2 |
| preferably exact, but approximate would still be useful
|
574.3 | Cubic splines maybe ? Are you curve fitting ? | EAGLE1::BEST | R D Best, Systems architecture, I/O | Wed Oct 29 1986 18:22 | 26 |
| I am only guessing at what you are trying to do, but if you are trying
to curve fit, you might be better off using cubic splines a la
Sedgewick's 'Algorithms' book. I think it's chapter 4.
He discusses why high order polynomials are not generally good for
curve fitting if you want smooth curves. Cubic splines are a piecewise
smooth fitting to consecutive points within the point set. They limit
the nasty functional fluctuations that can arise between fit points
when the generating function is a high order polynomial.
If you can live with approximations, or if you have some idea of what
kind of function the points to be fitted are being generated by,
least squares can get a best-fit set of coefficients, but you choose
the order of the polynomial.
What you might do is run successive least squares for increasing
values of the polynomial order (call it n) starting at n=2 and
keep going until the minimum error (over the ensemble of points)
falls below an acceptable value (say 0.5 %).
I'm sure that there are other much better ways to do this that others
who are more adept at numerical analysis can suggest.
If you are looking for storage compaction or rapid reconstruction
of the points, then other algorithms are probably appopriate.
/R Best
|
574.4 | Interpolating polynomial | ENGINE::ROTH | | Thu Oct 30 1986 06:26 | 37 |
| You don't indicate your problem, so I can't suggest other strategies.
However, you can easily fit a polynomial to an arbitrary set of points
via a Lagrange interpolating polynomial. Alternatively, you can
simply postulate that so and so a polynomial exits and use undetermined
coefficients. Both methods have theoretical interest, but are numerically
ill conditioned.
An example of Lagrange interpolation should make the method clear.
Given 4 points, find a cubic that interpolates them.
The polynomial would be:
(x-x2)(x-x3)(x-x4) (x-x1)(x-x3)(x-x4)
y(x) = y1--------------------- + y2--------------------- + y3 and y4 terms.
(x1-x2)(x1-x3)(x1-x4) (x2-x1)(x2-x3)(x2-x4)
The idea is to have the fraction multiplying each yk vanish at all the
other points (on account of the numerator), but be equal to 1 at xk
exactly. For n points you get an n-1 degree polynomial by expanding
the products and collecting the terms. I had one of those 1-liner
APL programs that did this in college.
Another way is to say you have
y(x) = a0*x^n + a1*x^(n-1) + ... + an
If you plug in your n+1 data points you can solve for the a's by
simultaneous equations. If less than n+1 of the resulting equations
are linearly independant, then you need less than an n'th degree
polynomial. For your problem, try the interpolation first.
Random aside on bandlimited sampling and data transmission. Its just
like an infinite degree Lagrange interpolating polynomial; you want
the modem's eye to open just at the point where all the other terms
are having their zero crossing...
- Jim
|
574.5 | need working least squares algorithm | CADSE::GINZBURG | | Wed Apr 13 1988 12:30 | 5 |
| Does anyone has a piece of code that does successive least squares
until error falls below given value?
Any help is greatly appreciated.
Olga G.
|
574.6 | Least squares approximation | ELWD2::CHINNASWAMY | | Sun Apr 24 1988 12:11 | 4 |
|
Look into Dan Mackracken's book on Linear Methods...
|