|Date: ||Fri, 30 Jun 2000 09:43:28 -0700|
|Sender: ||"SAS(r) Discussion" <SAS-L@LISTSERV.UGA.EDU>|
|From: ||"David L. Cassell" <Cassell.David@EPAMAIL.EPA.GOV>|
|Subject: ||Re: question on variance|
|Content-type: ||text/plain; charset=us-ascii|
> I don't believe that it is useful to use a taylor series expansion. A
> taylor series is intended to approximate a complex function with a polynomial.
> The taylor series expansion of x^2 is only useful if you go to the second
> order term; in which case the taylor series expansion of x^2 is x^2 - and
> so it is for any other polynomial (terms above the second order term equal
> zero because higher order derivatives equal zero).
Actually, the Taylor series is a legitimate way to go. You're
misunderstanding the issue here. You don't want to do a Taylor
series expansion on x^2 . You want to do a Taylor series expansion
on f[Var(X)] , which is not at all the same.
> However I still don't know the answer to the problem but I am more than
> happy to assume that x is a normally distributed random variable.
You don't even need the normality assumption. The Taylor series
expansion is the same regardless of the underlying distribution.
You may get a better approximation under certain distributional
assumptions, but you do not need to assume normality to use the
expansion to estimate the variance of f(X) . This technique has been
used for decades in industrial applications, and usually goes by the
name "propagation of error" in texts. Go ahead, do the Taylor
series as was shown yesterday. And remember, the general form
works for any function f(x), not just f(x)=x^2 .
David Cassell, OAO Cassell.David@epa.gov
Senior computing specialist