The standard deviation of X, denoted SD(X), is given by

$SD(X)=\sqrt{Var(X)}$

Find SD(aX + b) if X has variance $\sigma^2$

standard deviation

variance

Answered by LanceJ 3 years ago

$$SD(aX+b)$$

$$=\sqrt{Var(aX+b)}$$

$$=\sqrt{Var(aX)}$$

$$=\sqrt{a^2Var(X)}$$

$$=\sqrt{a^2\sigma^2}$$

$$=a\sigma$$

Surround your text in `*italics*`

or `**bold**`

, to write a math equation use, for example, `$x^2+2x+1=0$`

or `$$\beta^2-1=0$$`

Stats

Views: 111

Asked: 3 years ago