Virtual Laboratories > Expected Value > 1 2 3 4 5 [6] 7
The main purpose of this section is a discussion of expected value vectors and covariance matrices for random vectors. These topics are particularly important in multivariate statistical models and the multivariate normal distribution. This section requires some prerequisite knowledge of linear algebra, at the undergraduate level.
We will let Rm×n denote the space of all m × n matrices of real numbers. In particular, we will identify Rn with Rn×1, so that an ordered n-tuple can also be thought of as an n × 1 column vector. The transpose of a matrix A is denote AT.
Suppose that X is an m × n matrix of real-valued random variables, whose i, jth entry is denoted Xij. Equivalently, X can be thought of as a random m × n matrix. It is natural to define the expected value E(X) to be the m × n matrix whose i, jth entry is E(Xij), the expected value of Xij.
Many of the basic properties of expected value of random variables have analogues for expected value of random vectors, with matrix operation replacing the ordinary ones.
1.
Show that E(X + Y) = E(X
+ Y) if X and Y
are random m × n matrices.
2.
Show that E(AX) = AE(X)
if A is a non-random m × n matrix and X
is random n × k matrix.
3.
Show that E(XY) = E(X)E(Y)
if X is a random m × n matrix, Y
is random n × k matrix and X and
Y
are independent.
Suppose now that X is a random vector in Rm and Y is a random vector in Rn. The covariance matrix of X and Y is the m × n matrix cov(X, Y) whose i, jth entry is cov(Xi, Yj), the covariance of Xi and Yj.
4.
Show that cov(X, Y) = E{[X
- E(X)][Y - E(Y)]T}
5.
Show that cov(X, Y) = E(XYT)
- E(X)E(Y)T.
6.
Show that cov(Y, X) = cov(X,
Y)T.
7.
Show that cov(X, Y) = 0
if each element of X is uncorrelated with each element of Y
(in particular, if X and Y are
independent).
8.
Show that cov(X + Y, Z) = cov(X, Z) + cov(Y,
Z) if X
and Y are random vectors in Rm and
Z is a random vector in Rn.
9.
Show that cov(X, Y + Z)
= cov(X, Y) + cov(X,
Z) if X is a random vector in Rm
and Y, Z are random vectors in
Rn.
10.
Show that cov(AX, Y) = A
cov(X, Y) if X
is a random vector in Rm, Y
is a random vector in Rn and A
is a non-random k × m matrix.
11.
Show that cov(X, AY) = cov(X,
Y)AT if X
is a random vector in Rm, Y
is a random vector in Rn and A
is a non random k × n matrix.
Suppose now that X = (X1, X2, ..., Xn) is a random vector in Rn. The covariance matrix of X with itself is called the variance-covariance matrix of X:
VC(X) = cov(X, X).
12.
Show that VC(X) is a symmetric n × n matrix
with var(X1), ..., var(Xn) on the diagonal.
13.
Show that VC(X + Y) = VC(X)
+ cov(X, Y) + cov(Y,
X) + VC(X) if X
and Y are random vectors in Rn.
14.
Show that VC(AX) = A VC(X)
AT if X is a random vector
in Rn. and A is a non
random m × n matrix.
If a is in Rn, note that aTX is a linear combination of the coordinates of X:
aTX = a1X1 + a2X2 + ··· + anXn.
15.
Show that var(aTX) = aT
VC(X) a if X
is a random vector in Rn and a
is in Rn. Thus conclude that VC(X)
is either positive semi-definite or positive definite.
In particular, the eigenvalues and the determinant of VC(X) are nonnegative.
16.
Show that VC(X) is positive semi-definite (but not positive
definite) if and only if there exists a1, a2, ...,
an, c in R such that
a1X1 + a2X2 + ··· + anXn = c (with probability 1).
Thus, if VC(X) is positive semi-definite, then one of the coordinates of X can be written as an affine transformation of the other coordinates (and hence can usually be eliminated in the underlying model). By contrast, if VC(X) is positive definite, then this cannot happen; VC(X) has positive eigenvalues and determinant and is invertible.
17.
Suppose that (X, Y) has density function f(x, y)
= x + y for 0 < x < 1, 0 < y < 1.
Find
18.
Suppose that (X, Y) has density function f(x, y)
= 2(x + y) for 0 < x < y < 1. Find
19. Suppose
that (X, Y) has density function f(x, y) = 6x2y
for 0 < x < 1, 0 < y < 1. Find
20. Suppose
that (X, Y) has density function f(x, y) = 15x2y
for 0 < x < y < 1. Find
21. Suppose
that (X, Y, Z) is uniformly distributed on the region {(x,
y, z): 0 < x < y < z < 1}.
Find
22. Suppose that X
is uniformly distributed on (0, 1), and that given X, Y is
uniformly distributed on (0, X). Find