10.1.3 Multiple Random Processes

We often need to study more than one random process. For example, when investing in the stock market you consider several different stocks and you are interested in how they are related. In particular, you might be interested in finding out whether two stocks are positively or negatively correlated. A useful idea in these situations is to look at cross-correlation and cross-covariance functions.
For two random processes {X(t),tJ} and {Y(t),tJ}:
the cross-correlation function RXY(t1,t2), is defined by
RXY(t1,t2)=E[X(t1)Y(t2)],for t1,t2J;
the cross-covariance function CXY(t1,t2), is defined by
CXY(t1,t2)=Cov(X(t1),Y(t2))=RXY(t1,t2)μX(t1)μY(t2),for t1,t2J.
To get an idea about these concepts suppose that X(t) is the price of oil (per gallon) and Y(t) is the price of gasoline (per gallon) at time t. Since gasoline is produced from oil, as oil prices increase, the gasoline prices tend to increase, too. Thus, we conclude that X(t) and Y(t) should be positively correlated (at least for the same t, i.e., CXY(t,t)>0).

Example
Let A, B, and C be independent normal N(1,1) random variables. Let {X(t),t[0,)} be defined as
X(t)=A+Bt, for all t[0,).
Also, let {Y(t),t[0,)} be defined as
Y(t)=A+Ct, for all t[0,).
Find RXY(t1,t2) and CXY(t1,t2), for t1,t2[0,).
  • Solution
    • First, note that
      μX(t)=E[X(t)]=EA+EBt=1+t, for all t[0,).
      Similarly,
      μY(t)=E[Y(t)]=EA+ECt=1+t, for all t[0,).
      To find RXY(t1,t2) for t1,t2[0,), we write
      RXY(t1,t2)=E[X(t1)Y(t2)]=E[(A+Bt1)(A+Ct2)]=E[A2+ACt2+BAt1+BCt1t2]=E[A2]+E[AC]t2+E[BA]t1+E[BC]t1t2=E[A2]+E[A]E[C]t2+E[B]E[A]t1+E[B]E[C]t1t2,(by independence)=2+t1+t2+t1t2.
      To find CXY(t1,t2) for t1,t2[0,), we write
      CXY(t1,t2)=RXY(t1,t2)μX(t1)μY(t2)=(2+t1+t2+t1t2)(1+t1)(1+t2)=1.


Independent Random Processes:

We have seen independence for random variables. In particular, remember that random variables X1, X2,...,Xn are independent if, for all (x1,x2,...,xn)Rn, we have
FX1,X2,...,Xn(x1,x2,...,xn)=FX1(x1)FX2(x2)...FXn(xn).
Now, note that a random process is a collection of random variables. Thus, we can define the concept of independence for random processes, too. In particular, if for two random processes X(t) and Y(t), the random variables X(ti) are independent from the random variables Y(tj), we say that the two random processes are independent. More precisely, we have the following definition:
Two random processes {X(t),tJ} and {Y(t),tJ} are said to be independent if, for all
t1,t2,,tmJandt1,t2,,tnJ,
the set of random variables
X(t1),X(t2),,X(tm)
are independent of the set of random variables
Y(t1),Y(t2),,Y(tn).
The above definition implies that for all real numbers x1,x2,,xm and y1,y2,,yn, we have
FX(t1),X(t2),,X(tm),Y(t1),Y(t2),,Y(tn)(x1,x2,,xm,y1,y2,,yn)=FX(t1),X(t2),,X(tm)(x1,x2,,xm)FY(t1),Y(t2),,Y(tn)(y1,y2,,yn).
The above equation might seem complicated; however, in many real-life applications we can often argue that two random processes are independent by looking at the problem structure. For example, in engineering we can reasonably assume that the thermal noise processes in two separate systems are independent. Note that if two random processes X(t) and Y(t) are independent, then their covariance function, CXY(t1,t2), for all t1 and t2 is given by
CXY(t1,t2)=Cov(X(t1),Y(t2))=0(since X(t1) and Y(t2) are independent).


The print version of the book is available on Amazon.

Book Cover


Practical uncertainty: Useful Ideas in Decision-Making, Risk, Randomness, & AI

ractical Uncertaintly Cover