In real-life applications, we are often interested in multiple observations of random values over a period of time. For example, suppose that you are observing the stock price of a company over the next few months. In particular, let S(t) be the stock price at time t∈[0,∞). Here, we assume t=0 refers to current time. Figure 10.1 shows a possible outcome of this random experiment from time t=0 to time t=1.
Figure 10.1 - A possible realization of values of a stock observed as a function of time. Here, S(t) is an example of a random process.
Note that at any fixed time t1∈[0,∞), S(t1) is a random variable. Based on your knowledge of finance and the historical data, you might be able to provide a PDF for S(t1). If you choose another time t2∈[0,∞), you obtain another random variable S(t2) that could potentially have a different PDF. When we consider the values of S(t) for t∈[0,∞) collectively, we say S(t) is a random process or a stochastic process. We may show this process by
{S(t),t∈[0,∞)}.
Therefore, a random process is a collection of random variables usually indexed by time (or sometimes by space).
A random process is a collection of random variables usually indexed by time.
The process S(t) mentioned here is an example of a continuous-time random process. In general, when we have a random process X(t) where t can take real values in an interval on the real line, then X(t) is a continuous-time random process. Here are a few more examples of continuous-time random processes:
− Let N(t) be the number of customers who have visited a bank from t=9 (when the bank opens at 9:00 am) until time t, on a given day, for t∈[9,16]. Here, we measure t in hours, but t can take any real value between 9 and 16. We assume that N(9)=0, and N(t)∈{0,1,2,...} for all t∈[9,16]. Note that for any time t1, the random variable N(t1) is a discrete random variable. Thus, N(t) is a discrete-valued random process. However, since t can take any real value between 9 and 16, N(t) is a continuous-time random process.
− Let W(t) be the thermal noise voltage generated across a resistor in an electric circuit at time t, for t∈[0,∞). Here, W(t) can take real values.
− Let T(t) be the temperature in New York City at time t∈[0,∞). We can assume here that t is measured in hours and t=0 refers to the time we start measuring the temperature.
In all of these examples, we are dealing with an uncountable number of random variables. For example, for any given t1∈[9,16], N(t1) is a random variable. Thus, the random process N(t) consists of an uncountable number of random variables. A random process can be defined on the entire real line, i.e., t∈(−∞,∞). In fact, it is sometimes convenient to assume that the process starts at t=−∞ even if we are interested in X(t) only on a finite interval. For example, we can assume that the T(t) defined above is a random process defined for all t∈R although we get to observe only a finite portion of it.
On the other hand, you can have a discrete-time random process. A discrete-time random process is a process
{X(t),t∈J},
where J is a countable set. Since J is countable, we can write J={t1,t2,⋯}. We usually define X(tn)=X(n) or X(tn)=Xn, for n=1,2,⋯, (the index values n could be from any countable set such as N or Z). Therefore, a discrete-time random process is just a sequence of random variables. For this reason, discrete-time random processes are sometimes referred to as random sequences. We can denote such a discrete-time process as
{X(n),n=0,1,2,…} or {Xn,n=0,1,2,…}.
Or, if the process is defined for all integers, then we may show the process by
{X(n),n∈Z} or {Xn,n∈Z}.
Here is an example of a discrete-time random process. Suppose that we are observing customers who visit a bank starting at a given time. Let Xn for n∈N be the amount of time the ith customer spends at the bank. This process consists of a countable number of random variables
X1,X2,X3,...
Thus, we say that the process {Xn,n=1,2,3..} is a discrete-time random process. Discrete-time processes are sometimes obtained from continuous-time processes by discretizing time (sampling at specific times). For example, if you only record the temperature in New York City once a day (let's say at noon), then you can define a process
X1=T(12)X2=T(36)X3=T(60)...(temperature at noon on day 1, t=12)(temperature at noon on day 2, t=12+24)(temperature at noon on day 3, t=12+24+24)
And, in general, Xn=T(tn) where tn=24(n−1)+12 for n∈N. Here, Xn is a discrete-time random process. Figure 10.2 shows a possible realization of this random process.
Figure 10.2 - Possible realization of the random process {Xn,n=1,2,3,⋯} where Xn shows the temperature in New York City at noon on day n.
A continuous-time random process is a random process {X(t),t∈J}, where J is an interval on the real line such as [−1,1], [0,∞), (−∞,∞), etc.
A discrete-time random process (or a random sequence) is a random process {X(n)=Xn,n∈J}, where J is a countable set such as N or Z.
Random Processes as Random Functions:
Consider a random process {X(t),t∈J}. This random process is resulted from a random experiment, e.g., observing the stock prices of a company over a period of time. Remember that any random experiment is defined on a sample space S. After observing the values of X(t), we obtain a function of time such as the one showed in Figure 10.1. The function shown in this figure is just one of the many possible outcomes of this random experiment. We call each of these possible functions of X(t) a sample function or sample path. It is also called a realization of X(t).
From this point of view, a random process can be thought of as a random function of time. You are familiar with the concept of functions. The difference here is that {X(t),t∈J} will be equal to one of many possible sample functions after we are done with our random experiment. In engineering applications, random processes are often referred to as random signals.
A random process is a random function of time.
Example
You have 1000 dollars to put in an account with interest rate R, compounded annually. That is, if Xn is the value of the account at year n, then
Xn=1000(1+R)n, for n=0,1,2,⋯.
The value of R is a random variable that is determined when you put the money in the bank, but it does not not change after that. In particular, assume that R∼Uniform(0.04,0.05).
Find all possible sample functions for the random process {Xn,n=0,1,2,...}.
Find the expected value of your account at year three. That is, find E[X3].
Here, the randomness in Xn comes from the random variable R. As soon as you know R, you know the entire sequence Xn for n=0,1,2,⋯. In particular, if R=r, then
Xn=1000(1+r)n, for all n∈{0,1,2,⋯}.
Thus, here sample functions are of the form f(n)=1000(1+r)n, n=0,1,2,⋯, where r∈[0.04,0.05]. For any r∈[0.04,0.05], you obtain a sample function for the random process Xn.
Here, we note that the randomness in X(t) comes from the two random variables A and B. The random variable A can take any real value a∈R. The random variable B can also take any real value b∈R. As soon as we know the values of A and B, the entire process X(t) is known. In particular, if A=a and B=b, then
X(t)=a+bt, for all t∈[0,∞).
Thus, here, sample functions are of the form f(t)=a+bt, t≥0, where a,b∈R. For any a,b∈R you obtain a sample function for the random process X(t).
We have
Y=X(1)=A+B.
Since A and B are independent N(1,1) random variables, Y=A+B is also normal with
EY=E[A+B]=E[A]+E[B]=1+1=2,
Var(Y)=Var(A+B)=Var(A)+Var(B)(since A and B are independent)=1+1=2.
Thus, we conclude that Y∼N(2,2):
fY(y)=14π−−√e−(y−2)24.
We have
E[YZ]=E[(A+B)(A+2B)]=E[A2+3AB+2B2]=E[A2]+3E[AB]+2E[B2]=2+3E[A]E[B]+2⋅2(since A and B are independent)=9.
The random processes in the above examples were relatively simple in the sense that the randomness in the process originated from one or two random variables. We will see more complicated examples later on.