Page 1 :
CLASSIFICATION OF, RANDOM PROCESSES, IMPORTANT DEFINITIONS, AND RESULTS, , Differentiate Random Variables and Random Process., Random Variable, Random Process, A function of the possible A function of the possible, outcomes of an experiment, outcomes of an experiment, i.e., X (s), Outcome is mapped into a Outcomes are mapped into, number x', and also time. i.e., X (s, t), wave form which is a function, of time 't'., 2., Define first order stationary process., A random process is said to be first order stationary, if the, first order density function satisfies, fx (X1, 41) = fx (x1, 4q + c), If E [X (t)] = constant, then {X (t)} is a first order stationary., 3., Define second order stationary process., A random process is said to be second order stationary, if the, second order density function satisfies, f (x, xz, ; 1, lz) = ƒ (x, X2 ; lq + c, 1z + c), and e, 4., Define strongly stationary process (SSS)., „) = fx(*1, x2, ... ;+C, ... !, + c), ..., 5,, Define auto correlation function., Rx x (4, 42) = E {X (1,) × X (1,)}, Scanned by CamScanner
Page 2 :
Compare Random Variables and Random Processes :, VICTORY GUIDE, 3.2, 6., Define mean square value., Rxx (1, t) = E[X? (?)], Define wide-sense stationary (WSS) process., E {X (t)} = constant, and, 7., 8., E [X (1) X (1+ t)] = Ryx (t) depends only on t,, where t = t2 -t1, Scanned by CamScanner, COMPARISON OF DEFINITIONS, Random Processes, Random Variables, 1. Distribution, F (x) = P (X <x), (For one dimensional), F (x, 1) = P {X (t) <x}, %3D, Function, 2. Distribution . F (x, x2; t1, t2), F (x, y) = P (X sx, Y <y), Function, %3D, {Cx5() x:'x5(り) x} d =, 3. Density, f (x, t), %3D, Function, (1 *x) J, (x) d = (x) /, 4. Density, a2 F (x, y), Ə² F (x1, x2; '¡, l2), f (x, y) =, Function, Exo 'xe
Page 3 :
10. Define Ensemble Average., ASSIFICATION OF RANDOM PROCESSES, 3.3, Define Time Average., T., 9., XT = 2T S X () dt, -T, Ensemble Average, E [X (t)], %3D, 11. Define Mean Ergodic., T, lim, T+o 2T J. X (t) dt = u, -T, Define Mean Ergòdic theorem., Var XT, 12., lim, 13., Define Correlation Ergodic., lim, YT, X (1) X (1 + 1), 14. Define Markov process., E [Y (1)], where Y (t), A random process {X (t)} is said to be Markovian if, P. [X (t, + 1) <*n+ 1/X (t„) = x», X (?, – 1) = Xn- 1 .. X (19) = xo], X (1o) = xo], n -, = P (X (t,+ 1) <xp +1/X (t,) = x,], where to S, St2<... <t, <tn+ 1•, 15. Define Markov chain., Mathematically, we define the Markov Chain as, follows., If P {X, = a,/ X-1= an-1, Xn-2an-2, =P {X, = a,/X,-1= an-1} for all n, then the process {X,}, n = 0, 1, 2,, is called as Markov, ...., Chain., Here aj, a2., a, are called the states of the Markov, Chain., 16. Define One-Step Transition Probability., P {X„= a,| X,-1 = a,} = Py (n- 1, n), n - step transition probability., 17. Define, %3D, %3D, P {X, = a,/ X, = a,} = Py (n), %3D, %3D, ............., Scanned by CamScanner
Page 4 :
3.6, VICTORY GUIDE, 29., Given Mean & Variance of Poisson Process., Mean = At, Variance = At, %3D, 30. Define Auto Correlation of the Poisson Process., Rx x (41, 12) = 12,12+1 min (41, 12), 31. Define Áuto Covariance of the Poisson Process., a min (41, 12), %3D, Cx x (1, 12), 32. Define Correlation Coefficient of the Poisson Process., Pxx (1, 42) = V, if4 S1ą, t S2, %3D, Write the properties of Poisson Process., PROPERTY 1:, 33., Poisson process is a markov process, PROPERTY 2:, ADDITIVE PROPERTY : Sum of 2 independent, Poisson Processes is a Poisson Process., PROPERTY 3:, Difference of two independent poisson processes is, not a poisson process, PROPERTY 4:, The inter arrival time of a poisson precess., The interval. between two successive occurrences of a, Poisson Process with parameter ^ has an exponential, distribution with mean, PROPERTY 5:, Let N be a Poisson Process with parameter 1 and let, N, i = 1, 2, .., m be m processes got from N by splitting, N according to probabilities p, i = 1, 2,, ... m where P, +., %3!, P2, Pm, = 1. Then N, i = 1, 2,, m are independent, •.., ..., Poisson Processes with rate Ap., ..........., Scanned by CamScanner