An experiment whose outcome cannot be predicted is called random experiment. The set of all outcomes of a random experiment is called a sample space and is denoted by S. A sample Space is discrete if it has finitely many or a count ably infinite number of elements. A sample space is continuous it contains uncountable number of elements. Any subset of a sample space is called event. is called an impossible event and whole space is called a sure event.
i. Random variable: A real valued function, defined over the sample space of a random experiment is called the random variable associated to that random experiment. That is the values of a random variable correspond to the outcomes of a random experiment.
Example: In the case of tossing three coins, the outcomes can be described as getting ‘0’ head, 2 heads and 3 heads. Let us consider a variable ‘X’ which takes values 0, 1, 2, 3. The value of X correspond to the outcomes 0 Head, 1 Head, 2 Heads and 3 Heads. Then X can be considered to be a random variable associated to the Random experiment of tossing three coins.
ii. Discrete and Continuous Random Variables: A random variable may be discrete or continuous. A random variable is said to be discrete if it assumes only specified values in an interval. When X takes values 1, 2, 3, 4, 5, 6 it is a discrete variable.
iii. A random variable is said to be continuous if it can assume any value in a given interval. When X takes any value in a given interval (a, b), it is a continuous variable in that interval.
iv. Distribution (Probability distribution) (Probability function) (Density function): Let ‘X’ be a random variable assuming values x1, x2, x3….. Let ‘x’ stand for any one of x1, x2, x3 …… Then probability that the random variable ‘X’ takes the value x is defined as probability function of ‘X’ and is denoted by f(x) or P(x). Therefore P(x) = P(X = x) where X is the random variable and ‘x’ stands for values of X, is the probability function of x.
When X takes values x1, x2 … we have the corresponding probabilities P(x1), P(x2)…such that P(x1) + P(x2) +………..= 1
i.e. ∑ P(x) = 1 and all P(x) >0
Example: In tossing two coins the random variable representing the number of heads takes the values x = 0, 1, 2…….
P (no head) = 1/4 Therefore P(x=0) = 1/4
P (one head) = 2/4 Therefore P(x=1) = 2/4
P (two heads) = 1/4 Therefore P(x=2) = 1/4
v. Properties of Discrete Probability Distribution:
Let P(x) be the probability density function, then,
P(x) > 0 for all values of x
- ∑ P(x) = 1
vi. Expectation of X (Expected value of X): Let the random variable X assume that the values of x1, x2 … with corresponding probabilities P(x1), P(x2)… Then the expected value of the random variable X denoted by E (x) is given by the formula.
E(x) = x1 p (x1) + x2 p (x2) + ……….. I.e. E(x) = ∑ x p (x)
Example: When a die is thrown, the random variable X (number on the die takes the values 1, 2, 3, 4 ,5, 6 with corresponding probabilities 1/6, 1/6, 1/6, 1/6, 1/6, 1/6. Then
E(x) = (1 * 1/6) + (2 * 1/6) + (3 * 1/6) + (4 * 1/6) + (5 * 1/6) + (6 * 1/6) = 21/6
Variance of X: Variance of a random variable X, whose expectation denoted by E (x), is defined as
V (x) = E (x2) – [E(x)] 2
vii. Discrete Distribution function: Let X is a random variable and ‘x’ be any value of it. Then P[X < x] denoted by F (x) is called distribution function.
viii. Properties of distribution function:
- F (x) is a non decreasing function
- F (x) > 0
- F (∞) = 1.