# Probability

The origin of the probability theory starts from the study of games like cards, tossing coins, dice, etc.  But in modern times, probability has great importance in decision making.  According to the classical theory, probability is the ratio of the favorable case to the total number of equally likely cases.  Empirical or relative frequency probability is based on logic, past experience, and present condition. According to the subjective approach, the probability of an event is assigned by an individual on the basis of evidences available to him/her. What is the likelihood of getting a raise given previous data?

What is the likelihood of getting tails when flipping a coin?

Some basic concepts:

• Experiment: When we conduct a trial or experiment to obtain some statistical information, it is called an experiment.
• Event: In an experiment, the outcome is called the event (e.g., getting a heads when flipping a coin).
• Exhaustive event: The total outcome is called the exhaustive event (the outcomes of both heads and tails from flipping a coin).
• Equally likely events: The events are said to be equally likely if the chance of happening is equal of all events.  In other words, events are said to be equally likely when one event does not occur more often than the others (e.g., getting a heads or a tail on a fair coin flip).
• Mutually exclusive events: Two events are said to be mutually exclusive when they cannot occur simultaneously in a single trial (e.g., can’t get heads and tails at the same time when flipping a coin).
• Complementary event: When event A and event B are mutually exclusive and exhaustive, then Event A is called the complementary event of B, and Event B is called the complementary event of A (e.g., the complement of getting heads is getting tails when flipping a coin).
• Simple and Compound event: When we consider the probability of occurrence or no occurrence of a single event, then it is said to be a simple event (e.g., getting a heads when flipping a coin).  In the case of a compound event, we consider the probability of the joint occurrence of two or more events (e.g., tossing two dice).
• Dependent event: When the probability of an occurrence of one event affects the probability of the occurrence of another event, then it said to be the dependent event (e.g., selecting a jack a second time from a deck of cards after selecting a jack the first time) .
• Independent event: When the probability of an occurrence of one event does not affect the probability of an occurrence of another event, then it is said to be the independent event (e.g., selecting a jack from a deck of cards is independent from a selecting a jack from another deck of cards).

Laws/Theorems:

1. Additive Law of Probability: Given two events A and B, the union of A and B isP(AB) = P(A) + P(B) – P(A∩B)If A and B are mutually exclusive events, then P(A∩B) = 0and the union of A and B isP(AB) = P(A) + P(B)Given three events A, B, and C, their union is

P(ABC) = P(A) + P(B) + P(C) – P(A∩B) – P(B∩C) – P(A∩C) + P(A∩B∩C)

If A, B, and C are mutually exclusive events, then all intersections equal zero and their union is

P(ABC) = P(A) + P(B) + P(C)

Multiplicative Law of Probability: In case of two dependent events, A and B, the Multiplicative Law of Probability states that the probability of their intersection is equal to the probability of one event multiplied by the conditional probability of the second event. Mathematically, the Multiplicative Law of Probability is denoted:

P(A∩B) = P(A)P(B|A) = P(B)P(A|B)

In the case of two independent events, A and B, the conditional probabilities of the events equal the probability of the events themselves. The Multiplicative Law of Probability is denoted:

P(A∩B) = P(A)P(B)

1. Law of Total Probability: In the case of a discrete probability distribution, if the set of events Bi ” for ” i = 1,2,…,n forms a partition of the probability space, then for any event AP(A) = P(A∩B1) + P(A∩B2) + … + P(A∩Bn)
2. Bayes’ Theorem of Probability: Bayes’ Theorem (Bayes’ Law) uses the probability of A given B to find the probability of B given A. Given the definition of the conditional probability of A given B and vice versa, solving for the intersection of A and B yields P(A|B)P(B) = P(A∩B) = P(B|A)P(A) rearranging this relationship yields: P(A|B) = (P(B|A)P(A)) / P(B)

Applying the Law of Total Probability, we obtain:

P(A) = P(A∩B) + P(A∩BC) = P(A|B)P(B) + P(A|BC)P(BC)

and

P(B|A) = (P(A|B)P(B)) / (P(A|B)P(B) + P(A|BC)P(BC))