Common Assumptions in Statistics

Quantitative Results

In inferential statistics, researchers typically assess certain assumptions before analysis.  Depending on the statistical analysis, the assumptions may differ.  A few of the most common assumptions in statistics are normality, linearity, and equality of variance.

Normality assumes that continuous variables used in the analysis follow a normal distribution. Normal distributions are symmetric around the center (a.k.a., the mean) and follow a ‘bell-shaped’ distribution.  They typically assess normality when examining mean differences (e.g., t-tests, ANOVAs/MANOVAs) and prediction analyses (e.g., linear regression). They can examine normality using methods like the Kolmogorov-Smirnov (KS) test and the analysis of skew and kurtosis. The KS test uses the z-test statistic; if p < .05, it violates the assumption of normality. Skew below ±2.0 and kurtosis below ±7.0 define normality. If the observed values exceed these limits, researchers violate the assumption of normality.

Linearity means a straight-line relationship between independent and dependent variables. If they violate linearity, predictions may become inaccurate. Researchers assess linearity in Pearson correlation and regression analyses.  They assess linearity by examining scatter plots.

Equality of variance means equal variances across groups. It’s examined in t-tests and ANOVAs. Levene’s test assesses it for each continuous dependent variable. If p < .05, equal variance is violated.