In statistical analysis, all statistical tests assume some assumptions. Violation of these assumptions changes the conclusion of the research. Therefore, all research, whether it is a journal article, thesis or dissertation, must follow these assumptions.
The following are the data assumptions found in statistical research:
Unidimensionality: When a researcher measures the construct from other measured variables, then according to the assumption of unidimensionality, measured items should come under one dimension. Assumption of unidimensionality means that the item taken for the construct measure the same things. The following are methods of measuring unidimensionality:
Cronbach’s alpha: To measure the assumption of unidimensionality, Cronbach’s alpha is the most common method. The value of the Cronbach’s alpha should be greater than .7 to measure the assumption of unidimensionality.
Factor analysis: In factor analysis, to measure the assumption of unidimensionality, the cutoff value of the factor loading should be higher than .3, or eigenvalue should be greater than 1.
Assumptions of normality: Most of the parametric tests follow the assumption of normality. Normality means that the distribution of the test is normally distributed with 0 mean, with 1 standard deviation and a symmetric bell shaped curve. To test the assumption of normality, the following measures and tests are applied:
Skewness: To test the assumption of normal distribution, Skewness should be within the range ±1.
Kurtosis: To test the assumption of whether or not the data is normally distributed, Kurtosis value should be within range ±3. Some people use ±2 range of Kurtosis.
Shapiro-Wilk’s W test: Most of the researchers use this test to test the assumption of normality. Wilk’s test should be significant to meet the assumption of normality.
Kolmogorov-Smirnov D test: In the case of a large sample, most researchers use K-S test to test the assumption of normality.
Graphical method for test of normality:
Histogram: Histogram gives the rough idea of whether or not data follows the assumption of normality.
Q-Q plot: Most researchers use Q-Q plot to test the assumption of normality. In this method, observed value and expected value are plotted on a graph. If the value varies more from a straight line, then the data is not normally distributed. Otherwise data will be normally distributed.
Box plot test of normality assumption: Box plot test is used to test if there are outliers present in the data. Outliers and skewness show the violation of the assumption of normality.
Multivariate normality: In Mardia’s statistics, which is based on the skewness, kurtosis tests the assumption of multivariate normality.
Assumptions of homogeneity of variance:
Levene’s test: To test the assumption of homogeneity of variance, Levene’s test is used. Levene’s test measures whether or not the variance between the dependent variable and the independent variable is the same.
Brown & Forsythe’s test: This is a recently developed test which criticizes the Levene test to measure the normality assumption. The Brown-Forsythe test is more robust than the Levine test when groups are unequal in size.
Bartlett’s test: This test is also an alternative test to the Levene’s test. It is used to test the assumption of homogeneity of variance.
Homogeneity of variance-covariance matrices assumption:
Box’s M test is used to test the multivariate homogeneity of variance-covariance matrices assumption. An insignificant value of Box’s M test shows that those groups do not differ from each other.
Independence Assumption: Many test statistics assume that the sample observations are independent of each other. To test the Assumption of independence, D-W test is used. The value of the D-W coefficient should lie between 1.5 and 2.5 for independence of observation.
Randomness: Most of the statistics assume that the sample observations are random. Run Test is used to test the assumption of randomness.
Equality of means: Hotelling’s T-square test is used to test the multivariate test for the equality of mean assumption. An insignificant value of Hotelling’s T-square shows the equality of means.
Multicollinearity: Multicollinearity means that the dependent variables have a perfect degree of correlation. To test the assumption of multicollinearity, VIF and Condition indices are used. A value of VIF >4 indicates a problem of multicollinearity. In SEM, analysis condition indices are used to test the multicollinearity. A value grater than 15 of Condition indices shows a problem of multicollinearity.
Statistics Solutions can assist with your quantitative analysis by assisting you to develop your methodology and results chapters. The services that we offer include:
Quantitative Results Section (Descriptive Statistics, Bivariate and Multivariate Analyses, Structural Equation Modeling, Path analysis, HLM, Cluster Analysis)
*Please call 877-437-8622 to request a quote based on the specifics of your research, or email Info@StatisticsSolutions.com.