Quantitative Results

Statistical Analysis

In the process of devising your data analysis plan or conducting your analysis, you may have had a reviewer ask you if you have considered conducting a “hierarchical regression” or a “hierarchical linear model”. At a glance, it may seem like these two terms refer to the same kind of analysis. However, “hierarchical linear modeling” and “hierarchical regression” are actually two very different types of analyses that are used with different types of data and to answer different types of questions. So, what is the difference between the two?

Aligning theoretical framework, gathering articles, synthesizing gaps, articulating a clear methodology and data plan, and writing about the theoretical and practical implications of your research are part of our comprehensive dissertation editing services.

- Bring dissertation editing expertise to chapters 1-5 in timely manner.
- Track all changes, then work with you to bring about scholarly writing.
- Ongoing support to address committee feedback, reducing revisions.

Hierarchical linear modeling is also sometimes referred to as “multi-level modeling” and falls under the family of analyses known as “mixed effects modeling” (or more simply “mixed models”). This type of analysis is most commonly used when the cases in the data have a nested structure. Say for example you were collecting data from students. The students in your study might come from a few different classrooms. Therefore, your data consists of students nested within classrooms. The students in your study that come from the same classroom will share some common variance associated with being in the same classroom, so those cases cannot be treated as truly independent of one another. Since a conventional multiple linear regression analysis assumes that all cases are independent of each other, a different kind of analysis is required when dealing with nested data. Hierarchical linear modeling allows you to model nested data more appropriately than a regular multiple linear regression.

Hierarchical regression, on the other hand, deals with how predictor (independent) variables are selected and entered into the model. Specifically, hierarchical regression refers to the process of adding or removing predictor variables from the regression model in steps. For instance, say you wanted to predict college achievement (your dependent variable) based on high school GPA (your independent variable) while controlling for demographic factors (i.e., covariates). For your analysis, you might want to enter the demographic factors into the model in the first step, and then enter high school GPA into the model in the second step. This would let you see the predictive power that high school GPA adds to your model above and beyond the demographic factors. Hierarchical regression also includes forward, backward, and stepwise regression, in which predictors are automatically added or removed from the regression model in steps based on statistical algorithms. These forms of hierarchical regression are useful if you have a very large number of potential predictor variables and want to determine (statistically) which variables have the most predictive power.

In a nutshell, hierarchical linear modeling is used when you have nested data; hierarchical regression is used to add or remove variables from your model in multiple steps. Knowing the difference between these two seemingly similar terms can help you determine the most appropriate analysis for your study.