In the process of devising your data analysis plan or conducting your analysis, you may have had a reviewer ask you if you have considered conducting a “hierarchical regression” or a “hierarchical linear model”. At a glance, it may seem like these two terms refer to the same kind of analysis. However, “hierarchical linear modeling” and “hierarchical regression” refer to two distinct types of analyses used with different data types and to answer different questions. So, what is the difference between the two?
Researchers use hierarchical linear modeling, or “multi-level modeling,” for nested data structures. Say for example you were collecting data from students. The students in your study might come from a few different classrooms. Therefore, your data consists of students nested within classrooms.
Students from the same classroom share common variance, so you cannot treat those cases as independent. Since conventional multiple linear regression assumes all cases are independent, researchers need a different analysis for nested data. Hierarchical linear modeling allows you to model nested data more appropriately than a regular multiple linear regression.
Hierarchical regression, on the other hand, deals with how predictor (independent) variables are selected and entered into the model. Specifically, hierarchical regression refers to the process of adding or removing predictor variables from the regression model in steps. For example, predict college achievement from high school GPA, controlling for demographic factors. You might enter demographic factors in the first step and high school GPA in the second step of your analysis.
Need help conducting your analysis? Leverage our 30+ years of experience and low-cost same-day service to complete your results today!
Schedule now using the calendar below.
This would let you see the predictive power that high school GPA adds to your model above and beyond the demographic factors. Hierarchical regression includes forward, backward, and stepwise regression, where statistical algorithms add or remove predictors from the model in steps. These forms of hierarchical regression are useful if you have a very large number of potential predictor variables and want to determine (statistically) which variables have the most predictive power.
In short, use hierarchical linear modeling for nested data and hierarchical regression to add or remove variables from your model in multiple steps. Knowing the difference between these two seemingly similar terms can help you determine the most appropriate analysis for your study.