What is Linear Regression?

Linear regression is a basic and commonly used type of predictive analysis.  The overall idea of regression is to examine two things: (1) does a set of predictor variables do a good job in predicting an outcome (dependent) variable?  (2) Which variables in particular are significant predictors of the outcome variable, and in what way do they–indicated by the magnitude and sign of the beta estimates–impact the outcome variable?  These regression estimates are used to explain the relationship between one dependent variable and one or more independent variables.  The simplest form of the regression equation with one dependent and one independent variable is defined by the formula y = c + b*x, where y = estimated dependent variable score, c = constant, b = regression coefficient, and x = score on the independent variable.

The software below allows you to conduct a regression, then interprets the regression’s assumptions and output.

Naming the Variables.  There are many names for a regression’s dependent variable.  It may be called an outcome variable, criterion variable, endogenous variable, or regressand.  The independent variables can be called exogenous variables, predictor variables, or regressors.

Three major uses for regression analysis are (1) determining the strength of predictors, (2) forecasting an effect, and (3) trend forecasting.

First, the regression might be used to identify the strength of the effect that the independent variable(s) have on a dependent variable.  Typical questions are what is the strength of relationship between dose and effect, sales and marketing spending, or age and income.

Second, it can be used to forecast effects or impact of changes.  That is, the regression analysis helps us to understand how much the dependent variable changes with a change in one or more independent variables.  A typical question is, “how much additional sales income do I get for each additional $1000 spent on marketing?”

Third, regression analysis predicts trends and future values.  The regression analysis can be used to get point estimates.  A typical question is, “what will the price of gold be in 6 months?”

There are several types of linear regression analyses available to researchers.

  • Simple linear regression
    1 dependent variable (interval or ratio), 1 independent variable (interval or ratio or dichotomous)

 

  • Multiple linear regression
    1 dependent variable (interval or ratio) , 2+ independent variables (interval or ratio or dichotomous)

 

  • Logistic regression
    1 dependent variable (dichotomous), 2+ independent variable(s) (interval or ratio or dichotomous)

 

  • Ordinal regression
    1 dependent variable (ordinal), 1+ independent variable(s) (nominal or dichotomous)

 

  • Multinominal regression
    1 dependent variable (nominal), 1+ independent variable(s) (interval or ratio or dichotomous)

 

When selecting the model for the analysis, an important consideration is model fitting.  Adding independent variables to a linear regression model will always increase the explained variance of the model (typically expressed as R²).  However, overfitting can occur by adding too many variables to the model, which reduces model generalizability.  Occam’s razor describes the problem extremely well – a simple model is usually preferable to a more complex model.  Statistically, if a model includes a large number of variables, some of the variables will be statistically significant due to chance alone.

Statistics Solutions can assist with your quantitative analysis by editing your methodology and results chapters.  For more information on how we can assist, please click here.

Related Pages:

Assumptions of a Linear Regression

 

To Reference this Page: Statistics Solutions. (2013). What is Linear Regression [WWW Document]. Retrieved from here.