# Regression

A **regression** assesses whether predictor variables account for variability in a dependent variable. This page will describe regression analysis examples, regression assumptions, the evaluation of the R-square (coefficient of determination), the F-test, the interpretation of the beta coefficient(s), and the regression equation.

**Conduct Your Regression Now!**

*Fill out the form above, and start using Intellectus Statistics for FREE*

*!*

**Questions answered by a regression analysis:**

Do age and gender predict gun regulation attitudes?

Do the five facets of mindfulness influence peace of mind scores?

**Regression Example:**

There are numerous applications of regression tests. For example, a school has two types of reading programs (tradition program and a novel program), and would like to examine if program type influences (predicts) test scores.

**Assumptions:**

First, this analysis is very sensitive to outliers. The researcher should first standardize the scores to see if a value are greater than the absolute value of 3.29 standard deviations from the other scores, and if so, consider deleting that score.

Second, the main assumptions of regression are linearity and constant variance. To assess assumptions the researcher should plot the standardized residuals verses the predicted values. If the plot shows random scatter (homoscedasticity), the assumptions are met. However, if there is a curvilinear shape (e.g., U-shape), then linearity is not met, or if the scatter has a cone shape, then constant variance of the regression analysis is not met.

**F-test**

When the regression is conducted, a* F*-value, and significance level of that *F*-value, is part of that output. If the *F*-value is statistically significant (typically *p* < .05), this signifies that the model using the predictors did a good job of predicting the outcome variable and that there is a significant relationship between the set of predictors and the dependent variable.

**Evaluation of the R-Square**

When the regression is conducted, an R^{2} statistic (coefficient of determination) is presented. This value is the multivariate equivalent of the bivariate correlation coefficient. The* *R^{2} can be interpreted as, of all of the reasons why the outcome variable can vary, the percent of those reasons accounted for by the predictor(s) variables.

**Evaluation of the Adjusted R-Square**

The adjusted R^{2} value is the R^{2 }if the researcher used this model on a new data set.

**Beta Coefficients**

After the evaluation of the* F*-value and R^{2}, it is important to evaluate the regression beta coefficients: unstandardized and standardized. The beta coefficients can be negative or positive, and have a t-value and significance of that *t*-value associated with each. Think of the regression beta coefficient as the slope of a line: the *t*-value and significance assesses the extent to which the magnitude of the slope is significantly different from the line laying on the X-axis. If the beta coefficient is not statistically significant (i.e., the *t*-value is not significant), no statistical significance can be interpreted from that predictor. If the beta coefficient is significant, examine the sign of the beta. If the regression beta coefficient is positive, the interpretation is that for every 1-unit increase in the predictor variable, the dependent variable will increase by the unstandardized beta coefficient value. For example, if the beta coefficient is .80 and statistically significant, then for each unit increase in the predictor variable, the outcome variable will increase by .80 units.

**Equation**

Once the beta coefficient is determined, then a regression equation can be written. Using the example and beta coefficient above, the equation can be written as follows:

Y=.80X + c, where Y is the outcome variable, X is the predictor variable, .80 is the beta coefficient, and c is a constant.

**For assistance with conducting regressions or other quantitative analyses click here.*

Related Pages:

Linear Regression

Multiple Linear Regression

Logistic Regression

Ordinal Regression