**Chapter Summary**

The relationship between two measured variables is called a **correlation**. Correlations indicate that some of the variation in one variable is shared with the variation in another. These relationships can be **positive**, meaning that as one variable increases another variable increases, or **negative**, meaning that as one variable increases the other decreases. The possible range of a correlation is –1.0 to 1.0, in which –1.0 indicates a perfect negative relationship between variables, 1.0 indicates a perfect positive relationship between variables, and 0.0 indicates no relationship between variables. The further the value of a correlation from 0.0, the stronger it is.

With a 0.0 correlation, we may determine that there is no relationship between variables. However, correlation is a linear measurement, and to the extent that the relationship between two variables is **curvilinear**, we may end up with a 0.0 correlation even though there is a non-linear relationship between variables.

Anytime researchers find a significant correlation, there are three possible interpretations: (1) variable A causes variable B, (2) variable B causes variable A, or (3) a third variable C causes both variables A and B. In addition, they may also have committed a Type I error. It may be tempting to discuss a correlation in terms of causality, but without experimentation, such statements cannot actually be made about the nature of the relationship between variables.

Correlation is a statistic that is often used in psychology for non-experimental studies. In addition, researchers use correlation to determine the reliability of our measures, correlating responses at one time with responses at another time. Twin studies, where it is impossible to use random assignment (i.e., you cannot force someone to be a twin or not), often rely on correlational methods. In addition, both path analysis and factor analysis are more advanced statistical techniques that rely upon correlation.

Regression is a statistical approach that considers the relationship between variables, but unlike correlation, assigns both independent and dependent variables. **Multiple regression** generally uses continuous independent variables to predict a continuous dependent variable. However, sometimes researchers want to predict a **dichotomous** or categorical variable from a continuous variable, in which case they would use a specialized form of regression known as **logistic regression**. All regression analyses rely upon a lack of **collinearity** of **predictor variables**, i.e., there should not be significant correlations between independent variables.

**Additional Online Resources**

Correlation activity from the University of Idaho: http://www.webpages.uidaho.edu/psyc320/lessons/lesson02/lesson2-1_activity.htm

“Correlation still isn’t causation.” By Scott O. Lilienfeld: http://www.psychologicalscience.org/index.php/publications/observer/2006/february-06/correlation-still-isnt-causation.html

Third variable issues: http://researchnews.osu.edu/archive/nitelite.htm

Correlation humour: http://xkcd.com/552/

**Flashcards**

Test your knowledge of the keywords and definitions in the chapter.

## Interactive Quiz for Chapter 13

**Instructions:** For each question, click on the radio button beside your answer. When you have completed the entire quiz, click the “Submit my answers” button at the bottom of the page to receive your results.