From Wikiversity
(Redirected from Linear correlation)
Jump to navigation Jump to search
Completion status: this resource is ~50% complete.

Correlation (co-relation) refers to the degree of relationship (or dependency) between two variables.

Linear correlation refers to straight-line relationships between two variables.

A correlation can range between -1 (perfect negative relationship) and +1 (perfect positive relationship), with 0 indicating no straight-line relationship.

The earliest known use of correlation was in the late 19th century[1].


[edit | edit source]
The degree of linear relationship between two variables can be represented in terms of a Venn Diagram. Perfectly overlapping circles would indicate a correlation of 1, and non-overlapping circles would represent a correlation of 0.

When we ask questions such as "Is X related to Y?", "Does X predict Y?", and "Does X account for Y"?, we are interested in measuring and better understanding the relationship between two variables.

Correlation measures the extent to which variables:

  1. covary
  2. depend on one another
  3. predict one another

The extent of correlation between two variables, by convention, is denoted r, and the correlation between variable X and variable Y is indicated by rXY.

Correlations are standardised to vary between -1 and +1, with 0 representing no relationship, -1 a perfect negative relationship, and +1 a perfect positive relationship.

A variety of bivariate correlational statistics are available, the choice of which depends on the variables' level of measurement:

Correlational analyses should be accompanied by appropriate bivariate graphs, such as:

The world is made of covariation

[edit | edit source]
Bees and flowers tend to co-occur.

Responses which vary can be measured as a variable (i.e., responses are distributed across a range).

Responses to two or more variables may covary. These variables share some variation. When the value of one variable is high, the value of other variable tends to be high (positive correlation) or low (negative correlation).

If you look around, you may notice that the world is made of covariation! e.g.,

  • pollen count is positively correlated with bee activity
  • rainfall is positively correlated with amount of vegetation
  • hours of study is positively correlated with test performance
  • number of fire trucks attending a fire is correlated with cost of repairs for the fire[2]
  • Sibling's IQ is positively correlated
  • perceived air temperature is negatively correlated with amount of clothing worn

The more you look, the more you'll see that there are many predictable patterns of co-occurrence between phenomena (i.e., things tend to occur together).


[edit | edit source]

Independent variable (IV) (predictor) is placed on the X axis and dependent variable (DV) is placed on the Y axis. Each case is plotted according to its X and Y value.

r = .76

Visual inspection of scatterplots is essential

[edit | edit source]

It is unwise to rely solely on correlation as a statistic that indicates the nature of the relationship between variables without also examining a visualisation of the data such as through a scatterplot.

For example, the linear (straight-line) correlation in each of these four scatterplots is .82, yet the nature of what the data indicated about the relationship between the variables is very different for each.

Four sets of data with the same correlation of 0.816


[edit | edit source]
Scatterplot showing homoscedasticity.
Scatterplot showing heteroscedasticity.

If the data are normally distributed, then scatterplots should be homoscedastistic (even spread about the line of best fit).

If data are not normally distributed (e.g., skewed), then the bivariate distribution may be heteroscedastic (uneven spread about the line of best fit). This violate the assumption of homoscedasticity for correlation.

For more information, see Homoscedasticity and Heteroscedasticity (Wikipedia).

Correlation does not equal causation

[edit | edit source]

Correlation does not prove causation, although it may be consistent with causation. It is important to understand that correlation does not equal causation. A relationship between two variables may be caused by a third variable.

Spurious correlations. Correlation does not imply causation (Wikipedia). More examples

See Correlation does imply causation (Wikipedia)

Range restriction

[edit | edit source]
Pearson/Spearman correlation coefficients between X and Y are shown when the two variables' ranges are unrestricted, and when the range of X is restricted to the interval (0,1).

For more info, see the effect of range restrictions (Howell, 2009) and restricted range (Lane, n. d.).

For a practical tutorial, see outliers and restricted range.

Coefficient of determination

[edit | edit source]

When a correlation coefficient (r) is squared (r2), this gives the coefficient of determination which is the percentage of variance shared between the two variables.

See also

Interactive activity

[edit | edit source]

Correlation guess: Correlation guess

Test yourself: This is a pre-quiz to see what you already know - Introductory quiz

See also

[edit | edit source]
[edit | edit source]