Jump to content

Intelligence Quotient

From Wikiversity
(Redirected from IQ)
Completion status: this resource is a stub, so not much has been done yet.
The IQs of a large enough population can be modeled with a Normal Distribution.

An intelligence quotient, or IQ, is a score derived from one of several different standardized tests designed to assess intelligence.


An intelligence quotient (IQ) is a total score derived from several standardized tests designed to assess human intelligence. The abbreviation "IQ" was coined by the psychologist William Stern for the German term Intelligenz quotient, his term for a scoring method for intelligence tests at University of Breslau he advocated in a 1912 book. Historically, IQ is a score obtained by dividing a person's mental age score, obtained by administering an intelligence test, by the person's chronological age, both expressed in terms of years and months. The resulting fraction is multiplied by 100 to obtain the IQ score. When current IQ tests were developed, the median raw score of the norming sample is defined as IQ 100 and scores each standard deviation (SD) up or down are defined as 15 IQ points greater or less, although this was not always so historically. By this definition, approximately two-thirds of the population scores are between IQ 85 and IQ 115.[1] About 2.5 percent of the population scores above 130, and 2.5 percent below 70.

Scores from intelligence tests are estimates of intelligence. Unlike, for example, distance and mass, a concrete measure of intelligence cannot be achieved given the abstract nature of the concept of "intelligence". IQ scores have been shown to be associated with such factors as morbidity and mortality, parental social status and, to a substantial degree, biological parental IQ. While the heritability of IQ has been investigated for nearly a century, there is still debate about the significance of heritability estimates and the mechanisms of inheritance.

IQ scores are used for educational placement, assessment of intellectual disability, and evaluating job applicants. Even when students improve their scores on standardized tests, they do not always improve their cognitive abilities, such as memory, attention and speed. In research contexts they have been studied as predictors of job performance, and income. They are also used to study distributions of psychometric intelligence in populations and the correlations between it and other variables. Raw scores on IQ tests for many populations have been rising at an average rate that scales to three IQ points per decade since the early 20th century, a phenomenon called the Flynn effect. Investigation of different patterns of increases in subtest scores can also inform current research on human intelligence.

See also

[edit | edit source]
  1. "Bell Curve of IQ Scores".