124 is about average.
But, how are your spelling, grammar and communication skills?
2007-03-16 16:17:25
·
answer #1
·
answered by GeneL 7
·
0⤊
1⤋
Have your IQ tested professionally. The online tests are a poor substitute.
100-120 is considered average (regardless of age) on the scales used by the Wexler and Stanford-Binnet tests (most commonly used tests).
Given your early acceptence into college and your reasonably high GPA you're results from your online test are probably low.
124 would be a good score for your age but your accomplishments suggest your IQ would be higher.
2007-03-18 23:04:45
·
answer #2
·
answered by ophelliaz 4
·
0⤊
0⤋
IQ tests are scored so that the average score is always 100. Now online tests tend to be higher.
IQ tests don't have anything to do with age (after a certain point. Piaget, who studied intelligence in children, would say you have to reach the formal operation stage first to preform at your best) and don't really study intelligence, but rather your ability to take an IQ test.
A score of 124 means you are very good at taking IQ tests :-)
2007-03-16 23:02:14
·
answer #3
·
answered by kathy 4
·
0⤊
0⤋
Your IQ has NOTHING to do with your age. If you took this test at age 5, you would have gotten the same score. People have the same aptitude their entire lives, unless you do specific exercises to help this; but that would only raise your IQ one or two points. An online test giving you this score may just be a lucky fluke, but it is a good score. Average score is 100. My IQ is somewhere around 126 (I don't know- I took that test a long time ago). Good luck with college and way to go, smarty!
2007-03-16 16:25:32
·
answer #4
·
answered by Squeegee Beckingheim :-) 5
·
0⤊
0⤋
You are in the group with 6.7 percent of the population. Your score is in the range of 120-129 which is considered superior. This means your score was better than 94% of the population tested. You are capable of great things don't waste it. I feel your 3.3 average should be higher based on you ability.
2007-03-16 16:56:45
·
answer #5
·
answered by BJ 1
·
0⤊
0⤋
i does not understand. yet whilst this is the case..he's seen "greater suitable intelligence" 50% of persons merely have an IQ of ninety-109. I hate to aid you understand yet 124 is a especially severe IQ.
2016-12-18 15:44:30
·
answer #6
·
answered by hirschfeld 4
·
0⤊
0⤋
It's pretty good. It puts you at the 95th percentile of the population at large. No one test gives a really good result; only the average of a number of tests over time is really believable.
2007-03-16 16:24:08
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋
Online tests are not accurate at all. Your true score may be +- 30 points.
2007-03-16 16:16:47
·
answer #8
·
answered by Anonymous
·
0⤊
0⤋
124 is above average go to highiqsociety and you can be a member!
2007-03-16 16:17:46
·
answer #9
·
answered by SpeedDemon 2
·
0⤊
0⤋
my IQ is 150
An intelligence quotient or IQ is a score derived from one of several different standardized tests attempting to measure intelligence. IQ tests are generally designed and used because they are found to be predictive of later intellectual achievement, such as educational achievement. IQ also correlates with job performance and income, although in all cases other factors explain most of the variance. Recent work has demonstrated links between IQ and health.
History
In 1905 the French psychologist Alfred Binet published the first modern intelligence test, the Binet-Simon intelligence scale. His principal goal was to identify students who needed special help in coping with the school curriculum. Along with his collaborator Theodore Simon, Binet published revisions of his intelligence scale in 1908 and 1911, the last appearing just before his untimely death.
In 1912, the abbreviation of "intelligence quotient" or I.Q., a translation of the German Intelligenz-Quotient, was coined by the German psychologist William Stern. A further refinement of the Binet-Simon scale was published in 1916 by Lewis M. Terman, from Stanford University, who incorporated Stern's proposal that an individual's intelligence level be measured as an intelligence quotient (I.Q.). Terman's test, which he named the Stanford-Binet Intelligence Scale formed the basis for one of the modern intelligence tests still commonly used today.
Originally, IQ was calculated with the formula 100 \times \frac{\text{mental age}}{\text{chronological age}}. A 10-year-old who scored as high as the average 13-year-old, for example, would have an IQ of 130 (100*13/10).
In 1939 David Wechsler published the first intelligence test explicitly designed for an adult population, the Wechsler Adult Intelligence Scale, or WAIS. Since publication of the WAIS, Wechsler extended his scale downward to create the Wechsler Intelligence Scale for Children, or WISC, which is still in common usage. The Wechsler scales contained separate subscores for verbal and performance IQ, thus being less dependent on overall verbal ability than early versions of the Stanford-Binet scale, and was the first intelligence scale to base scores on a standardized normal distribution rather than an age-based quotient.
Because age-based quotients only worked for children, it was replaced by a projection of the measured rank on the Gaussian bell curve with a center value (average IQ) of 100, and a standard deviation of 15 or occasionally 16. Thus the modern version of the IQ is a mathematical transformation of a raw score (based on the rank of that score in a normalization sample; see quantile, percentile, percentile rank), which is the primary result of an IQ test. To differentiate the two scores, modern scores are sometimes referred to as "deviance IQ", while the age-specific scores are referred to as "ratio IQ". While the two methodologies yield similar results near the middle of the bell curve, the older ratio IQ's yielded far higher scores for the intellectually gifted-Marilyn vos Savant appeared in the Guiness book of world records for obtaining a ratio IQ of 228. Such stratospheric numbers are not possible on the deviation IQ because a perfectly Gaussian curve defines the highest possible IQ within the United States (population 300 million) as between five and six standard deviations above the U.S. mean defined as 100. In addition, IQ tests like the Wechsler were not intended to reliably discriminate much beyond IQ 130, and simply do not contain enough exceptionally difficult items for one to score in the "stratosphere".
Since the publication of the WAIS, almost all intelligence scales have adopted the normal distribution method of scoring. The use of the normal distribution scoring method makes the term "intelligence quotient" an inaccurate description of the intelligence measurement, but I.Q. still enjoys colloquial usage, and is used to describe all of the intelligence scales currently in use.
[edit] How an IQ test works
IQ tests come in many forms, and some tests use a single type of item or question, while other use several different subtests. Most tests yield both an overall score and individual subtest scores.
A typical IQ test requires the test subject to solve a fair number of problems in a set time under supervision. Most IQ tests include items from various domains, such as short-term memory, verbal knowledge, spatial visualization, and perceptual speed. Some tests have a total time limit, others have a time limit for each group of problems, and there are a few untimed, unsupervised tests, typically geared to measuring high intelligence.
To set the scale for an IQ test, a representative sample of the population for which the IQ is made is gathered. IQ tests are calibrated in such a way as to yield a normal distribution, or "bell curve."
Each IQ test, however, is designed and valid only for a certain IQ range. Because so few people score in the extreme ranges, IQ tests usually cannot accurately measure very low and very high IQs.
Various IQ tests measure a standard deviation with different number of points. Thus, when an IQ score is stated, the standard deviation used should also be stated. A result of 124 in a test with a 24-point standard deviation corresponds to a score of 115 in a test with a 15-point deviation.[1]
Where an individual has scores that do not correlate with each other, there is a good reason to look for a learning disability or other cause for the lack of correlation. Tests have been chosen for inclusion because they display the ability to use this method to predict later difficulties in learning.
On average, IQ scores are stable over a person's lifetime. The mean for ages 17 and 18 was correlated r=.86 with the mean for ages 5, 6 and 7, r=.96 with the mean for ages 11, 12 and 13. Nevertheless, IQ scores do change over time. In the same study the average change between age 12 and age 17 was 7.1 IQ points; some individuals changed as much as 18 points.[2]
[edit]
The average IQ scores for many populations were rising rapidly during the 20th century: a phenomenon called the Flynn effect. It is not known whether these changes in scores reflect real changes in intellectual abilities (if not, then this raises questions about what IQ tests do
2007-03-16 16:12:56
·
answer #10
·
answered by Anonymous
·
0⤊
1⤋