English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Note: "10 IQ points" is not the answer

2007-01-18 07:31:54 · 13 answers · asked by Anonymous in Social Science Psychology

13 answers

There's not as big a difference as you would think. The higher up in IQ ranges you get, the less difference an additional point makes. Kind of like the concept of diminishing marginal utility. A 10 point increase for someone with an IQ of 80 would be significantly more than for someone with an IQ of 130.
It used to be the case that 140 was the beginning of the genius range. Now, most scales no longer use that title. Instead, they label anything over 130 as superior intelligence. So there would be a little difference, but both individuals would be very intelligent, and the difference in intelligence would likely be hard to see by mere observation.

2007-01-18 07:38:54 · answer #1 · answered by theeconomicsguy 5 · 0 1

Okay. There's a lot of misinformation here. But that's okay. We can work through it.

To determine what the differences between people with an IQ score of 140 and 130 are, you have to first ask yourself what those scores actually mean. Which - as you probably know - is not an undisputed subject. But let's get down to basics: at the very minimum level, the questions on IQ tests usually have to do with mathematics, certain linguistic abilities, and general kinds of knowledge. Thus we can quite concretely say that those ten points of difference minimally mean that one person currently has a slightly better computing, linguistic, and/or general knowledge ability.

This difference in ability may partly be due to innate talent, or it may be more due to learned skills. Or it may be due to external influences. We know that all these things can affect the outcomes of such tests. Likewise, all we can say is that at least one of those abilities is better, not all of them or even any particular one. So if you want to be definite about the difference, you have very little to be definite about.

Even less so if you start to extrapolate this information out into completely different areas involving anything but very basic skills. Given the kinds of questions that appear on IQ tests, it wouldn't be too hard to develop a computer that could generate truly amazing scores, yet I don't think anyone would actually credit such a device with any kind of real intelligence on that basis. And perhaps this is one of the weaknesses in all such tests - it's difficult even to define what you're talking about, much less measure it in any definitive way!

There are also relative considerations. Even assuming the complete validity of an IQ test, to someone of IQ 100 (most of everyone else), the difference between 130 and 140 is probably not even noticeable... those other two guys are just 'really smart'. Likewise, all the intelligence in the world is of little use without material to operate on and the inclination to do something. So in many arenas education and motivation are of far more significance than any native intelligence.

So while there probably IS a difference and it IS measurable, its significance will vary greatly.

2007-01-18 08:34:36 · answer #2 · answered by Doctor Why 7 · 1 0

About 1%

2007-01-18 08:00:06 · answer #3 · answered by Anonymous · 0 0

Ok, to give you a correct answer in as few lines as possible:

Not that much.



The higher the IQ gets, the less an increase in points reflects greater ability. Someone of IQ 100 compared to IQ 90 is far smarter, where as someone of IQ 150 compared to IQ 140 is almost the same intelligence.

This is caused by the 'ceiling' effect on tests; that is, scores get distorted when they get close to the maximum number of points on a test. Say the max score is 40/40, once you get close to that (say 35/40), less and less people continue to answer questions correctly at an exponential rate, causing inflation of the IQ score.


This is what causes those high range differences that essentially mean nothing. The raw scores (say out of 40 points) are almost identical, yet the difference in IQ's appears large.


He's an example from the RAVENS APM official norming.

25/36= IQ 114
27/36 = IQ 118

Now...

34/36 = 147
36/36 = 157 ...big difference for only 2 more questions correct.

2007-01-18 13:40:29 · answer #4 · answered by Anonymous · 0 0

The difference between 130 IQ and 140 IQ is greater than the distance between 100 IQ and 110 IQ
the distance between 140 IQ and 150 IQ is greater than 130-140.
It is an anomaly with the bell curve as while things fall and are normally distributed with IQ samples, the scales, or the differences between intervals expand or are greater near the tails than in the center of the distribution.
Statistically, the mean of IQ is 100 and the standard deviation is 15. for 130 IQ, this will give you a z of 2.00 with a probability at the 97.72 percentile. IQ of 140 gives you a z of 2.667 with a probability at 99.62 percentile.
It's kinda like the big bang theory of the expanding universe! kinda
disregard those answers that suggest that there is not much difference between 130-140 IQs Again, the differences are greater in the tails of the bell-curve.
hope this helps gl

Below is confimation of what I put forth in my answer, that there is considerbly more differences at the tails of the distribution than nearer the middle. This link will take you to the website
http://members.tripod.com/~gleigh/normal.htm

The Relation of the Normal Distribution and the Standard Deviation
In any normal distribution there is a relationship between the proportion of cases in between + or - each standard deviation from the mean. Here is that relationship in tabular form:

Spread
Proportion of Cases

+ or -1 standard deviation
68.26%

+ or -2 standard deviations
95.44%

+ or -3 standard deviations
99.74%



A child who is described as being in the 99th percentile is thus somewhere beyond about 2.5 standard deviations from the mean. A child above the 99.9th percentile is somewhere beyond three standard deviations above the mean.

The numerical difference between 99 and 99.9 is relatively small, BUT the difference in ability and behavioural characteristics of two chiildren on opposite borders of this range is very great.

2007-01-18 07:47:50 · answer #5 · answered by James O only logical answer D 4 · 0 1

The difference in IQ points at the level in which you are interested would not result in any substantive difference in normal day to day living.

The persons with IQ points between 130 and 140 have the intellectual potential to work successfully in any field they desire.

However, I would say that persons with IQ points of 140 would have a better chance of making a significant and long lasting contribution in their field of expertise.

At the highest levels of human accomplishment, I believe that every IQ point can make a difference.

2007-01-18 07:56:41 · answer #6 · answered by Anonymous · 1 0

It depends on the age. I had a 130 I.Q. when I was in kindergarten, and now I have a 140, am I much smarter now that I was then... of course... I know more and am better capable of solving problems.

I think, granted they are the same age, there is not a real difference.

2007-01-18 07:35:49 · answer #7 · answered by ♥Princess♥ 4 · 0 0

er it's a big diff man, the 140 guy is like smart as hell man, he can see things the 130 can't and er the 140 have more mental problems

2007-01-18 07:35:21 · answer #8 · answered by MiKe Drazen 4 · 1 1

Thats not too big a difference so probably not very much - maybe they can solve things a bit faster?

2007-01-18 07:34:24 · answer #9 · answered by radiancia 6 · 0 0

The person with the 140 I.Q. may know more than the person with 130 I.Q., but that's all it means. I.Q. scores don't measure Intelligence only what you know. It doesn't measure how you would apply that knowledge.

2007-01-18 07:40:15 · answer #10 · answered by AL IS ON VACATION AND HAS NO PIC 5 · 0 3

fedest.com, questions and answers