First we should know how thought arises. Scientists have not said about the functions of the energy in our body. Spiritual masters have said a lot about it.The energy which is the fundamental particle keeps rotating inour body.There are millions of them in our body.A spreading wave arises due to the whirling motion and cos of this magnetism is formed in our body.This is how we are able to hear, see, smell, feel &taste. This is bio-magnetism.The magnetism that is formed outside is universal magnetism. The spreading wave that goes through the brain is what we call as thoughts. The energy has the power to record anythig that we experience. So the waves are recorded & they are in different frequencies.According to the different moods or situations, everything is recorded and stored. The thoughts rise in beta, alpha, deeta &delta frequencies.Hope this information will help you.
2007-09-28 15:42:41
·
answer #1
·
answered by lalachi 4
·
0⤊
0⤋
Im not smart enough to know this off hand but this should help:
Any answer to this question should be taken with several grains of salt.
Digital computers and brains don't work the same way. For one thing, every memory location in a computer is created equal. You can move stuff from one location to another without losing any information. In the brain, on the other hand, certain cells specialize in certain jobs. While there is considerable plasticity (the ability to change what some part of the brain does, enabling the brain to recover from injury), there's nothing like the uniformity seen in a computer.
Secondly, processing and memory are completely separated in a computer; not so in the brain.
Finally, data in computers is digital, and not really susceptible to "noise". In the brain, there are continuous voltages.
With those caveats, let's look at numbers. The brain contains 10^11
neurons -- in other words, 100 giganeurons. Each one has synapses
connecting it to up to 1000 other neurons. Many researchers believe that
memories are stored as patterns of synapse strengths. If we suppose that the strength of each synapse can take on any of 256 values, then each synapse corresponds to a byte of memory. This gives a total of (very roughly) 100 terabytes for the brain.
For more info, see the book "Mind and Brain: Readings from Scientific
American".
2007-09-28 16:52:22
·
answer #2
·
answered by Gogo 1
·
0⤊
0⤋
It s not an electronic device to tell u in GB's or Mb's..
It s not a sample for mathematics...
But to analyze ur brain & its mem., just read a "book" - a complete book and gather the words(recall)
How much words u recall is the memory of ur brain.... Hope ths s a stupid answer but the question s also like that!
2007-09-30 00:07:29
·
answer #3
·
answered by Myth 4
·
0⤊
0⤋
there is no exact capacity of memory holdings for the brain, but there are records that the brain can see 100 photographs for 1 min each and still remember them when asked later.
2007-09-28 16:26:15
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
The ability to retain and recall information regarding past events. That's the most succinct way I can describe it. Its a very complex ability. The human brain is still very much and enigma.
2007-09-28 15:09:44
·
answer #5
·
answered by Lady Geologist 7
·
0⤊
0⤋
Considering the human brain works in analog and reacts to analog signals, the use of "Bytes" which are translated down to "Bits" (digital input of 1 and 0 or on and off) is irrelevant and will not work in this case.
2016-04-06 06:11:43
·
answer #6
·
answered by Anonymous
·
0⤊
0⤋
Today it is commonplace to compare the human brain to a computer, and the human mind to a program running on that computer. Once seen as just a poetic metaphor, this viewpoint is now supported by most philosophers of human consciousness and most researchers in artificial intelligence. If we take this view literally, then just as we can ask how many megabytes of RAM a PC has we should be able to ask how many megabytes (or gigabytes, or terabytes, or whatever) of memory the human brain has.
Several approximations to this number have already appeared in the literature based on "hardware" considerations (though in the case of the human brain perhaps the term "wetware" is more appropriate). One estimate of 1020 bits is actually an early estimate (by Von Neumann in The Computer and the Brain) of all the neural impulses conducted in the brain during a lifetime. This number is almost certainly larger than the true answer. Another method is to estimate the total number of synapses, and then presume that each synapse can hold a few bits. Estimates of the number of synapses have been made in the range from 1013 to 1015, with corresponding estimates of memory capacity.
A fundamental problem with these approaches is that they rely on rather poor estimates of the raw hardware in the system. The brain is highly redundant and not well understood: the mere fact that a great mass of synapses exists does not imply that they are in fact all contributing to memory capacity. This makes the work of Thomas K. Landauer very interesting, for he has entirely avoided this hardware guessing game by measuring the actual functional capacity of human memory directly (See "How Much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-term Memory", in Cognitive Science 10, 477-493, 1986).
Landauer works at Bell Communications Research--closely affiliated with Bell Labs where the modern study of information theory was begun by C. E. Shannon to analyze the information carrying capacity of telephone lines (a subject of great interest to a telephone company). Landauer naturally used these tools by viewing human memory as a novel "telephone line" that carries information from the past to the future. The capacity of this "telephone line" can be determined by measuring the information that goes in and the information that comes out, and then applying the great power of modern information theory.
Landauer reviewed and quantitatively analyzed experiments by himself and others in which people were asked to read text, look at pictures, and hear words, short passages of music, sentences, and nonsense syllables. After delays ranging from minutes to days the subjects were tested to determine how much they had retained. The tests were quite sensitive--they did not merely ask "What do you remember?" but often used true/false or multiple choice questions, in which even a vague memory of the material would allow selection of the correct choice. Often, the differential abilities of a group that had been exposed to the material and another group that had not been exposed to the material were used. The difference in the scores between the two groups was used to estimate the amount actually remembered (to control for the number of correct answers an intelligent human could guess without ever having seen the material). Because experiments by many different experimenters were summarized and analyzed, the results of the analysis are fairly robust; they are insensitive to fine details or specific conditions of one or another experiment. Finally, the amount remembered was divided by the time allotted to memorization to determine the number of bits remembered per second.
The remarkable result of this work was that human beings remembered very nearly two bits per second under all the experimental conditions. Visual, verbal, musical, or whatever--two bits per second. Continued over a lifetime, this rate of memorization would produce somewhat over 109 bits, or a few hundred megabytes.
While this estimate is probably only accurate to within an order of magnitude, Landauer says "We need answers at this level of accuracy to think about such questions as: What sort of storage and retrieval capacities will computers need to mimic human performance? What sort of physical unit should we expect to constitute the elements of information storage in the brain: molecular parts, synaptic junctions, whole cells, or cell-circuits? What kinds of coding and storage methods are reasonable to postulate for the neural support of human capabilities? In modeling or mimicking human intelligence, what size of memory and what efficiencies of use should we imagine we are copying? How much would a robot need to know to match a person?"
What is interesting about Landauer's estimate is its small size. Perhaps more interesting is the trend--from Von Neumann's early and very high estimate, to the high estimates based on rough synapse counts, to a better supported and more modest estimate based on information theoretic considerations. While Landauer doesn't measure everything (he did not measure, for example, the bit rate in learning to ride a bicycle, nor does his estimate even consider the size of "working memory") his estimate of memory capacity suggests that the capabilities of the human brain are more approachable than we had thought. While this might come as a blow to our egos, it suggests that we could build a device with the skills and abilities of a human being with little more hardware than we now have--if only we knew the correct way to organize that hardware.
2007-09-28 21:59:02
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋
The human memory is categorically better then the computer's memory, but not like in the mega byte like capacity, it's more like in a kilo byte or so, but it is categorically involving a qualitative and quantitative enclosure.
2007-09-28 20:46:50
·
answer #8
·
answered by Anonymous
·
0⤊
0⤋
You cannot mind's memory to computer's.
But if we compare then the answer that we can remember upto 1000 Gbs of data!
2007-09-28 18:21:45
·
answer #9
·
answered by D.A.M. 2
·
0⤊
0⤋
The brain is, by far, the most complex and mysterious organ in the human body. Composed of over 100 billion cells called neurons (sensory neuron and motor neurons), this amazing structure is the center from which all of our skills of higher reasoning originate -- creativity, learning, imagination, planning, and, perhaps most notable of all, our sense of identity. But how exactly does the physical brain make the transcendental leap to that much more esoteric concept, one's sense of self? How does the brain work -- and is it separate from what we might call the mind? These questions are still hotly debated in scientific, philosophic,religious, and cultural circles the world over,and the answers to them may well never be
fully understood. Yet with the advent of modern neuroscience and psychology, much has come to be understood about the human brain.
A technical definition of the human brain begins, most simply, with the manner in which it is assembled. Each of its 100 billion neurons connects to 10,000 others, forging a grand total of somewhere between 100-1000 trillion connections strung together by 90 million meters of neural fibers. Yet all of this neural density weighs between three to four pounds, and is set inside a cranium no more than 1 1/2 liters in volume, and the cortex (the brain's rippled gray-matter surface, the center of all higher thought processes) is no greater in thickness and surface area than a formal dinner napkin
If peeled apart, that cortical "dinner napkin" would reveal six distinct layers, each containing millions of interconnected neurons. Neurons on all layers communicate with each other through electrical impulses sent from the nucleus of each cell, down axons and across to dendrites of the surrounding neurons. This, in turn, allows the brain as a whole to communicate with the body it controls. Through neurons, the brain is able to receive information from numerous sensory receptors throughout the body, decide which of these sensory stimuli deserve attention, and send commands to initiate or inhibit various responses. Interestingly, the brain has far more capacity to respond to stimuli than it does to receive those stimuli in
the first place; there are 10 times the number of feedback connections as there are "bottom-up" or sensory-input connections. It is, perhaps, this favoring of response over input that allows humans their remarkable skills at adapting to new, unfamiliar situations -- their ability to interpret and innovate.
Today it is commonplace to compare the human brain to a computer, and the human mind to a program running on that computer. Once seen as just a poetic metaphor, this viewpoint is now supported by most philosophers of human consciousness and most researchers in artificial intelligence. If we take this view literally, then just as we can ask how many megabytes of RAM a PC has we should be able to ask how many megabytes (or gigabytes, or terabytes, or whatever) of memory the human brain has.
Several approximations to this number have already appeared in the literature based on "hardware" considerations (though in the case of the human brain perhaps the term "wetware" is more appropriate). One estimate of 1020 bits is actually an early estimate (by Von Neumann in The Computer and the Brain) of all the neural impulses conducted in the brain during a lifetime. This number is almost certainly larger than the true answer. Another method is to estimate the total number of synapses, and then presume that each synapse can hold a few bits. Estimates of the number of synapses have been made in the range from 1013 to 1015, with corresponding estimates of memory capacity.
A fundamental problem with these approaches is that they rely on rather poor estimates of the raw hardware in the system. The brain is highly redundant and not well understood: the mere fact that a great mass of synapses exists does not imply that they are in fact all contributing to memory capacity. This makes the work of Thomas K. Landauer very interesting, for he has entirely avoided this hardware guessing game by measuring the actual functional capacity of human memory directly (See "How Much Do People Remember? Some Estimates of the Quantity of Learned Information in Long-term Memory", in Cognitive Science 10, 477-493, 1986).
Landauer works at Bell Communications Research--closely affiliated with Bell Labs where the modern study of information theory was begun by C. E. Shannon to analyze the information carrying capacity of telephone lines (a subject of great interest to a telephone company). Landauer naturally used these tools by viewing human memory as a novel "telephone line" that carries information from the past to the future. The capacity of this "telephone line" can be determined by measuring the information that goes in and the information that comes out, and then applying the great power of modern information theory.
Landauer reviewed and quantitatively analyzed experiments by himself and others in which people were asked to read text, look at pictures, and hear words, short passages of music, sentences, and nonsense syllables. After delays ranging from minutes to days the subjects were tested to determine how much they had retained. The tests were quite sensitive--they did not merely ask "What do you remember?" but often used true/false or multiple choice questions, in which even a vague memory of the material would allow selection of the correct choice. Often, the differential abilities of a group that had been exposed to the material and another group that had not been exposed to the material were used. The difference in the scores between the two groups was used to estimate the amount actually remembered (to control for the number of correct answers an intelligent human could guess without ever having seen the material). Because experiments by many different experimenters were summarized and analyzed, the results of the analysis are fairly robust; they are insensitive to fine details or specific conditions of one or another experiment. Finally, the amount remembered was divided by the time allotted to memorization to determine the number of bits remembered per second.
The remarkable result of this work was that human beings remembered very nearly two bits per second under all the experimental conditions. Visual, verbal, musical, or whatever--two bits per second. Continued over a lifetime, this rate of memorization would produce somewhat over 109 bits, or a few hundred megabytes.
While this estimate is probably only accurate to within an order of magnitude, Landauer says "We need answers at this level of accuracy to think about such questions as: What sort of storage and retrieval capacities will computers need to mimic human performance? What sort of physical unit should we expect to constitute the elements of information storage in the brain: molecular parts, synaptic junctions, whole cells, or cell-circuits? What kinds of coding and storage methods are reasonable to postulate for the neural support of human capabilities? In modeling or mimicking human intelligence, what size of memory and what efficiencies of use should we imagine we are copying? How much would a robot need to know to match a person?"
What is interesting about Landauer's estimate is its small size. Perhaps more interesting is the trend--from Von Neumann's early and very high estimate, to the high estimates based on rough synapse counts, to a better supported and more modest estimate based on information theoretic considerations. While Landauer doesn't measure everything (he did not measure, for example, the bit rate in learning to ride a bicycle, nor does his estimate even consider the size of "working memory") his estimate of memory capacity suggests that the capabilities of the human brain are more approachable than we had thought. While this might come as a blow to our egos, it suggests that we could build a device with the skills and abilities of a human being with little more hardware than we now have--if only we knew the correct way to organize that hardware.
2007-09-30 19:11:08
·
answer #10
·
answered by veerabhadrasarma m 7
·
0⤊
0⤋