A CPU cache is a cache used by the central processing unit of a computer to reduce the average time to access memory. The cache is a smaller, faster memory which stores copies of the data from the most frequently used main memory locations. As long as most memory accesses are to cached memory locations, the average latency of memory accesses will be closer to the cache latency than to the latency of main memory.
Read more at:
http://en.wikipedia.org/wiki/CPU_cache
2006-10-15 23:39:35
·
answer #1
·
answered by Tarun 3
·
0⤊
0⤋
Pronounced cash, a special high-speed storage mechanism. It can be either a reserved section of main memory or an independent high-speed storage device. Two types of caching are commonly used in personal computers: memory caching and disk caching.
A memory cache, sometimes called a cache store or RAM cache, is a portion of memory made of high-speed static RAM (SRAM) instead of the slower and cheaper dynamic RAM (DRAM) used for main memory. Memory caching is effective because most programs access the same data or instructions over and over. By keeping as much of this information as possible in SRAM, the computer avoids accessing the slower DRAM.
Some memory caches are built into the architecture of microprocessors. The Intel 80486 microprocessor, for example, contains an 8K memory cache, and the Pentium has a 16K cache. Such internal caches are often called Level 1 (L1) caches. Most modern PCs also come with external cache memory, called Level 2 (L2) caches. These caches sit between the CPU and the DRAM. Like L1 caches, L2 caches are composed of SRAM but they are much larger.
Disk caching works under the same principle as memory caching, but instead of using high-speed SRAM, a disk cache uses conventional main memory. The most recently accessed data from the disk (as well as adjacent sectors) is stored in a memory buffer. When a program needs to access data from the disk, it first checks the disk cache to see if the data is there. Disk caching can dramatically improve the performance of applications, because accessing a byte of data in RAM can be thousands of times faster than accessing a byte on a hard disk.
When data is found in the cache, it is called a cache hit, and the effectiveness of a cache is judged by its hit rate. Many cache systems use a technique known as smart caching, in which the system can recognize certain types of frequently used data. The strategies for determining which information should be kept in the cache constitute some of the more interesting problems in computer science.
2006-10-16 07:47:16
·
answer #2
·
answered by Udaya B 1
·
0⤊
0⤋
Computers operate at very high speeds. Current CPU's operate at speeds of
400 million cycles per second or more. What this means is, every 2.5
nanoseconds, the computer can execute a complete processing loop. On modern
computers, this means usually executing one or two instructions. This is
the speed you will see advertised on a computer--a Pentium III 400.
The problem is, while the computer can operate at this speed, it has to get
the program and data to execute from somewhere.
What happens is the program and data is loaded from the hard drive into RAM.
From RAM it is loaded into cache RAM, and from there it is executed by the
CPU.
Hard drives are very slow compared to the CPU. RAM is much faster than a
hard drive, but still 4-5 times slower than your CPU. Also, RAM is erased
if the power goes off. Cache RAM is extremely fast--it is capable of
delivering data at or near the speed of the CPU.
Cache RAM and normal RAM are very similar in the way they work. Cache is
just extremly fast, and expensive.
That is why there is so very little of cache RAM available--it is expensive.
In order to reduce the cost of computers, hard drives are used to store huge
amounts of data because they are so cheap--some drives cost less than a
penny for a megabyte of storage.
RAM is much more expensive--about a dollar for a megabyte of storage. This
is over 100 times more expensive than a hard drives.
Cache RAM is a lot more expensive than regular RAM--about $15-20 per
megabyte of storage.
In order to reduce the cost of computers, engineers have designed
controllers that load data and instructions from the hard drive when they
may be needed into RAM. When they are not needed in RAM, something else is
loaded. Then, as the computer runs, whatever is needed for that time is
loaded into cache. When the controller does a pretty good job at predicting
what is needed, the computer will operate at close to its full speed. When
the controllers don't do a good job, things will slow down while the CPU
waits for data to be loaded from the hard drive to RAM, and then into the
cache before it can continue.
Now, to anser your question, cache memory is where the computer gets the
program and data it needs to execute. If the cache is slower than your CPU,
your computer will be slow. But if it is faster, your computer won't speed
up. So you want to make sure the cache is fast enough for your computer,
but getting faster cache memory is a waste of money.
Second, the amount of cache memory also affects the speed of your computer.
In general, the more cache, the faster your computer will go. Most
computers have a fairly small limit on the amount of cache RAM possible.
Generally, make sure your computer has as much cache RAM as it can handle.
2006-10-15 23:41:48
·
answer #3
·
answered by debasis 3
·
0⤊
0⤋
CACHE MEMORY - Generally a small chunk of fast memory that sits between either 1) a smaller, faster chunk of memory and a bigger, slower chunk of memory, or 2) a microprocessor and a bigger, slower chunk of memory. The purpose of cache memory is to provide a bridge from something that's comparatively very fast to something that's comparatively slow. Most microprocessors have built-in cache memory that holds some of the information from main memory. When the processor needs the information it takes it from the speedy cache instead of the slower main memory. Cache memory GREATLY increases the speed of a computer by storing data that is most often accessed.
2006-10-18 06:51:05
·
answer #4
·
answered by Arvindh S 2
·
0⤊
0⤋
If you read the prior answers, you'll see what CPU internal cache memory is: fastest memory to access by the CPU.
Operating systems also have another type of cache, which is mostly used to keep copies of frequently accessed files instead of rereading them continuously from disk.
But I just have a feeling you were probably enquiring about software cache. The main purpose of a software cache is to keep a local copy of a file instead of downloading it over and over.
2006-10-19 16:13:28
·
answer #5
·
answered by juliepelletier 7
·
0⤊
0⤋
where data stored for fraction of seconds.( during processing).
suppose you want to add 2+2+2+2+2. we all know its 10; but how?
first 2+2=4
then 4+
wait,
from where comes this 4? we are adding only 2+2+2+2+.....
yes this 4 is stored some where in your mind, then you are adding 2 with this 4 then ------
4+2=6
6+2=8
8+2=10
after 5 calculation you are geting exact calculated no.10.
And in between you are storing calculated no. one by one for fraction of secons in your mind. after geting exact result you just erase all other calculation.
The same work is done by cache memory.
2006-10-15 23:48:56
·
answer #6
·
answered by manu_bedil 2
·
0⤊
0⤋
Cache memory is the smallest memory and it helps to increase the processing speed and decrease the processing time.
2006-10-15 23:44:29
·
answer #7
·
answered by sethu p 1
·
0⤊
0⤋
Cache memory is the memory storage which is very faster then RAM. It in between normal RAM and the CPU. The data is loaded from the hard drive into RAM. From RAM it is loaded into cache RAM, and from there it is executed by the CPU.
2006-10-15 23:58:29
·
answer #8
·
answered by Ravi G 2
·
0⤊
0⤋
Cache memory means cache memory
2006-10-15 23:40:53
·
answer #9
·
answered by Suresh Kumar 3
·
0⤊
1⤋
Its memory not built into the motherboard, installed using chips external meaning outside the circuts on the board not soldered in etc .
2006-10-15 23:41:29
·
answer #10
·
answered by Anonymous
·
0⤊
0⤋