A microprocessor is a type of processor, or CPU. There is one inside the computer that you typed that question on. Microprocessors are by far the most common type of processors that people talk about, so most people just say "processor" or "CPU". The processor is an integrated circuit that handles all the basic computational tasks that a computer performs. Computers used to be big things that took up a room, but over time many of those circuits were shrunk down until they were able to put on a wafer of silicon that can fit on a quarter.
2007-02-05 13:51:18
·
answer #1
·
answered by Anonymous
·
0⤊
0⤋
Definition frowm Wikipedia:
A microprocessor (sometimes abbreviated µP) is a programmable digital electronic component that incorporates the functions of a central processing unit (CPU) on a single semiconducting integrated circuit (IC). The microprocessor was born by reducing the word size of the CPU from 32 bits to 4 bits, so that the transistors of its logic circuits would fit onto a single part. One or more microprocessors typically serve as the CPU in a computer system, embedded system, or handheld device.
Microprocessors made possible the advent of the microcomputer in the mid-1970s. Before this period, electronic CPUs were typically made from bulky discrete switching devices (and later small-scale integrated circuits) containing the equivalent of only a few transistors. By integrating the processor onto one or a very few large-scale integrated circuit packages (containing the equivalent of thousands or millions of discrete transistors), the cost of processor power was greatly reduced. Since the advent of the IC in the mid-1970s, the microprocessor has become the most prevalent implementation of the CPU, nearly completely replacing all other forms. See History of computing hardware for pre-electronic and early electronic computers.
The evolution of microprocessors has been known to follow Moore's Law when it comes to steadily increasing performance over the years. This law suggests that the complexity of an integrated circuit, with respect to minimum component cost, doubles every 24 months. This dictum has generally proven true since the early 1970s. From their humble beginnings as the drivers for calculators, the continued increase in power has led to the dominance of microprocessors over every other form of computer; every system from the largest mainframes to the smallest handheld computers now uses a microprocessor at its core.
2007-02-05 21:51:18
·
answer #2
·
answered by julie 2
·
0⤊
0⤋
A microprocessor (sometimes abbreviated µP) is a programmable digital electronic component that incorporates the functions of a central processing unit (CPU) on a single semiconducting integrated circuit (IC). The microprocessor was born by reducing the word size of the CPU from 32 bits to 4 bits, so that the transistors of its logic circuits would fit onto a single part. One or more microprocessors typically serve as the CPU in a computer system, embedded system, or handheld device.
Microprocessors made possible the advent of the microcomputer in the mid-1970s. Before this period, electronic CPUs were typically made from bulky discrete switching devices (and later small-scale integrated circuits) containing the equivalent of only a few transistors. By integrating the processor onto one or a very few large-scale integrated circuit packages (containing the equivalent of thousands or millions of discrete transistors), the cost of processor power was greatly reduced. Since the advent of the IC in the mid-1970s, the microprocessor has become the most prevalent implementation of the CPU, nearly completely replacing all other forms. See History of computing hardware for pre-electronic and early electronic computers.
The evolution of microprocessors has been known to follow Moore's Law when it comes to steadily increasing performance over the years. This law suggests that the complexity of an integrated circuit, with respect to minimum component cost, doubles every 24 months. This dictum has generally proven true since the early 1970s. From their humble beginnings as the drivers for calculators, the continued increase in power has led to the dominance of microprocessors over every other form of computer; every system from the largest mainframes to the smallest handheld computers now uses a microprocessor at its core.
2007-02-05 21:49:16
·
answer #3
·
answered by ♥Roberta. 5
·
1⤊
0⤋
The processor of a computer is like the brain for a human. A microprocessor is just a smaller form of a processor. The capacity is the same, just smaller form.
2007-02-05 21:50:12
·
answer #4
·
answered by F B 3
·
0⤊
1⤋
Micro= tiny & Processor = one who process.
therefore microprocessor is an electronics device that can perform any airthmetical & logical operation.
2007-02-09 01:46:33
·
answer #5
·
answered by Arindam C 1
·
0⤊
0⤋
A Microprocessor is a computer chip, that, when installed with Microsoft Windows, enables hackers to steal private information, install up to 114,000 viruses, and frustrate users enough to get Kubuntu.
2007-02-05 21:52:20
·
answer #6
·
answered by Anonymous
·
0⤊
1⤋
It is the brain in a computer. It does simple things like 1 + 0 = 1, and programs build up complex patterns of those instructions to give you what is sent to the monitor. There is one on everymotherboard, or else the computer can't work. Micro part comes from it being small chips, because they used to be huge seperate parts of the computer.
2007-02-05 21:52:09
·
answer #7
·
answered by Kristofer 4
·
0⤊
0⤋
It is a small processor. hahahaha. No, a microprocessor is what people often refer to as a CPU.
2007-02-05 21:49:47
·
answer #8
·
answered by Kokopelli 6
·
0⤊
1⤋
It is an integrated circuit that contains a complete computer central processing unit on the chip.
A central processing unit is the main place a computer preforms logic functions, arithmetic and "executes" programs
An integrated circuit is a large circuit entirely on one chip composed of transistors patterned on a single wafer of silicon
2007-02-05 21:49:33
·
answer #9
·
answered by walter_b_marvin 5
·
1⤊
1⤋