English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2006-09-28 09:49:17 · 4 answers · asked by Anonymous in Computers & Internet Computer Networking

4 answers

I think the other answerers misread the question, it's not megabytes per second and it's not millions of instructions per second... Mpps = Millions of packets per second. 1Mpps is 1 million packets per second. Now, they could be 64 byte packets or 1500 byte packets or somewhere in between, so multiply by 1,000,000 and then multiply by 8 bits per byte to get bits/second. For example, if they're 576 byte packets, that comes out to 4.608Gbps at 1Mpps.

2006-09-28 14:09:20 · answer #1 · answered by networkmaster 5 · 0 0

1 mpps

2006-09-28 16:51:06 · answer #2 · answered by Anonymous · 0 0

Million instructions per second
Critics of the term refer to it as "Meaningless Indication of Processor Speed" or "Meaningless Information on Performance for Salespeople" or "Meaningless Integer Performance Spec". In Linux and UNIX circles MIPS are often referred to as bogoMIPS. MIPS are certainly not comparable between CPU architectures.

The floating-point arithmetic equivalent of MIPS is FLOPS, to which the same cautions apply.

In the 1970s, minicomputer performance was compared using VAX MIPS, where computers were measured on a task and their performance rated against the VAX 11/780 that was marketed as a "1 MIPS" machine. (The measure was also known as the "VAX Unit of Performance" or VUP. Though orthographically incorrect, the "S" in "VUPs" is sometimes written in upper case.) This was chosen because the 11/780 was roughly equivalent in performance to an IBM System/370 model 158-3, which was commonly accepted in the computing industry as running at 1 MIPS.

Most 8-bit and early 16-bit microprocessors have a performance measured in kIPS (thousand instructions per second), which equals 0.001 MIPS. The first general purpose microprocessor, the Intel i8080, ran at 640 kIPS. The Intel i8086 microprocessor, the first 16-bit microprocessor in the line of processors made by Intel and used in IBM PCs, ran at 800 kIPS. Early 32-bit PCs (386) ran at about 3 MIPS.

zMIPS refers to the MIPS measure used internally by IBM to rate its mainframe servers (zSeries and System z9). Analyst firm Isham Research has lately coined the term kMIPS (kilo-million instructions per second) to measure the processor speeds in IBM's largest servers.

2006-09-28 16:52:50 · answer #3 · answered by DanE 7 · 0 0

its 1 Megabyte per second which is more like 100K per second, so it would take 1 minute to download 10 megabytes

2006-09-28 16:51:31 · answer #4 · answered by Anonymous · 0 0

fedest.com, questions and answers