I've been in the computer biz for many years. I've started with Teletype (CDC mainframe) and Apple ][ / TRS-80 (micro). It seems like the minicomputer - at the time - were just better microcomputers. There was always a chasm between main and micro, and it seemed to grow over the years. But, in the last 5 years, the micro moved into the server domain with up to hundreds of CPUs, lots of memory, fault-tolerant, etc, etc.
So, currently, is there a simple definition that separates a Mainframe from a well designed "highend" server (or supercomputer)?
PS: I am referring from a usability perspective, not really hardware.
2007-01-06
04:51:31
·
3 answers
·
asked by
flyddw
2
in
Computers & Internet
➔ Other - Computers