The reason I want to connect 2 Linux boxes together via infiniband cards in each is to superscale the processors. I want my main Linux box to offload 'latent' processor queues to the second. If possible, (and before I buy the infiniband PCI-E cards), I would set up as follows: PC #1 AMD x2, Mandriva linux. This is the main PC. When its work queues are two long it should offload to PC #2 AMD socket 754, minimalist Linux with kernel compiled specifically for task. Both have available PCI-E slots. This "p2p network" is only for super scalar experiments with Linux kernel. I am not very knowledgeable with hardware and networking. I need help
Another use is that (If I can even do it) to have a "processor share" PC on direct link with two other PCs (via 2 PCI-E slots). This PC would do nothing but share out its CPU cycles in a demanding task I have. (with another PC sitting around, why not cluster and get the most out of what I am doing - I want to cluster 2 PCs without a switch.)
2007-06-02
09:21:45
·
2 answers
·
asked by
jarrod d
2
in
Computer Networking