Distributed computing utilizes a network of many computers, each accomplishing a portion of an overall task, to achieve a computational result much more quickly than with a single computer. In addition to a higher level of computing power, distributed computing also allows many users to interact and connect openly. Different forms of distributed computing allow for different levels of openness, with most people accepting that a higher degree of openness in a distributed computing system is beneficial.
The segment of the Internet most people are most familiar with, the World Wide Web, is also the most recognizable use of distributed computing in the public arena. Many different computers make everything one does while browsing the Internet possible, with each computer assigned a special role within the system.
2006-09-04 20:24:32
·
answer #1
·
answered by finalmoksha 3
·
0⤊
0⤋
Distributed computing is a method, where the power many connected computers are utilised in an effective way (both time and space), to do a specified job (program).
The given job is split into manageable parts and distributed to computers which has resource and time for execution. This way the computations are done in parallel in time and space and the results are gathered and shown at output.
Grid computing is now becoming a specialised field in this method, where the operations are optimised taking into account the diversity of the capabilities of the computer systems themselves.
2006-09-04 20:30:31
·
answer #2
·
answered by natanan_56 2
·
0⤊
0⤋
The ambition of distributed computing enthusiasts is to provide number crunching power for anybody who needs it, at an affordable price, that is, much more cheaply than by buying your own supercomputer, for jobs that require huge amounts of computing grunt. Jobs like image processing, protein folding simulation, SETI data processing, turbulent flow, ray tracing. You'd buy computing power like buying electricity or telecoms services. It would be provided by using the spare processing capacity of home computers connected to the Net. So I download something like the setiathome programme, every couple of days it uploads processed data and downloads raw data, when online. The rest of the time, when it's turned on but I'm not using it, it crunches the data. When I press a key it gives me its undivided attention. All it costs me is a few cents worth of electricity. It costs the user much less than supercomputer time. So for negligible cost or effort, you can help solve scientific and medical problems. It uses the fact that parallel processing, that is, breaking a big number crunching problem down into bite-sized bits, handing the bits out to thousands of small computers and processing the bits simultaneously can be done at least as fast as doing the same job sequentially with an expensive supercomputer. It sounds complicated, but the only difficult part is writing the software and setting it up. Participating is easy and doesn't require expert knowledge.
2006-09-04 20:51:47
·
answer #3
·
answered by zee_prime 6
·
0⤊
0⤋
The distribution of the mean tends to a normal distribution as n tends to ∞. mean ~ N(μ, σ^2/n) So the standard deviation tends to 2.872 / √n For n=64 it's almost normal For n=100, it's closer to being normal.
2016-03-17 08:18:40
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
i support the above answers...
2006-09-04 20:29:19
·
answer #5
·
answered by KPR IT Solutions 2
·
0⤊
0⤋