Firstly, in Parallel Computing, multiple processors communicate with each other using a shared memory. Secondly, in distributed computing, multiple processors are in connection by a communication network. Let us take a look at both of them in brief.
Certainly, in this computing system, multiple processors perform tasks simultaneously. Moreover, memory is share and it provides concurrency and saves time and money.
In this computing, we break down problems into instructions and solve concurrently. In addition, following are the advantages of parallel systems:
Advantages of Parallel Computing
- Firstly, it saves time and money. As resources works simultaneously, it reduces time.
- Secondly, in serial computing larger problems become impractical to solve. Parallel helps in solving it easily.
- Thirdly, It can also take advantage of non local resource when local ones are finite.
- It makes good use of potential computing power .
Types of Parallelism
- Bit level parallelism
- Instruction level parallelism
- Task parallelism
- Data level parallelism
Bit level parallelism is based on the increasing processor size. Moreover, it reduces number of instruction.
Why we need Parallel Computing?
- To manage large data
- For dynamic simulation and modeling
- Saves time and money
- Efficient utilization of resources.
Distributed Computing
Firstly, this system can run on various operating systems and can use various communications protocols. Moreover, can run on hardware that is provided by many vendors. A distributed system can consist of any number of possible configurations, such as mainframes, personal computers.
The distributed data storage is done in following manner:
- Data Replication
- Data Fragmentation
In data replication, data is redundantly stored at two or more sites. Hence, it maintains copies of data. Moreover, this will increase the availability of data at different sites.
In data fragmentation, data gets divided into smaller parts. This offers consistency. Fragmentation is possible in two ways, one is horizontal fragmentation and vertical fragmentation.
Difference
Parallel Computing | Distributed Computing |
Operations are done simultaneously. | System components are available at different locations. |
Single computer | Multiple computer |
Improves system performance | Improves system scalability |
Conclusion
In conclusion, we have learnt that in Parallel Computing, multiple processors communicate with each other using a shared memory. Firstly, it saves time and money. Parallel helps in solving complex problems easily.As resources works simultaneously, it reduces time.
Types of parallelism include bit level, instruction level, data level and task parallelism.
Moreover, in distributed computing, multiple processors are in connection by a communication network. The distributed data storage is done using data replication and data fragmentation. It improves system scalability.
We have learnt some of its advantages, its significance. Lastly, we have seen the difference between parallel computing and distributed computing.