The Differences Between Serial and Parallel Processing


Computers are inherently serial. Working in parallel makes a lot more sense in many applications, so the parallel computer was invented -- a set of serial computers working together. Parallel computers can make some jobs go a lot faster, but not all problems automatically run faster on parallel computers, and each problem must be broken up to run in parallel by a highly trained, and very expensive, parallel programmer.

The Sequential CPU

  • The Central Processing Unit (CPU) takes instructions from main memory and executes them one at a time. After executing an instruction, the CPU gets the next instruction and continues to execute instructions serially. It can do anything that the programmer can describe in a sequence of instructions that the computer can understand. It is inherently serial.

Parallel Advantages

  • Serial processing is like using a laundromat that has only one washer and dryer. If you have a lot of laundry, it will take a long time. With enough machines you can do dozens of loads in about the same time as one. However, it is not always possible to break problems into parts that can be run simultaneously. If you are summing or searching through a million numbers you can form groups of 1,000 numbers each and process the groups simultaneously on 1,000 processors, finishing 1,000 times faster than on one processor. Other tasks, like dividing two numbers, must be performed sequentially.

Parallel Problems

  • The first disadvantage of parallel computing is cost. Good serial computers start at about $1,000. Parallel computers start at over $1 million and go up -- way up. Software and trained programmers for parallel computers are also more expensive. Even if a problem can be broken up into parts that can be run at the same time, it can be difficult to coordinate all the parts.

Amdahl's Law

  • Another problem is due to Amdahl's law. Every parallel program has two parts: housekeeping and problem processing. Housekeeping involves the coordination of multiple processors, while problem processing is the actual computation. Amdahl's law states that the percentage of time each processor spends on housekeeping increases with the number of parallel processors. The implication of Amdahl's law is that it is impractical to increase the number of parallel processors beyond a certain point.

Related Searches


  • Photo Credit Dynamic Graphics/Dynamic Graphics Group/Getty Images
Promoted By Zergnet



You May Also Like

Related Searches

Check It Out

Geek Vs Geek: Robot battles, hoverboard drag race, and more

Is DIY in your DNA? Become part of our maker community.
Submit Your Work!