CBCS Mathematical Physics Lab: Computer Architecture and Organization

Dear Readers, Computers are seen everywhere around us, in all spheres of life. May it be the field of education and research, travel and tourism, weather forecasting, social networking, e-commerce or any other, computers have now become an indispensable part of our lives. Before entering the CBCS Mathematical Physics Lab course in details, I want to say few words about fundamental of Computer.

CBCS Mathematical Physics Lab: Computer Architecture and Organization

The manner, in which computers have revolutionised our lives because of their accuracy and speed of performing a job, is truly remarkable. Today no organization can function without a computer. In fact various organizations are trying to become paper free owing to benefits of computers. But the computers of today have evolved over the years from a simple calculating device to the portable high speed computers that we see today.

In the first lecture of CBCS Mathematical Physics Lab, I have started with a very tiny hypothetical computer through which I tried to introduced most of the terms that are used in computer organization and architecture, and the working principle of a computer. Also I tried to explain the concept of execution of a program in this computer. In this course material, the term computer always means digital computer, and these concepts are not related to analog computer.

It also contains some historical information regarding the evaluation of first generation computer and the changes of technologies to achieve the computer of current generation.

Computer technology has made incredible improvement in the past half century. In the early part of computer evolution, there were no stored-program computer, the computational power was less and on the top of it the size of the computer was a very huge one.

Today, a personal computer has more computational power, more main memory, more disk storage, smaller in size and it is available in affordable cost.

This rapid rate of improvement has come both from advances in the technology used to build computers and from innovation in computer design. In this course we will mainly deal with the innovation in computer design.[bg_collapse view=”button-red” color=”#f7f5f5″ expand_text=”Show More” collapse_text=”Show Less” ]

The task that the computer designer handles is a complex one: Determine what attributes are important for a new machine, then design a machine to maximize performance while staying within cost constraints. This task has many aspects, including instruction set design, functional organization, logic design, and implementation.

While looking for the task for computer design, both the terms computer organization and computer architecture come into picture.

It is difficult to give precise definition for the terms Computer Organization and Computer Architecture. But while describing computer system, we come across these terms, and in literature, computer scientists try to make a distinction between these two terms.

Computer architecture refers to those parameters of a computer system that are visible to a programmer or those parameters that have a direct impact on the logical execution of a program. Examples of architectural attributes include the instruction set, the number of bits used to represent different data types, I/O mechanisms, and techniques for addressing memory.

Computer organization refers to the operational units and their interconnections that realize the architectural specifications. Examples of organizational attributes include those hardware details transparent to the programmer, such as control signals, interfaces between the computer and peripherals, and the memory technology used.

At first we will touch upon all those factors and finally come up with the concept how these attributes contribute to build a complete computer system.[/bg_collapse]

CBCS Mathematical Physics Lab: Evolution of Computers

The growth of computer industry started with the need for performing fast calculations. The manual method of computing was slow and prone to errors. So attempts were made to develop faster calculating devices. The journey that started from the first
calculating device i.e. Abacus has led us today to extremely high speed calculating devices. Let us first have a look at some early calculating devices and then we will explore various generations of computer.

Abacus

Abacus was discovered by the Mesopotamians in around 3000 BC. An abacus consisted of beads on movable rods divided into two parts. Addition and multiplication of numbers was done by using the place value of digits of the numbers and position of beads in an abacus.

The Chinese further improved on the abacus so that calculations could be done more easily. Even today abacus is considered as an apt tool for young children to do calculations. In an abacus, each row is thought of as a ten’s place. From right to left, row no-1 represents the one’s column and the second column represents ten’s place. The third column represents the hundred’s place and so on. The starting position of the top beads (representing the value of five) is always towards the top wall of the abacus while the lower beads (representing the value of one) will always be pushed towards the lower wall as a starting position.

Napier’s Logs and Bones

The idea of logarithm was developed by John Napier in 1617. He devised a set of numbering rods known as Napier’s Bones through which both multiplication and division could be performed. These were numbered rods which could perform multiplication of any number by a number in the range of 2-9. There are 10 bones corresponding to the digits 0-9 and there is also a special eleventh bone that is used to represent the multiplier. By placing bones corresponding to the multiplier on the left side and the bones corresponding to the digits of the multiplicand on the right, the product of two numbers can be easily obtained.

 

Pascaline

Blaise Pascal, a French mathematician invented an adding machine in 1642 that was made up of gears and was used for adding numbers quickly. This machine was also called Pascaline and was capable of addition and subtraction along with carry-transfer
capability. It worked on clock work mechanism principle. It consisted of various numbered toothed wheels having unique position values. The addition and subtraction operations was performed by controlled rotation of these wheels.

 

Leibnitz’s Calculator

 

In 1673 Gottfried Leibnitz, a German mathematician extended the capabilities of the adding machine invented by Pascal to perform multiplication and division as well. The multiplication was done through repeated addition of numbers using a stepped cylinder each with nine teeth of varying lengths.

 

 Jacquard’s Loom

In order to make the cotton weaving process automatic, Joseph Jaquard devised punch cards and used them to control looms in 1801. The entire operation was under a program’s control. Through this historic invention, the concept of storing and retrieving
information started.

Difference engine and Analytical Engine

Charles Babbage, an English mathematician developed a machine called Difference Engine in 1822 which could calculate various mathematical functions, do polynomial evaluation by finite difference and theoretically could also solve differential equations.

Thereafter in 1833, he designed the Analytical Engine which later on proved to be the basis of modern computer. This machine could perform all the four arithmetic operations as well as comparison. It included the concept of central processor, memory
storage and input-output devices. Even the stored information could be modified. Although the analytical engine was never built that time but Babbage established the basic principles on which today’s modern computers work.

Both these great inventions earned him the title of ‘Father of Modern Computers’.

Mark 1

In 1944 Prof Howard Aiken in collaboration with IBM constructed an electromechanical computer named Mark 1 which could multiply two 10 digit numbers in 5 seconds. This machine was based on the concept of Babbage’s Analytical engine and was the first operational general purpose computer which could execute preprogrammed instructions automatically without any human intervention.

 

 

In 1945, Dr. John Von Neumann proposed the concept of a stored program computer. As per this concept the program and data could be stored in the same memory unit. The basic architecture of the Von Neumann computer is shown in the figure below

According to Von Neumann architecture, the processor executes instructions stored in the memory of the computer. Since there is only one communication channel, the processor at a time can either fetch data or an instruction. That means at one point of time either the data or an instruction can be picked (fetched) from the storage unit for execution by the processor. Hence execution takes place in sequential manner. This limitation of Von Neumann Computer is known as Von Neumann bottleneck. EDVAC (Electronic Discrete Variable Automatic Computer) was the first stored program computer developed in 1952. After the invention of first electronic computer ENIAC (Electronic Numerical Integrator and Calculator) in 1946, the computer technology improved tremendously and at very fast pace.

Main Menu                                                                                    Next Page                                                                                                                    

Leave a Reply

Your email address will not be published.