Menu
Log in

Computer Engineering Concepts


1.1          History

Since the computer is basically a computing device, the history of the computer begins with the history of computing devices. Computing itself has two main aspects. One aspect deals with the methods used for calculations, and the other aspect deals with the tools that facilitate these calculations. The following timeline lists some of the major advancements in the history of computing, covering both the methods and the tools. From the timeline it can be seen that the work of George Boole is an example of a method, and the work of Blaise Pascal is an example of a tool.

300 AD       The abacus speeds up arithmetic computation

1617           Napier’s bones, speed up large multiplication.

1642           Blaise Pascal’s machine performs simple arithmetic using linked dials.

1673           Leibniz’s calculator uses gears to speed up multiplication and division.

1822           Charles Babbage’s mechanical system calculates mathematical tables.

1854           George Boole introduces the mathematics of two states with logic.

1936           Claude Shanon finds the link between logic and electric circuits.

1937           George Stibitz creates the binary adder.

1941           Konrad Zuse designs and builds the first electrical computing device.

1945           John Von Neuman outlines the elements of a computer.

1946           J. P. Echert and J. Mauchly unveil the first digital computer. The ENIAC.

1947           J. Bardeen, W. Shockly, and W. Brattain invent the transistor. 

1954           IBM introduces the first mass produced computer.

1955           Bell introduces the first computer that uses transistors.

1971           Intel introduces the microprocessor.

1975           Bill Gates and Paul Allen create the first PC software.

1977           Apple computers enter the PC market

1981           IBM enters the PC market.

1980's        Local Area Networks (LAN’s) are introduced.             

1990's        The Internet grows exponentially.


The IBM Personal Computer of the 1980’s

Fig 1.1.  An IBM personal computer of the 1980's


Over the years the uses and popularity of computers has increased to the point where it is now found in most homes,  businesses, and carried around in our pockets in the form of smartphones. With this continued increase in the use of computers, the interest in the study of computing has also increased. Today the study of computing is divided in two main areas: hardware and software. The study of hardware deals with the tangible (things that can be touched) aspect of computing. It is focused on the physical components of a computer. Software on the other hand is the study of the intangible aspect of computing, it is focused on the instructions given to the computer.


This division of computing into hardware and software is similar to our learning environment, which is made up of the tangible objects (like a chair, desks, and markers) and the intangible aspects (like knowledge, ideas, and concepts. This division of computing into hardware and software, for the most part, allows the two branches to develop independently of one another. The area of study that focuses mainly on software is commonly referred to as computer science, and the area of study that focuses on hardware is commonly referred to as computer engineering. It should be noted that the two areas are connected to each other, and the two branches should not be considered as independent of each other, but instead, they should be thought of as complementing each other. A good understanding of software requires knowledge of hardware. Similarly a good understanding of hardware requires knowledge of software. For a thorough understanding of computers a good knowledge of both areas is essential.

As computer use continues to increase in our society, so does the applications of computer technology in different areas. This trend is a permanent one. Our world has now moved into the digital age where the preferred means of information representation and transfer for all types of media is using digital signals. This trend is seen in TVs, movies, telephony, publications, and in various other areas. As this digital trend continues, it creates a need for workers who are knowledgeable with the technology at all levels; the hardware level, the software level, and the user level. The need for workers in this field may decrease and increase in the short term, but in the long term the need is an increasing one. This trend therefore holds significant employment opportunities for those with the technical knowledge and understanding of computers.








GlobalEduTech Solutions

Powered by Wild Apricot Membership Software