Computers as we all know was invented by Charles Babbage. In the old days the computers used to be very very big a single unit may take up the space of an entire large room and consume power in order of a few thousand kilowatts. But with gradual development in electronics and semiconductor technology the miniaturisation of the computers began. Today a computer can be as small as a credit card and consume electricity in order of 1 watt or less if we exclude the power consumption of the display unit.
Babbage's Analytical Engine |
Tremendous development happens in the fields of computer processing power it seems to double up every one and half to two years. In the 1960s when one of the co-founders of Intel Gordon Moore was preparing for a speech he noticed that data indicates that the processing power of a computer doubles every two years and that became to be known as Moore's law and it provided a road map to the companies for the future plans of developing the computer processors. Along with the processing power the memory and storage of computers have kept pace today even a smart phone has a Random Access Memory(RAM) of 2Gb or more. A few decades ago even supercomputers did not have that kind of memory and today 1 Tb of computer storage has become common.
Now the above discussion was about hardware. What about software ? From simple programmes in the beginning to later on complex operating systems like Unix , Macintosh and MS-Dos , computer software has come a long way. Earlier programmers used to write codes in Assembly Languge mnemonics that were supplied with the processor of the computer. Now that implied that a programmer had to write a separate code for a different machine that may be using another processor. Also assembly language programming was tedious. This problem was solved by the development of high level programming language like ADA, Fortran, BASIC, Simula,Smalltalk, C etc. This higher level programming languages were more similar to natural languages . There was a compiler of each of this programming languages that would be written specifically for a different processor. The compiler's function is to convert the high level programming language code into machine executable binary code. This made the life of programmers somewhat easier and also made it possible to write complex software.
With the increase in the power provided by the hardware because of the exponential growth of processing power, memory and storage the software started becoming more and more complex. Now to manage this complexity a new paradigm of programming came into existence Object Oriented Programming(OOP) . Object Oriented Programming made writing software easier.
Now for the users of computers , earlier computers was considered as something that would be used by geeks and big businesses and that ordinary people and no use of it. The early computers required the users to remember and type in commands that would make computer do tasks for them. However the co-founder of APPLE Computers Steve Jobs when he visited Xerox PARC'S Palo Alto research center and saw the first example of a Graphical User Interface he was so inspired by it, that the designed it into the Macintosh Operating System. The Graphical User Interface(GUI) changed the world of computers it made computing possible for the ordinary people who had never used computers before.
Macintosh GUI |
Because of the GUI all a user had to do was the to click a button to open a file , click some other tab to save the file. The ordinary user was saved from the difficult task of remembering commands. Because of the GUI users were able to train themselves into using computers rather than depending on anyone. Also the GUI made the computer screen beautiful as compared to earlier green screened computer. The computing from a logical work became something intuitive and creative. The following decades witnessed stupendous growth in computer graphics and animation along with growth of Internet that changed the world.