Generation of Computer History

1,2,3 and 4th generations , what are they ?

First Generation

The first generation used vacuum tubes for circuity and takes up an entire room. the machine was called Colossus. It produced 5000 characters a second , but it produced a lot of heat. The place it was made in is now the National Museum of Technology. Paper was inputted and out putted from the printer . It took 2 and half thousand volts to it to be used , it was not very useful and destroyed over world war.


Big image

Secong genearion

By 1948, the invention of the transistor greatly changed the computer's development. The transistor replaced the large, cumbersome vacuum tube in televisions, radios and computers. As a result, the size of electronic machinery has been shrinking ever since. The transistor was at work in the computer by 1956. Coupled with early advances in magnetic-core memory, transistors led to second generation computers that were smaller, faster, more reliable and more energy-efficient than their predecessors.The first large-scale machines to take advantage of this transistor technology were early supercomputers, Stretch by IBM (above) and LARC by Sperry-Rand. These computers, both developed for atomic energy laboratories, could handle an enormous amount of data, a capability much in demand by atomic scientists. The machines were costly and tended to be too powerful for the business sector's computing needs, thereby limiting their attractiveness. Second generation computers replaced machine language with assembly language, allowing abbreviated programming codes to replace long, difficult binary codes.1

Third genaration

In this era, there were several innovations in various fields of computer technology. These include Integrated Circuits (ICs), Semiconductor Memories, Microprogramming, various patterns of parallel processing and introduction of Operating Systems and time-sharing. In the Integrated Circuit, division there was gradual progress. Firstly, there were small-scale integration (SSI) circuits (having 10 devices per chip), which evolved to medium scale integrated (MSI) circuits (having 100 devices per chip). There were also developments of multi-layered printed circuits. Parallelism became the trend of the time and there were abundant use of multiple functional units, overlapping CPU and I/O operations and internal parallelism in both the instruction and the data streams. Functional parallelism was first embodied in CDC6600, which contained 10 simultaneously operating functional units and 32 independent memory banks. This device of Seymour Cray had a computation of 1 million flopping point per second (1 M Flops). After 5 years CDC7600, the first vector processor was developed by Cray and it boasted of a speed of 10 M Flops. IBM360/91 was a contemporary device and was twice as first as CDC6600, whereas IBM360-195 was comparable to CDC7600. In case of language, this era witnessed the development of CPL i.e. combined programming language (1963). CPL had many difficult features and so in order to simplify it Martin Richards developed BCPL - Basic Computer Programming Language (1967). In 1970 Ken Thompson developed yet another simplification of CPL and called it B.  Latest news from Asia    Copyright by It-history.net 

Fourth generation

In this generation, there were developments of large-scale integration or LSI (1000 devices per chip) and very large-scale integration or VLSI (10000 devices per chip). These developments enabled the entire processor to fit into a single chip and in fact, for simple systems, the entire computer with processor; main memory and I/O controllers could fit on a single chip. Core memories now were replaced by semiconductor memories and high-speed vectors dominated the scenario. Names of few such vectors were Cray1, Cray X-MP and Cyber205. A variety of parallel architectures developed too, but they were mostly in the experimental stage. As far as programming languages are concerned, there were development of high-level languages like FP or functional programming and PROLOG (programming in logic). Declarative programming style was the basis of these languages where a programmer could leave many details to the compiler or runtime system. Alternatively languages like PASCAL, C used imperative style. Two other conspicuous developments of this era were the C programming language and UNIX operating system. Ritchie, the writer of C and Thompson together used C to write a particular type of UNIX for DEC PDP 11. This C based UNIX was then widely used in many computers. Another event that is mention worthy was the publication of the report by Peter D. Lax in 1982, which was sponsored by the US department and National Scientific Foundation. The Lax report, as it was called, emphasized on the need of initiatives and coordinated national attention in the arena of high performing computing in the US. The immediate response to the Lax report was the establishment of NSF Supercomputing Centers. Other centers that came up later were San Diego Supercomputing Center, National Center for Supercomputing Applications, Pittsburgh Supercomputing Center, John von Neumann Center and Cornell Theory Center. These institutes had really been instrumental in providing computing time on super computers to the students, training them and also helping in the development of software packages.