6 Gens of Computers!
An information leaflet on how the computer envolved
First gen
on the better side... wait their is no good side!!!
in short the fist gen computers were simply bad. VERY VERY VERY BAD.
the first computers capable of programmable electronics were designed in 1943, by Thomas Flowers. it was used to crack German cyphers.
Tommy Flowers
Website: http://www.historylearningsite.co.uk/tommy_flowers.htm
The Second Gen
The transition from first generation to second generation of computers was not sudden There was a huge development in technology, designs and programming languages. transistor technology formed the basis of the electronic switches and the switching time came down to around 0.3 microseconds.
In the field of programming languages, there were smart introductions like FORTRAN (1956), ALGOL (1958) and COBOL (1959). The second generation also witnessed the development of two supercomputers - i.e. the most powerful devices amongst the peers. These two were the Liverpool Atomic Research Computer (LARC) and IBM7030. These machines overlapped memory operations with processor operations and had primitive type of parallel processing. Some of the important commercial machines of this era were IBM 704, 709 and 7094. The later introduced I/O processing.
the second gen lasted from 1954 to 1962
The Third Gen
A computer built with small-scale integration integrated circuits, designed after the mid-1960s.
Third generation computers use semiconductor memories in addition to, and later instead of, ferrite core memory. The two main types of semiconductor memory are Read-Only Memory (ROM) and read-and-write memories called random-access memory (RAM).
A technique called micro programming became widespread and simplified the design of the CPU and increased their flexibility. This also made possible the development of operating systems as software.
A variety of techniques for improving processing efficiency were invented, such as pipe lining, (two things at the same time processed as a single instruction), and multiprocessing ( execution of multiple programs done at the same time ).
As the execution of a program requires that program to be in memory, the running of several programs at the same time requires that all programs be in memory simultaneously. Thus the development of techniques for processing at the same time was matched by the development of memory management techniques such as dynamic memory allocation, virtual memory, and paging.
The LILLIAC IV is an example of a third generation computer.
The CTSS (Compatible Time-Sharing System) was developed at MIT in the early 1960s and had a considerable influence on the design of subsequent time sharing operating systems.
The Fourth Gen
In this generation, there were developments of large-scale integration or LSI (1000 devices per chip) and very large-scale integration or VLSI (10000 devices per chip). These developments allowed the entire processor to fit into a single chip and in fact, for simple systems, the entire computer with processor; main memory and I/O controllers could fit on a single chip.
Core memories now were replaced by semiconductor memories and high-speed vectors dominated the scene. Names of few such vectors were Cray1, Cray X-MP and Cyber205.
As far as programming languages are concerned, there were development of high-level languages like FP or functional programming and PROLOG (programming in logic). Declarative programming style was the basis of these languages where a programmer could leave many details to the system. Alternatively languages like PASCAL, C used a different style.
Another event that is mention worthy was the publication of the report by Peter D. Lax in 1982, which was sponsored by the US department and National Scientific Foundation. The Lax report, as it was called, emphasized on the need of initiatives and coordinated national attention in the arena of high performing computing in the US.
P.S. for full version pleases check the video description for the links
The Fifth Gen
The Fifth Generation Computer Systems project (FGCS) was an initiative by Japan's Ministry of International Trade and Industry, begun in 1982, to create a computer using massive computing/processing. It was to be the result of a massive government/industry research project in Japan during the 1980s. It aimed to create a supercomputer like performance and to provide a platform for future developments in artificial intelligence. There was also an unrelated Russian project also named as fifth-generation computer (see Kronos (computer)).
Sixth gen
Of all those changes that have taken place in the field of computer technology, some changes are sudden whereas others took time.In the current period, this transition from one period to another is clear because most of them are gradual advancements of an already established system. This present generation of computer technology is highly related with parallel computing and several growth areas has been noticed in this area, in both hardware part and in the better understanding of how to develop algorithms to make full use of massive parallel architectures.
Though vector system is equally in use, it is often thought that the future would be dominated by parallel systems. However, there are several devices where there are combinations of parallel-vector designs. Fujitsu Corporation is planning to build a system with more than 200 vector processors. Another goal of this sixth generation is to get Teraflops i.e. ten arithmetic operations per second and that can be done by building up a system with more than thousand processors. Currently, the processors are constructed with a combination of RISC, pipe lining and parallel processing.
Networking technology is spreading rapidly and one of the most unknown growths of the sixth generation computer technology is the huge growth of WAN. For regional network, T1 is the standard and the national "backbone" uses T3 to interconnect the regional networks. Finally, the rapid advancement and high level of awareness regarding computer technology is greatly because of the two laws. Just like the Lax report of 1982, the High Performance Computing Act of 1991, Information Infrastructure, and technology Act of 1992 have strengthened and ensured the high performance computing. The former has ensured the establishment of high performance computing and communications programming (HPCCP) and the later has reinforced the need of making leading edge technologies available to academicians right from kindergarten up to graduation level.
Apple Worldwide Developers Conference
Appleās renowned developer community will come togetherat WWDC to learn about the future of iOS and OS X.
WWDC features more than 100 technical sessions, over 1,000 Apple engineers,hands-on labs, and the Apple Design Awards.
Developers can apply for tickets to attend WWDC and millions worldwidewill be able to watch sessions streamed live.
WWDC Scholarships are available to students and members ofparticipating STEM organizations around the world.
Monday, Jun 8, 2015, 09:00 PM
San Francisco, CA, United States
Tim Berners-Lee
Website: http://www.w3.org/People/Berners-Lee/
Twitter: @timberners_lee
Alan Turing
Website: http://www.turing.org.uk/
Bill Gates
Website: http://www.gatesnotes.com/
Facebook: https://www.facebook.com/BillGates
Twitter: @BillGates