Tim berners-lee.

He received a knighthood in 2004 from Queen Elizabeth II.
In 2001, Tim Berners-Lee became a Fellow of the Royal Society.
Berners Lee is a British computer scientist who invented the World Wide Web. Tim Berners-Lee is the primary inventor of the World Wide Web, the system of text links that made the Internet accessible to mass audiences. Lee wrote the original Web software himself in 1990 and made it available on the Internet in 1991. He joined MIT's Laboratory for Computer Science in 1994 and remains a leading authority on Internet issues. His 1999 book Weaving the Web described the Web's birth and growth. In 2003, Queen Elizabeth II announced that Berners-Lee would be made a Knight Commander of the Order of the British Empire.

In 2004, Berners-Lee was awarded the first Millennium Technology Prize, a Finland-based award for excellence which carries a cash prize of one million Euros.

Big image

Alan Turing

Alan Turing was born on 23 June, 1912, in London. His father was in the Indian Civil Service and Turing's parents lived in India until his father's retirement in 1926. Turing and his brother stayed with friends and relatives in England. Turing studied mathematics at Cambridge University, and subsequently taught there, working in the burgeoning world of quantum mechanics. It was at Cambridge that he developed the proof which states that automatic computation cannot solve all mathematical problems. This concept, also known as the Turing machine, is considered the basis for the modern theory of computation.
Big image

The 4 Generations Of Computers

The First Gen

The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.

First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.

The Second Gen

Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.

The Third Gen

The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.

Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

The Fourth Gen

The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer-from the central processing unit and memory to input/output controls-on a single chip.

The Fifth Gen

Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality.


A drive is a medium that is capable of storing and reading information that is not easily removed like a disk. The picture is an example of different drives listed in Microsoft Windows My Computer.
Big image


Alternatively referred to as a video display terminal (VDT) and video display unit (VDU), a monitor encompasses a display screen for video images and casing that holds it. In its most common usage, monitor refers only to devices that contain no electronic equipment other than what is essentially needed to display and adjust the characteristics of an image.
Big image


Alternatively referred to as the brain of the computer, processor, central processor, or microprocessor, the CPU (pronounced as C-P-U), short for Central Processing Unit, was first developed at Intel with the help of Ted Hoff in the early 1970's. The computer CPU is responsible for handling all instructions it receives from hardware and software running on the computer.
Big image

Ram Vs Rom

There is one major difference between a ROM and a RAM chip. A ROM chip is non-volatile storage and does not require a constant source of power to retain information stored on it. When power is lost or turned off, a ROM chip will keep the information stored on it. In contrast, a RAM chip is volatile and requires a constant source of power to retain information. When power is lost or turned off, a RAM chip will lose the information stored on it.
Big image


Alternatively referred to as the mb, mainboard, mobo, mobd, backplane board, base board, main circuit board, planar board, system board, or a logic board on Apple computers. The motherboard is a printed circuit board that is the foundation of a computer, located at the bottom of the computer case. It allocates power to the CPU, RAM, and all other computer hardware components. Most importantly, the motherboard allows hardware components to communicate with one another.
Big image


Alternatively referred to as I/O, Input/output is any software or hardware device that is designed to send and receive data to and from a computer hardware component. For example, when referring to hardware, a computer mouse is only an input device, it can send data but doesn't receive any data back, and a computer monitor is an output device, it can display information but doesn't send data back to the computer. A good example of an I/O device is a Modem that connects to the Internet and sends and receives information.
Big image

Touch Screens

A touchscreen is a monitor or other flat surface with a sensitive panel directly on the screen that registers the touch of a finger as an input. Instead of being touch-sensitive, some touchscreens use beams across the screen to create a grid that senses the presence of a finger.
Big image

Raspberry PI

First released in 2012, the Raspberry Pi is a single-board computer created by the Raspberry Pi foundation in the United Kingdom. It was designed to be a small, cheap computer that could be used in schools to teach science to children. Since its introduction, almost four million Raspberry Pis have been sold.

These machines use a 700 MHz ARM processor, and models can come equipped with either 256MB or 512MB of RAM. They can support up to five USB 2.0 ports. Raspberry Pis may utilize a MicroSD card or flash memory (up to 4GB) for non-volatile storage, and uses HDMI for digital audio and video output. There is no on-board network interface, but they support wireless and Ethernet networking via USB.

Big image

3D Printer

Created by Charles W. Hull in 1984, the 3D printer is a sophisticated printing device that uses a design from a digital image to produce an identical physical object using materials such as metal alloys, polymers, or plastics.

An object's design typically begins in a computer aided design (CAD) software system, where its blueprint is created. The blueprint is then sent from the CAD system to the printer in a file format known as a Stereolithography (STL), which is typically used in CAD systems to design 3D objects. The printer then reads the blueprint in cross-sections and begin the process of recreating the object just as it appears in the computer aided design. In the picture below is an example of a 3D printer called the FlashForge.

Big image

Evaluation Finn O'Sullivan

You have put in a lot of information and added photos but could of put them in relevant places eg put the 1st generation of computers picture next to the 1st generation of computers text and don't put the photos of the the computer generation before you've finished it if you wanted to do it like that. also you could have made the buttons look nice.

But over all it was very factual and interesting