Sir Tim Berners-Lee
Tim Berners-Lee was born on the 8th June 1955, and is 59.
He is the director and developer of the World Wide Web.
He liked studying trains as a child, which was how he came to his interest in Electronics. He went to the university of Oxford, where he received a degree in Physics.
In 2009, Tim Berners-Lee worked with the government to make the web and data more accessible.
He received a knighthood in 2004 from Queen Elizabeth
The First Generation
First Generation (1940-1956) Vacuum Tubes
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.
The Second Generation
Second Generation (1956-1963) Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output. technology. The first computers of this generation were developed for the atomic energy industry.
The Fourth Generation
Second Generation (1956-1963) Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.
The first computers of this generation were developed for the atomic energy industry.
The Fifth Generation
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.
DID YOU KNOW...?
An interrogated circuit is a small electronic device made out of a semiconductor material. The first integrated circuit was developed in the 1950s by Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor.
Alan Mathison Turing, OBE 23 June 1912 – 7 June 1954) was a British pioneering computer scientist, mathematician, logician, cryptanalyst, philosopher, mathematical biologist, and marathon and ultra distance runner. He was highly influential in the development of computer science, providing a formalisation of the concepts of "algorithm" and "computation" with the Turing machine, which can be considered a model of a general purpose computer. Turing is widely considered to be the father of theoretical computer science and artificial intelligence.
He committed suicide when he was 41 by drinking cyanide. In 2009 priminister, Gordon Brown, made an official apology.
What is a Moniter?
The monitor is the piece of computer drive that displays the video and graphics information generated by the computer through the video card.
Monitors are very similar to televisions but usually display information at a much higher resolution.
Monitors are display devices external to the computer case and connect via a cable to a port on the video card or motherboard. Even though the monitor sits outside the main computer housing, it is an essential part of the complete system..
Monitors come in two major types - LCD or CRT. CRT monitors look much like old-fashioned televisions and are very deep in size. LCD monitors are much thinner, use less energy, and provide a greater graphics quality.
LCD monitors have completely obsoleted CRT monitors due to their higher quality, smaller "footprint" on the desk, and decreasing price.
Most monitors are in a widescreen format and range in size from 17" to 24" or more. This size is a diagonal measurement from one corner of the screen to the other.
A hard drive is nothing more than a magnetised storage area. Your Operating System is stored on your hard drive, as well as all your software programmes, like Microsoft Word. The hard drive is actually a few circular disks stacked one on top of the other. A little arm moves over the disks and writes things to these circular platters, and reads them. When you save a file or create a new folder, think of these circular platters being written to and you'll have a basic idea of just what your hard drive is.
A hard drive is given a letter of the alphabet for convenience sake, and in most computers this will be the letter C. That's why the hard drive is popularly know as the C Drive. There are other drives on your computer. These are the usual drives on modern Personal Computers:
Alternatively referred to as the brain of the computer, processor, or microprocessor, the CPU (pronounced as C-P-U), short for Central Processing Unit , was first developed at Intel with the help of Ted Hoff in the early 1970's. The computer CPU is responsible for handling all instructions it receives from hardware and software running on the computer.
Note: Many new computer users may improperly call their computer and sometimes their monitor the CPU. When referring to your computer or monitor, it is proper to refer to them as either the "computer" or "monitor" and not a CPU.
RAM and ROM
A computer uses two types of storage. A main store consisting of ROM and RAM, and backing stores which can be internal, eg hard disk, or external, eg a CD or USB flash drive.
ROM and RAM
Main store (or computer memory) is divided into Read Only Memory (ROM) and Random Access Memory (RAM).
ROM is memory that cannot be changed by a program or user. ROM retains its memory even after the computer is turned off. For example, ROM stores the instructions for the computer to start up when it is turned on again.
RAM is a fast temporary type of memory in which programs, applications and data are stored. Here are some examples of what's stored in RAM:
- the operating system
- the graphical user interface (GUI)
If a computer loses power, all data stored in its RAM is lost.
Alternatively referred to as the mb, mainboard, mobo, mobd, backplane board, base board, main circuit board, planar board, system board, or a logic board on Apple computers. The motherboard is a printed circuit that is the foundation of a computer, located at the bottom of the computer case. It allocates power to the CPU, RAM, and all other computer hardware components. Most importantly, the motherboard allows hardware components to communicate with one another.
Below is a picture of the ASUS P5AD2-E motherboard with names of each major component of the motherboard. Clicking on the image below gives you a larger more detailed version of the picture below.
What is a Raspberry Pi?
The Raspberry Pi is a low cost, credit-card sized device that plugs into a computer monitor or TV, and uses a standard keyboard and mouse. It is a capable little device that enables people of all ages to explore computing, and to learn how to program in languages like Scratch and Python. It’s capable of doing everything you’d expect a desktop computer to do, from browsing the internet and playing high-definition video, to making spreadsheets, word-processing, and playing games.
What’s more, the Raspberry Pi has the ability to interact with the outside world, and has been used in a wide array of digital maker projects, from music machines and parent detectors to weather stations and tweeting birdhouses with infra-red cameras. We want to see the Raspberry Pi being used by kids all over the world to learn to program and understand how computers work.
3D printing or additive manufacturing is a process of making three dimensional solid objects from a digital file. The creation of a 3D printed object is achieved using additive processes. In an additive process an object is created by laying down successive layers of material until the entire object is created. Each of these layers can be seen as a thinly sliced horizontal cross-section of the eventual object.
It all starts with making a virtual design of the object you want to create. This virtual design is made in a CAD (Computer Aided Design) file using a 3D modeling program (for the creation of a totally new object) ór with the use of a 3D scanner (to copy an existing object). This scanner makes a 3D digital copy of an object and puts it into a 3D modeling program.