Computer History.

Integrated Circuits.

Transistors were miniaturized and placed on silicon chips, called semiconductors, which increased the speed and efficiency of computers.Instead of punched cards and printouts, users started using third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different programmes at one time with a central program that checked the memory. Computers for the first time became usable to people because they were smaller and cheaper.

Transistors Computers.

Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. Computers became smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation. The transistor still produced a big amount of heat that subjected the computer to damage; it was a big improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.

Microprocessors.

In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUI s, the mouse and handheld devices.

Artificial Intelligence.

Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today.