Random-access memory (RAM) is a type of computer data storage. A RAM device makes is possible to access data in random order, which makes it very fast to find a specific piece of information. Certain other types of storage are not random-access. For example, a hard disk drive and a CD read and write data in a predetermined order. The mechanical design of these devices prescribes that data access is consecutive. This means that the time it takes to find a specific piece of information can vary greatly depending on where it is located on the disk.

RAM devices are used in computer systems as the main memory. RAM is considered volatile memory, which means that the stored information is lost when there is no power. So RAM is used by the central processing unit (CPU) when a computer is running to store information that needs to be used very quickly, but it does not store any information permanently.

Present-day RAM devices use integrated circuits to store information. This is a relatively expensive form of storage and the cost per unit of storage ($ per gigabyte) is much higher than for devices like a hard drive. However, the time to access data is so much faster for RAM that speed outweighs cost. A computer therefore uses a certain amount of RAM for fast-access, temporary storage of information and a much larger amount of non-random, permanent mass storage like a hard disk drive. For example, a typical computer system may have 2-8 GB (gigabytes) of RAM, while the storage capacity of the hard disk drive can be several hundred GB or even one TB.

A Bit of History

The earliest form of RAM goes back to the very first computers in the 1940s. Magnetic-core memory relied on an array of magnetized rings. Data could be stored by magnetizing each ring individually. Each ring was wired separately, which resulted in fairly large installations. A single ring could store a single bit of data and the direction of magnetization indicated 0 or 1.

Technological advances resulted in smaller devices that could store more information, but relied on the same principle. The memory unit in the photograph below is about 10 x 10 cm and could store 1,024 bits. That is very small by today's standards, but it was state-of-the-art in the 1960s.

The real breakthrough for computer memory came in the 1970s with the invention of solid-state memory in integrated circuits. This uses very small transistors, making it possible to store a lot more information on a very small area. However, this increase in memory density came at the cost of volatility: a constant power supply is needed to maintain the state of each transistor. Today's RAM still relies on this same principle.

Types of RAM

Several types of RAM are in use today. Dynamic RAM (DRAM) is by far the most widely used. It stores each bit of data using a transistor and capacitor pair. Combined, they represent a single memory cell. The capacitor holds a low or a high charge, representing a 0 or 1, respectively. Static RAM (SRAM) uses four or more transistors to store a single bit of data. Different combinations represent a state of 0 or 1. The term static refers to the fact that it maintains its current state without having to be refreshed on a regular basis. Dynamic RAM, on the other hand, needs to be refreshed to maintain the small electric charge on each capacitor. Both types of RAM, however, are volatile in the sense that they lose the information when there is no longer a power supply.