An introduction to binary
Binary is the very first computer language that has been used to create the computing system. It was divided up into memory and how much memory it could take. The computer understood this by a series of 1s and 0s. It started off with a bit of card or "memory" and if it had a hole in it, it was a 1 and if not it was a 0. Each one and zero counts as a bit. There are 8 bits to a code of Binary every time and each place in the binary code has an amount to it. The first bit is 1, then 2, then 4, then 8, then 16, then 32, then 64, and lastly 128. You only use this to see how much value the binary code has, whenever it is a 1 the number of the binary code must be added into the other amounts of ones that there are. For example 00001001 is 9.
1 bit = 1 bit, 4 bits = 1 nibble, 8 bits = 1 byte, 1024 bytes = kilobyte, 1024 kilobytes = 1 megabyte, 1024 megabytes = 1 gigabyte, 1024 gigabytes = 1 terabyte. The common mistake for this is everybody thinks that it always goes up by 1000 but once you get to bytes it goes up by 1024 a very awkward number.