What is a Bit


In this age of jargonization, one has to know a lot of words and acronyms in every field that we come across. With specialized fields of knowledge emerging every day, new technical terms keep cropping up. Bits and bytes are terms which you often come across when dealing with anything in electronic or computing field.

What is a Bit and Byte?
A bit is actually an acronym which stands for 'Binary Digit'. It is the smallest possible unit of information in digital computing. Computers do not use decimal numbers to store data. In computers all data is stored in binary numbers. They are all based on binary digital logic. Every bit can take only two characteristic values either 0 or 1. For computers and in digital communication, a bit is the smallest amount of information stored in binary form. In digital telecommunication too, all the voltage levels are converted into binary form of data or bits.

The origin of the term 'binary digit' or 'bit' is attributed to John Tukey, a scientist who worked at the Bell Laboratories who first used it in 1947. Since, then the term has been in use in the world of computers. A byte is a string of 8 bits put together. A byte is therefore a bigger unit of information than a bit. The term 'byte' was first used and coined by Dr. Werner Buccholz, a computer scientist working at IBM in 1956.

Just as, including zero, decimal number system is based on ten numbers, the binary number system has just two numbers 0 and 1. All the data that a computer processes is in the form of 0s and 1s. These bits are represented by dual voltage levels in digital communication. If you have watched the science fiction movie 'Matrix', you'll remember how the whole matrix is seen as an imaginary digital world, made up of 0s and 1s by 'Neo', the protagonist of the movie. Setting the fictional part apart, that is actually how computers see data, which is a stream of 0s and 1s or stream of 'bits'.

Computer converts all data into bits and bytes through alphanumeric and decimal to binary conversion. So for the computer, alphabets and letters are all represented in bits. That is, bit is a letter of the computer language, while a byte is a word (made up of 8 letters)! So speaking the machine language or digital language, is speaking in bits and bytes! Interestingly, a four bit binary word is called a 'nibble', because it is half a byte!

Let us see various instances where the terms 'bit' and 'bytes' are used. You must have come across the terms bits and bytes when checking out capacity of data storage devices or bandwidth of your Internet connection. The capacity of the computer hard disk is given in giga bytes (abbreviated to GB) usually. A GB or gigabyte is a billion bytes or eight billion bits. Data transfer rates are always mentioned in bits. Internet is an ocean of bits and bytes.

Computer chips are of two kinds: 32 bit and 64 bit. This denotes the amount of data that can be processed by the chips or read at a time. The Internet bandwidth is measured in kilobytes (thousand bytes) per second, that is 'kbps' or mega bytes per second (MBPS).

The twenty first century is the age of information technology and therefore, bits and bytes are terms which will be increasingly heard around the world even in the future.

Blog Archive