is a basic unit of information in computing and digital communications. A bit can have only one of two values, and may therefore be physically implemented with a two-state device. These values are most commonly represented as either a 0or1
. The term bit
is a portmanteau of binary digit
. In information theory, the bit is equivalent to the unit shannon
, named after Claude Shannon.
The two values can also be interpreted as logical values (true/false, yes/no), algebraic signs (+/−), activation states (on/off), or any other two-valued attribute (near/far, night/day, cat
/dog, ...). The correspondence
between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program. The length of a binary number may be referred to as its bit-length
In information theory, one bit is typically defined as the uncertainty
of a binary random variable
that is 0 or 1 with equal probability, or the information that is gained when the value of such a variable
computing, a quantum bit
or qubit is a quantum system that can exist in superposition of two classical (i.e., non-quantum) bit values.
for bit, as a unit of information, is either simply bit
(recommended by the IEC 80000-13:2008 standard) or lowercase b
(recommended by the IEEE 1541-2002 standard). A group of eight
bits is commonly called one byte
, but historically the size of the byte is not strictly defined.