A bit (contraction of binary digit) is the basic unit of information in computing and telecommunications; it is the amount of information stored by a digital device or other physical system that exists in one of two possible distinct states. These may be the two stable states of a flip-flop, two positions of an electrical switch, two distinct voltage or current levels allowed by a circuit, two distinct levels of light intensity, two directions of magnetization or polarization, etc.
In computing, a bit can also be defined as a variable or computed quantity that can have only two possible values. These two values are often interpreted as binary digits and are usually denoted by the Arabic numerical digits 0 and 1. Indeed, the term "bit" is a contraction of binary digit. The two values can also be interpreted as logical values (true/false, yes/no), algebraic signs (+/−), activation states (on/off), or any other two-valued attribute. In several popular programming languages, numeric 0 is equivalent (or convertible) to logical false, and 1 to true. The correspondence between these values and the physical states of the underlying storage or device is a matter of convention, and different assignments may be used even within the same device or program.
In information theory, one bit is typically defined as the uncertainty of a binary random variable that is 0 or 1 with equal probability,[1] or the information that is gained when the value of such a variable becomes known.[2]
Tiada ulasan:
Catat Ulasan