# Bit

**Home * Programming * Data * Bit**

**Bit**,

the basic unit of information in information theory and computing - a binary digit, either 0 or 1 in the arithmetical sense, 'false' or 'true' in the boolean sense, black (dark) or white (light) as a Color in Chess, etc..

# Quote

by Claude Shannon in *A Mathematical Theory of Communication* 1948 ^{[2]}:

The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey. A device with two stable positions, such as a relay or a flip-flops circuit, can store one bit of information.

# Aggregations

Aggregations of bits are used to code numbers, integers or floating point values, characters, codes and sets. Four bits are called a Nibble with 16 states - written as one hexadecimal digit {'0'..'9', 'A'-'F'}. A group of eight Bits, two Nibbles or one Byte with 256 states (e.g. unsigned numbers 0..255) is most often the smallest addressable unit in computer architectures. Bitboards are set-wise bit aggregations which covers all 64 squares of a Chessboard.

# Bitwise Arithmetic

Bitwise addition (Modulo 2) and subtraction with aggregations of Bits without overflows can be applied by bitwise exclusive or:

a | b | a xor b |
---|---|---|

0 | 0 | 0 |

0 | 1 | 1 |

1 | 0 | 1 |

1 | 1 | 0 |

# See also

# External Links

- Bit from Wikipedia
- Bit (disambiguation) from Wikipedia
- Nat (information) from Wikipedia
- Ban (information) from Wikipedia
- Qubit from Wikipedia
- shannon (unit) from Wikipedia
- Bits by Lawrence J. Krakauer

# References

- ↑ Bits by Lawrence J. Krakauer
- ↑ Claude Shannon (
**1948**).*A Mathematical Theory of Communication*, pdf reprint