#### In quantum computing, a **qubit** or **quantum bit** (sometimes **qbit**) is the basic unit of quantum information: 0

0 (zero) is a number, and the numerical digit used to represent that number in numerals. It fulfills a central role in mathematics as the additive identity of the integers, real numbers, and many other algebraic structures. As a digit, 0 is used as a placeholder in place value systems. Names for the number 0 in English include zero, nought (UK), naught (US) (/nɔːt/), nil, or—in contexts where at least one adjacent digit distinguishes it from the letter “O”—oh or o (/oʊ/). Informal or slang terms for zero include zilch and zip. Ought and aught (/ɔːt/), as well as cipher, have also been used historically.

0 is the integer immediately preceding 1. Zero is an even number because it is divisible by 2 with no remainder. 0 is neither positive nor negative. Many definitions include 0 as a natural number, in which case it is the only natural number that is not positive. Zero is a number which quantifies a count or an amount of null size. In most cultures, 0 was identified before the idea of negative things (i.e., quantities less than zero) was accepted.

As a value or a number, zero is not the same as the digit zero, used in numeral systems with positional notation. Successive positions of digits have higher weights, so the digit zero is used inside a numeral to skip a position and give appropriate weights to the preceding and following digits. A zero digit is not always necessary in a positional number system (e.g., the number 02). In some instances, a leading zero may be used to distinguish a number.

View more: