A hypothetical disease the existence of which has been deduced from the observation that unused programs or features will often stop working after sufficient time has passed, even if “nothing has changed”. The theory explains that bits decay as if they were radioactive. As time passes, the contents of a file or the code in a program will become increasingly garbled.
People with a physics background tend to prefer the variant “bit decay” for the analogy with particle decay.
There actually are physical processes that produce such effects (alpha particles generated by trace radionuclides in ceramic chip packages, for example, can change the contents of a computer memory unpredictably, and various kinds of subtle media failures can corrupt files in mass storage), but they are quite rare (and computers are built with error detection circuitry to compensate for them). The notion long favoured among hackers that cosmic rays are among the causes of such events turns out to be a myth.
Bit rot is the notional cause of software rot.
See also computron, quantum bogodynamics.
adjective (computing) (of central processing units) able to be built up in sections to form complete central processing units with various word lengths architecture A technique for constructing a processor from modules, each of which processes one bit-field or “slice” of an operand. Bit slice processors usually consist of an ALU of 1, 2, 4 […]
brace (def 3). noun the handle or stock of a tool into which a drilling bit is fixed
bit gauge. a device for stopping a bit when it has reached a desired depth.
a simple contiguous sequence of binary digits transmitted continuously over a communications path; a sequence of data in binary form. noun (computing) a sequence of digital data transmitted electronically bit stream or bitstream (bĭt’strēm’) The transmission of binary digits as a simple, unstructured sequence of bits.