From @YaleVM.YCC.YALE.EDU:LOJBAN@CUVMB.BITNET Sat Aug 14 04:38:58 1993 Received: from ELI.CS.YALE.EDU by NEBULA.SYSTEMSZ.CS.YALE.EDU via SMTP; Sat, 14 Aug 1993 14:36:55 -0400 Received: from YALEVM.YCC.YALE.EDU by eli.CS.YALE.EDU via SMTP; Sat, 14 Aug 1993 14:36:50 -0400 Message-Id: <199308141836.AA18397@eli.CS.YALE.EDU> Received: from YALEVM.CIS.YALE.EDU by YaleVM.YCC.Yale.Edu (IBM VM SMTP V2R2) with BSMTP id 4943; Sat, 14 Aug 93 14:35:32 EDT Received: from YALEVM.CIS.YALE.EDU (NJE origin LISTSERV@YALEVM) by YALEVM.CIS.YALE.EDU (LMail V1.1d/1.7f) with BSMTP id 3385; Sat, 14 Aug 1993 14:34:42 -0400 Date: Sat, 14 Aug 1993 11:38:58 -0700 Reply-To: Jeff Prothero Sender: Lojban list From: Jeff Prothero Subject: bits and binits X-To: lojban@cuvmb.cc.columbia.edu To: Erik Rauch Status: O X-Status: | Another issue on which I've been having a think. | I agree with F. Baube[tm] that computer implementation shouldn't be part | of the definition of a bit; it is, after all, a binary digit (etymologically | too!), and was first used in communication theory. I propose we call it a | truth-quantum: jetka'u. Please note that the information-theoretic concept of a 'bit' is quite unrelated to the computer science concept of a 'bit'. Last time I took a class which needed to discuss both, we used 'binit' for 'binary digit' and 'bit' for the information-theoretic term. (If it had been a CSci class, maybe CSci would have gotten 'bit' *grin*.) An information-theoretic 'bit' is just enough information to settle a yes-no question which has an a priori 50-50 probability of going either way. A computer-science 'bit' is any system with two stable state, in essence. The above being logically but not pragmatically sufficient to elucidate the difference, let me add some examples: A blank "100 megabyte" hard disk has approximately one billion zero binits on it; but it holds zero bits of information, since you know a priori that all binits are zero and learn nothing by examining any individual one of them. Physics establishes a deep, fundamental connection between the entropy and information content of a system, and hence a quantitative relationship between bits and energy. Bits have a meaning embedded in the nature of our universe. Binits are by comparison a rather arbitrary human convention. (Yes, one can argue that the ability to embed binary arithmetic is a property of our universe, and why we expect any other technological civilization to have discovered binary also. But this is a very different point.) One binit is always sufficient to hold one bit of information, but one binit actually holds one bit only in the limit of perfect compression and such... in general, each binit will hold less than one bit. (Precisely, a binit's information content is 1/log2(p) where p is the a priori probability of the bit being found in it's current state, if I'm not phasing out here.) Every system and communication channel has a precise theoretical bit-content/bit-rate: the number of binits needed to represent it depend on the algorithms and conventions used to encode it. Confusing bits and binits is in essence as great an error as confusing a name with the object named... Jeff