Byte definition

Byte





Home | Index


We love those sites:

4 definitions found

From WordNet (r) 2.0 [wn]:

  byte
       n : a sequence of 8 bits (enough to represent one character of
           alphanumeric data) processed as a single unit of
           information

From Jargon File (4.3.1, 29 Jun 2001) [jargon]:



  byte /bi:t/ n. [techspeak] A unit of memory or data equal to the
     amount used to represent one character; on modern architectures this is
     usually 8 bits, but may be 9 on 36-bit machines. Some older
     architectures used `byte' for quantities of 6 or 7 bits, and the PDP-10
     supported `bytes' that were actually bitfields of 1 to 36 bits! These
     usages are now obsolete, and even 9-bit bytes have become rare in the
     general trend toward power-of-2 word sizes.
  
     Historical note: The term was coined by Werner Buchholz in 1956 during
     the early design phase for the IBM Stretch computer; originally it was
     described as 1 to 6 bits (typical I/O equipment of the period used 6-bit
     chunks of information). The move to an 8-bit byte happened in late 1956,
     and this size was later adopted and promulgated as a standard by the
     System/360. The word was coined by mutating the word `bite' so it would
     not be accidentally misspelled as {bit}. See also {nybble}.
  
  

From The Free On-line Dictionary of Computing (27 SEP 03) [foldoc]:

  Byte
       
           A popular computing magazine.
       
          {Home (http://www.byte.com)}.
       
          (1997-03-27)
       
       

From The Free On-line Dictionary of Computing (27 SEP 03) [foldoc]:

  byte
       
           /bi:t/ (B) A component in the machine {data hierarchy}
          usually larger than a {bit} and smaller than a {word}; now
          most often eight bits and the smallest addressable unit of
          storage.  A byte typically holds one {character}.
       
          A byte may be 9 bits on 36-bit computers.  Some older
          architectures used "byte" for quantities of 6 or 7 bits, and
          the PDP-10 and IBM 7030 supported "bytes" that were actually
          {bit-fields} of 1 to 36 (or 64) bits!  These usages are now
          obsolete, and even 9-bit bytes have become rare in the general
          trend toward power-of-2 word sizes.
       
          The term was coined by Werner Buchholz in 1956 during the
          early design phase for the {IBM} {Stretch} computer.  It was a
          mutation of the word "bite" intended to avoid confusion with
          "bit".  In 1962 he described it as "a group of bits used to
          encode a character, or the number of bits transmitted in
          parallel to and from input-output units".  The move to an
          8-bit byte happened in late 1956, and this size was later
          adopted and promulgated as a standard by the {System/360}
          {operating system} (announced April 1964).
       
          James S. Jones  adds:
       
          I am sure I read in a mid-1970's brochure by IBM that outlined
          the history of computers that BYTE was an acronym that stood
          for "Bit asYnchronous Transmission E__?__" which related to
          width of the bus between the Stretch CPU and its CRT-memory
          (prior to Core).
       
          Terry Carr  says:
       
          In the early days IBM taught that a series of bits transferred
          together (like so many yoked oxen) formed a Binary Yoked
          Transfer Element (BYTE).
       
          [True origin?  First 8-bit byte architecture?]
       
          See also {nibble}, {octet}.
       
          [{Jargon File}]
       
          (2003-09-21)
       
       

















Powered by Blog Dictionary [BlogDict]
Kindly supported by Vaffle Invitation Code Get a Freelance Job - Outsource Your Projects | Threadless Coupon
All rights reserved. (2008-2024)