Binary math - whats the word for this

Discussion in 'The Lounge - Off Topic' started by supernova, Jan 22, 2009.

  1. supernova

    supernova Gigabyte Poster

    1,422
    21
    80
    I know what MSB and LSB are!

    But if you look at a binary sequence (right to left) what do you call the first '1'
    I believe there is a word(s) used to describe this (for this definition) and I can’t remember or find it.

    Any bright sparks know?

    I would really like to use it as correct terminology in a paper I am writing
     
    Certifications: Loads
    WIP: Lots
  2. dwhyte85

    dwhyte85 Nibble Poster

    54
    1
    39
    i keep thinking [binary] order of magnitude, although i have something beginning with Q stuck in my brain - i think binary order of magnitude defines the first (far left bit) whether it's a negative or positive binary value for 2s compliment?? - don't take that as fact, it's probably way off - 4 years since i did any real math with binary. I guess it could be called signed too?
     
    Certifications: Bsc. Comp Sci, MCP, MCTS, MCSA, CCENT, MBCS
    WIP: ICND 2, CEH and converting MCSA to MCITP: Enterprise Administrator
  3. BosonMichael
    Honorary Member Highly Decorated Member Award 500 Likes Award

    BosonMichael Yottabyte Poster

    19,183
    500
    414
    In my binary explanations, I usually used terms like "bit position" and "bit weight" to describe them.
     
    Certifications: CISSP, MCSE+I, MCSE: Security, MCSE: Messaging, MCDST, MCDBA, MCTS, OCP, CCNP, CCDP, CCNA Security, CCNA Voice, CNE, SCSA, Security+, Linux+, Server+, Network+, A+
    WIP: Just about everything!
  4. supernova

    supernova Gigabyte Poster

    1,422
    21
    80
    yeah I’ll think I’ll just be using "the first binary "1" bit" for now.

    Its just that in the back of my mind i thought there may be proper terminology as its something you often look at whilst coding/designing software and systems.

    thanks guys
     
    Certifications: Loads
    WIP: Lots
  5. dmarsh
    Honorary Member 500 Likes Award

    dmarsh Petabyte Poster

    4,305
    503
    259
    I'm only aware of Most Significant Bit (MSB) and Least Significant Bit (LSB).

    The little or big endian is generally whether its left or right byte order in memory.

    Modern computers seem to have standardised on the MSB being on the left, in other words "Bit endianness" is little-endian (low-bit first).

    Of course you are correct that MSB normally is used to signify order of magnitude not position, just in case it is a novel architecture.

    If its a signed number then its often called the sign bit.

    If its just binary data with no numeric meaning then I guess neither MSB or sign bit are strictly accurate but I expect the MSB term probably still gets used.

    Never seen binary math done with the MSB on the right in any text so I would have thought the MSB term would suffice.

    Maybe put in a few diagrams at the beginning if you want to make it really clear.

    Wikipeidia has some good stuff :-

    http://en.wikipedia.org/wiki/Bit_numbering

    Maybe you just need to state 'assuming LSB-0 architecture this would be MSB' etc...
     
  6. supernova

    supernova Gigabyte Poster

    1,422
    21
    80

    01111111

    MSB = 0

    111011011

    MSB = 1

    01111111

    LSB = 1

    0110110

    LSB = 0

    big-endian and little-endian reverses MSB to right most and LSB left most

    When we talk about MSB we often refer to big-endian systems ie MSB = left most

    However, MSB is positional in terms of the highest valued bit position and not its actional state

    I need to explain

    000111
    010000
    001110
    111111
    000000


    doing my head in ... i should just forget it, it may pop into my head latter...

    I am sure i have written about this before .....may get my logic/digital design books out later

    Thanks

    Andrew
     
    Certifications: Loads
    WIP: Lots
  7. dmarsh
    Honorary Member 500 Likes Award

    dmarsh Petabyte Poster

    4,305
    503
    259
    Well peoples deffinitions but the most significant bit is the bit position which carries the highest significance or magnitude.

    Its not necessarily related to its current state.

    Hence LSB-0 architecture means an architecture where registers in the machine treat bit in position zero as being of lowest magnitude ie it means zero or one in numeric terms.

    Both LSB-0 and MSB-0 architectures seem to assume MSB on left and just change the bit position numbering so they effectively dictate the meaning of both the numbering and the MSB term.

    Let me know if you find what you were looking for, I'd be interest to know !

    Bit numbering conventions :-
    http://zytrax.com/tech/protocols/ip...D5D30EAB=0&bcsi_scan_filename=ip-classes.html

    Most people seem to use 'Power of 2' or 'LSB-0', otherwise they have a preface/appendix with a diagram and numbering or refer to a processor architecture.

    Good doc on binary here for those that always seem to be asking !
    http://www.xcprod.com/titan/XCSB-DOC/binary.html
     
  8. supernova

    supernova Gigabyte Poster

    1,422
    21
    80
    Thats why its not the correct term i am looking for.
     
    Certifications: Loads
    WIP: Lots
  9. dmarsh
    Honorary Member 500 Likes Award

    dmarsh Petabyte Poster

    4,305
    503
    259
    Whats wrong with using the term 'highest set bit' after you have defined the bit endian type ?
     

Share This Page

Loading...
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.