Email: Password: Remember Me | Create Account (Free)

Back to Subject List

Old thread has been locked -- no new posts accepted in this thread
???
01/28/08 13:38
Read: times


 
#150008 - int - number of bits unknown
Responding to: ???'s previous message
Neil Kurzman said:
And an int times an int = int with a 16 X 16 multiply


That is assuming that int is a 16 bit signed integer.

The 'C' does not explicitily delare the number of bits in any of it's integral data types. That is always compiler and platform dependant issue.

An integer (int) may have any number of bits right from 8 up to 64.

The same applies to notation with 'U', 'UL', and 'L' as datatype long is usually 32 bit integer but it might also be a 64 bit one - again depending on the platform and the compiler.

So the best solution is to use typecasting with derived data types that are declared explicitily for the current compiler/platform and do typecasting with those.

A sample:
typedef unsigned char uint8_t;  // works on all compilers
typedef signed char int8_t;     // might not work on all compilers
typedef unsigned short uint16_t;// Assumes 16 bit short
// and so forth  ...







List of 28 messages in thread
TopicAuthorDate
The art of typecasting            01/01/70 00:00      
   I'd like to disagree            01/01/70 00:00      
   two comments            01/01/70 00:00      
      Two comments on comments            01/01/70 00:00      
   The Art of correct Constants            01/01/70 00:00      
      int - number of bits unknown            01/01/70 00:00      
         Corrected Art of correct Constants            01/01/70 00:00      
         More assumptions            01/01/70 00:00      
         stdint.h            01/01/70 00:00      
            Or, in the absence of stdint.h...            01/01/70 00:00      
      Const vs #define            01/01/70 00:00      
         Opps!            01/01/70 00:00      
            which may be a reason to prefer #define over const            01/01/70 00:00      
               enum            01/01/70 00:00      
                  Varies            01/01/70 00:00      
                     Know Thy Stuff - enum            01/01/70 00:00      
                        You are right            01/01/70 00:00      
   Know Thy Stuff            01/01/70 00:00      
      And...            01/01/70 00:00      
         I'll pass(cal) on that one :)            01/01/70 00:00      
   Everything in C defaults to int, which is signed            01/01/70 00:00      
   Never overlook lint            01/01/70 00:00      
      before Steve says it...            01/01/70 00:00      
         Pascal            01/01/70 00:00      
            strong typing            01/01/70 00:00      
               why Ada never took off            01/01/70 00:00      
                  Wirth-less            01/01/70 00:00      
               Why ?            01/01/70 00:00      

Back to Subject List