??? 01/28/08 14:23 Read: times |
#150012 - Corrected Art of correct Constants Responding to: ???'s previous message |
Esko Ilola said:
Neil Kurzman said:
And an int times an int = int with a 16 X 16 multiply That is assuming that int is a 16 bit signed integer. True - so the correct rule is: int times an int = int with a int multiply
(long)int times an int = int with a long-int multiply |
Topic | Author | Date |
The art of typecasting | 01/01/70 00:00 | |
I'd like to disagree | 01/01/70 00:00 | |
two comments | 01/01/70 00:00 | |
Two comments on comments | 01/01/70 00:00 | |
The Art of correct Constants | 01/01/70 00:00 | |
int - number of bits unknown | 01/01/70 00:00 | |
Corrected Art of correct Constants | 01/01/70 00:00 | |
More assumptions | 01/01/70 00:00 | |
stdint.h | 01/01/70 00:00 | |
Or, in the absence of stdint.h... | 01/01/70 00:00 | |
Const vs #define | 01/01/70 00:00 | |
Opps! | 01/01/70 00:00 | |
which may be a reason to prefer #define over const | 01/01/70 00:00 | |
enum | 01/01/70 00:00 | |
Varies | 01/01/70 00:00 | |
Know Thy Stuff - enum | 01/01/70 00:00 | |
You are right | 01/01/70 00:00 | |
Know Thy Stuff | 01/01/70 00:00 | |
And... | 01/01/70 00:00 | |
I'll pass(cal) on that one :) | 01/01/70 00:00 | |
Everything in C defaults to int, which is signed | 01/01/70 00:00 | |
Never overlook lint | 01/01/70 00:00 | |
before Steve says it... | 01/01/70 00:00 | |
Pascal | 01/01/70 00:00 | |
strong typing | 01/01/70 00:00 | |
why Ada never took off | 01/01/70 00:00 | |
Wirth-less | 01/01/70 00:00 | |
Why ? | 01/01/70 00:00 |