@taradinoc Because you may be doing it in an assertion. Ie., "assert that this enum is in range when I get it". That's extremely common.
-
-
Replying to @cmuratori
@taradinoc Now all of those would have to be casted to integer types first.1 reply 0 retweets 0 likes -
Replying to @cmuratori
@cmuratori The compiler can choose the size of the enum to match the values, so that comparison may not be meaningful.1 reply 0 retweets 0 likes -
Replying to @taradinoc
@taradinoc It _can_, but it _did not_. The largest value allowed for functioning of the software in this case was 64.1 reply 0 retweets 0 likes -
Replying to @cmuratori
@taradinoc So it warned on "enum < 64", even though it was not using some magical hardware 6-bit type for future operations.3 replies 0 retweets 0 likes -
Replying to @cmuratori
@cmuratori For example, GCC with -fstrict-enums may do optimizations that eliminate your tests, by assuming your enums are always in range.1 reply 0 retweets 0 likes -
Replying to @taradinoc
@taradinoc That is fine in an optimized build but not in a debug build, and if the compiler did that with no option to turn it off...3 replies 0 retweets 0 likes -
Replying to @cmuratori
@cmuratori It is (now) optional in GCC, but either behavior is conformant, so clang is justified in warning about it...1 reply 0 retweets 0 likes -
Replying to @taradinoc
@taradinoc Justifying counterproductive behavior by pointing to the spec is one of the big reasons GCC sucks. Compilers shouldn't do it.1 reply 0 retweets 0 likes -
Replying to @cmuratori
@cmuratori If you just want constants for a type that's otherwise int, why not use#define? ISTM this optimization is what makes enum useful1 reply 0 retweets 0 likes
@taradinoc #define's cannot be scoped, nor can then be automatically defined as monotonically increasing by 1. They're a poor substitute.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.