I made the mistake, going in the other direction—tis-interpreter to GCC—of forgetting GCC doesn't have a duty to warn.
Because the behavior is changed. A call which necessarily must fail falsely succeeds.
-
-
The change of behavior is exactly the reason for optimizations.
-
All opt takes place by "as if" rule. Optimizations that violate that are not valid. Changing observable behavior is a bug
-
Sure. Everything boils down to the question what "observable behavior" is.
-
The amount of used memory is not part of observable behavior. In the same way as the execution time is not.
-
I submit this example to youhttps://twitter.com/spun_off/status/731563481007325187 …
-
There are two separate questions here. 1) Objects larger than PTRDIFF_MAX. That's what you discuss mostly. Complex question.
-
2) Optimizations of malloc <= PTRDIFF_MAX. AIUI
@RichFelker's POV is that this opt is wrong no matter what the size. I disagree. -
I have not seen Rich say that (though I did see you and JF Bastien argue as if he had)
- 1 more reply
New conversation -
-
-
Aside from SIZE_MAX/2 there are all sorts of other reasons it might have to fail, like rlimits.
-
Right, that was my next question:-) You are saying that no malloc could be optimized away bc it changes mem footprint?
-
What about stack? Suppose I have a tail recursion which (unoptimized) overflows the stack. Can it be optimized?
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.