Evil proposed compiler optimization: in __attribute(__pure__)__ functions, optimize malloc(n) to 0.
It's not under the program's control. Aside from necessarily failing after somewhere on the order of 2^(CHAR_BIT*sizeof(void*)) objects are in existence, it's up to the implementation if/when malloc fails. There is no context where it "can't fail".
-
-
The program is under no obligation to attempt to create so many objects simultaneously. Even if it did so, it's under no obligation to call that __attr__((const)) function in such a state.
-
I think you misread. The point was that under extreme conditions the implementations is forced to make malloc fail, but it's never forced to make it succeed. Failure is always an option.
-
In particular on any implementation where the invoking user can control resource limits, it's always possible for any particular call to malloc to fail.
-
That isn't true and you know it isn't true. Classic linked list malloc that never returns memory to the OS will always succeed if you just freed a block of the size you're trying to allocate.
-
The nature of the malloc implementation isn't under the application's control. On an implementation with shared libs it can change just by upgrade. Failure is always potentially-reachable.
-
It's under the user's control. In particular, it's not under the compiler's control at the time it's making decisions about how to mess with the programmer's __attr__((const)) function. I think the only avenue of argument here is that malloc/free have a forbidden "effect."
-
If it's under the user's control, user input changes the result of the function, => nonpure.
-
That ship already sailed. nexttoward() is __attr__((const)) in my headers yet it's exposed as a weak symbol => can be overriden by user via LD_PRELOAD and similar => nonpure, according to you. :)
- 6 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.