Multiple things. The most important is the compiler's inability to handle objects > PTRDIFF_MAX in a consistent manner.
No, if your seg'd arch has 16-bit size_t, you can't have individual allocations >64k, just larger total mem.
-
-
I don't see where this would be required. size_t is defined via sizeof.
-
Specification of lib functions. E.g. strlen is specified to return length, not "length, converted to size_t".
-
If you want to just say it's UB or unspecified result if it doesn't fit, that's even more awful and unusable.
-
Sure it's UB. In the same way as printf of >INT_MAX chars.
-
Only way you can claim it's UB is by omission of a clear statement what happens. In any case, unusably bad impl.
-
This is very different from printf >INT_MAX, since there's no conceptual upper bound on printf output length.
-
If you limit width/precision then I think SIZE_MAX * SIZE_MAX should work as an upper bound.
-
The standard doesn't limit them, though. So printf("%999999999999999999999999999999999999999999999999999999d",0)
- 3 more replies
New conversation -
-
-
For example malloc(20000) can succeed more than 3 times.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.