@cmuratori hey, have you explained the difference between int *foo and int &foo in a function parameter on stream?
@cmuratori @nothings @jwatte @thorduragust There is only a _semantic_ difference as far as I know - the actual codegen does not change.
-
-
@cmuratori@nothings@jwatte@thorduragust But in practical terms the semantics is what usually matters to me as a coder :-) -
@cmuratori@nothings@jwatte@thorduragust Totally agreed that voodoo about whether it improves code *speed* are not that interesting. - Show replies
New conversation -
-
-
@cmuratori@nothings@jwatte@thorduragust the promise of non-null references in c++ are bullshit, trivial to break by mistake -
@JimKjellin@cmuratori@jwatte@thorduragust By spec, a well-formed program has no NULL references. But a well-formed program has no NULL - Show replies
New conversation -
-
-
@cmuratori@nothings@thorduragust Turns out, if you follow the intended semantics, the programming language helps you. -
@jwatte@nothings@thorduragust No it doesn't? Either you had a pointer at some point, which means you *'d it to turn it into a reference, - Show replies
New conversation -
-
-
@cmuratori@nothings@jwatte@thorduragust I have seen (shit)code like this: int& v = *pV; if (!&v) { ...; } Can be optimized away as UB ...Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.