What's your theory for why programmers reach for printf-debugging more frequently than other modes (like step-debugging)?
-
-
I'd say they also often have this misconception that what they try to fix is too small of a thing to bring out the big guns.
-
Agreed. I used to have an inertia around using the debugger - possibly because I thought the config/setup was a hassle. By the time I’d logged everything out and run the process 10x with various outputs it was way slower overall.
End of conversation
New conversation -
-
-
fflush(stdout); //


Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Is also the time most learn about output buffering.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
I think we can make more sophisticated debugging faster and as reliable.
-
it’s also a pattern that’s easily applicable in virtually all systems you’ll ever encounter. Vs debuggers all work slightly differently and you have to learn things every single time.
End of conversation
New conversation -
-
-
stderr to the rescue!
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Old habits die hard and new things are scary to learn
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Except that one time early in my career when I discovered printf is non-reentrant
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
It's also available in every programming environment. I've even used pixel-coloring for debugging in cases where string printing wasn't available (e.g., certain DOS video modes).
-
Stated differently, it's effectively the highest abstraction for "programming debugging", making it easy to support and for knowledge transfer across environments.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.