I've heard via a Turing award winner that there was a time when (most?) people did analysis via wall clock time. Is that wrong?
-
-
-
His claim is that he was ridiculed at conferences when he started using asymptotic analysis, but I can't find evidence either way
- 1 more reply
New conversation -
-
-
That doesn't refute the claim -- it says nothing about what people talked about at CS conferences or what was being published.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
As expected from stackexchange, the first answer cites two examples which are explicitly called out as invalid answers in the question.
-
Last time I looked into this I came away thinking big-O was far older than computer science itself.
End of conversation
New conversation -
-
-
One motivation: Hartmanis&Stearns'65, speed-up theorem. Thinking of ch.1 (end)notes in http://theory.cs.princeton.edu/complexity/book.pdf … & thm's corollaries.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
1913: „Where O(1/n) stands for a function which never exceeds a constant multiple of 1/n.“ https://www.jstor.org/stable/1988602?seq=1#page_scan_tab_contents …
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
In Germany we view CS as a branch of maths, so we couldn't even really ask the question like that.
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.