"In the respective words of four experienced modellers, the code is “deeply riddled” with bugs, “a fairly arbitrary Heath Robinson machine”, has “huge blocks of code – bad practice” and is “quite possibly the worst production code I have ever seen”."https://twitter.com/DavidDavisMP/status/1259421989648904197 …
The proper way to do this is to do a lot of runs and report the average and standard deviation. (Which they tell you to do, but really the code should do it itself.) But with the transmissibility data being garbage, the output has to be garbage anyway.
-
-
The proper way to do this is to have a deterministic output for a given pseudo-random seed and inputs. Sure, run it with different seeds, but do you genuinely not understand that ICL's position is that it's non-deterministic for the same seed, or why that's a problem?
-
The extra non-determinism is only with the multithreaded version, which means it comes from OS scheduling. It's just calls to the random number generator happening in a different order, not some strange Heisenbug that renders the whole output questionable.
- 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.