If I had to place a bet on this being the worst fire season ever in California, or not, my chips are shifting in a not good direction. Drought monitor from today: 26% of the state is in exceptional drought. Up from 16% last week. https://droughtmonitor.unl.edu/data/jpg/20210525/20210525_ca_text.jpg …
But that is only true at reasonable speeds, if the speed is high enough you can infer the outcome regardless of the random component (because of the sigmoid relationship). Is this the case for wildfires?
-
-
I guess we may not know if this is a "first ever" situation
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
-
-
Probably depends on what kind of metric we use to define "worse outcomes" (in both cases). If it's area burned, then probably not--that has largely to do w/random weather. If, instead, the metric is something like "average fire severity/intensity," then probably yes.
-
And metrics like lives lost/homes burned are even more dictated by random elements (though usually time-invariant ones, like whether fires cluster near high risk/dense populations). Think Oakland Hills in '91: only ~1,500 acres but 25 deaths & >3000 structures burned.
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.