And yet, no mention of this analysis made it into the paper until (from my understanding) an FOI request? I can think of a single sentence to add to the paper to explain this finding, but they didn't do it
-
-
Replying to @GidMK
Here is the deaths per year in the relevant age groups in Sweden between 2010-2020. Can be explored here https://www.statistikdatabasen.scb.se/pxweb/sv/ssd/START__BE__BE0101__BE0101I/DodaHandelseK/ … I don't see how presenting numbers and the explaining that it is likely noice helps anyone?pic.twitter.com/5morhzWvNQ
1 reply 0 retweets 1 like -
Replying to @JacobGudiol @GidMK
Data presenting cause of deaths in different age groups according to deaths certificates for the first 6 months was also published by SCB in November There were no deaths <20 years where covid-19 was the suspected cause
1 reply 0 retweets 1 like -
Replying to @JacobGudiol
I'm confused here, because clearly the researchers knew it was an issue otherwise why investigate?
1 reply 0 retweets 1 like -
Replying to @GidMK @JacobGudiol
Let me put it this way - this was clearly something that the authors knew about, investigated, and considered to be an issue that potentially derailed the narrative. And yet, not addressed at all in the paper, except lumped in with other numbers. That is a bad look
1 reply 0 retweets 3 likes -
Replying to @GidMK @JacobGudiol
It is potentially entirely a fluke. It is also possible that excess mortality in this age group was influenced by limited hospital capacity, or that under-reporting of COVID in children was a problem. There is some uncertainty there
1 reply 0 retweets 1 like -
Replying to @GidMK @JacobGudiol
It's also worrying that what the email itself describes is still basically p-hacking - only reporting the analyses that agree with your argument is literally the definition of p-hacking
1 reply 0 retweets 2 likes -
Replying to @GidMK
Both the analysis agrees with the argument. We know that now. There is no real excess death among school kids in Sweden It is just that the other analysis required a lot more explaining in order for it not to be used in a misleading way, as it is now by science magazine etc
2 replies 0 retweets 1 like -
Replying to @JacobGudiol @GidMK
Say you did some sort of a subanalysis of your IFR data and first found something that looked weird, not fitting with your general conclusion.
1 reply 0 retweets 1 like -
Replying to @JacobGudiol @GidMK
Would you not investigate that further before publishing it? And if you found out that it was just randomness, not even some systematic bias in some of the date or similar. Would you still report it? Just to tell people "I also found this but it was nothing"
2 replies 0 retweets 2 likes
Yes absolutely. That's one of the reason we had so many supplementary materials in our most recent IFR paper - we did so many analyses!
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.