For 30+ years, particle physicists' predictions for where breakthrough discoveries are waiting have been wrong. False promises based on such flawed predictions erode trust in physics - and science in general. Particle physicists should fix this problem.https://www.nytimes.com/2019/01/23/opinion/particle-physics-large-hadron-collider.html …
-
-
Replying to @skdh
1) I recently visited a "big" high energy accelerator (don't want to reveal since I actually admire people working there). I was taken aback at shoddy data preservation practices. It is astonishing that data from experiments that cost billions of dollars are never logged
2 replies 2 retweets 19 likes -
Replying to @AnimaAnandkumar @skdh
I terms of data archiving, I am not aware of a single lumiblock lost so far. Where we struggle, is to move the SW to process it 20yrs along with technology changes. We can always process past data with past SW, but maintaining every branch of tens of millions lines of code costs.
2 replies 0 retweets 8 likes -
Replying to @SaltyBurger @skdh
I did not want to name the lab, so practices can differ from what you have seen. But even at
@CERN I thought data is preprocessed before storing. The problem is this processing is not invertible. And scientifically not the optimal thing to do1 reply 0 retweets 1 like -
We write raw data (i.e. unprocessed bytestream) from the detector directly onto tape for permanent storage. No loss there.
2 replies 0 retweets 3 likes
I see. In this other lab, that was not the practice. They are using some old software code to transform raw data before storing. The problem is no one understands what exactly this code does and why this is the correct transformation to use. But they just use it due to legacy
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.