is there an efficient way, in python, to read many (10^5) small files to memory? using sequential open() -> major IO bottleneck, even w/ SSD
-
-
guh, I'd dumb. 'sequential' inaccurate. have 24 parallel processes; each calls open, performs task on binary input, repeat.
-
jc, *I'm*. anyway hope was that there might be a way to mass-access drive data rather than repeated, maybe-colliding reads.
- 1 more reply
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.