my data was too big for google sheets, so i installed and learned python. but now it's too big for my computer, python won't run it. off to buy a new computer i guess?
Don't load the whole thing into memory at once. Go line by line and do whatever analysis you need. Or use a library with sparse statistics capabilities. You can divide into chunks worst case and recombine once processed.
how do i divide or go line by line without loading the whole thing in? I have a giant csv file and i need to get it into python in order to even chunk it up, right?
If you're doing certain things, like linear regression, you can process one data pt at a time and still get the same result (see link). But for most models that wouldn't work, at least not exactly.
Thus usage if "in situ" (Latin for in place, where it was already situated) is not normal for data science and programming.
However, Lysander's advice can be clarified and refined by suggesting that you process the data into a form (format, file type) which is amenable …