my data was too big for google sheets, so i installed and learned python. but now it's too big for my computer, python won't run it. off to buy a new computer i guess?
Write a function which reads only say 100 lines, processes them, then deletes them from memory and only keeps a basic summary. That way you never overload memory with too much at once
At a very low level, there's a concept of "buffers" - bits of memory where you can read arbitrary parts of files into memory without loading the whole file. From this concept, there are ways to partially load lots of filetypes/split up files into smaller files for processing.
You should use a stream.
Maybe you should use awk instead of python, since it does exactly this by default, it will save you from having to write a lot of code.
Are you using the default csv module in python?
I'm pretty sure that already does read one row at a time.
If you can process that one row at a time with a for loop or generator expression, that is very likely to use less memory.
Haven't tested but think if you do "with open(file) as f: for line in f:" then the file gets treated as an iterable and the whole file doesn't get loaded, it just goes through it a line at a time.