Efficiently editing large input file based on simple lookup with python dataframes
Efficiently editing large input file based on simple lookup with python dataframes Question: I have a very large txt file (currently 6Gb, 50m rows) with a structure like this… **id amount batch transaction sequence** a2asd 12.6 123456 12394891237124 0 bs9dj 0.6 123456 12394891237124 1 etc… I read the file like this… inputFileDf = pd.read_csv(filename, header=None, …