Skip to content

memory error when skipping rows #8681

Closed
@nkulki

Description

@nkulki

I have a file with over 100Million rows. When I do
pd.read_csv(filename, skiprows=100000000, iterator=True)
python crashes with a memory error. I have 32 gigs of memory and python eats up all that memory!!

Metadata

Metadata

Assignees

No one assigned

    Labels

    IO CSVread_csv, to_csvPerformanceMemory or execution speed performance

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions