Skip to content

QST: How to convert large queryset into dataframes in optimized way? #55371

Open
@hitesh-scanova

Description

@hitesh-scanova

Research

  • I have searched the [pandas] tag on StackOverflow for similar questions.

  • I have asked my usage related question on StackOverflow.

Link to question on StackOverflow

https://stackoverflow.com/questions/77222590/how-to-convert-large-queryset-into-dataframes-in-optimized-way

Question about pandas

A django queryset of more than 5 lakhs records when converting into dataframes is getting into OutOfMemory issue.
query_data = MyObjects.objects.filter().values()
df = pd.DataFrame.from_records(query_data)
query_data is returning around 6 lakh rows but pandas is unable to process this much data and got stuck. How can we optimize or prevent this.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Needs TriageIssue that has not been reviewed by a pandas team memberUsage Question

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions