Skip to content

Fetch large data for reports slow #6543

Open
@omkar-tenkale

Description

@omkar-tenkale

I'm trying to fetch more than 100,000 rows for report generation.
I tried these approaches,

  1. Recursive calls with 1,000 limit to fetch all rows
  2. Using query.each() and push row in array

Both the calls are slow,

For cloud code query

query = //new query
query.equalTo("SOMEKEY","SOMEVALUE")

For total of 1,500,000 rows
In cloud code,

In recursive call approach, there is approx 1 second waiting time per 1000 rows.

In query.each() approach this query takes approx 60 seconds to process, http waiting time

I assume this is fastest parse can work.

Can i see some improvement by indexing the mongodb objectId column?

Is there a better approach for processing such huge data.

One alternative I can think of is parse aggregate but this approach doesn't respect parse authentication, security, acl etc. and doesn't work with existing query logic.
Will this show any performance improvement and worth a try?

Metadata

Metadata

Assignees

No one assigned

    Labels

    type:featureNew feature or improvement of existing feature

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions