Description
I'm trying to fetch more than 100,000 rows for report generation.
I tried these approaches,
- Recursive calls with 1,000 limit to fetch all rows
- Using query.each() and push row in array
Both the calls are slow,
For cloud code query
query = //new query
query.equalTo("SOMEKEY","SOMEVALUE")
For total of 1,500,000 rows
In cloud code,
In recursive call approach, there is approx 1 second waiting time per 1000 rows.
In query.each() approach this query takes approx 60 seconds to process, http waiting time
I assume this is fastest parse can work.
Can i see some improvement by indexing the mongodb objectId column?
Is there a better approach for processing such huge data.
One alternative I can think of is parse aggregate but this approach doesn't respect parse authentication, security, acl etc. and doesn't work with existing query logic.
Will this show any performance improvement and worth a try?