Closed
Description
Issue Description
On Parse using Aurora Postgres:
When saving an array with either a 0 or some decimal value (ex. 1.1) to a field of type Array, those number values are automatically converted to strings somehow. Any other integer is fine. For example, the array becomes ["0", "1.1", 2, 3, 4, 5] when expecting [0, 1.1, 2, 3, 4, 5]. I have tried a lot of weird and random things to try to force the value to an int (like using parseInt(0)), but nothing seems to work. The save is done using .set (for example: record.set('array', [0, 1.1, 2, 3, 4, 5])).
Steps to reproduce
- Create an array with 0s or decimals inside. e.g. [0, 1.1, 2, 3, 4, 5]
- Set an object with that array. e.g. record.set('array', [0, 1.1, 2, 3, 4, 5]);
- Save the object e.g. record.save()
Expected Results
record of Table/Class to have [0, 1.1, 2, 3, 4, 5] set in 'array' field
Actual Outcome
["0", "1.1", 2, 3, 4, 5]
Environment Setup
-
Server
- parse-server version (Be specific! Don't say 'latest'.) : 2.8.2
- Localhost or remote server? (AWS, Heroku, Azure, Digital Ocean, etc): AWS/Localhost
-
Database
- MongoDB version: N/A, Aurora PostgreSQL 10.4
- Localhost or remote server? (AWS, mLab, ObjectRocket, Digital Ocean, etc): [AWS]
Logs/Trace
N/A