Closed
Description
For the same netCDF file used in #821, the value scaling seems to be wrongly applied to compute float64 surface temperature values from a (signed) short
variable analysed_sst
:
short analysed_sst(time=1, lat=3600, lon=7200);
:_FillValue = -32768S; // short
:units = "kelvin";
:scale_factor = 0.01f; // float
:add_offset = 273.15f; // float
:long_name = "analysed sea surface temperature";
:valid_min = -300S; // short
:valid_max = 4500S; // short
:standard_name = "sea_water_temperature";
:depth = "20 cm";
:source = "ATSR<1,2>-ESACCI-L3U-v1.0, AATSR-ESACCI-L3U-v1.0, AVHRR<12,14,15,16,17,18>_G-ESACCI-L2P-v1.0, AVHRRMTA-ESACCI-L2P-v1.0";
:comment = "SST analysis produced for ESA SST CCI project using the OSTIA system in reanalysis mode.";
:_ChunkSizes = 1, 1196, 2393; // int
Values are roughly -50 to 600 Kelvin instead of 270 to 310 Kelvin. It seems like the problem arises from misinterpreting the signed short raw values in the netCDF file.
Here is a notebook that better explains the issue: https://github.com/CCI-Tools/sandbox/blob/4c7a98a4efd1ba55152d2799b499cb27027c2b45/notebooks/norman/xarray-sst-issues.ipynb