Skip to content

Int64 with null value mangles large-ish integers #30268

Closed
@craigdsthompson

Description

@craigdsthompson

Code Sample, a copy-pastable example if possible

import pandas as pd
import numpy as np

x = 9999999999999999
y = 123123123123123123
z = 10000000000000543
s = pd.Series([1, 2, 3, x, y, z], dtype="Int64")
s[3] == x  # True
s
s2 = pd.Series([1, 2, 3, x, y, z, np.nan], dtype="Int64")
s2[3] == x  # False
s2
np.iinfo(np.int64).max

With interpreter output:

>>> import pandas as pd
>>> import numpy as np
>>> 
>>> x = 9999999999999999
>>> y = 123123123123123123
>>> z = 10000000000000543
>>> s = pd.Series([1, 2, 3, x, y, z], dtype="Int64")
>>> s[3] == x  # True
True
>>> s
0                     1
1                     2
2                     3
3      9999999999999999
4    123123123123123123
5     10000000000000543
dtype: Int64
>>> s2 = pd.Series([1, 2, 3, x, y, z, np.nan], dtype="Int64")
>>> s2[3] == x  # False
False
>>> s2
0                     1
1                     2
2                     3
3     10000000000000000
4    123123123123123120
5     10000000000000544
6                   NaN
dtype: Int64
>>> np.iinfo(np.int64).max
9223372036854775807

Problem description

It seams that the presence of np.nan values in a column being typed as Int64 causes some non-null values to be mangled. This seems to happen with large-ish values (but still below the max int limit).

Given that Int64 is the "Nullable integer" data type, null values should be allowed, and should certainly not silently change the values of other elements in the data frame.

Expected Output

>>> import pandas as pd
>>> import numpy as np
>>> 
>>> x = 9999999999999999
>>> y = 123123123123123123
>>> z = 10000000000000543
>>> s = pd.Series([1, 2, 3, x, y, z], dtype="Int64")
>>> s[3] == x  # True
True
>>> s
0                     1
1                     2
2                     3
3      9999999999999999
4    123123123123123123
5     10000000000000543
dtype: Int64
>>> s2 = pd.Series([1, 2, 3, x, y, z, np.nan], dtype="Int64")
>>> s2[3] == x  # True (was False above)
True
>>> s2
0                     1
1                     2
2                     3
3      9999999999999999
4    123123123123123123
5     10000000000000543
6                   NaN
dtype: Int64
>>> np.iinfo(np.int64).max
9223372036854775807

Output of pd.show_versions()

INSTALLED VERSIONS

commit : None
python : 3.7.1.final.0
python-bits : 64
OS : Darwin
OS-release : 19.0.0
machine : x86_64
processor : i386
byteorder : little
LC_ALL : None
LANG : en_CA.UTF-8
LOCALE : en_CA.UTF-8

pandas : 0.25.3
numpy : 1.17.4
pytz : 2019.3
dateutil : 2.8.1
pip : 10.0.1
setuptools : 39.0.1
Cython : None
pytest : None
hypothesis : None
sphinx : None
blosc : None
feather : None
xlsxwriter : None
lxml.etree : None
html5lib : None
pymysql : None
psycopg2 : None
jinja2 : None
IPython : None
pandas_datareader: None
bs4 : None
bottleneck : None
fastparquet : None
gcsfs : None
lxml.etree : None
matplotlib : None
numexpr : None
odfpy : None
openpyxl : None
pandas_gbq : None
pyarrow : None
pytables : None
s3fs : None
scipy : None
sqlalchemy : None
tables : None
xarray : None
xlrd : None
xlwt : None
xlsxwriter : None

Metadata

Metadata

Assignees

Labels

BugExtensionArrayExtending pandas with custom dtypes or arrays.NA - MaskedArraysRelated to pd.NA and nullable extension arrays

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions