Skip to content

Assigning datetime array to column fails with OutOfBoundsDatetime when having NaT and other unit as [ns] #7492

Closed
@jorisvandenbossche

Description

@jorisvandenbossche

Assigning an array with datetime64[ns] values including a NaT just works:

In [85]: a = np.array([1, 'nat'], dtype='datetime64[ns]')

In [86]: pd.Series(a)
Out[86]: 
0   1970-01-01 00:00:00.000000001
1                             NaT
dtype: datetime64[ns]

In [88]: df = pd.Series(a).to_frame()

In [89]: df['new'] = a

But when having an array with another date unit, converting it to a Series still works, but assigning it directly to a column not anymore, resulting in a OutOfBoundsDatetime error:

In [90]: a = np.array([1, 'nat'], dtype='datetime64[s]')

In [91]: pd.Series(a)
Out[91]: 
0   1970-01-01 00:00:01
1                   NaT
dtype: datetime64[ns]

In [92]: df['new'] = a
Traceback (most recent call last):
...
  File "tslib.pyx", line 1720, in pandas.tslib.cast_to_nanoseconds (pandas\tslib.c:27435)

  File "tslib.pyx", line 1023, in pandas.tslib._check_dts_bounds (pandas\tslib.c:18102)

OutOfBoundsDatetime: Out of bounds nanosecond timestamp: 292277026596-12-03 08:29:52

If you first convert it to a series, it does work. Also if the NaT is not present:

In [93]: df['new'] = pd.Series(a)

In [94]: a = np.array([1, 2], dtype='datetime64[s]')

In [95]: df['new'] = a

Metadata

Metadata

Assignees

No one assigned

    Labels

    BugDatetimeDatetime data dtypeMissing-datanp.nan, pd.NaT, pd.NA, dropna, isnull, interpolate

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions