Skip to content

BUG: to_timedelta drops decimals from input if precision is greater than nanoseconds #36738

Closed
@Gvxy2

Description

@Gvxy2
  • I have checked that this issue has not already been reported.

  • I have confirmed this bug exists on the latest version of pandas.

  • (optional) I have confirmed this bug exists on the master branch of pandas.


I am trying to convert a time to timedelta. The input data is in string format. When number of decimals is too big, it resets all decimals to zero.

import pandas as pd
d = {'Time':['8:53:08.26','8:53:08.71800000001', '8:53:09.729']}
df = pd.DataFrame(data=d)
pd.to_timedelta(df["Time"])

The result of the above pice of code gives:

0   0 days 08:53:08.260000
1          0 days 08:53:08
2   0 days 08:53:09.729000
Name: Time, dtype: timedelta64[ns]

As it can be seen, all the decimals from second data is lost.

Problem description

With the present behavior, a sorted array of data returns a wrong and unsorted array.

Expected Output

0   0 days 08:53:08.260000
1   0 days 08:53:08.718000
2   0 days 08:53:09.729000
Name: Time, dtype: timedelta64[ns]

Output of pd.show_versions()

INSTALLED VERSIONS

commit : f2ca0a2
python : 3.8.5.final.0
python-bits : 64
OS : Windows
OS-release : 7
Version : 6.1.7601
machine : AMD64
processor : Intel64 Family 6 Model 78 Stepping 3, GenuineIntel
byteorder : little
LC_ALL : None
LANG : None
LOCALE : es_ES.cp1252

pandas : 1.1.1
numpy : 1.19.1
pytz : 2020.1
dateutil : 2.8.1
pip : 20.2.2
setuptools : 49.6.0.post20200925

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions