0

I tried to use pandas along with sqlalchemy. My problem is to match time values between the pandas algorithm and sqlite restriction.

I have an Excel file with 130 columns, some columns has time values or gives pandas NaT. I won't address each column by name, so first filtering the time columns with:

u = df.select_dtypes(include=['datetime64[ns]'])

then replace the 'NaT' values with:

df[u.columns] = df[u.columns].where(df.notnull(), pd.Timestamp('1900-01-01'))

that works, so fare. Then I tried:

df[u.columns] = df[u.columns].where(df.notnull(), pd.Timestamp('1900-01-01').to_pydatetime())

but nothing is changed, the result type is Timestamp, instead of python datetime.

Tried to change that later, after generating a python dict, but can't do.

Anybody have an advice?

FObersteiner
  • 22,500
  • 8
  • 42
  • 72
  • 1
    if you have a `'datetime64[ns]'` Series (df column), you can't change the dtype of individual elements (e.g. to Python datetime). Why would you do that in the first place? – FObersteiner Dec 20 '21 at 15:12
  • i try to push the data to a sqlite database via python sqlalchemy api, for easier handling under the hood of flask. (in the past i used the sqlite3 python api directly). sqlalchemy do not understand datetime64[ns]. –  Dec 20 '21 at 17:45
  • So actually, you want the whole column to be Python datetime? – FObersteiner Dec 20 '21 at 18:09
  • Ok, so columns, not only individual elements as you code snippet suggests. – FObersteiner Dec 20 '21 at 18:26
  • related: [Storing pure python datetime.datetime in pandas DataFrame](https://stackoverflow.com/q/39278042/10197418) – FObersteiner Dec 21 '21 at 07:32
  • after some test, the only way i see is ```df[u.columns] = df[u.columns].where(df.notnull(), pd.Timestamp('1900-01-01'))``` to fill NaT and change the pandas TimeStamp after to_dic conversion with to_pydatetime().date() –  Dec 21 '21 at 09:04

0 Answers0