I have Panda dataframe that contains total [1127618 rows x 64 columns] and tried to save into mysql with given command below.
engine = create_engine('mysql+mysqlconnector://user:password@127.0.0.1/joohun_test', echo=False)
df.to_sql(name='tst_dr3_201801', con=engine, if_exists = 'replace', index=False
when it execute this command, it takes forever and seems like never finished task So if I reduced the dataframe down to [10000 rows × 64 columns], it can finally be saved into the Mysql database. But I run into different problem in final form of data saved in MySQL. As you see the columns,"smaster_uuttype" and 'user', there's white space added between characters.
MariaDB [joohun_test]> select serialno, uuttype, smaster_uuttype,failingtestname,cpptimeid, user, year, month from tst_dr3_sample limit 10;
+-------------+--------------------+--------------------------------------+-----------------+-----------+------------------+------+-------+
| serialno | uuttype | smaster_uuttype | failingtestname | cpptimeid | user | year | month |
+-------------+--------------------+--------------------------------------+-----------------+-----------+------------------+------+-------+
| ABCDEFGH | ABCD-ABC-2500ABCD= | D E F G - H I J - 2 5 0 0 A B C D = | | NULL | d u n g l e | 2018 | 1 |
however, looking at the same row of panda dataframe, there's no space between characters.
serialno uuttype smaster_uuttype failingtestname cpptimeid user year month
0 ABCDEFGH ABCD-ABC-2500ABCD= DEFG-HIJ-2500ABCD= None dungle 2018 1
there are two things I'd like to know.
- I'd like to know if there's way to save data into mysql with the way string is saved in panda without space between characters.
- Is there a way to save large size dataframe into mysql instead of disecting dataframe into small frames?