0

I have previously used pandas to read smaller tables < 1 million rows. I attempted to read a very large table and my python script fails. I tried to catch an exception but I got no message. I then tried to run the script as very basic and I go an error but no message.

  Message=
  Source=R:\0400_GIM\Working\0010_Python_Scripts_PRODVM\ACQ_Pivot\TestGeophysDetails.py
  StackTrace:
  File "R:\0400_GIM\Working\0010_Python_Scripts_PRODVM\ACQ_Pivot\TestGeophysDetails.py", line 60, in <module>
    dfHBC = pd.read_sql_query('SELECT * FROM [ACQ_RH_EXP].[dbo].[GEOPHYSDETAILS]', acqEngine)

How do I determine if the error is a timeout or something else? Thanks in advance.

tawab_shakeel
  • 3,701
  • 10
  • 26
Quentin
  • 21
  • 3
  • https://stackoverflow.com/questions/18107953/how-to-create-a-large-pandas-dataframe-from-an-sql-query-without-running-out-of I hope this answers your question. – Ahmad Farhan Feb 12 '20 at 04:42
  • Thanks Ahmad, That works for reading the data. I would still like to know why I am not receiving an error or timeout on the original query. – Quentin Feb 13 '20 at 01:31

0 Answers0