I have a python script that grabs NBA game data from each day of the season. My script has a for loop which starts at day 1 and goes to current day. For each day, the script:
- Opens webpage with schedule for that day and gets all game info using Beautiful Soup
- stores it in a dataframe
- appends that data to a BigQuery table
The dataframe is never more than 15 rows and usually even smaller, yet the script will just randomly hang while trying to insert into BigQuery table using to_gbq function. It might happen on the first day, the 3rd day, or some other. But it hasn't even come close to running for every day before hanging.
This is the code that it gets stuck on:
game_df.to_gbq(
"nba_data.games",
if_exists="append",
project_id="my_test_project",
credentials=credentials
)
Any idea why this would happen?