I am scraping a table from a website and want to create a pandas dataframe out of that. My question is what is the best method to achieve this in terms of efficiency / best practice?
What I have done is while scraping, append items to several lists to represent the columns. Once I'm done parsing the table from the website, I create the DataFrame and assign the lists to column names. See below:
zip_df = pd.DataFrame(index=zip_codes)
zip_df['Latitude'] = latitudes
zip_df['Longitude'] = longitudes
There seems to be many different ways to approach this (e.g. Python pandas: fill a dataframe row by row). Is the way I am doing it most logical? Or are there better approaches?