I'm currently working in a web scraper my min goal is to collect data and put that data into a DataFrame. I am having some issues trying to append some data.
This is the code:
import re
import time
from difflib import SequenceMatcher
import numpy as np
import pandas as pd
from selenium import webdriver
from selenium.webdriver import Chrome
from selenium.webdriver.common.by import By
from selenium.webdriver.common.keys import Keys
from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.support.expected_conditions import \
presence_of_element_located
from selenium.webdriver.support.ui import WebDriverWait
import functions_database as fd
import page_objects as po # Internal code
import scrapers as scr # Internal code
df_read_amz = pd.read_csv(
'/home/daniel/amazon-project-scrapers/amz_scraper.edited.csv')
amazon_data = list(df_read_amz.ASIN)
offers = []
for asin in amazon_data:
amz_offers = scr.get_offers(asin)
print('this is amz_offers', amz_offers)
dic_amz_offers = offers.append(amz_offers)
print('this is list dictionary', dic_amz_offers)
df_amz_offers = pd.DataFrame(dic_amz_offers)
print(df_amz_offers)
Notice that:
scr.get_offers(asin) is an internal function developed by a third party, it's outcome is the following:
this is amz_offers [{'seller_name': 'Under Moments', 'seller_url': 'https://www.amazon.com/gp/aag/main/ref=olp_merch_name_1/137-8432761-0781850?ie=UTF8&asin=B00I2ZFMO4&isAmazonFulfilled=0&seller=A2EDR7YR1BSPTD', 'condition': 'New', 'delivery': 'FBM', 'shipping': 0.0, 'price': 9.78, 'number_of_ratings': 179481, 'asin': 'B00I2ZFMO4', 'parent_asin': None}]
I am trying to make a dataframe with that outcome, the most solid idea I have is to append it to a list:
´dic_amz_offers = offers.append(amz_offers)
print('this is list dictionary:', dic_amz_offers)´
But this happens: ´this is list dictionary: None´
df_amz_offers = pd.DataFrame(dic_amz_offers)
print(df_amz_offers)
as a consequence, I also get an empty DF
Empty DataFrame Columns: [] Index: []`
How do I get the DataFrame I need? Thank you :)