0

I have 2 excel files, one that contains data and another as template. I am trying to extract data from the first file to the template, the column names differ but I already mapped using a dictionary.

pf = pd.read_csv(excelFile)
pf2 = pd.read_excel(template)

mapping = {'Store Name': 'Sell-to Customer No.',
       'P.O. Number:': 'External Document No.',
       'PO Date & Time': 'Order Date',
       'Item #': 'Item No.',
       'Quantity': 'Quantity',
       'Carrier Name': 'E-Ship Agent Service',
       'Shipping Account Number': 'Ship-to Code',
       'Ship To Name': 'Ship-to Name',
       'Ship to Address': 'Ship-to Address',
       'Ship To Address 2': 'Ship-to Address 2',
       'Ship To City': 'Ship-to City',
       'Ship To State': 'Ship-to State',
       'Ship To Zip': 'Ship-to ZIP Code',
       'Ship To Phone': 'Ship-to Contact',
       'Wholesale Price': 'Your Reference'}

for i in mapping:
    if mapping[i] in pf2.columns:
        pf2[mapping[i]] = pf[i]
        #print(pf[i])  This gives me all rows 
        #print(pf2[mapping[i]])   This gives me at most 4 rows

What am I doing wrong?

UPDATE: Example output for first iteration from pf2[mapping[i]] I get,

0   16.8
1   26.4
2   14.4
3   20.0

from pf[i] I get,

0   16.8
1   26.4
2   14.4
3   20.0
4   28.0
5   16.0
6   26.4
...
35  26.3
eyllanesc
  • 235,170
  • 19
  • 170
  • 241
paul
  • 121
  • 1
  • 10
  • Could you add small sample of your data? – Dani Mesejo Oct 25 '18 at 00:37
  • Actually, I just fixed it, the problem was, in my template file, row 0, 1, 2, 3 had data for a specific column. That is why it only add only to those rows. When I removed them, it works perfectly. Thanks. – paul Oct 25 '18 at 00:50
  • @DanielMesejo Just out of curiosity, do you know to avoid pandas from adding the very first row that enumerates the rows? – paul Oct 25 '18 at 00:54
  • You could simply add a boolean marker for the first time, also see this https://stackoverflow.com/questions/20637439/skip-rows-during-csv-import-pandas – Dani Mesejo Oct 25 '18 at 00:56

0 Answers0