I'm writing a script to just quickly organize some data that I get, it all comes out in individual CSVs the script compiles it and gives the time stamp.
When I run it in Spyder it works and makes the file. I tried to turn it into a function so I could just run it from the command line, nothing happens. I can run it as a function in spyder though, but when I run it there I call featable()
. I tried that in PCP but that didnt work either. What am I doing wrong?
#Import modules
import pandas as pd
import os
import sys
print ('Starting')
def featable (workspace, basefile, filename):
print ('Function starting')
#future arguments
path= workspace
primeraw = basefile
newfilename = filename
#Creating base for new file
prime1=pd.read_csv(primeraw,usecols = [i for i in range(3)])
prime=pd.DataFrame(prime1)
print("Base file started")
#Appending Raw Data to Prime
files = os.listdir(path)
for f in files:
os.chdir(path)
#get time stamp
time=str(f[0:5]).replace("-", ":")
tempfile= pd.read_csv(f,usecols = [i for i in range(4)])
#append rew data col to to prime
df=pd.DataFrame(tempfile[' Raw Data (365-15/450-20)'])
prime[time]=df
print("Column added")
#Creating new File
prime.to_csv(r"%s/%s.csv" % (path,newfilename),index = False, header=True)
print("Done")
if __name__ == "__main__":
[workspace, basefile, filename] = sys.argv[1:]
featable (workspace, basefile, filename)
Error message:
(arcgispro-py3-clone) E:\UA Data>CTF.py('E:/UA Data/Clario_Assay_Data/FEA_6_16/PLATE20','E:/UA Data/Clario_Assay_Data/FEA_6_16/PLATE20/11-32_127_PLATE ID 20_RUN NUMBER 1.CSV','test')
Starting
Traceback (most recent call last):
File "E:\UA Data\CTF.py", line 49, in <module>
[workspace, basefile, filename] = sys.argv[1:]
ValueError: too many values to unpack
(arcgispro-py3-clone) E:\UA Data>