glad that someone can help.
The python script (a.py) run well inserting into Mysql (mydatabase/table1) from Terminal but mysql (mydatabase/table1) not able to be inserted or updated when run as cron job.
a.py parse all *.htm files in documents directory and the result lst_url insert into mydatabase/table1 field lst_url.
a.py located in documents directory.
The cron job is * * * * * documents/a.py
Thanks in advance.
Toh
a.py
-------------------------------------------------
import time
from bs4 import BeautifulSoup
from datetime import datetime
import pymysql
import glob
import os
conn = pymysql.connect(host='127.0.0.1', port=3306, user='root',passwd='XXXXXXXX',db='mysql', charset='utf8')
cur=conn.cursor()
cur.execute("USE mydatabase")
for page in glob.glob('documents/*.htm'):
with open(page) as html_file:
soup = BeautifulSoup(html_file, 'lxml')
try:
with conn.cursor() as cursor:
for item in soup.find_all('div',class_='listing_info'):
lst_url= item.find('a', href=re.compile(r'[/]([a-z]|[A-Z])\w+')).attrs['href'
sql="INSERT INTO mydatabase.table1(lst_url) VALUES (%s)"
cur.execute(sql,(lst_url))
conn.commit()
finally:
#conn.close()
print("Done")