I am trying to to read and write to the same file. currently the data in 2289newsML.txt
exists as normal sentences but I want to append the file so it stores only tokenized versions of the same sentences.
I used the code below but even tho it prints out tokenized sentences it doesnt write them to the file.
from pathlib import Path
from nltk.tokenize import word_tokenize
news_folder = Path("file\\path\\")
news_file = (news_folder / "2289newsML.txt")
f = open(news_file, 'r+')
data = f.readlines()
for line in data:
words = word_tokenize(line)
print(words)
f.writelines(words)
f.close
any help will be appreciated.
Thanks :)