1

I ran a python script to write my log files using:

nohup python my_script.py >> log.txt

However, I thought maybe it is >> in Linux which doesn't support the Chinese characters encoded in utf-8.

enter image description here

In my script I used print to show the utf-8 characters and it works well in the python shell. So I want to know how can I write the utf-8 characters to log files correctly? Thanks.

zfz
  • 1,597
  • 1
  • 22
  • 45
  • What is the file's encoding? show it use the command : set fileencoding – hello.co May 24 '13 at 05:55
  • 1
    See http://stackoverflow.com/questions/5530708/can-i-redirect-unicode-output-from-the-consol-directly-into-a-file and http://stackoverflow.com/questions/8016236/python-unicode-handling-differences-between-print-and-sys-stdout-write – torek May 24 '13 at 06:11

1 Answers1

4

I've found the solution. Just add one line in head of the python script:

# -*- coding: UTF-8 -*-

For example, a simple script named utf8.py:

# -*- coding: UTF-8 -*-

if __name__ == '__main__':
    s = u'中文'
    print s.encode('utf-8')

Then redirect the print to a txt file:

[zfz@server tmp]$ python utf8.py >> utf8.txt
[zfz@server tmp]$ cat utf8.txt 
中文

The Chinese characters can be output to the txt file correctly.

zfz
  • 1,597
  • 1
  • 22
  • 45