0

I have a locally single-node hosted hadoop. my name and datanode are same.

I'm trying to create a file using python library.

self.hdfs = PyWebHdfsClient(host='192.168.231.130', port='9870', user_name='kush',
                                    base_uri_pattern="http://192.168.231.130:9870/webhdfs/v1/", timeout=1)
        if not self.hdfs.exists_file_dir(path):
            self.hdfs.make_dir(path)
error-->self.hdfs.create_file("{}/results_{}.csv".format(path, name),
                              'word,negative,neutral,positive,compound\n')

the file exist and make_dir work correctly. But my create file keeps throwing an error The exception I got is this:

requests.exceptions.ConnectionError: HTTPConnectionPool(host='kush', port=9864): Max retries exceeded with url: /webhdfs/v1/user/kush/data/results_4104.csv?op=CREATE&user.name=kush&namenoderpcaddress=192.168.231.130:9000&createflag=&createparent=true&overwrite=false (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x000001F8C3FB1C40>: Failed to establish a new connection: [WinError 10061] No connection could be made because the target machine actively refused it'))

I have already allowed firewall for 9000 , 9870 & 9864. Thanks in advance. Help will be greatly appreciated

Kush Singh
  • 157
  • 3
  • 11

1 Answers1

0

apparently i just needed to add 'kush' to /etc/hosts but one issue was my hostname and machine name was same so i changed my machine name and then added it to my /etc/hosts both windows and linux and voila!

Kush Singh
  • 157
  • 3
  • 11