1

We have installed nifi(HDF service) to our existing HDP. The installation was successful and I'm able to run nifi without any error. But I am facing an issue in PutHDFS

2018-06-20 12:00:14,246 WARN [StandardProcessScheduler Thread-6] org.apache.hadoop.conf.Configuration /tmp/core-site.xml:an attempt to override final parameter: fs.defaultFS;  Ignoring.
2018-06-20 12:00:14,248 ERROR [StandardProcessScheduler Thread-6] o.apache.nifi.processors.hadoop.PutHDFS PutHDFS[id=11a40827-0164-1000-ffff-ffffb07a04d9] HDFS Configuration error - java.net.ConnectException: Connection refused: {}
java.net.ConnectException: Connection refused
        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
        at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
        at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
        at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.checkHdfsUriForTimeout(AbstractHadoopProcessor.java:345)
        at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.resetHDFSResources(AbstractHadoopProcessor.java:260)
        at org.apache.nifi.processors.hadoop.AbstractHadoopProcessor.abstractOnScheduled(AbstractHadoopProcessor.java:205)
        at sun.reflect.GeneratedMethodAccessor728.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:137)
        at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:125)
        at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:70)
        at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:47)
        at org.apache.nifi.controller.StandardProcessorNode$1.call(StandardProcessorNode.java:1334)
        at org.apache.nifi.controller.StandardProcessorNode$1.call(StandardProcessorNode.java:1330)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
        at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
2018-06-20 12:00:14,248 ERROR [StandardProcessScheduler Thread-6] org.apache.nifi.engine.FlowEngine A flow controller task execution stopped abnormally
java.util.concurrent.ExecutionException: java.lang.reflect.InvocationTargetException
        at java.util.concurrent.FutureTask.report(FutureTask.java:122)
        at java.util.concurrent.FutureTask.get(FutureTask.java:192)
        at org.apache.nifi.engine.FlowEngine.afterExecute(FlowEngine.java:100)
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1150)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.reflect.InvocationTargetException: null

I have attached the Screenshot too.

PutHDFS

Configuration

Abhinav
  • 658
  • 1
  • 9
  • 27
  • Can you verify the `hdfs-site.xml` and `core-site.xml` that you have provided for the property `Hadoop Configuration Resources` if they are correct? – Sivaprasanna Sethuraman Jun 20 '18 at 14:29
  • I'm pretty sure that they're correct. Can you tell me what particularly should be there to check? Because the cluster is running fine with the default properties provided by Ambari itself – Abhinav Jun 20 '18 at 14:30
  • Can you show the configuration screen of the processor? – Bryan Bende Jun 20 '18 at 14:35
  • @BryanBende I have attached the picture. And I know you may find it suspicious that I haven't provided the path to hdfs-site.xml and core-site.xml in configuration, that is because I have copied those files to /usr/hdf/current/nifi/lib(which means I don't need to explicitly provide the path) – Abhinav Jun 20 '18 at 14:40
  • @Abhinav while it is true that copying them into lib does put it on the global classpath of nifi, it will not guarantee that those are picked up before any default ones that are bundle in hadoop client jars, i would recommend setting the property in the processor – Bryan Bende Jun 20 '18 at 14:48
  • @BryanBende As you say, I have edited the property and restarted the service. Still, the error is same. I don't know why it's showing the configuration error in Hadoop and why would it want to override the FS.defaultFS property? – Abhinav Jun 20 '18 at 14:50
  • @Abhinav What is in your core-site.xml for the fs.defaultFS? The error is that its trying to make a URI of the value for fs.defaultFS and then make a socket connection that URI and it is getting connection refused – Bryan Bende Jun 20 '18 at 15:05
  • @BryanBende The value is hdfs://x.x.com:8020 – Abhinav Jun 20 '18 at 15:07
  • @BryanBende I got it that it can't resolve the URI but why not? And what should I do to make it work – Abhinav Jun 20 '18 at 15:10
  • Can't really say since I don't have access to your environment... you could try eliminating NiFi from the picture and installing hadoop client on the server where NiFi is running and then try to issue command line operations like "hadoop fs ls /" or something, if that doesn't work then it is some networking issue in your environment – Bryan Bende Jun 20 '18 at 15:22

1 Answers1

1

As @BryanBende said, you are getting a java.net.ConnectException: Connection refused, so it seems to be something related to security configuration in the Hadoop machine and not related to NiFi itself, try to connect with a Hadoop client.

Information about this kind of errors:

Hadoop cluster setup - java.net.ConnectException: Connection refused

https://wiki.apache.org/hadoop/ConnectionRefused

Óscar Andreu
  • 1,630
  • 13
  • 32
  • But it also states that there is some Hadoop configuration error. But all my services are up and running then how come only for Ni-Fi the services are not working? – Abhinav Jul 09 '18 at 07:11
  • Is a hadoop client from the NiFi machine able to connect to hadoop? I saw you are using netgear.slave.com, is the Nifi machine resolving this IP with the private ip or is using the public IP? in this case you can have routing problems. So from my point of view, the first you need to try is check if there is connectivity between both machines and how, then check if the port is open (you said that yes) and the Nifi machine is allowed to access to it. – Óscar Andreu Jul 09 '18 at 18:05