0

I am trying to execute a pig statement that shows me the data in a txt file and I am running in mapreduce mode, but I am getting an error please can somebody help me to resolve this!!

[root@master ~]# pig -x mapreduce
    17/04/19 17:42:34 INFO pig.ExecTypeProvider: Trying ExecType : LOCAL
    17/04/19 17:42:34 INFO pig.ExecTypeProvider: Trying ExecType : MAPREDUCE
    17/04/19 17:42:34 INFO pig.ExecTypeProvider: Picked MAPREDUCE as the ExecType
    2017-04-19 17:42:34,853 [main] INFO  org.apache.pig.Main - Apache Pig version 0.16.0 (r1746530) compiled Jun 01 2016, 23:10:49
    2017-04-19 17:42:34,853 [main] INFO  org.apache.pig.Main - Logging error messages to: /root/pig_1492603954851.log
    2017-04-19 17:42:34,907 [main] INFO  org.apache.pig.impl.util.Utils - Default bootup file /root/.pigbootup not found
    2017-04-19 17:42:36,060 [main] INFO  org.apache.pig.backend.hadoop.executionengine.HExecutionEngine - Connecting to hadoop file system at: hdfs://localhost
    2017-04-19 17:42:37,130 [main] INFO  org.apache.pig.PigServer - Pig Script ID for the session: PIG-default-f60d05c3-9fee-4624-9aa8-07f1584e6165
    2017-04-19 17:42:37,130 [main] WARN  org.apache.pig.PigServer - ATS is disabled since yarn.timeline-service.enabled set to false
    grunt> dump b;
    2017-04-19 17:42:41,135 [main] ERROR org.apache.pig.tools.grunt.Grunt - You don't have permission to perform the operation. Error from the server: org.apache.hadoop.security.AccessControlException: Permission denied: user=root, access=EXECUTE, inode="/tmp/temp1549818457":dead:supergroup:drwx------
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
        at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1720)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1704)
        at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkTraverse(FSDirectory.java:1692)
        at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:60)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3894)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:983)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)

    2017-04-19 17:42:41,136 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias b
    Details at logfile: /root/pig_1492603954851.log
The Man
  • 49
  • 8
  • Can you check :- http://stackoverflow.com/questions/7194069/apache-pig-permissions-issue – Deepan Ram Apr 19 '17 at 12:20
  • when i changed the /tmp directory permisions to accessible to everyone then it give me these error:- Input(s): Failed to read data from "/temp" Output(s): Failed to produce result in "hdfs://localhost/tmp/temp1691370991/tmp-1112412323" Counters: Total records written : 0 Total bytes written : 0 Spillable Memory Manager spill count : 0 Total bags proactively spilled: 0 Total records proactively spilled: 0 Job DAG: null org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias b – The Man Apr 19 '17 at 15:20
  • Check whether you have the proper access to read files from the folder. If not , then provide access to the HDFS folder also. – Deepan Ram Apr 19 '17 at 15:39
  • I have permissions to read and show files i can launch a mapreduce wordcount program too but the pig thing is not working dont know why – The Man Apr 19 '17 at 15:46
  • Can you please update your question with the new error that you are facing after you have made modifications to the pig.temp.dir. – Deepan Ram Apr 19 '17 at 15:58
  • http://stackoverflow.com/questions/43500078/pig-gives-me-this-error-when-i-tried-dump-the-data – The Man Apr 19 '17 at 16:02
  • Can you try :- hadoop fs -chmod -R 777 /tmp/* – Deepan Ram Apr 19 '17 at 16:09
  • yup i tried that earlier i got the error which i have given the link here when i used the chmod command – The Man Apr 19 '17 at 16:11
  • when i used chmod again it says name node is in safe mode i tried stooping and restarting but id didnt help plus my pig has stooped working after a restart and now its giving me this error :_ Cannot locate pig-core-h2.jar. do 'ant -Dhadoopversion=23 jar', and try again – The Man Apr 19 '17 at 16:46

2 Answers2

1

You can try this :-

 pig -x mapreduce -p 'pig.temp.dir'='<temp_location_hdfs>'

'temp_location_hdfs' should have either 775 or 777 permissions.

Then you can try :- hadoop fs -chmod -R 777 /tmp/*

Deepan Ram
  • 842
  • 1
  • 10
  • 25
0

It seems like you do not have proper permission to pig.temp.dir setting and hence this issue. By default pig writes the intermediate results in /tmp on HDFS. Overwrite it by using -Dpig.temp.dir.

Aman Saurav
  • 751
  • 9
  • 28