0

My Input file is below . I am trying to dump the loaded data in relation. I am using pig 0.12.

a,t1,1000,100
a,t1,2000,200
b,t2,1000,200
b,t2,5000,100

I entered into HDFS mode by entering pig

myinput = LOAD 'file' AS(a1:chararray,a2:chararray,amt:int,rate:int);

if i do dump myinput then it shows the below error.

describe, illustrate works fine..

so

dump myinput ;

As soon i enter the dump command i get the below error message.

ERROR org.apache.hadoop.ipc.RPC - FailoverProxy: Failing this Call: submitJob for error   (RemoteException): org.apache.hadoop.ipc.RemoteException:  org.apache.hadoop.security.AccessControlException: User 'myid' cannot perform operation SUBMIT_JOB on queue default.
Please run "hadoop queue -showacls" command to find the queues you have access to .
    at org.apache.hadoop.mapred.ACLsManager.checkAccess(ACLsManager.java:179)
    at org.apache.hadoop.mapred.ACLsManager.checkAccess(ACLsManager.java:136)
    at org.apache.hadoop.mapred.ACLsManager.checkAccess(ACLsManager.java:113)
    at org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:4541)
    at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:993)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1326)
    at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1322)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1127)
    at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1320)



ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias myinput

Is this access issues? kind of privilege issue? Can someone help me

Rajen Raiyarela
  • 5,526
  • 4
  • 21
  • 41
Surender Raja
  • 3,553
  • 8
  • 44
  • 80
  • can you describe your setup? it seems to be some issue related to hadoop queue. – Rajen Raiyarela Jul 15 '14 at 13:37
  • MY_QUEUE =demo I just tried with Pig -DMapred.local.job.queue.name=demo ..then it worked... yes ..it sa queue issue.. – Surender Raja Jul 15 '14 at 13:43
  • Try % hadoop queue -showacls It should show which queues you're allowed to submit. – Rajen Raiyarela Jul 15 '14 at 13:46
  • For people who found this post when looking for [ERROR 1066: Unable to open iterator for alias](http://stackoverflow.com/questions/34495085/error-1066-unable-to-open-iterator-for-alias-in-pig-generic-solution) here is a [generic solution](http://stackoverflow.com/a/34495086/983722). – Dennis Jaheruddin Dec 28 '15 at 14:40

2 Answers2

1

If you didn't mention any load functions like PigStorage('\t') then it reads data with the column separator as tab(\t) by default.

In your data, the column separator is comma(,)

So Try this one,

myinput = LOAD 'file' using PigStorage(',') AS(a1:chararray,a2:chararray,amt:int,rate:int);

Hope it should work..

Rengasamy
  • 1,023
  • 1
  • 7
  • 21
0

you could describe your input data(separator), in your case comma : try this code please :

myinput = LOAD 'file'  USING PigStorage(',') AS (a1:chararray,a2:chararray,amt:int,rate:int);