0

Error occurred when execute select * from xxx:

Failed with exception java.io.IOException:java.io.IOException: No LZO codec found, cannot run.

Troubleshooting done:

Checked hadoop-lzo.jar located in $HADOOP_HOME/share/hadoop/common for all hadoop nodes:

# ls $HADOOP_HOME/share/hadoop/common/ | grep lzo
hadoop-lzo-0.4.20.jar

Checked LZO codec configured in $HADOOP_HOME/etc/hadoop/core-site.xml for all hadoop nodes:

<configuration>
    ...
    <property>
        <name>io.compression.codecs</name>
        <value>
            org.apache.hadoop.io.compress.GzipCodec,
            org.apache.hadoop.io.compress.DefaultCodec,
            org.apache.hadoop.io.compress.BZip2Codec,
            org.apache.hadoop.io.compress.SnappyCodec,
            com.hadoop.compression.lzo.LzoCodec,
            com.hadoop.compression.lzo.LzopCodec
        </value>
    </property>
    <property>
        <name>io.compression.codec.lzo.class</name>
        <value>com.hadoop.compression.lzo.LzoCodec</value>
    </property>
</configuration>

I also tested read lzo file in mapreduce, it works correctly, so I think hadoop-lzo configured properly, but it didn't work in hive.

Steven
  • 21
  • 4
  • have you checked, the user that runs hive processes, has R-X permission on lzo files? – Koushik Roy May 17 '21 at 00:15
  • @KoushikRoy I checked the lzo files were not have 'x' permission, then I added 'x' permission on lzo files, but the result was the same. All of uploading files to HDFS, submitting the mapreduce job and running hive process use the same user. – Steven May 17 '21 at 02:14
  • This error was due to hive read wrong core-site.xml... These nodes installed with older version hadoop, and also installed with hbase, and symbol linked older version hadoop core-site.xml to hbase conf dir. As a result, hive script read the core-site.xml under $HBASE_HOME/conf/core-site.xml. And there is no configuration of lzo compression. – Steven May 17 '21 at 18:20
  • I see, glad it worked :) – Koushik Roy May 18 '21 at 03:11

0 Answers0