0

we have hadoop cluster version 2.6.5 ( hortonworks ) and platform GUI based on ambari,

we configure the log4j to work with RollingFileAppender and MaxBackupIndex for 10 backup's

after restart the HIVE service we saw the following strange things

under /var/log/hive I can see the following logs ( example )

-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.23-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.23-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.23-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.24
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.24-20180804
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.24-20180804-20180805
-rw-r--r-- 1 hive hadoop       0 Aug  6 03:40 hiveserver2.log.24-20180805

I not understand why logs became with "-20180803" ? ,

because this isnt what we defined from the hive-log4j

Example of hive-log4j configuration from ambari

# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements.  See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership.  The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License.  You may obtain a copy of the License at
#
#     http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# Define some default values that can be overridden by system properties
hive.log.threshold=ALL
hive.root.logger=INFO,DRFA
hive.log.dir=${java.io.tmpdir}/${user.name}
hive.log.file=hive.log
# Define the root logger to the system property "hadoop.root.logger".
log4j.rootLogger=${hive.root.logger}, EventCounter
# Logging Threshold
log4j.threshold=${hive.log.threshold}
#
# Daily Rolling File Appender
#
# Use the PidDailyerRollingFileAppend class instead if you want to use separate log files
# for different CLI session.
#
# log4j.appender.DRFA=org.apache.hadoop.hive.ql.log.PidDailyRollingFileAppender
#log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender
log4j.appender.DRFA.File=${hive.log.dir}/${hive.log.file}
# Rollver at midnight
log4j.appender.DRFA=org.apache.log4j.RollingFileAppender
log4j.appender.DRFA.MaxBackupIndex=10
log4j.appender.DRFA.MaxFileSize=100MB
#log4j.appender.DRFA.DatePattern=.yyyy-MM-dd
# 30-day backup
#log4j.appender.DRFA.MaxBackupIndex=30
log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout
# Pattern format: Date LogLevel LoggerName LogMessage
#log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
# Debugging Pattern format
log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %-5p [%t]: %c{2} (%F:%M(%L)) - %m%n
#
# console
# Add "console" to rootlogger above if you want to use this
#
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} [%t]: %p %c{2}: %m%n
log4j.appender.console.encoding=UTF-8
#custom logging levels
#log4j.logger.xxx=DEBUG
#
# Event Counter Appender
# Sends counts of logging messages at different severity levels to Hadoop Metrics.
#
log4j.appender.EventCounter=org.apache.hadoop.hive.shims.HiveEventCounter
log4j.category.DataNucleus=ERROR,DRFA
log4j.category.Datastore=ERROR,DRFA
log4j.category.Datastore.Schema=ERROR,DRFA
log4j.category.JPOX.Datastore=ERROR,DRFA
log4j.category.JPOX.Plugin=ERROR,DRFA
log4j.category.JPOX.MetaData=ERROR,DRFA
log4j.category.JPOX.Query=ERROR,DRFA
log4j.category.JPOX.General=ERROR,DRFA
log4j.category.JPOX.Enhancer=ERROR,DRFA
# Silence useless ZK logs
log4j.logger.org.apache.zookeeper.server.NIOServerCnxn=WARN,DRFA
log4j.logger.org.apache.zookeeper.ClientCnxnSocketNIO=WARN,DRFA

please advice what could be the reason that we get the log structure as

hiveserver2.log.23-20180804-20180805

instead to get this correct one:

hiveserver2.log.23
King David
  • 500
  • 1
  • 7
  • 20
  • You're looking at the wrong file given that `hive.log.dir=${java.io.tmpdir}/${user.name}` is not /var/log/hive – OneCricketeer Aug 06 '18 at 12:17
  • why you think that java.io.tmpdi not represented the /var/log ? , second after we change the hive-log4j , we see the logs changes inside /var/log/hive , I mean we put the max backup to 10 and indeed log ended on 10 backup – King David Aug 06 '18 at 12:38
  • Java temp directory is commonly /tmp https://stackoverflow.com/a/1924576/2308683 – OneCricketeer Aug 06 '18 at 12:55
  • but under /tmp not have any file with hive , and when I am looking under /var/log/hive then we have exactly the files that are defined in the log4j as hiveserver2.log or hivemetastore.log , etc , anyway I am focus on the problem with the log syntax as hiveserver2.log.24-20180804-20180805 , and from hive-log4j looking I am really not understand from where I get this wrong syntax – King David Aug 06 '18 at 12:59
  • I'm not sure. Other than Hive is restarting and unable to detect previous day files, so it appends the current date and rolls whatever files it sees. I'm not understanding the file name either given that none of the settings have yyyyMMdd formatting – OneCricketeer Aug 06 '18 at 13:27
  • the setting - yyyyMMdd is disable in the hive-log4j ( see #log4j.appender.DRFA.DatePattern=.yyyy-MM-dd ) , and anyway it was in the past enable but start with "." and not with "-" , so its very strange what is going here – King David Aug 06 '18 at 13:31
  • Yes, it's disabled, so I'm asking what's generating `20180804`? – OneCricketeer Aug 06 '18 at 13:32
  • Let us [continue this discussion in chat](https://chat.stackoverflow.com/rooms/177496/discussion-between-king-david-and-cricket-007). – King David Aug 06 '18 at 13:35
  • Your listing shows **empty** files. My guess is you are looking at `stderr / stdout` redirections, in case some dependency vomits accidentally something there (e.g. Kerberos debug traces), and these have nothing to do with Log4J. – Samson Scharfrichter Aug 06 '18 at 18:24

0 Answers0