日志配置文件選擇使用log4j.properties flink程序不打印日志。
問題原因
????????日志依賴包沖突
解決辦法
????????將lib目錄下的log4j2依賴移除,如下:
????????log4j-1.2-api-2.12.1.jar
????????log4j-api-2.12.1.jar
????????log4j-core-2.12.1.jar
????????log4j-slf4j-impl-2.12.1.jar
log4j.properties
# This affects logging for both user code and Flink
log4j.rootLogger=INFO, infoFile# The following lines keep the log level of common libraries/connectors on
# log level INFO. The root logger does not override this. You have to manually
# change the log levels here.
log4j.logger.akka=INFO
log4j.logger.org.apache.kafka=INFO
log4j.logger.org.apache.hadoop=INFO
log4j.logger.org.apache.zookeeper=INFO# Log all infos in the given file
log4j.appender.infoFile=org.apache.log4j.RollingFileAppender
log4j.appender.infoFile.File=${log.file}
log4j.appender.infoFile.layout=org.apache.log4j.PatternLayout
log4j.appender.infoFile.layout.ConversionPattern=%d{yyyy/MM/dd HH:mm:ss,SSS} %p %C.%M(%L) | %m%n
log4j.appender.infoFile.append=true
log4j.appender.infoFile.MaxFileSize=32MB
log4j.appender.infoFile.MaxBackupIndex=128# Suppress the irrelevant (wrong) warnings from the Netty channel handler
log4j.logger.org.apache.flink.shaded.akka.org.jboss.netty.channel.DefaultChannelPipeline=ERROR, file文章來源:http://www.zghlxwxcb.cn/news/detail-707352.html# We only log the kafka appender logs to File to avoid deadlocks
log4j.logger.cloudera.shaded.org.apache.kafka=INFO, file
log4j.additivity.cloudera.shaded.org.apache.kafka=false文章來源地址http://www.zghlxwxcb.cn/news/detail-707352.html
到了這里,關(guān)于Flink使用log4j.properties不打印日志問題的文章就介紹完了。如果您還想了解更多內(nèi)容,請?jiān)谟疑辖撬阉鱐OY模板網(wǎng)以前的文章或繼續(xù)瀏覽下面的相關(guān)文章,希望大家以后多多支持TOY模板網(wǎng)!