spark输出log信息中怎么过滤INFO信息
发布网友
发布时间:2022-04-23 20:27
我来回答
共1个回答
热心网友
时间:2022-04-11 23:07
我按照网上的方法在文件log4j.properties中配置
# Set everything to be logged to the console
log4j.rootCategory=WARN, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
# Settings to quiet third party logs that are too verbose
log4j.logger.org.eclipse.jetty=WARN
log4j.logger.org.eclipse.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
这样使用spark-shell,可以看到只有warn信息输出,很简洁。
worker.Worker-1-lin-spark.out
lin@lin-spark:/opt/data01/spark-1.3.0-bin-2.6.0-cdh5.4.0$ bin/spark-shell
Spark assembly has been built with Hive, including Datanucleus jars on classpath
16/05/21 10:56:52 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable