Open
Description
Description
- Since apache/logging-log4j2:2.23.0,
onMismatch
ofMarkerFilter
is no longer available. This was actually found from HIVE-28417: Bump Log4j2 to 2.24.1 to facilitate compilation of GraalVM Native Image hive#5375 . It can be reproduced in both2.23.0
and2.23.1
. This will be reflected in a log similar toERROR MarkerFilter contains an invalid element or attribute "onMismatch"
.
Configuration
Version: [Log4j version] 2.23.0
Operating system: [OS and version] Ubuntu WSL 22.04.4
JDK: [JDK distribution and version] 8.0.422-tem
on SDKMAN!
Logs
-------------------------------------------------------------------------------
Test set: org.apache.hive.hplsql.TestHplsqlLocal
-------------------------------------------------------------------------------
Tests run: 86, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 3.359 s <<< FAILURE! - in org.apache.hive.hplsql.TestHplsqlLocal
org.apache.hive.hplsql.TestHplsqlLocal.testException Time elapsed: 0.676 s <<< FAILURE!
org.junit.ComparisonFailure:
expected:<[]Ln:2 PRINT
Correct
L...> but was:<[2024-08-02T03:55:09.225Z main ERROR MarkerFilter contains an invalid element or attribute "onMismatch"
]Ln:2 PRINT
Correct
L...>
at org.junit.Assert.assertEquals(Assert.java:117)
at org.junit.Assert.assertEquals(Assert.java:146)
at org.apache.hive.hplsql.TestHplsqlLocal.run(TestHplsqlLocal.java:482)
at org.apache.hive.hplsql.TestHplsqlLocal.testException(TestHplsqlLocal.java:229)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at org.junit.runner.JUnitCore.run(JUnitCore.java:115)
at org.junit.vintage.engine.execution.RunnerExecutor.execute(RunnerExecutor.java:43)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472)
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:150)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:173)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:485)
at org.junit.vintage.engine.VintageTestEngine.executeAllChildren(VintageTestEngine.java:82)
at org.junit.vintage.engine.VintageTestEngine.execute(VintageTestEngine.java:73)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:248)
at org.junit.platform.launcher.core.DefaultLauncher.lambda$execute$5(DefaultLauncher.java:211)
at org.junit.platform.launcher.core.DefaultLauncher.withInterceptedStreams(DefaultLauncher.java:226)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:199)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:132)
at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invokeAllTests(JUnitPlatformProvider.java:154)
at org.apache.maven.surefire.junitplatform.JUnitPlatformProvider.invoke(JUnitPlatformProvider.java:123)
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:377)
at org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:138)
at org.apache.maven.surefire.booter.ForkedBooter.run(ForkedBooter.java:465)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:451)
Reproduction
- I must admit that I am a log4j2 newbie. It's hard to say how I can immediately come up with a smaller reproducible unit test.
- I just did a small test at https://github.com/linghengqian/hive/pull/1 . The relevant operating methods are also located in the corresponding PR.
git clone [email protected]:linghengqian/hive.git -b test-v2230-log4j
cd ./hive/
sdk install java 8.0.422-tem
sdk use java 8.0.422-tem
sdk install maven
mvn clean install -DskipTests
mvn test -Dtest=TestHplsqlLocal
- The activated configuration file is located at
/hive/data/conf/hive-log4j2.properties
.
status = INFO
name = HiveLog4j2Test
packages = org.apache.hadoop.hive.ql.log
property.hive.log.level = DEBUG
property.hive.root.logger = DRFA
property.hive.log.dir = ${sys:test.tmp.dir}/log
property.hive.log.file = hive.log
property.hive.test.console.log.level = INFO
appenders = console, DRFA
appender.console.type = Console
appender.console.name = console
appender.console.target = SYSTEM_ERR
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = %d{ISO8601} %5p [%t] %c{2}: %m%n
appender.DRFA.type = RollingRandomAccessFile
appender.DRFA.name = DRFA
appender.DRFA.fileName = ${sys:hive.log.dir}/${sys:hive.log.file}
appender.DRFA.filePattern = ${sys:hive.log.dir}/${sys:hive.log.file}.%d{yyyy-MM-dd}
appender.DRFA.layout.type = PatternLayout
appender.DRFA.layout.pattern = %d{ISO8601} %5p [%t] %c{2}: %m%n
appender.DRFA.policies.type = Policies
appender.DRFA.policies.time.type = TimeBasedTriggeringPolicy
appender.DRFA.policies.time.interval = 1
appender.DRFA.policies.time.modulate = true
appender.DRFA.strategy.type = DefaultRolloverStrategy
appender.DRFA.strategy.max = 30
loggers = HadoopIPC, HadoopSecurity, Hdfs, HdfsServer, HadoopMetrics2, Mortbay, Yarn, YarnServer, Tez, HadoopConf, Zookeeper, ServerCnxn, NIOServerCnxn, ClientCnxn, ClientCnxnSocket, ClientCnxnSocketNIO, DataNucleus, Datastore, JPOX, Operator, Serde2Lazy, ObjectStore, CalcitePlanner, AmazonAws, ApacheHttp, Thrift, Jetty, BlockStateChange, swo, CBORuleLogger
logger.HadoopIPC.name = org.apache.hadoop.ipc
logger.HadoopIPC.level = WARN
logger.HadoopSecurity.name = org.apache.hadoop.security
logger.HadoopSecurity.level = INFO
logger.Hdfs.name = org.apache.hadoop.hdfs
logger.Hdfs.level = INFO
logger.HdfsServer.name = org.apache.hadoop.hdfs.server
logger.HdfsServer.level = WARN
logger.HadoopMetrics2.name = org.apache.hadoop.metrics2
logger.HadoopMetrics2.level = INFO
logger.Mortbay.name = org.mortbay
logger.Mortbay.level = INFO
logger.Yarn.name = org.apache.hadoop.yarn
logger.Yarn.level = INFO
logger.YarnServer.name = org.apache.hadoop.yarn.server
logger.YarnServer.level = WARN
logger.Tez.name = org.apache.tez
logger.Tez.level = INFO
logger.HadoopConf.name = org.apache.hadoop.conf.Configuration
logger.HadoopConf.level = ERROR
logger.Zookeeper.name = org.apache.zookeeper
logger.Zookeeper.level = INFO
logger.ServerCnxn.name = org.apache.zookeeper.server.ServerCnxn
logger.ServerCnxn.level = WARN
logger.NIOServerCnxn.name = org.apache.zookeeper.server.NIOServerCnxn
logger.NIOServerCnxn.level = WARN
logger.ClientCnxn.name = org.apache.zookeeper.ClientCnxn
logger.ClientCnxn.level = WARN
logger.ClientCnxnSocket.name = org.apache.zookeeper.ClientCnxnSocket
logger.ClientCnxnSocket.level = WARN
logger.ClientCnxnSocketNIO.name = org.apache.zookeeper.ClientCnxnSocketNIO
logger.ClientCnxnSocketNIO.level = WARN
logger.DataNucleus.name = DataNucleus
logger.DataNucleus.level = ERROR
logger.Datastore.name = Datastore
logger.Datastore.level = ERROR
logger.JPOX.name = JPOX
logger.JPOX.level = ERROR
logger.Operator.name = org.apache.hadoop.hive.ql.exec.Operator
logger.Operator.level = INFO
logger.Serde2Lazy.name = org.apache.hadoop.hive.serde2.lazy
logger.Serde2Lazy.level = INFO
logger.ObjectStore.name = org.apache.hadoop.hive.metastore.ObjectStore
logger.ObjectStore.level = INFO
logger.CalcitePlanner.name = org.apache.calcite.plan.RelOptPlanner
logger.CalcitePlanner.level = INFO
logger.CBORuleLogger.name = org.apache.hadoop.hive.ql.optimizer.calcite.RuleEventLogger
logger.CBORuleLogger.level = OFF
logger.CBORuleLogger.filter.marker.type = MarkerFilter
logger.CBORuleLogger.filter.marker.marker = FULL_PLAN
logger.CBORuleLogger.filter.marker.onMatch = DENY
logger.CBORuleLogger.filter.marker.onMisMatch = NEUTRAL
logger.AmazonAws.name=com.amazonaws
logger.AmazonAws.level = INFO
logger.ApacheHttp.name=org.apache.http
logger.ApacheHttp.level = INFO
logger.Thrift.name = org.apache.thrift
logger.Thrift.level = WARN
logger.Jetty.name = org.eclipse.jetty
logger.Jetty.level = WARN
logger.BlockStateChange.name = BlockStateChange
logger.BlockStateChange.level = WARN
rootLogger.level = ${sys:hive.log.level}
rootLogger.appenderRefs = root, console
rootLogger.appenderRef.root.ref = ${sys:hive.root.logger}
rootLogger.appenderRef.console.ref = console
rootLogger.appenderRef.console.level = ${sys:hive.test.console.log.level}
logger.swo.name = org.apache.hadoop.hive.ql.optimizer.SharedWorkOptimizer
logger.swo.level = DEBUG
Metadata
Metadata
Labels
Type
Projects
Status
To triage