Skip to content

NullPointerException on JDK 17 involving MODULE$ in scala-library 2.12 #13023

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
xiongbo-sjtu opened this issue Aug 8, 2024 · 8 comments
Closed

Comments

@xiongbo-sjtu
Copy link

xiongbo-sjtu commented Aug 8, 2024

Reproduction steps

Over the past few months, we have observed transient issues where NullPointerException (i.e., NPE) is thrown by methods of "$.MODULE$" in scala-library when running our short-lived Spark apps on multiple AWS EMR 7 clusters with Java 17 runtime. It's neither cluster specific nor host type specific. Note that everything works when re-running the same app on the same cluster (with the same worker nodes) after the initial failure due to NPE. In other words, we're unable to reproduce the issue at will. To mitigate the issue, we had to disable C2 and only use C1 with no profiling overhead via -XX:+TieredCompilation -XX:TieredStopAtLevel=1.

Scala version: Scala version 2.12.17 (OpenJDK 64-Bit Server VM, Java 17.0.12)

$ spark-shell
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 3.5.0-amzn-0
      /_/

Using Scala version 2.12.17 (OpenJDK 64-Bit Server VM, Java 17.0.12)

Problem

Various methods of "$.MODULE$" in scala-library throw NullPointerException in a random way (Scala version 2.12, OpenJDK 64-Bit Server VM, Java 17.0.12). The issues are observed on worker nodes with high CPU utilization (p90 utilization >90%), but we're not sure if the hot CPU plays a role there. Our hypothesis is that NPE might be caused by JIT compiler optimization that incorrectly removes the assignment of a certain variable from some code path, similar to this bug. I won't expect that JIT will change any static final field from non-null to null within the same JVM process though.

Here's the question: are those "$.MODULE$" fields implemented as static final fields in Scala 2.12? If not, do we expect them to be static final in Scala 2.13 after addressing scala/scala-dev#537?

24/07/17 22:29:55 java.lang.NullPointerException: Cannot invoke "scala.collection.immutable.List$.empty()" because "scala.collection.immutable.List$.MODULE$" is null
    at scala.util.matching.Regex$.scala$util$matching$Regex$$extractGroupsFromMatcher(Regex.scala:789) ~[scala-library-2.12.17.jar:?]
    at scala.util.matching.Regex.unapplySeq(Regex.scala:282) ~[scala-library-2.12.17.jar:?]
    at org.apache.spark.storage.BlockId$.apply(BlockId.scala:231) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
...
    at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:102) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
...
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.96.Final.jar:4.1.96.Final]
    at java.lang.Thread.run(Thread.java:840) [?:?]
24/05/20 15:05:01  java.lang.NullPointerException: Cannot invoke "scala.collection.mutable.HashTable$.defaultLoadFactor()" because "scala.collection.mutable.HashTable$.MODULE$" is null
    at scala.collection.mutable.HashTable.$init$(HashTable.scala:45) ~[scala-library-2.12.17.jar:?]
    at scala.collection.mutable.HashMap.<init>(HashMap.scala:45) ~[scala-library-2.12.17.jar:?]
    at scala.collection.mutable.HashMap.<init>(HashMap.scala:60) ~[scala-library-2.12.17.jar:?]
    at org.apache.spark.serializer.GenericAvroSerializer.<init>(GenericAvroSerializer.scala:52) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.serializer.KryoSerializer.registerAvro$1(KryoSerializer.scala:166) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
...
    at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:632) [spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
    at java.lang.Thread.run(Thread.java:840) [?:?]
24/04/06 14:55:15 java.lang.NullPointerException: Cannot invoke "scala.math.package$.min(int, int)" because "scala.math.package$.MODULE$" is null
    at org.apache.spark.util.ByteBufferInputStream.read(ByteBufferInputStream.scala:48) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at java.io.ObjectInputStream$PeekInputStream.read(ObjectInputStream.java:2897) ~[?:?]
    at java.io.ObjectInputStream$BlockDataInputStream.refill(ObjectInputStream.java:3150) ~[?:?]
    at java.io.ObjectInputStream$BlockDataInputStream.read(ObjectInputStream.java:3231) ~[?:?]
    at java.io.DataInputStream.readInt(DataInputStream.java:381) ~[?:?]
    at java.io.ObjectInputStream$BlockDataInputStream.readInt(ObjectInputStream.java:3436) ~[?:?]
    at java.io.ObjectInputStream.readInt(ObjectInputStream.java:1128) ~[?:?]
    at org.apache.spark.util.SerializableBuffer.$anonfun$readObject$1(SerializableBuffer.scala:33) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.util.SparkErrorUtils.tryOrIOException(SparkErrorUtils.scala:35) ~[spark-common-utils_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.util.SparkErrorUtils.tryOrIOException$(SparkErrorUtils.scala:33) ~[spark-common-utils_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:95) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.util.SerializableBuffer.readObject(SerializableBuffer.scala:32) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]
    at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
    at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
    at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1100) ~[?:?]
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2423) ~[?:?]
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257) ~[?:?]
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733) ~[?:?]
    at java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606) ~[?:?]
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457) ~[?:?]
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257) ~[?:?]
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733) ~[?:?]
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:509) ~[?:?]
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:467) ~[?:?]
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:123) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$deserialize$2(NettyRpcEnv.scala:299) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) ~[scala-library-2.12.17.jar:?]
    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:352) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$deserialize$1(NettyRpcEnv.scala:298) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) [scala-library-2.12.17.jar:?]
    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:298) [spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:646) [spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:697) [spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:689) [spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
...
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286) [netty-handler-4.1.96.Final.jar:4.1.96.Final]
...
    at org.apache.spark.network.crypto.TransportCipher$DecryptionHandler.channelRead(TransportCipher.java:192) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
...
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.96.Final.jar:4.1.96.Final]
    at java.lang.Thread.run(Thread.java:840) [?:?]
24/04/02 14:20:14 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
java.lang.NullPointerException: Cannot invoke "scala.concurrent.duration.Duration$.apply(long, java.util.concurrent.TimeUnit)" because "scala.concurrent.duration.Duration$.MODULE$" is null
    at scala.concurrent.duration.package$DurationLong$.durationIn$extension(package.scala:60) ~[scala-library-2.12.17.jar:?]
    at scala.concurrent.duration.package$DurationLong.durationIn(package.scala:60) ~[scala-library-2.12.17.jar:?]
    at scala.concurrent.duration.DurationConversions.seconds(DurationConversions.scala:37) ~[scala-library-2.12.17.jar:?]
    at scala.concurrent.duration.DurationConversions.seconds$(DurationConversions.scala:37) ~[scala-library-2.12.17.jar:?]
    at scala.concurrent.duration.package$DurationLong.seconds(package.scala:59) ~[scala-library-2.12.17.jar:?]
    at org.apache.spark.rpc.RpcTimeout$.apply(RpcTimeout.scala:131) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.util.RpcUtils$.askRpcTimeout(RpcUtils.scala:41) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.RpcEndpointRef.<init>(RpcEndpointRef.scala:33) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.NettyRpcEndpointRef.<init>(NettyRpcEnv.scala:533) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:640) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:697) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:689) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.network.server.AbstractAuthRpcHandler.receive(AbstractAuthRpcHandler.java:66) ~[spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:274) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:111) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:140) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:53) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286) [netty-handler-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:102) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at org.apache.spark.network.crypto.TransportCipher$DecryptionHandler.channelRead(TransportCipher.java:192) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.96.Final.jar:4.1.96.Final]
    at java.lang.Thread.run(Thread.java:840) [?:?]
24/02/23 06:09:26 ERROR Utils: Exception encountered
java.lang.NullPointerException: Cannot invoke "scala.math.package$.min(int, int)" because "scala.math.package$.MODULE$" is null
    at org.apache.spark.util.ByteBufferInputStream.read(ByteBufferInputStream.scala:48) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at java.io.ObjectInputStream$PeekInputStream.read(ObjectInputStream.java:2897) ~[?:?]
    at java.io.ObjectInputStream$BlockDataInputStream.refill(ObjectInputStream.java:3150) ~[?:?]
    at java.io.ObjectInputStream$BlockDataInputStream.read(ObjectInputStream.java:3318) ~[?:?]
    at java.io.ObjectInputStream.read(ObjectInputStream.java:1022) ~[?:?]
    at java.nio.channels.Channels$ReadableByteChannelImpl.read(Channels.java:387) ~[?:?]
    at org.apache.spark.util.SerializableBuffer.$anonfun$readObject$1(SerializableBuffer.scala:38) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.util.SparkErrorUtils.tryOrIOException(SparkErrorUtils.scala:35) ~[spark-common-utils_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.util.SparkErrorUtils.tryOrIOException$(SparkErrorUtils.scala:33) ~[spark-common-utils_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:95) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.util.SerializableBuffer.readObject(SerializableBuffer.scala:32) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
    at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]
    at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
    at java.lang.reflect.Method.invoke(Method.java:568) ~[?:?]
    at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1100) ~[?:?]
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2423) ~[?:?]
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257) ~[?:?]
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733) ~[?:?]
    at java.io.ObjectInputStream$FieldValues.<init>(ObjectInputStream.java:2606) ~[?:?]
    at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2457) ~[?:?]
    at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2257) ~[?:?]
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1733) ~[?:?]
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:509) ~[?:?]
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:467) ~[?:?]
    at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:123) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$deserialize$2(NettyRpcEnv.scala:299) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) ~[scala-library-2.12.17.jar:?]
    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:352) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.NettyRpcEnv.$anonfun$deserialize$1(NettyRpcEnv.scala:298) ~[spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) [scala-library-2.12.17.jar:?]
    at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:298) [spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:646) [spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:697) [spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:689) [spark-core_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:274) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:111) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:140) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:53) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286) [netty-handler-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) [netty-codec-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:102) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at org.apache.spark.network.crypto.TransportCipher$DecryptionHandler.channelRead(TransportCipher.java:192) [spark-network-common_2.12-3.5.0-amzn-0.jar:3.5.0-amzn-0]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) [netty-transport-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.96.Final.jar:4.1.96.Final]
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) [netty-common-4.1.96.Final.jar:4.1.96.Final]
    at java.lang.Thread.run(Thread.java:840) [?:?]
@SethTisue
Copy link
Member

SethTisue commented Aug 8, 2024

are those "$.MODULE$" methods implemented as static final methods in Scala 2.12?

seems like something you could check, in both 2.12 and 2.13, with javap?

@SethTisue
Copy link
Member

Note that MODULE$ is a field, not a method.

@xiongbo-sjtu xiongbo-sjtu changed the title NullPointerException thrown by "$.MODULE$" methods in scala-library (Scala version 2.12, OpenJDK 64-Bit Server VM, Java 17.0.12) NullPointerException thrown by "$.MODULE$" fields in scala-library (Scala version 2.12, OpenJDK 64-Bit Server VM, Java 17.0.12) Aug 8, 2024
@xiongbo-sjtu
Copy link
Author

Seth, thank you for your fast response!

Note that MODULE$ is a field, not a method.
Updated.

seems like something you could check, in both 2.12 and 2.13, with javap?

Unfortunately javap doesn't show me anything about the MODULE$ field.

$ javap -classpath /usr/lib/spark/jars/scala-library-2.12.17.jar scala.collection.immutable.List | grep empty
  public static <A> scala.collection.immutable.List<A> empty();

$ javap -classpath /usr/lib/spark/jars/scala-library-2.12.17.jar scala.collection.immutable.List | grep -i MODULE
<no output>
$ javap -classpath /usr/lib/spark/jars/scala-library-2.12.17.jar scala.math.package | grep 'min(int, int)'
  public static int min(int, int);

$ javap -classpath /usr/lib/spark/jars/scala-library-2.12.17.jar scala.math.package | grep -i MODULE
<no output>

@som-snytt
Copy link

MODULE$ is assigned in the class initializer in 2.13 but in the instance initializer in 2.12. That is why the field is not final on 2.12; but the instance is created under clinit lock for safe publication.

➜  snips scala -J--enable-preview -J--add-exports -Jjdk.jdeps/com.sun.tools.javap=ALL-UNNAMED -nobootcp
Welcome to Scala 2.12.19 (OpenJDK 64-Bit Server VM, Java 17.0.10).
Type in expressions for evaluation. Or try :help.

scala> object X { val x = 42 }
defined object X

scala> :javap -private -verbose X$

{
  public static $line3.$read$$iw$$iw$X$ MODULE$;
    descriptor: L$line3/$read$$iw$$iw$X$;
    flags: (0x0009) ACC_PUBLIC, ACC_STATIC

  private final int x;
    descriptor: I
    flags: (0x0012) ACC_PRIVATE, ACC_FINAL

  public static {};
    descriptor: ()V
    flags: (0x0009) ACC_PUBLIC, ACC_STATIC
    Code:
      stack=1, locals=0, args_size=0
         0: new           #2                  // class $line3/$read$$iw$$iw$X$
         3: invokespecial #22                 // Method "<init>":()V
         6: return

  public $line3.$read$$iw$$iw$X$();
    descriptor: ()V
    flags: (0x0001) ACC_PUBLIC
    Code:
      stack=2, locals=1, args_size=1
         0: aload_0
         1: invokespecial #27                 // Method java/lang/Object."<init>":()V
         4: aload_0
         5: putstatic     #29                 // Field MODULE$:L$line3/$read$$iw$$iw$X$;
         8: aload_0
         9: bipush        42
        11: putfield      #25                 // Field x:I
        14: return

@som-snytt
Copy link

Note that the companion for List is List$.

@xiongbo-sjtu
Copy link
Author

som-snytt, thanks for confirming the different behavior between Scala 2.12 and 2.13.

Note that the companion for List is List$.

Also thanks for your input! I'm able to confirm that the MODULE$ fields are static instead of being static final

$ javap -p -classpath /usr/lib/spark/jars/scala-library-2.12.17.jar scala.math.package$ | grep -i MODULE
  public static scala.math.package$ MODULE$;

$ javap -p -classpath /usr/lib/spark/jars/scala-library-2.12.17.jar scala.collection.immutable.List$ | grep -i MODULE
  public static scala.collection.immutable.List$ MODULE$;

Given that MODULE$ is assigned in the instance initializer in 2.12, this might explain why JIT is able to swap it from non-null to null on the fly.

@xiongbo-sjtu xiongbo-sjtu changed the title NullPointerException thrown by "$.MODULE$" fields in scala-library (Scala version 2.12, OpenJDK 64-Bit Server VM, Java 17.0.12) NullPointerException thrown by methods of a "$.MODULE$" field in scala-library (Scala version 2.12, OpenJDK 64-Bit Server VM, Java 17.0.12) Aug 8, 2024
@xiongbo-sjtu xiongbo-sjtu changed the title NullPointerException thrown by methods of a "$.MODULE$" field in scala-library (Scala version 2.12, OpenJDK 64-Bit Server VM, Java 17.0.12) NullPointerException thrown by methods of "$.MODULE$" in scala-library (Scala version 2.12, OpenJDK 64-Bit Server VM, Java 17.0.12) Aug 8, 2024
@SethTisue
Copy link
Member

SethTisue commented Aug 8, 2024

closing since the problem presumably doesn't exist on 2.13, and we don't keep 2.12-only tickets open. but discussion can continue

@SethTisue SethTisue closed this as not planned Won't fix, can't repro, duplicate, stale Aug 8, 2024
@SethTisue SethTisue changed the title NullPointerException thrown by methods of "$.MODULE$" in scala-library (Scala version 2.12, OpenJDK 64-Bit Server VM, Java 17.0.12) NullPointerException on JDK 17 involving MODULE$ in scala-library 2.12 Aug 8, 2024
@xiongbo-sjtu
Copy link
Author

For further discussion, please leverage the following thread.

https://bugs.openjdk.org/browse/JDK-8338379

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants