19/01/15
11:55:42 INFO BlockManagerInfo: Added broadcast_0_piece0 in
memory on gha-data-hdp-dn2085.gh.sankuai.com:58151
(size: 35.6 KB, free: 366.3 MB)
19/01/15
11:55:42 INFO BlockManagerMasterEndpoint: add
broadcast_0_piece0 location: BlockManagerId(1, gha-data-hdp-dn2085.gh.sankuai.com,
58151, None)
19/01/15
11:55:43 INFO HeartbeatReceiver: Checking executors
providing.
19/01/15
11:55:46 INFO YarnSchedulerBackend$YarnDriverEndpoint:
requestedTotalExecutors = 0
numExistingExecutors
= 2
numPendingExecutors = 0
executorsPendingToRemove
= 0
executorsPendingLossReason
= 0
executorsActive =
2
19/01/15
11:55:47 WARN TaskSetManager: Lost task 0.0 in stage 0.0
(TID 0, gha-data-hdp-dn2085.gh.sankuai.com,
executor 1): java.lang.ExceptionInInitializerError
at
org.locationtech.geomesa.spark.GeoMesaSparkKryoRegistrator$$anon$1.write(GeoMesaSparkKryoRegistrator.scala:36)
at
org.locationtech.geomesa.spark.GeoMesaSparkKryoRegistrator$$anon$1.write(GeoMesaSparkKryoRegistrator.scala:32)
at
com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:628)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:366)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:307)
at
com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:628)
at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:315)
at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:393)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at
java.lang.Thread.run(Thread.java:745)
Caused
by: org.apache.spark.SparkException: Exception thrown in
awaitResult:
at
org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
at
org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
at
org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
at
org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
at
org.apache.spark.util.RpcUtils$.makeDriverRef(RpcUtils.scala:32)
at
org.apache.spark.geomesa.GeoMesaSparkKryoRegistratorEndpoint$.org$apache$spark$geomesa$GeoMesaSparkKryoRegistratorEndpoint$$EndpointRef$lzycompute(GeoMesaSparkKryoRegistratorEndpoint.scala:32)
at
org.apache.spark.geomesa.GeoMesaSparkKryoRegistratorEndpoint$.org$apache$spark$geomesa$GeoMesaSparkKryoRegistratorEndpoint$$EndpointRef(GeoMesaSparkKryoRegistratorEndpoint.scala:32)
at
org.apache.spark.geomesa.GeoMesaSparkKryoRegistratorEndpoint$KryoMessage$class.ask(GeoMesaSparkKryoRegistratorEndpoint.scala:76)
at
org.apache.spark.geomesa.GeoMesaSparkKryoRegistratorEndpoint$KryoGetTypesMessage.ask(GeoMesaSparkKryoRegistratorEndpoint.scala:90)
at
org.apache.spark.geomesa.GeoMesaSparkKryoRegistratorEndpoint$ExecutorKryoClient$.getTypes(GeoMesaSparkKryoRegistratorEndpoint.scala:108)
at
org.apache.spark.geomesa.GeoMesaSparkKryoRegistratorEndpoint$$anonfun$init$1.apply(GeoMesaSparkKryoRegistratorEndpoint.scala:57)
at
org.apache.spark.geomesa.GeoMesaSparkKryoRegistratorEndpoint$$anonfun$init$1.apply(GeoMesaSparkKryoRegistratorEndpoint.scala:44)
at
scala.Option.foreach(Option.scala:257)
at
org.apache.spark.geomesa.GeoMesaSparkKryoRegistratorEndpoint$.init(GeoMesaSparkKryoRegistratorEndpoint.scala:43)
at
org.locationtech.geomesa.spark.GeoMesaSparkKryoRegistrator$.<init>(GeoMesaSparkKryoRegistrator.scala:65)
at
org.locationtech.geomesa.spark.GeoMesaSparkKryoRegistrator$.<clinit>(GeoMesaSparkKryoRegistrator.scala)
...
11 more
Caused
by: org.apache.spark.rpc.
: Cannot
find endpoint: spark://kryo-schema@`someip`:45441
at
org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$asyncSetupEndpointRefByURI$1.apply(NettyRpcEnv.scala:141)
at
org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$asyncSetupEndpointRefByURI$1.apply(NettyRpcEnv.scala:137)
at
scala.concurrent.Future$$anonfun$flatMap$1.apply(Future.scala:253)
at
scala.concurrent.Future$$anonfun$flatMap$1.apply(Future.scala:251)
at
scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at
org.spark_project.guava.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:293)
at
scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:136)
at
scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at
scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at
scala.concurrent.Promise$class.complete(Promise.scala:55)
at
scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153)
at
scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:326)
at
scala.concurrent.Future$$anonfun$recover$1.apply(Future.scala:326)
at
scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at
org.spark_project.guava.util.concurrent.MoreExecutors$SameThreadExecutorService.execute(MoreExecutors.java:293)
at
scala.concurrent.impl.ExecutionContextImpl$$anon$1.execute(ExecutionContextImpl.scala:136)
at
scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at
scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at
scala.concurrent.Promise$class.complete(Promise.scala:55)
at
scala.concurrent.impl.Promise$DefaultPromise.complete(Promise.scala:153)
at
scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:237)
at
scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:237)
at
scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at
scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:63)
at
scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:78)
at
scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:55)
at
scala.concurrent.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:55)
at
scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
at
scala.concurrent.BatchingExecutor$Batch.run(BatchingExecutor.scala:54)
at
scala.concurrent.Future$InternalCallbackExecutor$.unbatchedExecute(Future.scala:601)
at
scala.concurrent.BatchingExecutor$class.execute(BatchingExecutor.scala:106)
at
scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:599)
at
scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at
scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at
scala.concurrent.Promise$class.trySuccess(Promise.scala:94)
at
scala.concurrent.impl.Promise$DefaultPromise.trySuccess(Promise.scala:153)
at
org.apache.spark.rpc.netty.NettyRpcEnv.org$apache$spark$rpc$netty$NettyRpcEnv$$onSuccess$1(NettyRpcEnv.scala:216)
at
org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$3.apply(NettyRpcEnv.scala:232)
at
org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$3.apply(NettyRpcEnv.scala:232)
at
org.apache.spark.rpc.netty.RpcOutboxMessage.onSuccess(Outbox.scala:82)
at
org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:194)
at
org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:120)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at
io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at
io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at
org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336)
at
io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357)
at
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343)
at
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
at
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
at
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566)
at
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480)
at
io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442)
at
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
at
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
...
1 more
19/01/15
11:55:47 INFO TaskSetManager: Starting task 0.1 in stage 0.0
(TID 1, gha-data-hdp-dn2085.gh.sankuai.com,
executor 1, partition 0, RACK_LOCAL, 5131 bytes)
19/01/15
11:55:47 WARN TaskSetManager: Lost task 0.1 in stage 0.0
(TID 1, gha-data-hdp-dn2085.gh.sankuai.com,
executor 1): java.lang.NoClassDefFoundError: Could not
initialize class
org.locationtech.geomesa.spark.GeoMesaSparkKryoRegistrator$
at
org.locationtech.geomesa.spark.GeoMesaSparkKryoRegistrator$$anon$1.write(GeoMesaSparkKryoRegistrator.scala:36)
at
org.locationtech.geomesa.spark.GeoMesaSparkKryoRegistrator$$anon$1.write(GeoMesaSparkKryoRegistrator.scala:32)
at
com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:628)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:366)
at
com.esotericsoftware.kryo.serializers.DefaultArraySerializers$ObjectArraySerializer.write(DefaultArraySerializers.java:307)
at
com.esotericsoftware.kryo.Kryo.writeClassAndObject(Kryo.java:628)
at
org.apache.spark.serializer.KryoSerializerInstance.serialize(KryoSerializer.scala:315)
at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:393)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at
java.lang.Thread.run(Thread.java:745)
19/01/15
11:55:47 INFO TaskSetManager: Starting task 0.2 in stage 0.0
(TID 2, gha-data-hdp-dn2085.gh.sankuai.com,
executor 2, partition 0, RACK_LOCAL, 5131 bytes)
19/01/15
11:55:47 INFO BlockManagerInfo: Added broadcast_1_piece0 in
memory on gha-data-hdp-dn2085.gh.sankuai.com:49216
(size: 8.7 KB, free: 366.3 MB)
19/01/15
11:55:47 INFO BlockManagerMasterEndpoint: add
broadcast_1_piece0 location: BlockManagerId(2, gha-data-hdp-dn2085.gh.sankuai.com,
49216, None)
19/01/15
11:55:48 INFO BlockManagerInfo: Added broadcast_0_piece0 in
memory on gha-data-hdp-dn2085.gh.sankuai.com:49216
(size: 35.6 KB, free: 366.3 MB)
19/01/15
11:55:48 INFO BlockManagerMasterEndpoint: add
broadcast_0_piece0 location: BlockManagerId(2, gha-data-hdp-dn2085.gh.sankuai.com,
49216, None)
19/01/15
11:55:51 INFO TaskSetManager: Lost task 0.2 in stage 0.0
(TID 2) on gha-data-hdp-dn2085.gh.sankuai.com,
executor 2: java.lang.ExceptionInInitializerError (null)
[duplicate 1]
19/01/15
11:55:51 INFO TaskSetManager: Starting task 0.3 in stage 0.0
(TID 3, gha-data-hdp-dn2085.gh.sankuai.com,
executor 1, partition 0, RACK_LOCAL, 5131 bytes)
19/01/15
11:55:51 INFO TaskSetManager: Lost task 0.3 in stage 0.0
(TID 3) on gha-data-hdp-dn2085.gh.sankuai.com,
executor 1: java.lang.NoClassDefFoundError (Could not
initialize class
org.locationtech.geomesa.spark.GeoMesaSparkKryoRegistrator$)
[duplicate 1]
I have got the exception above. How to
consolve it ?